KR20170087344A - Mobile terminal and method for controlling the same - Google Patents
Mobile terminal and method for controlling the same Download PDFInfo
- Publication number
- KR20170087344A KR20170087344A KR1020160007217A KR20160007217A KR20170087344A KR 20170087344 A KR20170087344 A KR 20170087344A KR 1020160007217 A KR1020160007217 A KR 1020160007217A KR 20160007217 A KR20160007217 A KR 20160007217A KR 20170087344 A KR20170087344 A KR 20170087344A
- Authority
- KR
- South Korea
- Prior art keywords
- touch
- mobile terminal
- input
- point
- remote
- Prior art date
Links
Images
Classifications
-
- H04M1/72519—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/42—Graphical user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
The present invention relates to a mobile terminal capable of touch input and a control method thereof. A mobile terminal according to the present invention includes: a display unit receiving a user input for setting a point corresponding to a predetermined remote touch object; And a control unit for executing a control command to be executed when a predetermined touch input is applied to the point based on the predetermined touch input being applied to the remote touch object.
Description
The present invention relates to a mobile terminal capable of touch input and a control method thereof.
A terminal can be divided into a mobile / portable terminal and a stationary terminal depending on whether the terminal is movable or not. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
On the other hand, as mobile terminals are closely used in everyday life, mobile terminals may be used with one hand. At this time, there is a problem that it is difficult to press the point or the button frequently pressed on the screen of the mobile phone with the thumb.
The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a mobile terminal and a method of controlling the mobile terminal, in which a touch input is applied to a remote touch icon to cause the same effect as that of a touch point to be touched.
According to an aspect of the present invention, there is provided a display apparatus including: a display unit receiving a user input for setting a point corresponding to a predetermined remote touch object; And a controller for executing a control command to be executed when a predetermined touch input is applied to the point based on the predetermined touch input being applied to the remote touch object.
In an embodiment, the control unit may output a setting object for setting a point corresponding to the remote touch object based on a predetermined user input being applied.
In another embodiment, the control unit may set the first point to a point corresponding to the remote touch object, based on a drag input to move the setting object to the first point.
In yet another embodiment, the control unit may further include: a display unit for displaying the setting object on the basis of the fact that the setting object is outputted again based on the predetermined user input being applied and a drag input for moving the setting object to the second point is applied, And reset the second point to a point corresponding to the remote touch object.
In another embodiment, the control unit may output the remote touch object as an icon indicating a position of a point corresponding to the remote touch object.
In another embodiment, the control unit may output a plurality of remote touch objects having corresponding points on the display unit.
In another embodiment, the control unit controls the first remote touch object, which is one of the plurality of remote touch objects, based on a predetermined touch input applied to the first remote touch object, And executes a control command to be executed when the preset touch input is applied.
In another embodiment, the control unit may output a predetermined image effect to a point corresponding to the remote touch object, based on a predetermined touch input to the remote touch object.
According to another aspect of the present invention, there is also provided a method comprising: (a) receiving a user input for setting a point corresponding to a predetermined remote touch object; And (b) executing a control command to be executed when a predetermined touch input is applied to the point, based on whether the predetermined touch input is applied to the remote touch object. ≪ / RTI >
In the embodiment, the step (a) may include outputting a setting object for setting a point corresponding to the remote touch object based on a predetermined user input being applied.
In another embodiment, the step (a) may include setting the first point to a point corresponding to the remote touch object, based on a drag input for moving the setting object to the first point, ; ≪ / RTI >
In yet another embodiment, the step (b) may further comprise the step of: outputting the setting object again based on the predetermined user input being applied, and based on the drag input for moving the setting object to the second point And resetting the second point to a point corresponding to the remote touch object.
In another embodiment, the step (b) may include outputting the remote touch object as an icon indicating a position of a point corresponding to the remote touch object.
In yet another embodiment, the step (b) may include outputting a plurality of remote touch objects on which a corresponding point is set on the display unit.
In yet another embodiment, the step (b) may further include, based on the predetermined touch input to the first remote touch object being one of the plurality of remote touch objects, And executing a control command to be executed when a predetermined touch input is applied to the first point.
In another embodiment, the step (b) includes the step of outputting a preset image effect to a point corresponding to the remote touch object, based on a predetermined touch input to the remote touch object can do.
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one embodiment of the present invention, there is an advantage that a user can easily touch a position difficult to press with the thumb while holding the mobile phone with one hand.
In addition, according to at least one of the embodiments of the present invention, there is an advantage that a user can easily press a button that is frequently pressed.
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1A is a block diagram illustrating a mobile terminal according to the present invention.
FIGS. 1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a flowchart illustrating a method of controlling a mobile terminal according to the present invention.
3 is a conceptual diagram for explaining an embodiment of disposing a remote touch object.
4 is a conceptual diagram for explaining an embodiment for setting a point corresponding to a remote touch object.
5 is a conceptual diagram for explaining an embodiment for changing a point set to correspond to a remote touch object.
6 to 8 are conceptual diagrams for explaining an embodiment of a remote touch object.
9 is a conceptual diagram for explaining an embodiment in which a plurality of remote touch objects are disposed.
10 is a conceptual diagram for explaining an embodiment in which, when a touch input is applied to a remote touch object, an image effect is output to a corresponding point.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar components are denoted by the same reference numerals, and redundant explanations thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
1B and 1C, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
The
The
In addition, the
The
The touch sensor may be a film having a touch pattern and disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the
The
The
The
2 is a flowchart illustrating a method of controlling a mobile terminal according to the present invention.
Referring to FIG. 2, a step S210 of receiving a user input for setting a point corresponding to a predetermined remote touch object is performed.
As an embodiment, the remote touch object may be implemented with a soft key or a physical key for executing a remote touch command. The user can set a point on the
In step S220, a control command to be executed when the predetermined touch input is applied to the point is executed based on the predetermined touch input to the remote touch object.
As an embodiment, if a touch input is applied to the remote touch icon, the same effect as touch input is applied to the first point on the
In a specific embodiment, when an icon of an application is output at the first point, when the touch input is applied to the remote touch icon, the same effect as that of touch input at the first point occurs. That is, the application can be executed.
Hereinafter, specific embodiments will be described.
In an embodiment, the step S210 may include outputting a setting object for setting a point corresponding to the remote touch object based on the predetermined user input being applied.
In still another embodiment, the step S210 includes setting the first point to a point corresponding to the remote touch object based on a drag input for moving the setting object to the first point can do.
In another embodiment, the step S220 may further include a step of outputting the setting object again based on the predetermined user input being applied and, on the basis of the drag input for moving the setting object to the second point, And resetting the second point to a point corresponding to the remote touch object.
In another embodiment, the step S220 may include outputting the remote touch object as an icon indicating a position of a point corresponding to the remote touch object.
In still another embodiment, the step S220 may include outputting a plurality of remote touch objects on which a corresponding point is set on the
In yet another embodiment, the step S220 may include a step of, based on the predetermined touch input being applied to the first remote touch object, which is one of the plurality of remote touch objects, And executing a control command to be executed when the predetermined touch input is applied to the touch input.
In another embodiment, the step S220 may include outputting a predetermined image effect to a point corresponding to the remote touch object, based on a predetermined touch input to the remote touch object .
Hereinafter, specific embodiments will be described in terms of components.
In an embodiment, the
In another embodiment, the
According to another embodiment, the
In another embodiment, the
In another embodiment, the
According to another embodiment, the
In another embodiment, the
3 is a conceptual diagram for explaining an embodiment of disposing a remote touch object.
Referring to FIG. 3, the
The
As an embodiment, a
Accordingly, the
When the setting is completed as described above, the
According to this embodiment, the position of the
As another embodiment, a drag input from the
In another embodiment, a
On the other hand, the
Also, the
4 is a conceptual diagram for explaining an embodiment for setting a point corresponding to a remote touch object.
Referring to FIG. 4, when the
As an example, an opaque
In another embodiment, the
Accordingly, a
The
When the touch input is applied to the
As an embodiment, the same control command as the touch input is applied to the menu icon being output to the
As another embodiment, when a touch input is applied to the
On the other hand, the
5 is a conceptual diagram for explaining an embodiment for changing a point set to correspond to a remote touch object.
Referring to FIG. 5, after the
Then, when a touch input is applied to the
Accordingly, a
After the change is completed, when the touch input is applied to the
As an embodiment, the same control command as the touch input is applied to the menu edit icon being output to the
In another embodiment, when a touch input is applied to the
Meanwhile, the
6 to 8 are conceptual diagrams for explaining an embodiment of a remote touch object.
6 to 8, the remote touch object may be output as
Specifically, the
6, when the
7, when the
8, when the
Meanwhile, the
In addition, the
9 is a conceptual diagram for explaining an embodiment in which a plurality of remote touch objects are disposed.
Referring to FIG. 9, a first
The first
Accordingly, when a touch input is applied to the first
As another embodiment, a point indicating an approximate position of the
On the other hand, the
10 is a conceptual diagram for explaining an embodiment in which, when a touch input is applied to a remote touch object, an image effect is output to a corresponding point.
Referring to FIG. 10, when a touch input is applied to the
As an example, a blinking image effect or a
In this way, the
As another embodiment, when a touch input is applied to the
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one embodiment of the present invention, there is an advantage that a user can easily touch a position difficult to press with the thumb while holding the mobile phone with one hand.
In addition, according to at least one of the embodiments of the present invention, there is an advantage that a user can easily press a button that is frequently pressed.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a
Claims (16)
And a control unit for executing a control command to be executed when a predetermined touch input is applied to the point based on that the predetermined touch input is applied to the remote touch object.
Wherein,
And outputs a setting object for setting a point corresponding to the remote touch object based on the predetermined user input being applied.
Wherein,
And sets the first point to a point corresponding to the remote touch object based on whether a drag input for moving the setting object to the first point is applied.
Wherein,
Based on that the preset object is again output and a drag input for moving the setting object to the second point is applied, based on the predetermined user input being applied, the second point corresponding to the remote touch object The mobile terminal is reset.
Wherein,
And outputs the remote touch object as an icon indicating a position of a point corresponding to the remote touch object.
Wherein,
And outputs a plurality of remote touch objects on which the corresponding points are set on the display unit.
Wherein,
A control command which is executed when a predetermined touch input is applied to a first point corresponding to the first remote touch object, based on a predetermined touch input being applied to a first remote touch object which is one of the plurality of remote touch objects, Is executed.
Wherein,
Wherein the predetermined image effect is outputted to a point corresponding to the remote touch object based on that the predetermined touch input is applied to the remote touch object.
and (b) executing a control command to be executed when a predetermined touch input is applied to the point based on the predetermined touch input being applied to the remote touch object. .
The step (a)
And outputting a setting object for setting a point corresponding to the remote touch object based on that a predetermined user input is applied.
The step (a)
And setting the first point to a point corresponding to the remote touch object based on a drag input for moving the setting object to the first point.
The step (b)
Based on that the preset object is again output and a drag input for moving the setting object to the second point is applied, based on the predetermined user input being applied, the second point corresponding to the remote touch object And resetting the mobile terminal to the point where the mobile terminal is located.
The step (b)
And outputting the remote touch object as an icon indicating a position of a point corresponding to the remote touch object.
The step (b)
And outputting a plurality of remote touch objects on which a corresponding point is set on the display unit.
The step (b)
A control command which is executed when a predetermined touch input is applied to a first point corresponding to the first remote touch object based on a predetermined touch input being applied to a first remote touch object which is one of the plurality of remote touch objects, The method of claim 1, further comprising:
The step (b)
And outputting a preset image effect to a point corresponding to the remote touch object based on the predetermined touch input being applied to the remote touch object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160007217A KR20170087344A (en) | 2016-01-20 | 2016-01-20 | Mobile terminal and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160007217A KR20170087344A (en) | 2016-01-20 | 2016-01-20 | Mobile terminal and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170087344A true KR20170087344A (en) | 2017-07-28 |
Family
ID=59422385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160007217A KR20170087344A (en) | 2016-01-20 | 2016-01-20 | Mobile terminal and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170087344A (en) |
-
2016
- 2016-01-20 KR KR1020160007217A patent/KR20170087344A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20170024846A (en) | Mobile terminal and method for controlling the same | |
KR20150142359A (en) | Mobile terminal and method for controlling the same | |
KR20160019145A (en) | Mobile terminal and method for controlling the same | |
KR101510704B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR20150084133A (en) | Mobile terminal and method for controlling the same | |
KR20170021514A (en) | Display apparatus and controlling method thereof | |
KR20170052190A (en) | Terminal device and controlling method thereof | |
KR20160012781A (en) | Mobile terminal and method for controlling the same | |
KR20170019071A (en) | Mobile terminal and method for controlling the same | |
KR20170011240A (en) | Mobile terminal and method for controlling the same | |
KR101830661B1 (en) | Mobile terminal | |
KR20160031336A (en) | Mobile terminal and method for controlling the same | |
KR20150094243A (en) | Mobile terminal and method for controlling the same | |
KR101591329B1 (en) | Mobile terminal and method for controlling the same | |
KR20160077907A (en) | Mobile terminal and method for controlling the same | |
KR101698099B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR101613960B1 (en) | Watch type mobile terminal and control method for the mobile terminal | |
KR20160015720A (en) | Mobile terminal and method for controlling the same | |
KR101641565B1 (en) | Mobile terminal and method for controlling the same | |
KR20170087344A (en) | Mobile terminal and method for controlling the same | |
KR20170068033A (en) | Mobile terminal and method for controlling the same | |
KR20170064765A (en) | Mobile terminal and method for controlling the same | |
KR20180032402A (en) | Mobile terminal | |
KR20170097879A (en) | Mobile terminal and method for controlling the same | |
KR20160013789A (en) | Mobile terminal and control method for the mobile terminal |