KR20150068823A - Mobile terminal - Google Patents
Mobile terminal Download PDFInfo
- Publication number
- KR20150068823A KR20150068823A KR1020130154931A KR20130154931A KR20150068823A KR 20150068823 A KR20150068823 A KR 20150068823A KR 1020130154931 A KR1020130154931 A KR 1020130154931A KR 20130154931 A KR20130154931 A KR 20130154931A KR 20150068823 A KR20150068823 A KR 20150068823A
- Authority
- KR
- South Korea
- Prior art keywords
- touch
- display unit
- mobile terminal
- input
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a mobile terminal for receiving a control command of a user through a touch sensor provided on a display unit.
A terminal can be divided into a mobile / portable terminal and a stationary terminal depending on whether the terminal is movable. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.
A mobile terminal is a portable electronic device that is portable and has one or more functions such as voice and video communication function, information input / output function, and data storage function.
As the functions of the mobile terminal are diversified, the mobile terminal is implemented in the form of a multimedia player having complicated functions such as photographing and photographing of a moving picture, reproduction of a music or video file, broadcasting reception, and a game.
Various new attempts have been made in terms of hardware or software to implement the complex functions of such multimedia devices. For example, a user interface environment is provided for a user to easily and conveniently search for or select a function.
However, the user interface currently implemented in the mobile terminal has a limitation that it does not distinguish the user's touch input method.
One object of the present invention is to propose a mobile terminal that processes a touch input method of a user separately.
Another object of the present invention is to provide a mobile terminal that provides a user interface that improves user's convenience.
According to another aspect of the present invention, there is provided a mobile terminal including a display unit including a touch sensor and outputting screen information, And a controller for performing different controls by distinguishing a single touch dragged in a state in which the plurality of points are touched and a multi-touch dragged in parallel in the same direction while a plurality of points are touched.
According to an embodiment of the present invention, when the multi-touch is applied to the display unit, the controller outputs an image whose size gradually changes along the drag input path to the display unit, And displays the list of applications that have been output by each application.
When the touch input is applied to any one of the previously displayed screens, the control unit executes an application corresponding to the screen to which the touch input is applied, and outputs the screen to which the touch input is applied in the executed application have.
According to another embodiment of the present invention, the controller divides the drag direction of each of the single touch and the multi-touch to perform different controls.
According to another embodiment of the present invention, when the application having the concept of the upper lists distinguished from each other and the sub-lists belonging to the upper lists is executed, the input of the single-touch or the multi- The controller divides the single touch and the multi-touch and switches the upper lists on the basis of either the single touch or the multi-touch to output the switched upper list to the display unit, And outputs the converted sub-list to the display unit.
According to the present invention, it is possible to implement a mobile terminal that provides different input methods to a user and executes different controls based on user's input in different ways.
Further, the present invention can implement a mobile terminal that performs different control by dividing the drag direction of the single-touch and multi-touch.
In addition, the present invention can implement a mobile terminal that provides convenience to a user by extending a user interface associated with input.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
FIG. 2A is a perspective view of an example of a mobile terminal according to the present invention. FIG.
FIG. 2B is a rear perspective view of the mobile terminal shown in FIG. 2A. FIG.
FIG. 3A and FIG. 3B are conceptual diagrams illustrating operations implemented in a mobile terminal according to the present invention. FIG.
4 is a flowchart for explaining a control method for performing different controls by distinguishing a single touch and a multi-touch.
5A through 7 are conceptual diagrams illustrating a user interface implemented on the basis of the operation described with reference to FIG.
Hereinafter, a mobile terminal according to the present invention will be described in detail with reference to the drawings. In the present specification, the same or similar reference numerals are given to different embodiments in the same or similar configurations. As used herein, the singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. In addition, the suffix "module" and " part "for constituent elements used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC , A tablet PC (tablet PC), and an ultrabook (ultrabook). However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.
1 is a block diagram of a
The
Hereinafter, the components will be described in order.
The
The
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.
The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).
For example, the
The broadcast signal and / or broadcast related information received through the
The
The
The
The short-
The
Referring to FIG. 1, an A / V (Audio / Video)
The image frame processed by the
The
The
The
The
The
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
Also, the
Here, a stereoscopic image represents a 3-dimensional stereoscopic image, and a 3-dimensional stereoscopic image represents a progressive depth and reality in which objects are located on a monitor or a screen. It is an image that makes you feel the same as reality space. 3D stereoscopic images are implemented using binocular disparity. The binocular parallax means the parallax caused by the positions of the two separated eyes. When the two eyes see different two-dimensional images and the images are transmitted to the brain through the retina and fused, the depth and real feeling of the stereoscopic image can be felt do.
The
Examples of the autostereoscopic method include a parallax barrier method, a lenticular method, an integral imaging method, and a switchable lens method. The projection method includes a reflection type holographic method and a transmission type holographic method.
Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.
In addition, the 3D thumbnail image can generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and combine them to generate one 3D thumbnail image. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.
The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the
On the other hand, when a
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
Referring to FIG. 1, a
Examples of the
Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.
The
In the case where the three-
The
The
The stereoscopic
The
The
The
For example, the
As another example, a photosensor may be stacked on a display element. The photosensor is configured to scan the movement of the object proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of light change, thereby acquiring position information of the object to be sensed.
The
The
The
In addition to the vibration, the
The
The
The
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
In addition, the
In addition, if the state of the mobile terminal meets a set condition, the
The
The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein.
The software code may be implemented in a software application written in a suitable programming language. The software code is stored in the
2A is a perspective view of a
The disclosed
The body includes a case (frame, housing, cover, etc.) which forms an appearance. In this embodiment, the case may be divided into a
The cases may be formed by injection molding of synthetic resin, or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
A first
The
The
The touch sensing means may be formed of a translucent material so that the visual information output from the
A
The first
The sound generated from the first
The
The
In the figure, the
The contents input by the first and / or
The
The
FIG. 2B is a rear perspective view of the
Referring to FIG. 2B, a
For example, the
A
And a second
An antenna (not shown) for receiving a broadcast signal may be additionally disposed on the side of the terminal body in addition to an antenna for a call or the like. An antenna constituting a part of the broadcast receiving module 111 (see FIG. 1) can be installed to be able to be drawn out from the terminal body.
The terminal body is provided with a power supply unit 190 (see FIG. 1) for supplying power to the
The extracted location information of the
Hereinafter, an operation of the mobile terminal, a control method thereof, and a user interface according to the present invention will be described.
FIGS. 3A and 3B are conceptual diagrams illustrating operations implemented in the
The
FIG. 3A shows an operation implemented in the
an
Referring to FIG. 5B, the
FIG. 3B illustrates an operation performed by the
(a), an
(b), the controller divides the single touch and the multi-touch, and outputs a processing result different from that when inputting with a single touch to the
As shown in the figure, the
The
The
Conventionally, multitasking supported by a mobile terminal provides only a function of switching each application, and does not provide a history of screen information output from each application. Therefore, in order to halt the operation of the application in use and call up a screen that was previously executed in another application, it is necessary to switch from the multitasking application to another application and blow up the screen that was executed in the past through detailed control in the switched application There was an inconvenience.
However, in the present invention, the
For example, in the case of a music application, the present invention outputs an
(c), when an input for dragging the
the
For example, if the user touches the
Accordingly, the user can easily input the single touch and the multi-touch into the application, switch between applications through the
Hereinafter, a control method for distinguishing between a single touch and a multi-touch will be described.
4 is a flowchart for explaining a control method for performing different controls by distinguishing between a single touch and a multi-touch.
First, when a touch input is applied to the display unit, the touch sensor senses a touch input (S100). The touch input includes a short touch that allows the touch sensor to be touched for a short time, a short touch that maintains a certain point of the touch sensor for a predetermined time, a drag that moves to the touched state, ), And so on. The present invention is based on a drag input.
When the touch sensor provided on the display unit senses a touch input, the control unit distinguishes between a single touch and a multi-touch depending on whether one point is dragged in a touched state or a plurality of points are touched in a same direction S200).
Further, the control unit not only distinguishes the single touch and the multi-touch, but also performs different control by dividing the drag direction of each of the single touch and the multi-touch. For example, a single touch and a multitouch which are dragged in the longitudinal direction of the terminal are distinguished from a single touch and a multitouch which are dragged in the width direction of the terminal, and different controls are executed.
If the divided drag is a single-touch input, control is performed on the single-touch (S310), and if the drag is multi-touch, control set on the multi-touch is executed (S320). Different controls are set for single-touch and multi-touch.
In accordance with this flow, the control unit distinguishes the single touch and the multi-touch from each other and executes different controls, and the different controls executed by the control unit will be described below.
5A to 7 are conceptual diagrams illustrating a user interface implemented on the basis of the operation described with reference to FIG.
5A and 5B are conceptual diagrams showing a user interface in which a single touch and a multi-touch are distinguished in a music application to perform different controls.
5A, when a multi-touch input is dragged from the lower end to the upper end of the terminal 200, a
(b), a multi-touch input is dragged from the upper end of the terminal 200 toward the lower end, and a
5A, it can be seen that the control unit (not shown) divides the drag direction of the multi-touch and performs different controls.
Referring to FIG. 5B, when a single-touch input is dragged from the lower end of the terminal 200 toward the upper end, a
(d2),
5B, it can be seen that the controller divides the drag direction of the single touch and executes different controls. 5A and 5B are compared with each other, it can be seen that the control unit distinguishes between the single touch and the multi-touch and performs different controls.
FIG. 6 is a conceptual diagram for explaining a
When a single-touch or multi-touch input is applied to the
For example, in a music application, the top list may be a music album or music folder, and the bottom list is music belonging to the music album or music folder. As another example, in a gallery application, the top list may be a photo album or a photo folder, and the sublist is a photo or a video belonging to the photo album or photo folder. As another example, in the web browser application, the top list may be a window of the web browser, and the bottom list may be a tab belonging to the window of the web browser.
FIG. 6 shows that the controller divides a single touch and a multi-touch into a music list and switches between a top list and a bottom list.
(a), the controller switches the
(b), the control unit switches the
Accordingly, the present invention not only distinguishes between a single touch and a multi-touch as described in Figs. 5A and 5B and Fig. 6, but also differently controls the drag direction of the single touch and the multi-touch to perform different controls, Is visually outputted to the
The different controls that may be implemented by the present invention are not limited to those described above. In a case where it is necessary to provide a user interface that performs different controls through an input method in which screens displayed on the
7 is a conceptual diagram for explaining an example of providing a user interface with improved convenience.
(a), a user interface for switching a frame of a moving picture being reproduced in the
(b), the present invention can be applied to a user interface for switching frames of a moving picture being reproduced by the
Although not shown in FIG. 7, the present invention can be applied to a mobile terminal that assigns a user interface to be distinguished from a single-touch and a multi-touch that drag in the longitudinal direction of the terminal 200, It is possible to provide a mobile terminal with enhanced convenience for a user by allocating user interfaces that are distinguished from each other in touch.
The mobile terminal described above is not limited to the configuration and the method of the embodiments described above, but the embodiments may be configured by selectively combining all or a part of each embodiment so that various modifications can be made.
Claims (5)
When a drag input is applied to the display unit, a single touch being dragged in a state where one point is touched and a multi-touch being dragged in the same direction in a state where a plurality of points are touched, And the mobile terminal.
Wherein the control unit outputs to the display unit an image that gradually changes in size along the drag input path when the multi-touch is applied to the display unit, and causes the display unit to display the list of applications previously executed in the terminal, And outputting the previously output images.
The control unit executes an application corresponding to the screen to which the touch input is applied when the touch input is applied to any one of the previously displayed screens and outputs the screen to which the touch input is applied in the executed application The mobile terminal comprising:
Wherein the controller divides the drag direction of each of the single touch and the multi-touch to perform different controls.
When the input of the single touch or the multi-touch is applied to the display unit in a state in which the application having the concepts of the upper lists and the lower lists belonging to the upper lists is executed,
The controller divides the single touch and the multi-touch, switches the upper lists on the basis of either the single touch or the multi-touch, outputs the switched upper list to the display unit, And outputs the converted sub-list to the display unit by switching the sub-lists.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130154931A KR20150068823A (en) | 2013-12-12 | 2013-12-12 | Mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130154931A KR20150068823A (en) | 2013-12-12 | 2013-12-12 | Mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150068823A true KR20150068823A (en) | 2015-06-22 |
Family
ID=53516176
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130154931A KR20150068823A (en) | 2013-12-12 | 2013-12-12 | Mobile terminal |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150068823A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017090823A1 (en) * | 2015-11-27 | 2017-06-01 | 엘지전자 주식회사 | Rollable mobile terminal and control method therefor |
WO2017104860A1 (en) * | 2015-12-16 | 2017-06-22 | 엘지전자 주식회사 | Rollable mobile terminal |
WO2017111192A1 (en) * | 2015-12-24 | 2017-06-29 | 엘지전자 주식회사 | Rollable mobile terminal and control method therefor |
-
2013
- 2013-12-12 KR KR1020130154931A patent/KR20150068823A/en not_active Application Discontinuation
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017090823A1 (en) * | 2015-11-27 | 2017-06-01 | 엘지전자 주식회사 | Rollable mobile terminal and control method therefor |
KR20170062327A (en) * | 2015-11-27 | 2017-06-07 | 엘지전자 주식회사 | Rollable mobile terminal and control method thereof |
US10424272B2 (en) | 2015-11-27 | 2019-09-24 | Lg Electronics Inc. | Rollable mobile terminal and control method thereof |
WO2017104860A1 (en) * | 2015-12-16 | 2017-06-22 | 엘지전자 주식회사 | Rollable mobile terminal |
US10627931B2 (en) | 2015-12-16 | 2020-04-21 | Lg Electronics Inc. | Rollable mobile terminal |
WO2017111192A1 (en) * | 2015-12-24 | 2017-06-29 | 엘지전자 주식회사 | Rollable mobile terminal and control method therefor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102080746B1 (en) | Mobile terminal and control method thereof | |
KR102127925B1 (en) | Mobile terminal and control method thereof | |
KR101748668B1 (en) | Mobile twrminal and 3d image controlling method thereof | |
KR101917690B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR102130797B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR20140113156A (en) | Mobile terminal and control method thereof | |
KR102080743B1 (en) | Mobile terminal and control method thereof | |
KR102124801B1 (en) | Mobile terminal and control method thereof | |
KR101988262B1 (en) | Mobile terminal and control method thereof | |
KR20140109719A (en) | Mobile terminal and control method thereof | |
KR20180079879A (en) | Mobile terminal and method for controlling the same | |
KR102037928B1 (en) | Mobile terminal | |
KR20150055448A (en) | Mobile terminal and control method thereof | |
KR102105461B1 (en) | Mobile terminal and control method thereof | |
KR20180103866A (en) | Mobile terminal and control method thereof | |
KR20120122314A (en) | Mobile terminal and control method for the same | |
KR20150068823A (en) | Mobile terminal | |
KR20140109718A (en) | Mobile terminal and control method thereof | |
US8941648B2 (en) | Mobile terminal and control method thereof | |
KR102026639B1 (en) | Mobile terminal and control method thereof | |
KR20140085039A (en) | Control apparatus of mobile terminal and method thereof | |
KR20140102569A (en) | Mobile terminal and control method thereof | |
KR20140122559A (en) | Mobile terminal and control method thereof | |
KR101850391B1 (en) | Mobile terminal and control method thereof | |
KR20150071498A (en) | Mobile terminal and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |