KR20130019943A - Apparatus and method for processing touch input - Google Patents

Apparatus and method for processing touch input Download PDF

Info

Publication number
KR20130019943A
KR20130019943A KR1020110082253A KR20110082253A KR20130019943A KR 20130019943 A KR20130019943 A KR 20130019943A KR 1020110082253 A KR1020110082253 A KR 1020110082253A KR 20110082253 A KR20110082253 A KR 20110082253A KR 20130019943 A KR20130019943 A KR 20130019943A
Authority
KR
South Korea
Prior art keywords
touch
function
flicking
input
touch input
Prior art date
Application number
KR1020110082253A
Other languages
Korean (ko)
Inventor
강기동
Original Assignee
현대자동차주식회사
기아자동차주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대자동차주식회사, 기아자동차주식회사 filed Critical 현대자동차주식회사
Priority to KR1020110082253A priority Critical patent/KR20130019943A/en
Publication of KR20130019943A publication Critical patent/KR20130019943A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a touch input processing apparatus and a method thereof, and to provide a touch input processing apparatus and a method for distinguishing and recognizing a single touch input and a multi touch input, and performing different functions according to the result.
To this end, the present invention provides a touch input processing device, comprising: touch sensing means for sensing a single flicking touch and a multi-flicking touch on a touch screen; If the touch sensing means detects a single flickering touch, the function processing means controls the first processing corresponding to the single flickering touch, and if the touch sensing means senses the multi-flickering touch, the multi-flickering is performed. Control means for controlling the function processing means to perform a touch corresponding second function; And the function processing means for performing a first function or a second function according to the control of the control means.

Description

Touch input processing device and its method {APPARATUS AND METHOD FOR PROCESSING TOUCH INPUT}

The present invention relates to a touch input processing apparatus and a method thereof, and more particularly, to a touch input processing apparatus and method for distinguishing and recognizing a single touch input and a multi touch input and performing different functions according to the result. will be.

In general, the touch screen is one of input devices configuring an interface between an information communication device and a user using various displays. Enable the interface. Touch screen is used in various devices such as ATM (Automated Teller Machine), PDA (Personal Digital Assistant), mobile phone because it can be used easily by anyone of all ages just by touching input tools such as finger and touch pen. It is widely used in many fields such as banks, government offices, tourism and transportation.

The touch input processing apparatus for processing a user's input using such a touch screen may not only process a single touch input, but may not process a multi touch input.

That is, the conventional touch input processing apparatus has a problem in that when the user drags with one finger, the screen can be moved one page at a time, and the user cannot move a predetermined page at a time.

In order to solve the above problems of the prior art, the present invention provides a touch input processing apparatus and method for distinguishing and recognizing a single touch input and a multi touch input, and performing different functions according to the result. There is this.

The objects of the present invention are not limited to the above-mentioned objects, and other objects and advantages of the present invention which are not mentioned can be understood by the following description, and will be more clearly understood by the embodiments of the present invention. It will also be readily apparent that the objects and advantages of the invention may be realized and attained by means of the instrumentalities and combinations particularly pointed out in the appended claims.

In accordance with another aspect of the present invention, there is provided a touch input processing apparatus comprising: touch sensing means for sensing a single flicking touch and a multi-flicking touch on a touch screen; If the touch sensing means detects a single flickering touch, the function processing means controls the first processing corresponding to the single flickering touch, and if the touch sensing means senses the multi-flickering touch, the multi-flickering is performed. Control means for controlling the function processing means to perform a touch corresponding second function; And the function processing means for performing a first function or a second function according to the control of the control means.

In addition, the method of the present invention for achieving the above object, the touch input processing method, comprising: detecting a single flicking touch on the touch screen; Performing a first function corresponding to the sensed single flicking touch; Detecting a multi-flicking touch on the touch screen; And performing a second function corresponding to the sensed multi-flicking touch.

As described above, the present invention distinguishes and recognizes a single touch input and a multi touch input, and performs different functions according to the result, thereby improving user convenience.

1 is a block diagram of an embodiment of a mobile terminal to which the present invention is applied;
2 is a front view of an embodiment of a mobile terminal to which the present invention is applied;
3 is a front view of an embodiment of a vehicle-mounted terminal to which the present invention is applied;
4 is an example of a play list configuration consisting of a plurality of pages in a music reproduction application;
5 is an example of a form in which a page is scrolled according to a single touch input in the touch input processing apparatus according to the present invention;
6 is an example of a form in which a page is scrolled according to a multi-touch input in the touch input processing apparatus according to the present invention;
7 is an example of a method of setting a bookmark page that can be directly moved through multi-touch in the touch input processing device according to the present invention;
8 is an example in which different functions are performed according to different touch inputs in the touch input processing apparatus according to the present invention;
9 is an example of a touch pattern learning and recognition correction process in the touch input processing device according to the present invention;
10 is a flowchart illustrating a touch input processing method according to the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The above and other objects, features and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings, It can be easily carried out. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.

The present invention is a fixed / mounted type as well as a mobile terminal such as a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a tablet computer, and the like. It can be applied to an on-board terminal AVN.

1 is a block diagram of an embodiment of a mobile terminal to which the present invention is applied.

As shown in FIG. 1, the mobile terminal 100 to which the present invention is applied includes a wireless communication unit 110, an A / V (Audio / Video) input unit 120, a user input unit 130, an output unit 150, The memory 160, the interface unit 170, the controller 180, and the power supply unit 190 may be included. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

First, the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal includes a data broadcast signal such as a TV broadcast signal, a radio broadcast signal, traffic information (for example, TPEG information), and a broadcast in which the data broadcast signal is combined with the TV broadcast signal or the radio broadcast signal. It may also include a signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module. By using the location information received using the location information module 115, the current location may be displayed on the map through the display unit 151 to be described later, or the driving direction, the traveling speed, and the route guidance may be performed.

Next, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes an image frame such as a still image or a moving image obtained by an image sensor in a vehicle black box recording mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided according to a usage environment to implement a multi-channel black box function that simultaneously captures images in two or more directions.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Through this, a destination or a departure point and the like may be input through voice for the route search. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

Next, the user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

Next, the output unit 150 is used to generate an output related to visual, auditory or tactile senses, and may include a display unit 151 and a sound output module 152.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a navigation mode, a user interface (UI) or a graphic user interface (GUI) related to a map, a speed, a direction, a distance indication, etc. related to a current location, a destination, a route, etc., is displayed in connection with driving. When the mobile terminal 100 is in the black box mode or the photographing mode, the photographed image, the UI, or the GUI is displayed.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). display, a 3D display, or a dual display where different images are displayed depending on the viewing direction (e.g. one display can be seen in the driver's seat while the map is visible and the front seat can see the broadcast screen). It may include.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

In addition, the touch screen may detect a touch signal at two or more points at the same time. A touch applied to two or more touch points at the same time will be referred to as "multi-touch".

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a multimedia file playing mode, a broadcast receiving mode, or the like. The sound output module 152 may also output a sound signal related to a function (eg, a warning sound, a notification sound, a route guidance voice, etc.) performed by the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The memory unit 160 may store a program for processing and controlling the controller 180 and temporarily stores input / output data (for example, a phone book, map information, audio, still image, video, etc.). You can also perform a function for storage. The memory unit 160 may also store a frequency of use of each of the data (for example, a frequently used destination and a frequency of use of each multimedia file). In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

When the mobile terminal 100 is connected to an external cradle, the interface unit 170 may be a path through which power from the cradle is supplied to the mobile terminal 100, or various commands A signal may be a path through which the signal is transmitted to the mobile terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for data communication, path search, black box recording, etc. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively. The touch input processing device according to the present invention is applied to the controller 180.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

Hereinafter, a touch input processing apparatus according to the present invention applied to the controller 180 will be described in detail.

The touch input processing device according to the present invention includes a touch detector, a controller, and a function processor.

The touch detector detects touches of the first type (single) and the second type (multi) of the user on the touch screen.

If the first type of touch input is detected through the touch detector while a predetermined application is running, the controller causes the first function corresponding to the first type to be performed in the application, and the second type of touch input is input through the touch detector. If detected, the function processor controls the application to perform a second function corresponding to the second form. In this case, the first type of touch input refers to an input in which one touch point occurs at the same time, and the second type of touch input refers to an input in which two or more touch points occur at the same time.

Here, when the application includes a plurality of menu pages, and when at least one page of the plurality of menu pages is displayed on the touch screen, the controller is configured to remove from the displayed at least one page when a first type of touch input is detected. A page scrolled by one unit page is displayed, and when a touch input of a second type is detected, a page scrolled by a second unit page different from the first unit is displayed from the at least one displayed page.

In addition, the application includes a plurality of menu pages arranged in a predetermined form, at least one of the plurality of menu pages is given a predetermined property, at least one of the plurality of menu pages on the touch screen When the page is displayed, the controller is configured to scroll by the first unit page from the at least one displayed page when the first type of touch input is detected, and when the touch type of the second type is detected, the predetermined From the at least one displayed page of the at least one page to which the attribute is assigned, the closest page located in the direction corresponding to the direction of the second type of touch input is displayed.

2 is a front view of an embodiment of a mobile terminal to which the present invention is applied.

As shown in FIG. 2, the mobile terminal 100 to which the present invention is applied has a terminal body having a bar shape. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The terminal body may include a display unit 151, an audio output unit 152, a camera 121, a user input unit 130, a microphone 122, an interface 170, and the like.

The display unit 151 occupies most of the main surface of the front case. The audio output unit 151 and the camera 121 may be disposed in regions adjacent to one end of both ends of the display unit 151, and the user input unit 131 and the microphone 122 may be disposed in regions adjacent to the other end. . The user input unit 130 and the interface 170 may be disposed on side surfaces of the front case and / or the rear case.

The user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100 and may include a plurality of operation units. The operating units may also be referred to as manipulating portions and may be employed in any manner as long as the user is operating in a tactile manner with a tactile impression.

The contents input by the operation units can be variously set. For example, the first manipulation unit receives a command such as start, end, scroll, and the like, and the second manipulation unit adjusts the volume of the sound output from the sound output unit 152 or enters the touch recognition mode of the display unit 151. Commands such as switching can be received.

Various types of time information can be displayed on the display unit 151. [ These pieces of information can be displayed in the form of letters, numbers, symbols, graphics, or icons.

At least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement for inputting such information, thereby being implemented as a keypad. Such a keypad may be referred to as a so-called " virtual keypad ".

2 illustrates receiving a touch applied to the virtual keypad through the front of the terminal body. The display unit 151 may operate as an entire area or may be divided into a plurality of areas and operated. In the latter case, the plurality of areas can be configured to operate in association with each other.

For example, an output window 151a and an input window 151b are displayed on the upper and lower portions of the display unit 151, respectively. The output window 151a and the input window 151b are areas allocated for outputting or inputting information, respectively. The virtual keypad 151c is displayed on the input window 151b in which numbers for inputting an address / address and the like are displayed. When the virtual keypad 151c is touched, numbers corresponding to the touched virtual keypad may be displayed in one area of the output window 151a. When the first manipulation unit 131 is manipulated, an operation such as interruption of a function currently being operated or power off may be performed.

In addition to the input method disclosed in the above embodiments, the display unit 151 or the touch pad may be configured to receive a touch input by scrolling. The user may move a cursor or a pointer located on an object displayed on the display unit 151, for example, an icon, by scrolling the display unit 151 or the touch pad. Further, when the finger is moved on the display unit 151 or the touch pad, the path along which the finger moves may be visually displayed on the display unit 151. [ This will be useful for editing the image displayed on the display unit 151. [

On the other hand, an arrow or finger graphic for pointing a specific object or selecting a menu in the display unit 151 is referred to as a pointer or a cursor. However, in the case of a pointer, it often means a finger or a stylus pen for a touch operation or the like. Therefore, in the present specification, the graphic displayed on the display unit 151 is referred to as a cursor to clearly distinguish the two, and a physical means capable of performing touch, proximity touch, and gesture such as a finger or a stylus pen is referred to as a pointer. It is called.

3 is a front view of an embodiment of a vehicle-mounted terminal to which the present invention is applied.

As shown in FIG. 3, the vehicle-mounted terminal to which the present invention is applied is embedded in a vehicle, as well as a fixed vehicle-mounted terminal that provides a navigation screen and a multimedia playback screen, and separately attached to the vehicle to provide a navigation screen and a multimedia playback screen. It includes a fixed vehicle mounting terminal to provide.

Such a vehicle-mounted terminal also includes a touch screen to provide various interfaces according to a touch method, and provides all functions of the mobile terminal.

Hereinafter, the touch input processing apparatus according to the present invention provides a touch user interface in which various functions are performed through different touch input patterns. More specifically, when a specific function is being performed, the touch input processing apparatus recognizes a case where a single touch input is detected through a touch screen and when a multi-touch input is detected, recognizes a different command and recognizes the corresponding touch input. It is possible to perform the operation according to.

As an example of operations according to different touch inputs, a form in which the touch user interface according to the present embodiment is applied to scrolling of a menu composed of a plurality of pages will be described with reference to FIGS. 4 to 6.

4 shows an example of a play list structure composed of a plurality of pages in a music reproduction application.

Referring to FIG. 4, the play list 300 includes 13 pages 301 to 313 in total, and each page is disposed adjacent to each other in the horizontal direction. For example, one page 301 is located on the left side of the two pages 302, and three pages 303 are located on the right side. At this time, it is assumed that bookmarks are set on pages 1 and 12, and one page corresponds to the full screen size of the touch screen 151.

In the menu configuration as illustrated in FIG. 4, page switching according to a general touch input will be described with reference to FIG. 5.

5 illustrates an example in which a page is scrolled according to a single touch input in the touch input processing apparatus according to the present invention.

First, when two pages 302 are displayed on the touch screen as shown in (a) of FIG. 5, the user is located at the top of the page in order to change the currently displayed page to the next page (ie, three pages). The page move button 420 may be selected as a touch input through the pointer 430. Accordingly, as shown in FIG. 5C, three pages 303 are displayed on the touch screen.

Instead of touching the move button, as shown in (b) of FIG. 5, the pointer 430 having one touch point (in this case, the user's finger) is dragged to the left in contact with a point on a page displayed on the touch screen. As a method of releasing the contact state, three pages 303 may be displayed as shown in FIG. 5C.

Hereinafter, for convenience, a touch input for releasing the contact state after moving in one direction while the pointer is in contact with the touch screen is referred to as a “flicking touch”.

Of course, when moving to the previous page, the user may select the previous page move button 410 or apply a flicking touch input in the right direction.

Since the page moving method described above with reference to FIG. 5 is scrolled by one page with one touch input, in order to move to a specific page, for example, a favorite page, the current page and the page on which the bookmark is set are touched apart. There is an inconvenience of having to perform the input. In order to solve this problem, in the present embodiment, when the flicking input through the multi-touch is applied in an arbitrary direction, the favorite page closest to the corresponding direction may be immediately displayed. This will be described with reference to FIG.

6 illustrates an example in which a page is scrolled according to a multi-touch input in the touch input processing apparatus according to the present invention.

In FIG. 6, it is assumed that a menu configuration as shown in FIG. 4 is applied.

First, in a state in which two pages 302 are displayed as shown in FIG. 6A, the user flicks left or bottom with multi-touch using two or more pointers (here, two fingers 510 and 520). Touch input can be applied.

Accordingly, as shown in (b) of FIG. 6, page 12 (312), which is a favorite page set on the right side of page 2 (302), is displayed. At this time, when the flicking touch input is applied in the right direction or the upper direction again by multi-touch as shown in (c) of FIG. 6, the favorite-set page located on the left side of the page 1212 312 as shown in FIG. 1 page 301 may be displayed.

Hereinafter, a method of setting a bookmark page that can be directly moved through multi-touch will be described with reference to FIG. 7.

First, referring to FIG. 7A, a bookmark may be set on at least some of a plurality of pages provided by a corresponding application according to a predetermined command input through the user input unit 130 and / or the touch screen 151. A menu screen may be displayed. The user may select a box corresponding to a page for which a bookmark is to be set to make a check state, or select a box that has already been checked to release the bookmark setting.

Meanwhile, when a multi-touch input is detected, instead of moving to a page where a parameter is set in advance, a predetermined number of pages may be scrolled. For example, if the preset number of pages is 5, the pages are scrolled to correspond to the flicking direction by one page in general flicking, but when the multi-touch flicking is input, the pages are scrolled to correspond to the flicking direction by five pages. It may be scrolled. In detail, if multi-touch flicking is input from page 1 to left, page 6 may be immediately displayed. When the multi-touch input corresponds to the above operation, the setting menu for setting the scroll unit may be configured as shown in FIG. 7B.

Referring to FIG. 7B, the user may select one of the check boxes corresponding to the number of pages 2, 5, and 10 prepared in advance, or set a desired number through manual input. Manual input may be performed through the virtual keypad 151b of FIG. 2 or the like.

Meanwhile, according to another exemplary embodiment of the present invention, a separate function may be performed instead of changing the performance unit / degree of the same function according to different types of touch inputs. This will be described with reference to FIG. 8.

8 illustrates an example in which different functions are performed according to different touch inputs in the touch input processing apparatus according to the present invention.

First, referring to FIG. 8A, a route guidance function is executed on the touch screen 151 of the mobile terminal to display a map of the vicinity and a current location 701 thereof. At this time, when the user inputs the flicking touch to the right through the pointer 710 having one contact point, the map may scroll to the right as shown in FIG. Of course, when the flicking touch is input to the left, the map will scroll to the left, and the map to which the flicking touch is input downward will be scrolled downward.

In contrast, when the user inputs multi-touch flicking in the right direction by simultaneously using the two pointers 710 and 720 as shown in FIG. 8C, the magnification of the map may be reduced as shown in FIG. 8D. have. Conversely, if multi-touch flicking is input in the left direction, the magnification of the map may be enlarged.

The page configuration shown in FIGS. 4 to 8, the number of pages, the type of application / list, the direction of the drag touch, and the setting screen are exemplary, and the present invention is not limited thereto. It may be applied to the flicking touch input of the vertical direction or the diagonal direction).

On the other hand, it is assumed that all the above-described touch input is input in a straight line toward one direction. By the way, when the mobile terminal is implemented in the form of a vehicle navigation, it is generally operated by the driver, it is generally disposed in the center of the vehicle rather than the front of the seat. Therefore, when the user performs a touch operation, it is not easy to apply a touch input in the horizontal or vertical direction of the touch screen precisely in the structure of the human body. In order to solve this problem, another embodiment of the present invention proposes to learn a user's touch pattern and apply it to the recognition of the user's touch input. This will be described with reference to FIG. 8.

9 illustrates an example of a touch pattern learning and recognition correction process in the touch input processing device according to the present invention.

When the driver's seat is located on the left side of the vehicle and the navigation device is located in the center of the vehicle, the navigation device is located on the right side of the driver relatively. In this case, even if the touch input is applied in parallel according to the structure of the human body or the user's habit, the trajectory of the touch through the actual pointer 800 becomes similar to that of FIG. 9A. In addition, the vertical touch input also has a form similar to that of FIG. 9B.

Accordingly, the controller 180 may receive a touch input in a horizontal / vertical direction through a predetermined setting menu in advance or store and average the touch pattern of the user cumulatively while performing a general operation. The touch pattern information of the user as shown in c) may be generated. Thereafter, when the touch input is detected on the touch screen, the controller 180 may recognize the command intended by the user more accurately than the stored touch pattern even though the touch input is not exactly horizontal or vertical.

10 is a flowchart illustrating a touch input processing method according to the present invention.

First, the touch detector detects a user's single flickering touch and multi-flicking touch on the touch screen (901).

Thereafter, the controller determines the type of touch (902).

As a result of the determination 902, if the single flicking touch input is performed, the function processor controls the function processor to perform a first function corresponding to the single flicking touch in the application (903).

As a result of the determination 902, if the multi-flicking touch input is performed, the function processor controls the function processor to perform a second function corresponding to the multi-flicking touch in the application (904).

Meanwhile, the method of the present invention as described above can be written in a computer program. And the code and code segments constituting the program can be easily deduced by a computer programmer in the field. In addition, the written program is stored in a computer-readable recording medium (information storage medium), and read and executed by a computer to implement the method of the present invention. The recording medium may include any type of computer readable recording medium.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. The present invention is not limited to the drawings.

130: user input unit 180: control unit

Claims (5)

Touch sensing means for sensing a single flicking touch and a multi-flicking touch on the touch screen;
If the touch sensing means detects a single flickering touch, the function processing means controls the first processing corresponding to the single flickering touch, and if the touch sensing means senses the multi-flickering touch, the multi-flickering is performed. Control means for controlling the function processing means to perform a touch corresponding second function; And
The function processing means for performing a first function or a second function according to the control of the control means
Touch input processing device comprising a.
The method of claim 1,
The second function is,
And a function of displaying a preset page corresponding to the input direction of the multi-flicking touch.
3. The method according to claim 1 or 2,
Wherein,
And a cumulative storage and averaging of the single flicking touch and the multiflicking touch detected by the touch sensing means to increase the accuracy of recognition.
Detecting a single flicking touch on the touch screen;
Performing a first function corresponding to the sensed single flicking touch;
Detecting a multi-flicking touch on the touch screen; And
Performing a second function corresponding to the sensed multi-flicking touch
Touch input processing method comprising a.
The method of claim 4, wherein
Accumulating and storing the sensed single flickering touch and the multi-flicking touch to generate pattern information
Touch input processing method further comprising.
KR1020110082253A 2011-08-18 2011-08-18 Apparatus and method for processing touch input KR20130019943A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110082253A KR20130019943A (en) 2011-08-18 2011-08-18 Apparatus and method for processing touch input

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110082253A KR20130019943A (en) 2011-08-18 2011-08-18 Apparatus and method for processing touch input

Publications (1)

Publication Number Publication Date
KR20130019943A true KR20130019943A (en) 2013-02-27

Family

ID=47897973

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110082253A KR20130019943A (en) 2011-08-18 2011-08-18 Apparatus and method for processing touch input

Country Status (1)

Country Link
KR (1) KR20130019943A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101510013B1 (en) * 2013-12-18 2015-04-07 현대자동차주식회사 Multi handling system and method using touch pad

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101510013B1 (en) * 2013-12-18 2015-04-07 현대자동차주식회사 Multi handling system and method using touch pad

Similar Documents

Publication Publication Date Title
KR101326994B1 (en) A contents control system and method for optimizing information of display wherein mobile device
US9742904B2 (en) Mobile terminal and method for controlling the same
KR101338825B1 (en) A controlling method and apparatus for display particular in selected area by using dynamic touch interaction
US9110585B2 (en) Method for providing user interface using drawn pattern and mobile terminal thereof
KR101973631B1 (en) Electronic Device And Method Of Controlling The Same
KR101901611B1 (en) Mobile terminal and control method for the mobile terminal
US20150031417A1 (en) Mobile terminal
US20110319138A1 (en) Mobile terminal and method for controlling operation of the mobile terminal
US20130145309A1 (en) Method and apparatus of controlling division screen interlocking display using dynamic touch interaction
KR20100125635A (en) The method for executing menu in mobile terminal and mobile terminal using the same
KR102056189B1 (en) Mobile terminal and method for controlling thereof
KR20140112851A (en) Mobile terminal and control method for the mobile terminal
KR101745002B1 (en) Apparatus and method for displaying a plurality of application
KR102058994B1 (en) Mobile terminal and method for controlling mobile terminal
KR20130080102A (en) Mobile terminal and control method therof
KR20130078236A (en) Mobile terminal and controlling method thereof, and recording medium thereof
US20160196058A1 (en) Mobile terminal and control method thereof
KR101615984B1 (en) Mobile terminal and method for controlling thereof
KR20130019943A (en) Apparatus and method for processing touch input
KR101405566B1 (en) A sequential image switching method and apparatus thereof using dynamic touch interaction
KR101919778B1 (en) Mobile terminal and method for controlling thereof, and recording medium thereof
KR20100117417A (en) Method for executing application in mobile terminal and mobile terminal using the same
KR101595381B1 (en) Mobile terminal and gesture input processing method thereof
KR20130031112A (en) Apparatus and for setting home menu
KR101856258B1 (en) A screen dividing method using dynamic touch interaction

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application