KR20100100413A - Touch based interface device, method, mobile device and touch pad using the same - Google Patents

Touch based interface device, method, mobile device and touch pad using the same Download PDF

Info

Publication number
KR20100100413A
KR20100100413A KR1020090019296A KR20090019296A KR20100100413A KR 20100100413 A KR20100100413 A KR 20100100413A KR 1020090019296 A KR1020090019296 A KR 1020090019296A KR 20090019296 A KR20090019296 A KR 20090019296A KR 20100100413 A KR20100100413 A KR 20100100413A
Authority
KR
South Korea
Prior art keywords
touch
point
input means
based interface
zoom
Prior art date
Application number
KR1020090019296A
Other languages
Korean (ko)
Inventor
임창영
황성재
Original Assignee
한국과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술원 filed Critical 한국과학기술원
Priority to KR1020090019296A priority Critical patent/KR20100100413A/en
Publication of KR20100100413A publication Critical patent/KR20100100413A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A touch based interface device and an input method are provided.

The touch-based interface device according to the present invention is characterized in that it recognizes a single point gesture and generates an image command for an image of two-dimensional or higher dimensions through the single point gesture, and can perform a plurality of commands in a very simple manner. have. For example, you can easily zoom-in, zoom-out, rotate, and move in the same way as you would with a multi-touch panel, even in one-hand situations, so you can quickly perform complex commands that were only a few steps in the prior art. You can do it. In addition, the virtual finger is displayed at the position used in the actual multi-touch, so that the user can intuitively understand it, and even without using expensive multi-touch hardware, the main part of the multi-touch function can be implemented, resulting in cost reduction effect. Can be.

Description

Touch based interface device, method, mobile device and touch pad using the same

The present invention relates to a touch-based interface device, a method, a mobile device and a touch pad using the same, and more particularly, to specify a virtual finger according to the position of the user's finger in a limited situation in which the user uses the mobile device with one hand. The present invention relates to a touch-based interface device, a method, a mobile device using the same, and a touch pad that allow a user to feel the device as if the user handles the device with two hands to enable the multi-touch function.

The touch-based technology refers to a user interface device capable of detecting a touch position on a screen (screen) or a touch pad and performing specific processing by stored software. In general, touch panels widely used in mobile devices are coated with a resistive material on a transparent plastic plate to detect a position due to a change in resistance upon contact, and a surface that recognizes a touch point due to interference of sound waves. Surface Acoustic Wave, Capacitive Overlay that recognizes touch points by analyzing high frequency waveforms, Infrared method that recognizes touch points by placing infrared light emitting elements and light receiving elements facing up, down, left, and right of the touch panel It is divided into (Infrared Beam). The dual pressure type touch screen has a function to determine the corresponding position by changing the resistance of the contact point when an input means such as a fingertip or other object comes into contact with the grid, and such a touch panel is a portable input device. 'S mobile device uses this approach. Furthermore, as the recent technological developments have made more complex and various operations possible in the mobile device, various interfaces for performing them are inevitably required. Accordingly, in order to perform more complex and various operations in a mobile device, a multi-touch input method that extracts a plurality of touch points and enables various commands has emerged. However, the existing multi-touch panel overlooks that most mobile device operations are performed with one hand, and there is an inconvenience of using two hands or two fingers. In addition, even when using a large display, such as a multi-touch tabletop interface, there was a limit to dealing with objects of a certain size.

In addition, a multi-touch panel using two fingers or both hands cannot zoom in, zoom out, or the like of a very small object due to the size of the finger. That is, it is virtually impossible to perform a function such as zooming in an object smaller than the size of a finger when at least two fingers meet, which in turn serves as a limitation of the conventional multi-touch panel. Furthermore, for large area touch panels it is virtually impossible to zoom in one time beyond the width of both hands. This is also the inherent limitation of multi-touch panels, where one user must touch both hands at the same time.

Therefore, there is a need for a new touch panel input method and method device capable of simply performing a complicated operation with one hand, but a method for effectively performing this is currently not disclosed.

Therefore, the present invention has been proposed in view of the above-described problems of the prior art, and the first problem to be solved by the present invention is a simpler and simpler one-handed situation. An apparatus and its utilization device are provided.

The second problem to be solved by the present invention is to provide a touch-based interface method and a method of using the same that enables the main commands of the multi-touch even in hardware that is not multi-touch.

In order to solve the first problem, the present invention provides a device for recognizing a single point movement and generating an image command for an image of two-dimensional or higher dimensions through the single point movement, wherein the image command is zoom-in, zooming. It may be a -out or rotation command.

In one embodiment of the present invention, the touch-based interface device includes a position detector for detecting a touch position of the actual input means; A calculator configured to generate a variable point corresponding to the touch position of the actual input means and calculate the position; And a command unit generating an image command signal according to the movement of the actual input means and the movement of the variable point corresponding thereto. In addition, the variable point generation is made at a point symmetrical to the input position of the actual input means by using a predetermined point as a center point, the center point may be set by the user or may be the center of the screen of the touch panel. . In another embodiment of the present invention, a plurality of variable points may be generated.

In one embodiment of the present invention, the movement of the variable point may be performed according to a preset method, which may be performed in a manner symmetrical to the movement of the actual input means. In addition, the user input means may be one finger. In an embodiment of the present disclosure, the touch-based interface device may further include a display configured to display a virtual finger on the screen at the variable point.

The variable point generation may be performed when a preset condition is satisfied. In one embodiment of the present invention, the condition is a touch time of the actual input means, and a variable point is generated when the touch time exceeds a predetermined time. Can be. In one embodiment of the present invention, the image command may be a zoom-in or zoom-out command or a rotation command.

The present invention provides a mobile device and a touch pad input device including the touch-based interface device described above as an aspect of a specific configuration.

In order to solve the second problem, the present invention comprises the steps of recognizing a single point gesture; And generating an image command for an image of two or more dimensions according to the recognized single point gesture.

The touch-based interface input method may include detecting a touch position of an actual input means; Generating a variable point at a position corresponding to the actual input means; And moving the variable point in response to the movement of the user input means according to the set profile, thereby generating an image command signal by a change in position between the user input means and the variable point. May be generated at a point symmetrical to a touch position of the actual input means with respect to a preset center point, and the movement of the variable point may be symmetric to the movement of the actual input means with respect to the center point. In addition, a virtual finger may be generated at a position corresponding to the variable point, and the variable point generation is performed when a preset condition is satisfied. In one embodiment of the present invention, the condition is a touch time of an actual input means. The variable point may be generated when the touch time of the actual input means exceeds a preset time.

In one embodiment of the present invention, the image command signal may be a zoom-in or zoom-out command or a rotation command.

The present invention provides a mobile device and a touch pad input method using the touch-based interface input method as an aspect for solving the second problem.

The touch-based interface input method and apparatus of the present invention having the above configuration and action can perform a plurality of commands in a very simple manner. For example, you can easily zoom in, zoom out, rotate, and move the multi-touch panel in one-hand situations, so that you can quickly perform complex commands that were possible in a few steps in the prior art. do. In particular, mobile devices, such as mobile phones, where a one-handed input method is very common, will greatly expand the scope of touch-based technology. In addition, structural limitations that arise when using two hands or two hands (for example, they cannot be adjusted within the range of two fingers width, and in the case of large touch screens, the limitations of human hand width are difficult to enlarge) ) Can all be overcome. Furthermore, in the case where the variable point is generated at the position used in the actual multi-touch, and the variable point is moved and displayed according to the actual finger movement, the user does not have to use two hands or two fingers (that is, Multi-point gestures), two-dimensional image change commands (zoom-in, zoom-out, rotation) can be intuitively understood, and the main part of the multi-touch function can be realized even without expensive multi-touch hardware. It can bring savings.

In order to solve the above problems, the present invention provides a device for recognizing a single point motion and generating an image command for an image of two-dimensional or higher dimensions through the single point motion in the touch-based interface device as described above. That is, the present invention provides an apparatus for generating an image command for an image of two-dimensional or two-dimensional image even through a single-point gesture (ie, movement of one touch point such as one finger), wherein the image command is particularly zoom-in. , Zoom-out, rotate, or the like. In the case of a conventional touch panel, a single point gesture (movement) such as one finger only performs a predetermined limited command (for example, click, scroll, enlarge or reduce a preset range of the screen), and respond to the single point gesture. Accordingly, it was difficult to simultaneously perform commands such as zoom-in, zoom-out, and rotation on two-dimensional or higher-dimensional images. However, the present inventors solved this problem by generating a variable point corresponding to the single point on the screen, where the variable point is a coordinate value corresponding to the user's actual touch point, and the variable point is the movement of the actual touch point. Seems to correspond to the movement. The corresponding movement of the variable point is described in more detail below.

In the present invention, the solution is a position detection unit for detecting the touch position of the actual input means; A calculator configured to generate a variable point corresponding to the touch position of the actual input means and calculate the position; And a command unit generating an image command signal according to the movement of the actual input means and the movement of the variable point corresponding thereto. The present invention provides a new touch-based interface input method similar to using two hands by generating a variable point at a point corresponding to the position of the user's finger, which is the actual input means, and moving it to correspond to the movement of the user's finger. do. The touch-based interface input method according to the present invention is applicable not only to a mobile device including a touch screen, but also to any device including a touch function, and to a laptop having a touch-based interface device such as a touch pad. , Which belongs to the scope of the present invention.

If implemented in the touch pad, for example, a user can freely enlarge / reduce a window size on the screen.

In one embodiment of the present invention, by creating a virtual finger on the variable point on the screen, the user is more intuitive to use the touch panel or touch pad.

The variable point may be generated at a point corresponding to a touch point of the actual input means. The inventors of the present invention have a point at which the variable point is symmetrical to an input position of the actual input means with a predetermined point as a center point. It was done.

In particular, in an embodiment of the present invention, the center point is set by the user as well as the method of presetting the center point.

1 and 2 are diagrams illustrating a center point setting method and a variable point motion method according to an embodiment of the present invention.

Referring to FIG. 1, a variable point is generated and moved at a point symmetrical with respect to an actual input (touch) point with respect to a screen center point. That is, when the user's finger approaches the screen center point by sliding, the variable point also approaches the screen center point in response to the user's finger movement. Conversely, when the user's finger moves away from the screen center point, the variable point also moves away from the screen. In this case, the target screen of the 2D image command such as zoom-in, zoom-out, rotation, etc. may be the entire screen 100. Therefore, in order to solve this problem, the present inventor proposes a method of directly setting a target screen of an image command as shown in FIG. 2.

2 is a view for explaining a center point setting method according to another embodiment of the present invention.

Referring to FIG. 2, first, a user sets a center point of a target area 100a to be enlarged. The setting method may be various methods, for example, can be set through a double touch, a touch lasting for a predetermined time or the like, and all of them can be distinguished from a general touch command and a command for setting a center point. Belong. That is, when the inventor sets an arbitrary point as a center point and touches the input means on the touch panel with a predetermined condition, a variable point is generated at a point symmetrical in the X-Y axis about the center point. The center point setting method may be performed in various ways as described above (for example, discrete double touch, etc.), and the center point setting method by the user may have the same effect as the method of using two fingers of the conventional multi-touch ( That is, the same effect as the user initially setting the width of both fingers freely). Furthermore, unlike the prior art in which one zoom-in command is possible only within the maximum length range of two fingers, the technique according to the present invention allows one zoom-in command in all ranges in which the actual input means can move. There is an advantage.

In the above-described method, when the user moves the actual input means after setting the center point, the variable point moves symmetrically about the center point (that is, when the actual input means approaches the center point, the variable point also approaches, and vice versa. In this case, only the variable point also moves away. However, the symmetrical movement method is just an example, and the variable point may move in another manner desired by the user.

In addition, the present invention generates an image such as a virtual finger on the variable point in order to intuitively create the position and movement of the variable point to the user. The virtual finger represents a movement corresponding to the movement of the variable point, and as a result, the user can obtain an image effect as if two fingers are actually used. However, any way in which the variable point can be intuitive to the user is within the scope of the present invention, and the present invention is not limited to the form of an image simply called a virtual finger.

Furthermore, the inventors of the present invention, if the user can freely generate a variable point on the screen according to the user's desired conditions in the one-handed mobile device, and embody it as an image such as a virtual finger, the range of use of the conventional mobile device is more The present invention has been conceived in that it can be broadened.

In the present invention, since a touch command for performing a command such as a conventional scroll and a command for generating a virtual finger as a variable point should be distinguished, the present invention proposes a touch time as the variable point generation condition. However, in addition to this, a virtual finger may be generated under various conditions (for example, a double touch, etc.), and if a variable point is generated under any condition, this is within the scope of the present invention.

Hereinafter, the present invention will be described in detail with reference to an embodiment in which a virtual finger is generated at a variable point.

3 is a block diagram illustrating a new touch-based interface input device according to an embodiment of the present invention.

Referring to Figure 3, the present invention is to solve the problem that the multi-touch technology of the method that relies on only a plurality of actual input means itself, such as a conventional user's finger is quite complicated to perform all operations with one hand, another A virtual finger, which is a virtual input means, is generated at a variable point, and a command such as zoom-in or zoom-out is performed through coordinates and movements of the virtual finger.

That is, the present invention focuses on minimizing a user's input means when operating a mobile device or the like with one hand, and using the touch holding time of the user input means as a condition for generating a virtual finger as the virtual input means. Set it. Thus, when the user touches the touch panel or the touch pad with one hand for a predetermined time or more, the touch time is calculated, and when a predetermined set time elapses, a variable point is generated, and the variable point of the user A virtual finger, which is a virtual input means for intuition, is generated.

Accordingly, the touch-based interface input device according to the present invention is provided with a finger position detector 100 for detecting a touch position of a real user's finger and a time calculator 200 for calculating the touch time. In addition, when the touch time is more than a predetermined time, the operation unit 300 extracts the coordinates and movements of the virtual finger (variable point) corresponding to the actual input means, and the display unit 400 displaying the virtual finger according to the extracted information. Also provided is a command unit (not shown) for performing commands such as screen zoom-in and zoom-out according to the virtual finger coordinates and movement of the calculator.

According to the present invention, when the position of the actual user's finger is detected, the position and movement of the virtual finger are calculated and displayed on the screen. In particular, the present invention utilizes the information according to the change of the position of the actual user input means and the virtual finger as a change of the actual display image, for example, zoom-in, zoom-out, rotation, etc., which will be described in more detail below. do.

Zoom in and zoom out functions

The present invention utilizes the distance between the virtual finger and the real user finger to zoom in and out of the display screen.

That is, when the user finger position, x, y is detected by the detection unit of the touch-based interface input device according to the present invention, and the virtual finger positions x ', y' corresponding thereto are calculated, the user finger position and the virtual finger position The distance between them is calculated by the following equation.

Figure 112009013883758-PAT00001

That is, according to the present invention, the position of the virtual finger is also changed as the position of the actual user's finger is changed. As a result, the distance between the virtual finger and the actual finger is changed. Zoom-in or zoom-out commands such as zooming in or out.

4 is a diagram illustrating a zoom-in and a zoom-out function using one-hand touch with respect to a touch panel according to an exemplary embodiment of the present invention.

Referring to FIG. 4, first, when a user clicks the first position 301 for a predetermined time or more, the touch time is calculated, and the virtual finger 302 is generated and positioned at a place corresponding to the first position. The position of generating the virtual finger 302 may be appropriately adjusted according to the actual touch panel display size. For example, a virtual finger may be generated at a point symmetrical to the actual user input means detection position with respect to the center of the object that is the object of zoom-in or zoom-out or the center point of the screen, or may be based on an arbitrary preset point. Virtual fingers can be created at variable point positions that are symmetrical to the actual input position. In this case, there is an advantage in that it is possible to determine a point to zoom in and zoom out and to determine a center of rotation of the object.

Thereafter, when the user moves (ie, drags or slides) a finger in a specific direction 303 at the first position 301, the virtual finger also moves in a symmetrical direction. In this case, the distance between the position of the actual user finger and the virtual finger is also shortened. In this case, the entire display screen is zoomed out in response to the shortened length. On the contrary, when the distance between the actual finger and the virtual finger becomes long, it is zoomed in.

Another embodiment of the present invention shows an example of zooming in or out of a specific object within the entire screen as well as the entire screen.

5 is a diagram illustrating a zoom-in and a zoom-out function according to another exemplary embodiment of the present invention, and FIG. 6 is a photograph showing a virtual finger actually implemented.

Referring to FIG. 5, in the exemplary embodiment of the present invention, the user touches a point 304 in a specific picture of the entire display screen. In this case, when the user touch time passes a predetermined time, a virtual finger corresponding to the user finger is generated on the screen, wherein the virtual finger is located in the specific area. If the user moves the finger from one point 304 in the specific area in one direction 305, the specific area is zoomed in and in the opposite direction, zoomed out. Therefore, when a user clicks on an area where an object such as a picture is displayed for a predetermined time or more, a virtual finger is generated in the object area, thereby zooming in or out of the object.

Rotation function

According to the present invention, when a finger is rotated in a predetermined direction 306, an object of a specific region is rotated, wherein the rotation angle θ of the object at time t is calculated according to Equation 2 below.

Figure 112009013883758-PAT00002

 7 is a view showing the rotation of the object according to the present invention, Figure 8 is a view for explaining the principle of the rotation function according to another embodiment of the present invention.

7 and 8, the user changed the touch position from the first position 401 (time t-1) to the second position 402 (time t) in one cycle. In this case, the rotation angle θ of the object is determined as a change angle generated by a line connecting the center 403, the first position 401, and the second position 402 of the object. That is, the rotation angle of the object also rotates the object on the screen corresponding to the change angle calculated according to Equation 2, and the rotation direction is determined according to the sign of the rotation angle.

 9 is a flowchart illustrating a touch panel input method according to an exemplary embodiment of the present invention.

Referring to Figure 9, the touch panel input method according to the present invention improves the inconvenience of the prior art that the user of the multi-touch mobile terminal has to hold the terminal in one hand and change the object size and rotate command with both hands. When the touch panel is pressed for a predetermined time, a virtual finger appears at a position corresponding to the actual touch position, and various image commands such as zoom-in and zoom-out are performed using the virtual finger. .

The touch panel input method according to an embodiment of the present invention is first started by detecting an actual touch position of a user. Subsequently, it is determined whether the touch is the first detected touch. If the touch is the first touch, a virtual finger is generated when the touch state is maintained for a predetermined time or more at a fixed position. However, when the touch is not the first touch (for example, when the user continuously moves the finger on the screen), the virtual finger is positioned at a position corresponding to the touch position of the user. That is, the present invention does not enter the virtual finger mode after waiting a certain time in the case of a continuous virtual finger command in order to reduce the time to enter the virtual finger mode, and maintains the virtual finger command mode state without the virtual finger mode waiting time. As a result, the user waits for the virtual finger mode.

Subsequently, the screen image change (zoom-in, zoom-out, rotation) command using the virtual finger is performed according to the above-described method. If the user does not touch, the virtual finger command mode ends.

According to the present invention, after the virtual finger is generated and positioned on the screen, the distance and the rotation angle between the virtual finger and the actual finger are calculated according to the movement of the actual finger, and zoom-in with the calculated information. It is to perform commands such as -out and rotation. According to the above process, a device using a single touch panel, in particular, a mobile terminal user, can simply reduce the number of steps required for resizing and rotating the object. In addition, by using such a method, a command related to multi-touch may be performed even in a conventional single touch terminal.

Furthermore, the image command according to the present invention can be applied not only to two-dimensional images but also to images of higher dimensions (for example, three-dimensional images). FIG. 10 is a view showing an application of the present invention to three-dimensional images. .

Referring to FIG. 10, it can be seen that a virtual finger is generated at a variable point position corresponding to the position 901 touched by the user. In this case, the generated virtual finger (variable point) may be a number of places such as the virtual finger first position 902 and the virtual finger second position 903, and perform different commands according to the position of the virtual finger. That is, three-dimensional enlargement and reduction of the three-dimensional image is also possible, which also belongs to the scope of the present invention. In this case, a plurality of variable points are generated.

Those skilled in the art will appreciate that the present invention may be embodied in other specific forms and operations without changing the technical spirit or essential features of the present invention. Therefore, the embodiments described above are to be understood in all respects as illustrative and not restrictive. In addition, the scope of the present invention is shown by the claims described below rather than the above description, it should be construed that all modifications derived from the meaning of the claims and equivalent concepts are included in the scope of the present invention.

1 is a diagram illustrating a center point setting method and a variable point motion method according to an embodiment of the present invention.

2 is a diagram illustrating a center point setting method and a variable point motion method according to another embodiment of the present invention.

3 is a block diagram illustrating a new touch-based interface input device according to the present invention.

4 is a diagram illustrating a zoom-in and a zoom-out function using one-hand touch of a touch-based interface input device according to an embodiment of the present invention.

5 is a diagram illustrating a zoom-in and a zoom-out function according to another embodiment of the present invention.

 6 is a photograph illustrating a zoom-in and a zoom-out function of a virtual finger actually implemented according to another embodiment of the present invention.

7 is a view showing the rotation of the object according to the present invention.

8 is a view for explaining the principle of the rotation function according to another embodiment of the present invention.

9 is a flowchart illustrating a touch-based interface input method according to an embodiment of the present invention.

10 is a view showing an application aspect of the present invention to a three-dimensional image.

Claims (26)

In the touch-based interface device, A touch-based interface device for recognizing a single point movement, and generates a video command for the image of two-dimensional or higher dimensions through the single point movement. The method of claim 1, And the image command is a zoom-in, zoom-out, or rotation command. The method of claim 1, wherein the touch-based interface device, A position detector for detecting a touch position of an actual input means; A calculator configured to generate a variable point corresponding to the touch position of the actual input means and calculate the position; And And a command unit generating an image command signal according to the movement of the actual input means and the movement of the variable point corresponding thereto. The method of claim 3, wherein The variable point generation is a touch-based interface device, characterized in that at a point symmetrical to the input position of the actual input means with a predetermined point as a center point. The method of claim 4, wherein And the center point is set by the user. The method of claim 4, wherein And a plurality of variable points are formed. The method of claim 3, wherein And the movement of the variable point is performed according to a preset method. The method of claim 3, wherein And the method is performed in a manner symmetrical to the movement of the actual input means. The method of claim 3, wherein And the actual input means is one finger. The method of claim 3, wherein The touch-based interface device further comprises a display unit for displaying a virtual finger on the screen at the variable point. The method of claim 3, wherein The variable point generation is performed when a preset condition is satisfied. The method of claim 11, The condition is a touch time of the actual input means, the touch-based interface device, characterized in that a variable point is generated when the touch time exceeds a predetermined time. The method of claim 3, wherein And the image command is a zoom-in or zoom-out command. The method of claim 12, And the image command is a rotation command. A mobile device comprising the touch based interface device according to any one of claims 1 to 14. 15. A touch pad input device comprising the touch based interface device according to claim 1. In the touch-based interface method, Recognizing a single point gesture; And And generating an image command for an image of two or more dimensions according to the recognized single point gesture. The method of claim 16, The touch-based input method, Detecting a touch position of an actual input means; Generating a variable point at a position corresponding to the actual input means; And moving the variable point in response to the movement of the user input means according to a preset profile, thereby generating an image command signal by a change in position between the user input means and the variable point. Touch based interface method. The method of claim 18, The variable point is generated at a point symmetrical to a touch position of the actual input means with respect to a preset center point, and the movement of the variable point is a method of being symmetrical to the movement of the actual input means with respect to the center point. Touch-based interface method. The method of claim 18, And generating a virtual finger at a position corresponding to the variable point. The method of claim 20, The variable point generation is performed when the preset condition is satisfied. The method of claim 21, The condition is a touch time of the actual input means, and when the touch time of the actual input means exceeds a predetermined time, the touch-based interface method, characterized in that for generating the variable point. The method of claim 17, And the image command signal is a zoom-in or zoom-out command. The method of claim 17, And the image command signal is a rotation command. A mobile device interface method using the touch-based input method of claim 17. The touch pad interface method using the touch-based input method of claim 17.
KR1020090019296A 2009-03-06 2009-03-06 Touch based interface device, method, mobile device and touch pad using the same KR20100100413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090019296A KR20100100413A (en) 2009-03-06 2009-03-06 Touch based interface device, method, mobile device and touch pad using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090019296A KR20100100413A (en) 2009-03-06 2009-03-06 Touch based interface device, method, mobile device and touch pad using the same

Publications (1)

Publication Number Publication Date
KR20100100413A true KR20100100413A (en) 2010-09-15

Family

ID=43006491

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090019296A KR20100100413A (en) 2009-03-06 2009-03-06 Touch based interface device, method, mobile device and touch pad using the same

Country Status (1)

Country Link
KR (1) KR20100100413A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101251578B1 (en) * 2010-11-16 2013-04-08 (주)파트론 Optical input device using side directional light source
WO2013081413A1 (en) * 2011-12-02 2013-06-06 (주)지티텔레콤 Method for operating scene on touch screen

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101251578B1 (en) * 2010-11-16 2013-04-08 (주)파트론 Optical input device using side directional light source
WO2013081413A1 (en) * 2011-12-02 2013-06-06 (주)지티텔레콤 Method for operating scene on touch screen
JP2014534544A (en) * 2011-12-02 2014-12-18 ジーティーテレコム Screen operation method on touch screen

Similar Documents

Publication Publication Date Title
KR101019128B1 (en) Input method and tools for touch panel, and mobile devices using the same
US20180059928A1 (en) Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US8823749B2 (en) User interface methods providing continuous zoom functionality
Malik et al. Visual touchpad: a two-handed gestural input device
US20070097151A1 (en) Behind-screen zoom for handheld computing devices
US9348458B2 (en) Gestures for touch sensitive input devices
JP5456529B2 (en) Method and computer system for manipulating graphical user interface objects
US20140098049A1 (en) Systems and methods for touch-based input on ultrasound devices
US20130082928A1 (en) Keyboard-based multi-touch input system using a displayed representation of a users hand
US20130257734A1 (en) Use of a sensor to enable touch and type modes for hands of a user via a keyboard
US20110102570A1 (en) Vision based pointing device emulation
TW201109994A (en) Method for controlling the display of a touch screen, user interface of the touch screen, and electronics using the same
US20140149945A1 (en) Electronic device and method for zooming in image
TW201501019A (en) Electronic device and judgment method for multi-window touch control instructions
KR20100136578A (en) Means for touch input and stylus pen, touch screen device and control method using the same
US20130106707A1 (en) Method and device for gesture determination
TW201319921A (en) Method for screen control and method for screen display on a touch screen
KR101056088B1 (en) Touch panel input device, method and mobile device using same
WO2017101340A1 (en) Method and device for adjusting video window by means of multi-point touch control
KR20100100413A (en) Touch based interface device, method, mobile device and touch pad using the same
CN105183353B (en) Multi-touch input method for touch equipment
KR20110006251A (en) Input method and tools for touch panel, and mobile devices using the same
Lei et al. The multiple-touch user interface revolution
KR20100106638A (en) Touch based interface device, method and mobile device and touch pad using the same
KR101436587B1 (en) Method for providing user interface using two point touch, and apparatus therefor

Legal Events

Date Code Title Description
N231 Notification of change of applicant
WITN Withdrawal due to no request for examination