KR20140117067A - Method for processing input by touch motion - Google Patents
Method for processing input by touch motion Download PDFInfo
- Publication number
- KR20140117067A KR20140117067A KR1020130031942A KR20130031942A KR20140117067A KR 20140117067 A KR20140117067 A KR 20140117067A KR 1020130031942 A KR1020130031942 A KR 1020130031942A KR 20130031942 A KR20130031942 A KR 20130031942A KR 20140117067 A KR20140117067 A KR 20140117067A
- Authority
- KR
- South Korea
- Prior art keywords
- virtual plane
- touch
- motion
- setting
- gui
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Abstract
Description
An embodiment according to the concept of the present invention relates to a method of processing an input using a touch motion, and more particularly to a method of detecting the touch motion using a virtual plane and processing input according to a detection result.
Electronic devices such as smart television, internet television, connected television, and open hybrid television are mainly controlled using a remote controller.
The electronic device can be easily controlled using the remote controller, but as the electronic device becomes more complicated, a remote controller having a more complicated structure is required. Particularly, as the number of buttons of the remote control increases, it is inconvenient to memorize the functions of the buttons.
In addition, electronic devices controlled through motion recognition can be controlled by predefined operations, such as special hand shapes, flexion of the joints, hand movements, and the like. However, the user has the inconvenience of memorizing the above-described predefined operations.
According to an aspect of the present invention, there is provided a method of detecting touch motion using a virtual surface and processing input according to a detection result.
A method of processing an input using touch motion according to an exemplary embodiment of the present invention includes a step of sensing a touch motion of a virtual surface corresponding to a control target display and a step of detecting a touch motion of a graphical user interface control module may control the GUI of the control target display according to the detection result.
According to an embodiment, the virtual surface touch detection module may sense the touch motion based on a depth of the virtual motion of the touch motion.
According to an embodiment, the GUI control module can control the GUI differently according to the depth.
According to an embodiment, the imaginary plane may be a curved surface or a spherical surface.
According to the embodiment, the GUI control module can control the GUI of the display differently according to the detected pattern of the touch motion.
According to an embodiment, each of the first coordinates of the first coordinate system of the virtual surface may correspond one-to-one to each of the second coordinates of the second coordinate system of the controlled display.
According to the embodiment, when the touch motion corresponding to the first coordinate of any one of the first coordinates is sensed, the GUI control module displays the GUI, which is located at the second coordinate corresponding to any one of the first coordinates, Can be selected.
According to the embodiment, before the step of sensing the touch motion, the virtual plane setting module may further include setting the virtual plane according to the virtual plane setting motion of the user.
According to an embodiment, the step of setting the virtual plane may set the virtual plane by setting at least two of the corners of the virtual plane according to the virtual plane setting motion.
According to an embodiment, the step of setting the virtual plane may set the virtual plane using the touch motion of the user.
According to an embodiment, the step of setting the virtual plane may set the virtual plane according to the virtual plane setting motion corresponding to the input trajectory of the user.
The method according to the embodiment of the present invention has an effect that a graphical user interface (GUI) can be easily controlled through a touch motion that touches a virtual surface.
Further, there is an effect that the virtual plane can be set as desired by the user through the virtual plane setting motion.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In order to more fully understand the drawings recited in the detailed description of the present invention, a detailed description of each drawing is provided.
1 is a conceptual diagram of a touch-motion input system according to an embodiment of the present invention.
FIG. 2 is a view for explaining a method of receiving a touch motion using the virtual plane shown in FIG.
3 is a block diagram according to an embodiment of the touch-motion input device shown in FIG.
4 is a view for explaining an embodiment of a method of setting the virtual plane shown in FIG.
5 is a view for explaining another embodiment of the method for setting the virtual plane shown in Fig.
FIG. 6 is a diagram for explaining another embodiment of the method for setting the virtual plane shown in FIG.
7 is a flowchart of a method of processing input using touch motion according to an embodiment of the present invention.
Specific structural and functional descriptions of embodiments of the present invention disclosed herein are illustrated for purposes of illustrating embodiments of the inventive concept only, And can be embodied in various forms and should not be construed as limited to the embodiments set forth herein.
The embodiments according to the concept of the present invention can make various changes and have various forms, so that specific embodiments are illustrated in the drawings and described in detail herein. It is to be understood, however, that it is not intended to limit the embodiments according to the concepts of the present invention to the particular forms of disclosure, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.
The terms first and / or second, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are intended to distinguish one element from another, for example, without departing from the scope of the invention in accordance with the concepts of the present invention, the first element may be termed the second element, The second component may also be referred to as a first component.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", or the like, specify that there is a stated feature, number, step, operation, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as ideal or overly formal in the sense of the art unless explicitly defined herein Do not.
As used herein, a module may refer to a functional or structural combination of hardware to perform the method according to an embodiment of the present invention or software that can drive the hardware.
Accordingly, the module may mean a logical unit or a set of hardware resources capable of executing the program code and the program code, and does not necessarily mean a physically connected code or a kind of hardware.
BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, the present invention will be described in detail with reference to the preferred embodiments of the present invention with reference to the accompanying drawings.
1 is a conceptual diagram of a touch-motion input system according to an embodiment of the present invention.
Referring to FIG. 1, a touch
The touch-
The
The touch
According to an embodiment, the touch-
The
The touch point TP may refer to a point or a coordinate point at which the user's touch motion is input (or detected) on the
Each of the coordinates included in the coordinate system of the
The touch motion will be described in detail with reference to Fig.
FIG. 2 is a view for explaining a method of receiving a touch motion using the virtual plane shown in FIG.
1 and 2, when the input of the user, for example, the input using the hand, passes the predetermined depth d or more in the rear direction of the
That is, the touch motion may mean the input of a user who has passed a certain depth d or more in the rear direction of the
3 is a block diagram according to an embodiment of the touch-motion input device shown in FIG.
1 to 3, the touch-
Each of the virtual surface
The
The virtual surface
The virtual surface
For example, the virtual surface
In accordance with another embodiment, the virtual surface
The
The
For example, when the user's touch motion is repeatedly input to the same touch point TP twice, the
The
According to another embodiment, the
The virtual
According to an embodiment, the virtual
The virtual plane setting motion refers to various motions that a user can take to set or define the
4 is a view for explaining an embodiment of a method of setting the virtual plane shown in FIG. 5 is a view for explaining another embodiment of the method for setting the virtual plane shown in Fig. FIG. 6 is a diagram for explaining another embodiment of the method for setting the virtual plane shown in FIG.
4, a user may set or define a
In this case, the vertical width of the
Referring to FIG. 5, the virtual plane setting motion may have an input trajectory TR, and the virtual
In this case, the virtual
Referring to FIG. 5, the virtual plane setting motion may be performed using a touch motion.
Each of the touch points TP1 to TP4 is a portion where a touch motion is sensed by the user and the virtual
7 is a flowchart of a method of processing input using touch motion according to an embodiment of the present invention.
Referring to FIGS. 1 to 7, the virtual surface
According to an embodiment, before step S10, the virtual
The
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.
10: Touch-motion input system
100: Control target display
110: graphical user interface (GUI)
200: Touch-motion input device
300: virtual face
Claims (11)
Controlling a graphical user interface (GUI) control module to control a GUI of the controlled display according to a detection result.
Wherein the touch motion is detected based on a depth of the virtual motion of the touch motion.
And controlling the GUI differently according to the depth.
Wherein the virtual surface is a curved surface or a spherical surface.
Wherein the GUI of the display is controlled differently according to the detected pattern of touch motions.
Wherein each of the first coordinates of the first coordinate system of the virtual surface corresponds to one of the second coordinates of the second coordinate system of the control target display.
When a touch motion corresponding to a first coordinate of any one of the first coordinates is sensed, a GUI located at a second coordinate corresponding to any one of the first coordinates is selected, How to process.
Wherein the virtual plane setting module further comprises setting the virtual plane according to the virtual plane setting motion of the user.
And setting the virtual plane by setting at least two edges of the corners of the virtual plane according to the virtual plane setting motion.
And processing the input using the touch motion to set the virtual plane using the touch motion of the user.
And setting the virtual plane according to the virtual plane setting motion corresponding to a user's input trajectory.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130031942A KR20140117067A (en) | 2013-03-26 | 2013-03-26 | Method for processing input by touch motion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130031942A KR20140117067A (en) | 2013-03-26 | 2013-03-26 | Method for processing input by touch motion |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20140117067A true KR20140117067A (en) | 2014-10-07 |
Family
ID=51990534
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130031942A KR20140117067A (en) | 2013-03-26 | 2013-03-26 | Method for processing input by touch motion |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20140117067A (en) |
-
2013
- 2013-03-26 KR KR1020130031942A patent/KR20140117067A/en not_active Application Discontinuation
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10761610B2 (en) | Vehicle systems and methods for interaction detection | |
US7411575B2 (en) | Gesture recognition method and touch system incorporating the same | |
EP2437147B1 (en) | Information processing device, information processing method, and program | |
US9996160B2 (en) | Method and apparatus for gesture detection and display control | |
US20130343607A1 (en) | Method for touchless control of a device | |
US9916043B2 (en) | Information processing apparatus for recognizing user operation based on an image | |
US20140317576A1 (en) | Method and system for responding to user's selection gesture of object displayed in three dimensions | |
US10156938B2 (en) | Information processing apparatus, method for controlling the same, and storage medium | |
WO2014106219A1 (en) | User centric interface for interaction with visual display that recognizes user intentions | |
US11054896B1 (en) | Displaying virtual interaction objects to a user on a reference plane | |
US20120007826A1 (en) | Touch-controlled electric apparatus and control method thereof | |
CN104317398A (en) | Gesture control method, wearable equipment and electronic equipment | |
US20120013556A1 (en) | Gesture detecting method based on proximity-sensing | |
CN105759955B (en) | Input device | |
US9122346B2 (en) | Methods for input-output calibration and image rendering | |
US11500453B2 (en) | Information processing apparatus | |
CN104978018B (en) | Touch system and touch method | |
JP2014123316A (en) | Information processing system, information processing device, detection device, information processing method, detection method, and computer program | |
JP2013109538A (en) | Input method and device | |
KR20140117067A (en) | Method for processing input by touch motion | |
TWI444875B (en) | Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor | |
EP3059664A1 (en) | A method for controlling a device by gestures and a system for controlling a device by gestures | |
US10175825B2 (en) | Information processing apparatus, information processing method, and program for determining contact on the basis of a change in color of an image | |
KR20140066378A (en) | Display apparatus and method of controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E601 | Decision to refuse application |