KR20140117067A - Method for processing input by touch motion - Google Patents

Method for processing input by touch motion Download PDF

Info

Publication number
KR20140117067A
KR20140117067A KR1020130031942A KR20130031942A KR20140117067A KR 20140117067 A KR20140117067 A KR 20140117067A KR 1020130031942 A KR1020130031942 A KR 1020130031942A KR 20130031942 A KR20130031942 A KR 20130031942A KR 20140117067 A KR20140117067 A KR 20140117067A
Authority
KR
South Korea
Prior art keywords
virtual plane
touch
motion
setting
gui
Prior art date
Application number
KR1020130031942A
Other languages
Korean (ko)
Inventor
김상식
정성관
이원겸
박준석
신용철
Original Assignee
한국과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술원 filed Critical 한국과학기술원
Priority to KR1020130031942A priority Critical patent/KR20140117067A/en
Publication of KR20140117067A publication Critical patent/KR20140117067A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

A method for processing an input using a touch motion comprises: a step for which a virtual surface touch sensing module detects a touch motion on a virtual surface corresponding to a display to be controlled; and a step for which a graphical user interface (GUI) controlling module controls a GUI of the display to be controlled following the detected results.

Description

[0001] METHOD FOR PROCESSING INPUT BY TOUCH MOTION [0002]

An embodiment according to the concept of the present invention relates to a method of processing an input using a touch motion, and more particularly to a method of detecting the touch motion using a virtual plane and processing input according to a detection result.

Electronic devices such as smart television, internet television, connected television, and open hybrid television are mainly controlled using a remote controller.

The electronic device can be easily controlled using the remote controller, but as the electronic device becomes more complicated, a remote controller having a more complicated structure is required. Particularly, as the number of buttons of the remote control increases, it is inconvenient to memorize the functions of the buttons.

In addition, electronic devices controlled through motion recognition can be controlled by predefined operations, such as special hand shapes, flexion of the joints, hand movements, and the like. However, the user has the inconvenience of memorizing the above-described predefined operations.

According to an aspect of the present invention, there is provided a method of detecting touch motion using a virtual surface and processing input according to a detection result.

A method of processing an input using touch motion according to an exemplary embodiment of the present invention includes a step of sensing a touch motion of a virtual surface corresponding to a control target display and a step of detecting a touch motion of a graphical user interface control module may control the GUI of the control target display according to the detection result.

According to an embodiment, the virtual surface touch detection module may sense the touch motion based on a depth of the virtual motion of the touch motion.

According to an embodiment, the GUI control module can control the GUI differently according to the depth.

According to an embodiment, the imaginary plane may be a curved surface or a spherical surface.

According to the embodiment, the GUI control module can control the GUI of the display differently according to the detected pattern of the touch motion.

According to an embodiment, each of the first coordinates of the first coordinate system of the virtual surface may correspond one-to-one to each of the second coordinates of the second coordinate system of the controlled display.

According to the embodiment, when the touch motion corresponding to the first coordinate of any one of the first coordinates is sensed, the GUI control module displays the GUI, which is located at the second coordinate corresponding to any one of the first coordinates, Can be selected.

According to the embodiment, before the step of sensing the touch motion, the virtual plane setting module may further include setting the virtual plane according to the virtual plane setting motion of the user.

According to an embodiment, the step of setting the virtual plane may set the virtual plane by setting at least two of the corners of the virtual plane according to the virtual plane setting motion.

According to an embodiment, the step of setting the virtual plane may set the virtual plane using the touch motion of the user.

According to an embodiment, the step of setting the virtual plane may set the virtual plane according to the virtual plane setting motion corresponding to the input trajectory of the user.

The method according to the embodiment of the present invention has an effect that a graphical user interface (GUI) can be easily controlled through a touch motion that touches a virtual surface.

Further, there is an effect that the virtual plane can be set as desired by the user through the virtual plane setting motion.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In order to more fully understand the drawings recited in the detailed description of the present invention, a detailed description of each drawing is provided.
1 is a conceptual diagram of a touch-motion input system according to an embodiment of the present invention.
FIG. 2 is a view for explaining a method of receiving a touch motion using the virtual plane shown in FIG.
3 is a block diagram according to an embodiment of the touch-motion input device shown in FIG.
4 is a view for explaining an embodiment of a method of setting the virtual plane shown in FIG.
5 is a view for explaining another embodiment of the method for setting the virtual plane shown in Fig.
FIG. 6 is a diagram for explaining another embodiment of the method for setting the virtual plane shown in FIG.
7 is a flowchart of a method of processing input using touch motion according to an embodiment of the present invention.

Specific structural and functional descriptions of embodiments of the present invention disclosed herein are illustrated for purposes of illustrating embodiments of the inventive concept only, And can be embodied in various forms and should not be construed as limited to the embodiments set forth herein.

The embodiments according to the concept of the present invention can make various changes and have various forms, so that specific embodiments are illustrated in the drawings and described in detail herein. It is to be understood, however, that it is not intended to limit the embodiments according to the concepts of the present invention to the particular forms of disclosure, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The terms first and / or second, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are intended to distinguish one element from another, for example, without departing from the scope of the invention in accordance with the concepts of the present invention, the first element may be termed the second element, The second component may also be referred to as a first component.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", or the like, specify that there is a stated feature, number, step, operation, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as ideal or overly formal in the sense of the art unless explicitly defined herein Do not.

As used herein, a module may refer to a functional or structural combination of hardware to perform the method according to an embodiment of the present invention or software that can drive the hardware.

Accordingly, the module may mean a logical unit or a set of hardware resources capable of executing the program code and the program code, and does not necessarily mean a physically connected code or a kind of hardware.

BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, the present invention will be described in detail with reference to the preferred embodiments of the present invention with reference to the accompanying drawings.

1 is a conceptual diagram of a touch-motion input system according to an embodiment of the present invention.

Referring to FIG. 1, a touch motion input system 10 may include a display 100, a touch motion input device 200, and a virtual surface 300 .

The touch-motion input system 10 may be implemented with various types of systems including the display 100. In particular, the touch-motion input system 10 includes a system including a TV (e.g., smart television, internet television, connected television, and open hybrid television) Lt; / RTI >

The display 100 may be connected to various devices to display a screen processed by the devices. Display 100 may display various types of graphical user interfaces (e.g., 110).

The touch motion input device 200 senses a touch motion of a user input through a touch point TP of the virtual surface 300 and generates a touch point TP corresponding to the touch point TP The GUI 110 can be controlled.

According to an embodiment, the touch-motion input device 200 may include a 3D image sensor (e.g., a time of flight (TOF) image sensor) for sensing the user's touch motion.

The virtual plane 300 does not exist in a physical form, but may be a virtual region set or defined by the touch motion input device 200 in order to receive a touch motion. Depending on the embodiment, the virtual plane 300 may be set or defined in a planar, curved, or spherical shape.

The touch point TP may refer to a point or a coordinate point at which the user's touch motion is input (or detected) on the virtual surface 300. In this case,

Each of the coordinates included in the coordinate system of the virtual plane 300 may correspond to each of the coordinates included in the coordinate system of the display 100 in a one-to-one correspondence. That is, a specific point of the virtual plane 300 may match a specific point of the display 100.

The touch motion will be described in detail with reference to Fig.

FIG. 2 is a view for explaining a method of receiving a touch motion using the virtual plane shown in FIG.

1 and 2, when the input of the user, for example, the input using the hand, passes the predetermined depth d or more in the rear direction of the virtual plane 300, the touch- Can be recognized as a touch motion.

That is, the touch motion may mean the input of a user who has passed a certain depth d or more in the rear direction of the virtual plane 300.

3 is a block diagram according to an embodiment of the touch-motion input device shown in FIG.

1 to 3, the touch-motion input device 200 may include a memory 205, a virtual surface touch sensing module 210, a GUI control module 220, and a virtual plane setting module 230 have.

Each of the virtual surface touch sensing module 210, the GUI control module 220 and the virtual surface setting module 230 may be functionally and logically separated and each of the components may be separated into separate physical devices It does not mean that it is written in a separate code.

The memory 205 may store data necessary for operation of the virtual surface touch detection module 210, the GUI control module 220, and the virtual plane setting module 230, respectively.

The virtual surface touch sensing module 210 may sense the user's touch motion with respect to the virtual surface 300 corresponding to the display 100. [

The virtual surface touch sensing module 210 may include a 3D image sensor (not shown).

For example, the virtual surface touch sensing module 210 may determine that the distance between the user's finger sensed by the 3D image sensor (not shown) and the display 100 is less than the distance between the virtual surface 300 and the display 100 (Coordinate information of the touch point TP, for example) to the GUI control module 220 when the touch point TP is closer than the predetermined distance.

In accordance with another embodiment, the virtual surface touch detection module 210 provides information about the depth d of the user's finger through the virtual surface 300, in addition to information about the touch point TP, to the GUI control module 220 ).

The GUI control module 220 controls the GUI 110 of the display 100 corresponding to the touch point TP based on the information about the touch point TP transmitted from the virtual surface touch sensing module 210 To the display (100). For example, the GUI control module 220 may send a signal to the display 100 to select the GUI 110. [

The GUI control module 220 displays a signal for controlling the GUI 110 differently according to the pattern of the touch motion of the user sensed by the virtual surface touch sense module 210 on the display 100 ).

For example, when the user's touch motion is repeatedly input to the same touch point TP twice, the GUI control module 220 may transmit a signal to the display 100 to double-click the GUI 100. [

The GUI control module 220 can transmit a signal for dragging the GUI 100 to the display 100 when the touch point TP continuously moves in a state where the user's touch motion is detected have.

According to another embodiment, the GUI control module 220 may provide a signal to the display 100 to control the GUI 110 differently according to information about the depth d transmitted from the virtual surface touch sense module 210 Lt; / RTI >

The virtual plane setting module 230 can detect the virtual plane setting motion of the user and set or define the virtual plane 300 according to the virtual plane setting motion.

According to an embodiment, the virtual plane setting module 230 may include an image sensor for sensing the virtual plane setting motion of the user. According to another embodiment, the virtual plane setting module 230 may share an image sensor with the surface touch sensing module 210.

The virtual plane setting motion refers to various motions that a user can take to set or define the virtual plane 300. [ The virtual plane setting motion will be described in detail with reference to Figs.

4 is a view for explaining an embodiment of a method of setting the virtual plane shown in FIG. 5 is a view for explaining another embodiment of the method for setting the virtual plane shown in Fig. FIG. 6 is a diagram for explaining another embodiment of the method for setting the virtual plane shown in FIG.

4, a user may set or define a virtual plane 300 through a virtual plane motion to set at least two of the corners of the virtual plane 300. For example, The two corners may be positioned diagonally to each other.

In this case, the vertical width of the virtual plane 300 is set by the thumb of the left hand and the little finger of the right hand, and the horizontal width of the virtual plane 300 can be set by the thumb of the left hand and the thumb of the right hand .

Referring to FIG. 5, the virtual plane setting motion may have an input trajectory TR, and the virtual plane setting module 230 may set or define the virtual plane 300 using the input trajectory TR.

In this case, the virtual plane setting module 230 can set the virtual plane 300 having the start point and the end point of the input trajectory TR as the corners.

Referring to FIG. 5, the virtual plane setting motion may be performed using a touch motion.

Each of the touch points TP1 to TP4 is a portion where a touch motion is sensed by the user and the virtual plane setting module 230 sets or defines a virtual plane 300 having the touch points TP1 to TP4 as corners. can do. According to an embodiment, the virtual surface 300 may be set using two touch points, and the scope of the present invention is not limited by the number of touch points for setting or defining the virtual surface 300.

7 is a flowchart of a method of processing input using touch motion according to an embodiment of the present invention.

Referring to FIGS. 1 to 7, the virtual surface touch sensing module 210 may sense a touch motion with respect to the virtual surface 300 (S10).

According to an embodiment, before step S10, the virtual plane setting module 230 may set or define the virtual plane 300 according to the virtual plane setting motion.

The GUI control module 220 generates a signal for controlling the GUI 110 displayed on the display 100 according to the detection result, and the GUI 110 can be controlled by the signal (S12).

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

10: Touch-motion input system
100: Control target display
110: graphical user interface (GUI)
200: Touch-motion input device
300: virtual face

Claims (11)

Sensing a touch motion of a hypothetical surface touch sensing module with respect to a virtual surface corresponding to a control target display; And
Controlling a graphical user interface (GUI) control module to control a GUI of the controlled display according to a detection result.
2. The touch sensing device of claim 1,
Wherein the touch motion is detected based on a depth of the virtual motion of the touch motion.
3. The system according to claim 2,
And controlling the GUI differently according to the depth.
The method according to claim 1,
Wherein the virtual surface is a curved surface or a spherical surface.
The system according to claim 1, wherein the GUI control module comprises:
Wherein the GUI of the display is controlled differently according to the detected pattern of touch motions.
The method according to claim 1,
Wherein each of the first coordinates of the first coordinate system of the virtual surface corresponds to one of the second coordinates of the second coordinate system of the control target display.
7. The system of claim 6, wherein the GUI control module comprises:
When a touch motion corresponding to a first coordinate of any one of the first coordinates is sensed, a GUI located at a second coordinate corresponding to any one of the first coordinates is selected, How to process.
The method of claim 1, further comprising: prior to sensing the touch motion,
Wherein the virtual plane setting module further comprises setting the virtual plane according to the virtual plane setting motion of the user.
9. The method of claim 8, wherein the setting of the virtual plane comprises:
And setting the virtual plane by setting at least two edges of the corners of the virtual plane according to the virtual plane setting motion.
9. The method of claim 8, wherein the setting of the virtual plane comprises:
And processing the input using the touch motion to set the virtual plane using the touch motion of the user.
9. The method of claim 8, wherein the setting of the virtual plane comprises:
And setting the virtual plane according to the virtual plane setting motion corresponding to a user's input trajectory.
KR1020130031942A 2013-03-26 2013-03-26 Method for processing input by touch motion KR20140117067A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130031942A KR20140117067A (en) 2013-03-26 2013-03-26 Method for processing input by touch motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130031942A KR20140117067A (en) 2013-03-26 2013-03-26 Method for processing input by touch motion

Publications (1)

Publication Number Publication Date
KR20140117067A true KR20140117067A (en) 2014-10-07

Family

ID=51990534

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130031942A KR20140117067A (en) 2013-03-26 2013-03-26 Method for processing input by touch motion

Country Status (1)

Country Link
KR (1) KR20140117067A (en)

Similar Documents

Publication Publication Date Title
US10761610B2 (en) Vehicle systems and methods for interaction detection
US7411575B2 (en) Gesture recognition method and touch system incorporating the same
EP2437147B1 (en) Information processing device, information processing method, and program
US9996160B2 (en) Method and apparatus for gesture detection and display control
US20130343607A1 (en) Method for touchless control of a device
US9916043B2 (en) Information processing apparatus for recognizing user operation based on an image
US20140317576A1 (en) Method and system for responding to user's selection gesture of object displayed in three dimensions
US10156938B2 (en) Information processing apparatus, method for controlling the same, and storage medium
WO2014106219A1 (en) User centric interface for interaction with visual display that recognizes user intentions
US11054896B1 (en) Displaying virtual interaction objects to a user on a reference plane
US20120007826A1 (en) Touch-controlled electric apparatus and control method thereof
CN104317398A (en) Gesture control method, wearable equipment and electronic equipment
US20120013556A1 (en) Gesture detecting method based on proximity-sensing
CN105759955B (en) Input device
US9122346B2 (en) Methods for input-output calibration and image rendering
US11500453B2 (en) Information processing apparatus
CN104978018B (en) Touch system and touch method
JP2014123316A (en) Information processing system, information processing device, detection device, information processing method, detection method, and computer program
JP2013109538A (en) Input method and device
KR20140117067A (en) Method for processing input by touch motion
TWI444875B (en) Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor
EP3059664A1 (en) A method for controlling a device by gestures and a system for controlling a device by gestures
US10175825B2 (en) Information processing apparatus, information processing method, and program for determining contact on the basis of a change in color of an image
KR20140066378A (en) Display apparatus and method of controlling the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E601 Decision to refuse application