KR20140112323A - Directional gesture recognition method, apparatus thereof, and medium storing program source thereof - Google Patents

Directional gesture recognition method, apparatus thereof, and medium storing program source thereof Download PDF

Info

Publication number
KR20140112323A
KR20140112323A KR1020130026944A KR20130026944A KR20140112323A KR 20140112323 A KR20140112323 A KR 20140112323A KR 1020130026944 A KR1020130026944 A KR 1020130026944A KR 20130026944 A KR20130026944 A KR 20130026944A KR 20140112323 A KR20140112323 A KR 20140112323A
Authority
KR
South Korea
Prior art keywords
directional gesture
function corresponding
angle
executable
gesture recognition
Prior art date
Application number
KR1020130026944A
Other languages
Korean (ko)
Inventor
조정호
김문수
김두욱
박태건
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020130026944A priority Critical patent/KR20140112323A/en
Publication of KR20140112323A publication Critical patent/KR20140112323A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a directional gesture recognition method, an apparatus thereof, and a recording medium storing a program source for the method. According to an embodiment of the present invention, a directional gesture recognition method comprises the steps of: determining whether a function corresponding to an angle range cannot be performed while performing a function corresponding to an angle where a directional gesture is made with reference to a first table to which a function corresponding to each angle range is mapped; and performing a function corresponding to an angle where a directional gesture is made with reference to a second table to which an angle range different from the first table is mapped with respect to the functions when it is determined that a non-executable function exists among the functions. According to the present invention, user convenience can be provided by adjusting a directional gesture recognition range depending on the circumstances.

Description

[0001] DIRECTIONAL GESTURE RECOGNITION METHOD, APPARATUS THEREOF, AND MEDIUM STORING PROGRAM SOURCE THEREOF [0002]

The present invention relates to a directional gesture recognition method and apparatus, and a recording medium storing a program source for the method.

2. Description of the Related Art In recent years, due to the development of multimedia technology and communication technology, various terminals such as smart phones have been spreading. These terminals are equipped with various applications that provide various convenience to the user. Also, for convenience of use, such a terminal is equipped with a touch screen.

In such a terminal, the user can perform a specific function by performing various operations such as tapping or swiping the touch screen. As a method of performing various operations according to a directional gesture such as a swipe, for example, as shown in Fig. 1 (a), referring to a table mapped to a function corresponding to each angle range, And performs a corresponding function. This will be described in more detail as follows.

For example, as shown in FIG. 1 (b), when the angle? Made by the directional gesture is 230, the terminal refers to the table as shown in FIG. 1 (a) Downward " function corresponding to the " 230 deg. &Quot; 1 (c), when the angle? Formed by the directional gesture is 200 degrees, the terminal refers to the table as shown in FIG. 1 (a) To-left " function corresponding to " ° ".

However, depending on the type of application or on a specific screen, there is sometimes no function to perform according to a directional gesture in a specific direction. For example, when only the left and right side flip function is assigned to turn pages while reading an electronic book, no function is assigned to an angular range (45 ° ≤α <135 °, 225 ° ≤α <315 °) May be a waste in recognizing directional gestures.

Accordingly, the present invention provides a method for adjusting the recognition range of a directional gesture to which a specific function is assigned.

Other objects to be provided by the present invention can be understood through the following examples.

The directional gesture recognizing method according to an embodiment of the present invention is a method for recognizing a directional gesture by referring to a first table to which a function corresponding to each angle range is mapped, Determining whether or not a function corresponding to the angular range of the angular range is executable; And performing a function corresponding to an angle at which the directional gesture is made with reference to a second table to which an angular range different from the first table is mapped to the functions, when it is determined that the function is an unexecutable function among the functions .

Meanwhile, a directional gesture recognizing apparatus according to an embodiment of the present invention includes a first table in which functions corresponding to respective angle ranges are mapped, and a second table in which an angle range different from the first table is mapped A memory unit for storing two tables; A sensor unit for sensing a directional gesture and calculating an angle of the sensed directional gesture; And determining whether or not a function corresponding to one of the angular ranges is impossible while performing a function corresponding to an angle at which the directional gesture is made with reference to the first table, And a controller for performing a function corresponding to the angle at which the directional gesture is made with reference to the second table.

On the other hand, in the program-readable recording medium according to the embodiment of the present invention, while performing the function corresponding to the angle at which the directional gesture is made with reference to the first table to which the function corresponding to each angle range is mapped, Determining whether a function corresponding to one angular range is impossible to execute; And performing a function corresponding to an angle at which the directional gesture is made with reference to a second table to which an angular range different from the first table is mapped to the functions, when it is determined that the function is an unexecutable function among the functions Is recorded.

According to the present invention, there is an advantage that convenience can be provided to the user by adjusting the directional gesture recognition range according to the situation.

1 is an exemplary diagram for explaining a function performing method according to a directional gesture,
FIG. 2 is a flowchart illustrating a method for recognizing a directional gesture according to an embodiment of the present invention. FIG.
FIGS. 3 and 4 are diagrams for explaining a directional gesture recognition method according to an embodiment of the present invention;
5 is an exemplary view for explaining a process of obtaining an angle in which a directional gesture is made using an infrared sensor,
6 is a block diagram for explaining a directional gesture recognition apparatus to which embodiments of the present invention are applied.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.

2 is a flowchart illustrating a method for recognizing a directional gesture according to an embodiment of the present invention.

In step 201, the terminal refers to the first table, performs a function corresponding to the angle at which the directional gesture is made, and then proceeds to step 203. [ Here, the first table means a table in which functions corresponding to the respective angular ranges are mapped. For example, when a directional gesture corresponding to a specific angle range is recognized as shown in (a) of FIG. 3, a function to be performed in correspondence with the directional gesture is assigned. The first table may be one in which functions different from each other are assigned to each of the running applications or the screen being displayed on the terminal.

In step 203, the terminal determines whether or not the function corresponding to any one of the angular ranges defined in the first table is executable. For example, if only the 'flip left' and 'flip right' actions are assigned as functions that can be performed during display of an electronic book, the terminal can not perform the functions of 'flip up' and 'flip down' It can be judged to be. That is, the terminal can determine whether the function corresponding to any one of the angular ranges defined in the first table on the currently displayed screen is impossible. Alternatively, the terminal can determine whether or not a function corresponding to one of the angular ranges in the application is executable when the application is executed. This determination can be made, for example, by analyzing the source code of the application. For example, the terminal can determine whether or not a function corresponding to each angle range is allocated on a plurality of screens that can be provided by the application when the application is executed.

If it is determined in step 203 that the function corresponding to any one of the angular ranges defined in the first table is not executable and proceeds to step 205, the terminal refers to the second table, As shown in FIG. Here, the second table means a table in which angular ranges different from those of the first table are mapped to the respective functions. For example, the second table may be a table whose angle range mapped to the non-executable function is narrower than the first table. In addition, the second table may be a table having an angle range mapped to an executable function as compared with the first table.

For example, as shown in Fig. 3 (b), the angular range mapped to the unexecutable function (70 deg. Alpha < 110 deg.) Is the angular range (45 deg. Deg.]. Further, for example, as shown in Fig. 3 (b), the angular range mapped to the executable function (0 deg < alpha < 70 deg., 290 deg. (0 DEG &amp;le; alpha < 45 DEG, 315 DEG < alpha &lt; 360 DEG).

That is, according to the embodiment of the present invention, as shown in FIG. 3 (c), by widening the angle range mapped to the executable function, the directional gesture recognition section can be widened, .

4 (a), when the angle (230 deg.) At which the directional gesture is made is close to the vertical direction in the specific screen, according to the conventional art, as shown in Fig. 4 (b) The screen switching is not performed. However, according to the embodiment of the present invention, the directional gesture may be recognized as a left turn operation to perform screen switching as shown in (c) of FIG.

On the other hand, when the screen is switched, the application is terminated, or a new application is executed while performing the function according to the directional gesture with reference to the second table, functions corresponding to all angular ranges can be executed.

To do so, in step 205, the terminal determines whether or not the function corresponding to all angular ranges defined in the first table is executable. This determination can be made every time the currently displayed screen is switched, the currently running application is terminated, or a new application is executed.

If it is determined in step 207 that the function corresponding to all the angular ranges defined in the first table is executable, the terminal proceeds to step 201 to determine whether the directional gesture is made And performs a function corresponding to the angle.

On the other hand, the angle at which the directional gesture described above is made can be obtained in various ways. For example, as shown in FIG. 5, when an infrared sensor is used, based on the sensing information obtained in each of the regions 502, 504, 506, and 508 that detect gestures in different directions, It is possible to determine at what angle an angle? The angle at which the directional gesture is made may be obtained not only from an infrared sensor but also from various sensors capable of sensing a directional gesture. Also, the angle at which the directional gesture is made may be obtained using a camera mounted on the terminal. According to an embodiment, the angle at which the directional gesture is made may also be obtained by sensing information obtained on the touch screen.

In the foregoing, a directional gesture recognition method according to an embodiment of the present invention has been described with reference to FIGS. 1 to 5. FIG. Hereinafter, a directional gesture recognizing apparatus according to an embodiment of the present invention will be described with reference to FIG.

6 is a block diagram illustrating a directional gesture recognition apparatus to which embodiments of the present invention are applied. Referring to FIG. 6, the directional gesture recognition apparatus to which embodiments of the present invention are applied includes a control unit 610, a sensor unit 620, and a memory unit 630.

The control unit 610 determines whether or not the function corresponding to one of the angular ranges is impossible while performing the function corresponding to the angle at which the directional gesture is made with reference to the first table, Function is performed, a function corresponding to the angle at which the directional gesture is made is performed by referring to the second table. The controller 610 can determine whether the function corresponding to any one of the angular ranges on the currently displayed screen is impossible. The control unit 610 can determine whether or not the function corresponding to any one of the angular ranges is executable in the application when the application is executed. The controller 610 determines whether or not a function corresponding to all the angular ranges defined in the first table is executable while performing a function corresponding to the angle at which the directional gesture is made with reference to the second table, When it is determined that the function corresponding to all of the angular ranges is executable, the first table can be referred to perform a function corresponding to the angle at which the directional gesture is made.

The sensor unit 620 senses the directional gesture and calculates the angle of the sensed directional gesture. The sensor unit 620 can calculate the angle of the directional gesture using an infrared sensor. In addition, the sensor unit 620 may calculate the angle of the directional gesture using various sensors capable of sensing the directional gesture. According to the embodiment, the directional gesture recognition apparatus according to an embodiment of the present invention may be equipped with a camera capable of sensing a directional gesture.

The memory unit 630 stores a first table in which functions corresponding to the respective angular ranges are mapped and a second table in which angular ranges different from the first table are mapped to the functions. The second table may be a table whose angular range mapped to the non-executable function is narrower than the first table. The second table may be a table whose angle range mapped to the executable function is larger than the first table. The directional gesture may include at least one of a swipe, a flick, a drag, and a pinch operation.

The embodiments of the invention described above may be implemented in any of a variety of ways. For example, embodiments of the present invention may be implemented using hardware, software, or a combination thereof. When implemented in software, it may be implemented as software running on one or more processors using various operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages, and may also be compiled into executable machine code or intermediate code that runs in a framework or virtual machine.

Also, when embodiments of the present invention are implemented on one or more processors, one or more programs for carrying out the methods of implementing the various embodiments of the invention discussed above may be stored on a processor readable medium (e.g., memory, A floppy disk, a hard disk, a compact disk, an optical disk, a magnetic tape, or the like).

Claims (15)

In a directional gesture recognition method,
Determining whether a function corresponding to one of the angular ranges is impossible while performing a function corresponding to the angle at which the directional gesture is made by referring to the first table to which the function corresponding to each angular range is mapped; And
Performing a function corresponding to an angle at which a directional gesture is made with reference to a second table in which an angle range different from the first table is mapped to the functions, when it is determined that the function is an unexecutable function among the functions
And a directional gesture recognition method.
2. The apparatus of claim 1,
Wherein the angle range mapped to the non-executable function is a narrower table than the first table
Directional gesture recognition method.
3. The apparatus of claim 2,
The angular range mapped to the executable function is larger than the first table
Directional gesture recognition method.
The method of claim 1, wherein the directional gesture comprises:
Including at least one of a swipe, a flick, a drag and a pinch operation
Directional gesture recognition method.
The method according to claim 1, wherein the step of determining whether or not the function corresponding to any one of the angular ranges is executable,
Determining whether a function corresponding to any one of the angular ranges on the currently displayed screen is executable
Directional gesture recognition method.
The method according to claim 1, wherein the step of determining whether or not the function corresponding to any one of the angular ranges is executable,
Determining whether a function corresponding to any one of the angular ranges is executable in the application when the application is executed
Directional gesture recognition method.
2. The method of claim 1, further comprising: after performing the function corresponding to the angle at which the directional gesture was made with reference to the second table,
Determining whether a function corresponding to all angular ranges defined in the first table is executable; And
Performing a function corresponding to an angle at which the directional gesture is made with reference to the first table when it is determined that the function corresponding to all of the angular ranges is executable
Further comprising the steps of:
A directional gesture recognition apparatus comprising:
A memory unit for storing a first table in which functions corresponding to respective angular ranges are mapped and a second table in which angular ranges different from the first table are mapped to the functions;
A sensor unit for sensing a directional gesture and calculating an angle of the sensed directional gesture; And
Determining whether or not a function corresponding to one of the angular ranges is executable while performing a function corresponding to an angle at which the directional gesture is made with reference to the first table; A controller for performing a function corresponding to the angle at which the directional gesture is made with reference to the second table,
And a directional gesture recognition device.
9. The apparatus of claim 8,
Wherein the angle range mapped to the non-executable function is a narrower table than the first table
Directional gesture recognition device.
10. The information processing apparatus according to claim 9,
The angular range mapped to the executable function is larger than the first table
Directional gesture recognition device.
9. The method of claim 8, wherein the directional gesture comprises:
Including at least one of a swipe, a flick, a drag and a pinch operation
Directional gesture recognition device.
9. The apparatus according to claim 8,
It is determined whether or not the function corresponding to any one of the angular ranges on the currently displayed screen is impossible
Directional gesture recognition device.
9. The apparatus according to claim 8,
It is determined whether or not the function corresponding to any one of the angular ranges is executable in the application when the application is executed
Directional gesture recognition device.
9. The apparatus according to claim 8,
Determining whether or not a function corresponding to all angular ranges defined in the first table is executable while performing a function corresponding to an angle at which the directional gesture is made with reference to the second table; When it is determined that the function is executable, performs a function corresponding to the angle at which the directional gesture is made with reference to the first table
Directional gesture recognition device.
A program for performing the method according to any one of claims 1 to 7, wherein the program is recorded.
KR1020130026944A 2013-03-13 2013-03-13 Directional gesture recognition method, apparatus thereof, and medium storing program source thereof KR20140112323A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130026944A KR20140112323A (en) 2013-03-13 2013-03-13 Directional gesture recognition method, apparatus thereof, and medium storing program source thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130026944A KR20140112323A (en) 2013-03-13 2013-03-13 Directional gesture recognition method, apparatus thereof, and medium storing program source thereof

Publications (1)

Publication Number Publication Date
KR20140112323A true KR20140112323A (en) 2014-09-23

Family

ID=51757347

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130026944A KR20140112323A (en) 2013-03-13 2013-03-13 Directional gesture recognition method, apparatus thereof, and medium storing program source thereof

Country Status (1)

Country Link
KR (1) KR20140112323A (en)

Similar Documents

Publication Publication Date Title
US20210096651A1 (en) Vehicle systems and methods for interaction detection
US10514842B2 (en) Input techniques for virtual reality headset devices with front touch screens
CN106415472B (en) Gesture control method and device, terminal equipment and storage medium
US10042546B2 (en) Systems and methods to present multiple frames on a touch screen
US9411418B2 (en) Display device, display method, and program
US20130215018A1 (en) Touch position locating method, text selecting method, device, and electronic equipment
US9632693B2 (en) Translation of touch input into local input based on a translation profile for an application
WO2019062243A1 (en) Identification method and apparatus for touch operation, and electronic device
CN108733302B (en) Gesture triggering method
US10019148B2 (en) Method and apparatus for controlling virtual screen
CN105094675A (en) Man-machine interaction method and touch screen wearable device
CN110795015A (en) Operation prompting method, device, equipment and storage medium
KR102096070B1 (en) Method for improving touch recognition and an electronic device thereof
US20160070467A1 (en) Electronic device and method for displaying virtual keyboard
CN106569716B (en) Single-hand control method and control system
US10248307B2 (en) Virtual reality headset device with front touch screen
US10078443B2 (en) Control system for virtual mouse and control method thereof
KR20150008624A (en) Controlling method of electronic device and apparatus thereof
KR20140082606A (en) Method and apparatus for changing page of e-book using pressure modeling
CN104133627A (en) Zooming display method and electronic equipment
KR20140112323A (en) Directional gesture recognition method, apparatus thereof, and medium storing program source thereof
US20160124602A1 (en) Electronic device and mouse simulation method
CN114089868A (en) Touch operation method and device and electronic equipment
KR20130080218A (en) Method for moving the cursor of text editor using motion sensor, and computer-readable recording medium with moving program of the cursor of text editor using motion sensor
US20140035876A1 (en) Command of a Computing Device

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination