KR20140109062A - Method and apparatus for gesture recognition - Google Patents

Method and apparatus for gesture recognition Download PDF

Info

Publication number
KR20140109062A
KR20140109062A KR1020130023258A KR20130023258A KR20140109062A KR 20140109062 A KR20140109062 A KR 20140109062A KR 1020130023258 A KR1020130023258 A KR 1020130023258A KR 20130023258 A KR20130023258 A KR 20130023258A KR 20140109062 A KR20140109062 A KR 20140109062A
Authority
KR
South Korea
Prior art keywords
input
touch screen
screen display
input means
displayed
Prior art date
Application number
KR1020130023258A
Other languages
Korean (ko)
Inventor
정회상
Original Assignee
주식회사 인프라웨어
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 인프라웨어 filed Critical 주식회사 인프라웨어
Priority to KR1020130023258A priority Critical patent/KR20140109062A/en
Publication of KR20140109062A publication Critical patent/KR20140109062A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a gesture recognition method and apparatus, and more particularly, to a gesture recognition method and apparatus using an input tool on a touch display.
A gesture recognition method in a touch screen display, in accordance with an embodiment of the present invention, includes the steps of detecting input coordinates of an interactable input means, receiving a button input of an input means, displaying the detected input coordinates on a touch screen display Changing the size of the first object or a second object that is different from the first object is displayed according to the distance between the touch screen display and the input means when the button input is received, And a control unit.
The present invention provides a gesture recognition method and apparatus capable of more easily controlling an inserted object when using an input means in an electronic device employing a touch-based input method.

Description

[0001] METHOD AND APPARATUS FOR GESTURE RECOGNITION [0002]

The present invention relates to a gesture recognition method and apparatus, and more particularly, to a gesture recognition method and apparatus using an input tool on a touch display.

The input method of the electronic device is gradually shifting from the button or keypad input method to the touch-based input method. The touch-based input method provides more various input methods than the button or keypad input method. In electronic devices adopting a touch-based input method, basic operations such as operation and selection of an application are easy, but the creation of an electronic document is still limited to the input of text, the insertion of a simple figure or a photograph.

In an electronic device employing a touch-based input method, various types of touch gestures or supplementary input means such as a stylus pen are introduced to supplement an input method in which input precision or degree of freedom is lower than that of a keyboard or a mouse have. However, stylus pens and the like are useful for handwriting input such as handwriting or drawing, but they are merely a selection tool for editing objects such as an input figure or character.

Therefore, it has been difficult to edit figures and characters more easily when using the touch-based input means. In addition, in the case of using the stylus fan, in the case of the double tap which rapidly double tapes in web surfing or document view, it is possible to perform simple zoom in / zoom out and to control the degree of fine zoom, I needed a way to utilize it.

Accordingly, it is an object of the present invention to provide a gesture recognition method and apparatus capable of more easily controlling an inserted object when using an input means in an electronic device employing a touch-based input method.

Another object of the present invention is to provide a gesture recognition method and apparatus capable of easily zooming in and out of a display without using a finger when using an input means in an electronic device adopting a touch-based input method.

Another object of the present invention is to provide a gesture recognition method and apparatus capable of providing an indicator capable of more efficiently displaying an interval between an input means and a touch screen display in an electronic device employing a touch-based input method .

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a method of recognizing a gesture in a touch screen display, comprising the steps of: detecting input coordinates of an interactive input means; Wherein the detected input coordinates are maintained in the area of the first object displayed on the touch screen display and when a button input is received, the first object or first And changing a size at which a second object different from the object is displayed.

According to another aspect of the present invention, the first object includes a selected object and the second object includes a non-selected object.

According to still another aspect of the present invention, there is provided an image processing method including the steps of changing a size of a selected object or a non-selected object.

According to another aspect of the present invention, the first object includes a zoom control object, and the second object includes a content display object.

According to another aspect of the present invention, the interactive input means is a stylus pen.

According to another aspect of the present invention there is provided an apparatus comprising an indicator that is displayed to correspond to a distance between a touch screen display and an input means when the input of the input means continues in an area of the first object and the received button input continues, The method comprising the steps of:

According to another aspect of the present invention, an indicator is displayed on a touch screen display at a position where the input of the input means is applied.

According to another aspect of the present invention, the step of varying the displayed size comprises the steps of: detecting the input coordinates in an area of the first object displayed on the touch screen display; when the button input is received, Determining a distance of the means as a reference distance; And changing the size of the second object displayed on the first object or the second object different from the first object according to the difference between the reference distance and the changed distance.

According to another aspect of the present invention, the detected input coordinates are maintained in the area of the first object displayed on the touch screen display, and when the button input is received, after the input of the input means is in the area of the first object, And the button input is received.

According to an aspect of the present invention, there is provided a gesture recognition apparatus including a detection unit for detecting input coordinates of an input means capable of interaction, a button input receiving unit for receiving a button input of the input means, Wherein the first object or a second object that is different from the first object is displayed according to the distance between the touch screen display and the input means when the button input is received and continues in the area of the first object displayed on the screen display And a processing unit for changing the state of the image processing apparatus.

According to another aspect of the present invention, the first object includes a selected object and the second object includes a non-selected object.

According to another aspect of the present invention, the processing unit changes the size of the selected object or the non-selected object.

According to another aspect of the present invention, the first object includes a zoom control object, and the second object includes an object of the content display area.

According to another aspect of the present invention, the interactive input means is a stylus pen.

According to another aspect of the present invention there is provided an apparatus comprising an indicator that is displayed to correspond to a distance between a touch screen display and input means when input coordinates are persisted in an area of a first object displayed on the touch screen display, and an indicator display unit for displaying an indicator.

According to another aspect of the present invention, an indicator is displayed on a touch screen display at a position where the input of the input means is applied.

According to another aspect of the present invention, the processing unit determines the distance between the touch screen display and the input means as a reference distance when input coordinates are continued in an area of the first object displayed on the touch screen display and a button input is received And changes the magnitude of displaying the first object or the second object different from the first object according to the difference between the reference distance and the changed distance.

According to another aspect of the present invention, when the input coordinate is continued in the area of the first object displayed on the touch screen display and the input of the input means is in the area of the first object when the button input is received, Is received.

The embodiments of the present invention have at least the following effects.

That is, it is possible to provide a gesture recognition method and apparatus that can more easily control an inserted object when using an input means in an electronic device employing a touch-based input method.

It is also possible to provide a gesture recognition method and apparatus capable of easily zooming in and out of a display without using a finger when using an input means in an electronic device employing a touch-based input method.

It is also possible to provide a gesture recognition method and apparatus capable of providing an indicator capable of more efficiently displaying an interval between an input means and a touch screen display in an electronic device employing a touch-based input method.

The effects according to the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.

1 is a block diagram of a gesture recognition apparatus according to an embodiment of the present invention.
2 is a schematic diagram of a gesture recognition apparatus according to an embodiment of the present invention.
3A to 3C are schematic views illustrating enlargement and reduction operations of a gesture recognition apparatus according to an embodiment of the present invention.
4A to 4C are schematic views illustrating a zoom-in and zoom-out operation of the gesture recognition apparatus according to an embodiment of the present invention.
5A and 5B are schematic diagrams illustrating an indicator of a gesture recognition apparatus according to an embodiment of the present invention.
6 is a flowchart of a gesture recognition method according to an embodiment of the present invention.

The present invention is not limited to the embodiments disclosed below but may be embodied in various forms without departing from the spirit and scope of the invention. To fully disclose the scope of the invention to a person skilled in the art, and the invention is only defined by the scope of the claims.

In the present specification, when one element 'transmits' data or a signal to another element, the element can transmit data or signals directly to another element, and data can be transmitted through at least one other element Or signals can be transmitted to other components.

In this specification, an electronic document means information transmitted, received or stored in an electronic form by an information processing system, and can be used in electronic form or printed. The electronic document is not only an electronic document created using a document creation program such as Microsoft Office, a Hancom Office of Hangul and a computer company, or an Acrobat of Adobe, but also a structured document such as HTML, XML, and SGML Electronic documents, and may also include moving images or images that are created in electronic form and transmitted, received, or stored. The electronic document may also include documents generated in various memo programs implemented on an electronic device with a touch screen display. In addition, the electronic document includes all documents that store input, such as text, graphics, etc., to an electronic device with a touch screen display.

1 is a block diagram of a gesture recognition apparatus according to an embodiment of the present invention. Referring to FIG. 1, a gesture recognition apparatus 100 according to an exemplary embodiment of the present invention includes a detection unit 110, a button input receiving unit 120, a processing unit 150, and a display unit 140. The gesture recognition apparatus 100 is an apparatus including a touch screen display and is configured to perform a function corresponding to an input gesture in which a gesture is input by a finger or an input means capable of interaction.

The interactable input means means means for inputting the position of contact with the touch screen display and may include any input means capable of interacting with the touch screen display. The input means capable of interacting will be described later with reference to Fig.

Further, the input of the input means capable of interacting in this specification means an input which is generated in a state in which the input means and the touch screen display are not in contact with each other unless otherwise mentioned. That is, in the following, it is distinguished that the input signal is generated by the interaction means capable of being contacted with the touch screen display and the input signal is generated by the input means being at a certain distance on the touch screen display, The input is described as being done in a state where the input means and the touch screen display are not in contact.

A gesture refers to a change in a series of motions or input coordinates input by an input means capable of interacting on a touch screen device. For example, one or more inputs may be included herein as moving, and may involve button input. In particular, in this specification, a gesture may include a variation in height, such as the distance between the input means and the touch screen display, in addition to a two-dimensional coordinate movement such as left, right, up and down on the touch screen display.

The gesture recognition apparatus 100 may be a device or a terminal including at least a microprocessor, a memory, and a communication module, and may be a portable portable terminal. When the gesture recognition apparatus 100 is a portable terminal, the portable terminal may be a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (personal digital assistant), a portable multimedia player (PMP) And can be used as an accessory device. In FIG. 1, the gesture recognition apparatus 100 is assumed to be a smartphone for convenience of explanation, but the present invention is not limited thereto. The gesture recognition apparatus 100 may be one of the terminals listed above.

The detection unit 110 of the gesture recognition apparatus 100 acquires the coordinate input from the interactable input means and transmits it to the processing unit 130. [ For example, the detection unit 110 of the gesture recognition apparatus 100 according to an exemplary embodiment of the present invention detects a change in mutual capacitance formed in a sensing cell where a driving line and a sensing line cross each other in a touch screen display.

In the case where there is no conductive object in proximity to the sensing cell in the gesture recognition apparatus 100 according to the embodiment of the present invention, the mutual capacitance generated in the sensing cell does not change, A mutual capacitance change occurs. The detecting unit 110 of the gesture recognizing apparatus 100 can obtain the input coordinates and transmit the input coordinates to the processing unit 130 by detecting the position of the mutual capacitance change.

In the present specification, the position detection through the change of reciprocal capacitance is exemplarily described in the gesture recognition apparatus 100, but the method of detecting the coordinate input is not limited thereto and may be performed by various methods.

The button input receiving unit 120 of the gesture recognition apparatus 100 receives the input of the button from the interactable input means. For example, the input means of the gesture recognition apparatus 100 according to an embodiment of the present invention may be configured to include at least one button. The user can press a button or continue pressing the button, and a predetermined operation can be performed depending on whether the button is input or the button input is continued. The button input receiving unit 120 of the gesture recognition apparatus 100 according to an exemplary embodiment of the present invention receives an input of a button and transmits an input of the received button to the processing unit 130. [

The processing unit 130 of the gesture recognition apparatus 100 according to an embodiment of the present invention changes the size of the object according to the distance between the touch screen display and the input means based on the input coordinates of the input means and the button input.

In this specification, an object refers to all kinds of objects that can be displayed on a device. That is, any object may be defined as an object of the present invention as long as it is an object which can be displayed separately on the display so that its shape, shape, color, luminance, contrast, contour, etc. are distinguished from one or more of its surroundings. That is, the object includes everything that is distinguishable within the image displayed by the touch screen display, and may include, for example, objects in the image, background, person, text, icon, An object may be referred to herein as an area, an object, an object, or the like. An area of an object means an area determined based on the boundary line of the object.

The processing unit 130 of the gesture recognition apparatus 100 performs various operations in the gesture recognition apparatus 100 such as a mobile phone, a smartphone, a notebook computer, a digital broadcast terminal, a PDA (personal digital assistant), a portable multimedia player Or < / RTI >

The processing unit 130 of the gesture recognition apparatus 100 according to an embodiment of the present invention can determine that the input coordinates of the input means are not within the area of the object displayed on the touch screen display. The touch screen display of the gesture recognition apparatus 100 according to an embodiment of the present invention may display an object. The processing unit 130 of the gesture recognition apparatus 100 can determine whether or not the current input position of the input means is within the area of the object to be displayed based on the input coordinates from the detection unit 110 described above.

The area of the displayed object refers to the area determined based on the boundary line surrounding the object. Determination of an area based on an object's boundary means that the area of the object includes at least an area defined by the boundary of the object and may also include the surrounding area of the object.

In addition, the processing unit 130 of the gesture recognition apparatus 100 can determine whether the input coordinate is within the area of the object displayed on the touch screen display, and at the same time, there is a button input of the input means. In the gesture recognition apparatus 100 according to the present invention, the input of a button includes various button inputs such as a continuous input of a button, that is, a user continuously pressing a button, rapidly pressing a button twice, .

The processing unit 130 of the gesture recognition device 100 determines whether the input of the input means continues in the area of the object displayed on the touch screen display and when the button input is received, The display size can be changed.

Further, when the input coordinates are continued in the area of the object displayed on the touch screen display and the button input is received, the processing unit 130 of the gesture recognition device 100 determines the distance between the touch screen display and the input means as a reference distance And change the size of the object displayed according to the difference between the reference distance and the changed distance. In the gesture recognition apparatus 100 according to an exemplary embodiment of the present invention, since an input must exist within an area of an object and a touch input must be received at the same time, the distance between the input means and the touch screen display When determining the reference distance, it is possible to control a natural object.

It is also possible that the input coordinates persist in the area of the first object displayed on the touch screen display and the button input is received if the input of the input means is in the area of the first object and then the button input is received .

A specific configuration in which the gesture recognition apparatus 100 changes the size at which the object is displayed according to the distance between the touch screen display and the input means will be described later with reference to Figs.

Further, the processing unit 130 of the gesture recognition apparatus 100 may be configured such that, when the input of the input means is received after the object displayed on the touch screen display is selected, depending on the distance between the touch screen display and the input means, It is also possible to change the displayed size.

Changing the size at which an object is displayed can mean image processing to make the displayed size smaller or larger. That is, the processing unit 130 of the gesture recognition apparatus 100 can adjust the degree of enlargement and reduction of the display according to the distance between the touch screen display and the input means.

In addition, it is possible to change the size of an object to be displayed by performing image processing in addition to changing the size at which the object is displayed. For example, changing the size of the object may mean enlarging or reducing the size of a display, or changing the size of a graphic or font when editing an electronic document.

In the present specification, image processing or image correction refers to processing an input image through a processing unit 130 or a processor to process the image according to a purpose, and includes both analog signal processing and digital signal processing. Hereinafter, for the convenience of description, image processing refers to digital image processing on an image, but it is not limited thereto and can be widely interpreted. Further, the image processing in this specification may include at least four of point processing, area processing, geometric processing, and frame processing. The point processing is performed on a pixel-by-pixel basis based on the position of the pixel. The area processing can change the pixel value based on the original value of the pixel and the neighboring pixel value, and the geometric processing can change the position or arrangement of the pixels. The frame processing may change pixel values based on operations on two or more images.

The gesture recognition apparatus 100 according to an embodiment of the present invention can change the size or the size of an object displayed on the touch screen display in accordance with the distance between the input means and the touch screen display, Lt; / RTI >

The display unit 140 of the gesture recognition apparatus 100 receives an image from the processing unit 130 and displays the image. The display unit 140 may be a touch screen display, but is not limited within the scope of achieving the objects of the present invention.

2 is a schematic diagram of a gesture recognition apparatus according to an embodiment of the present invention. 2 shows a touch screen display 210, a first object 220, a region 230 of a first object, a second object 240, an input means 250 and an input coordinate indicator 260.

The touch screen display 210 displays a document or an image. Hereinafter, the touch screen display 210 is shown as displaying an editable graphic in an electronic document, but it is not limited thereto and a web page, a photograph, an icon, and the like can be displayed by another application.

The first object 220 is shown on the touch screen display 210 and the region 230 of the first object is shown to surround the first object 220 with respect to the perimeter of the first object 220. In FIG. 2, the first object 220 is shown as a rectangle, but for convenience of explanation, the first object 220 is not limited thereto. The area 230 of the first object may be displayed on the touch screen display 210 or may be displayed only when the first object 220 is selected, 1 < / RTI >

The second object 240 is shown on the touch screen display 210 as an object different from the first object 220. In the following, the second object 240 may be one illustrated to compare the relative size with the first object 220. That is, when the gesture recognition apparatus 200 according to an embodiment of the present invention changes the size of the first object 220, the size of the second object 240 does not change. When the gesture recognition device 200 enlarges or reduces the screen, the displayed sizes of the first object 220 and the second object 240 may all vary.

The input means 250 is means for inputting a contact position with the touch screen display 210 including the input tip 252 and the button 252. [ Hereinafter, the input means 250 may be an active stylus pen or a manual stylus pen. Also, the input means 250 may be an input device including a transmitting / receiving function such as Bluetooth.

The input tip 252 of the input means 250 may be a configuration for the input means 250 and the touch screen display 210 to interact. For example, if the input tip is proximate to the sensing cell of the touch screen display 210, the input tip may apply an input to the touch screen display 210 by changing the mutual capacitance of the sensing cell by partially blocking the electric field .

The distance between the input tip of the input means 250 and the portion extending in the direction of the input means 250 and intersecting the touch screen display 210 may be determined as the distance d. When the distance d becomes zero, the input means 250 is contacted on the touch screen display 210. [ Also, when the distance d is equal to or greater than a certain distance, the mutual capacitance of the sensing cell may not be changed by the input tip. If the mutual capacitance of the sense cell does not change by the input tip as described above, it can be regarded as no input.

The mutual electrostatic capacitance of the sensing cell is changed according to the distance d that extends in the direction of the input means 250 and the input means 250 and intersects with the touch screen display 210, So that it is possible to more accurately control the change of the interaction with the distance d by correcting the current or voltage value by mutual capacitance.

That is, the change in the displayed size may be constant with a constant distance difference between 0 and the maximum possible distance. For example, if the distance d is controlled within the range of 0 to 1 cm, 0.1 cm can be increased or decreased at a constant rate of 50% for 0.5 cm of enlargement of 10% and 100% for 1 cm, and the maximum value of zoom in and zoom out May be arbitrarily settable.

A button input different from the input to the touch screen display 210 may be applied to the button 252 of the input means 250. May be a configuration in which the button 252 of the input means 250 is referred to herein but is not limited to the button 252 and may be configured to recognize as an input a state covered by a finger such as a proximity sensor, 210 may include any other configuration that may provide a separate input. Hereinafter, the button 252 of the input means 250 is shown as being blank to indicate that the button input is not applied, and the button 252 may be filled to indicate that the button input is applied .

The input coordinate indicator 260 is an indicator that extends in the direction of the input means 250 and the input means 250 and is displayed at an intersection on the touch screen display 210. The input coordinate indicator 260 may also indicate the distance between the input means 250 and the touch screen display 210. A configuration for displaying the distance between the input means 250 and the touch screen display 210 will be described later with reference to FIG.

Hereinafter, the operation of changing the size of the object or the size of the object displayed on the touch screen display 210 according to the distance between the input means 250 and the touch screen display will be described in detail.

3A to 3C are schematic views illustrating enlargement and reduction operations of a gesture recognition apparatus according to an embodiment of the present invention. For the sake of convenience of description, the same structures as those shown in the gesture recognition device 200 shown in Fig. 2 will not be described again.

3A shows a case where the input of the input means 250 is detected in an area in the first object 220 on the touch screen display 210 and the button input of the input means 250 is detected. The gesture recognition apparatus 200 according to an embodiment of the present invention detects an input of the input means 250 and a button input of the input means 250 within the first object 220 area. The distance d is changed according to the movement of the input means 250 and the size of the first object 220 and the size of the first object 220 are changed as the distance d is changed.

3B shows that when the input of the input means 250 is detected in an area within the first object 220 on the touch screen display 210 and the button input of the input means 250 is detected the distance d is reduced to d1 Fig. When the distance d is reduced to d1 by the movement of the input means 250, the size of the first object 220 and the size of the first object 220 are reduced. In the gesture recognition apparatus 200, the original area of the first object 220 may be displayed to be maintained. When the area 230 of the first object 220 is changed according to the changed size of the first object 220, the input of the input unit 250 is easily separated from the area 230 of the first object 220, However, if the original region of the first object 220 is maintained, the predictability of the user is increased, and the gesture can be more easily controlled while changing the size or the display size of the first object 220. 3A to 3C, the first object 220 is selected and the size of the first object 220 and the size of the first object 220 are both changed. The second object 240 is a non- The size of the second object 240 may be reduced due to the reduction of the screen size.

3C shows that when the input of the input means 250 is detected in an area within the first object 220 on the touch screen display 210 and the button input of the input means 250 is detected, the distance d is increased to d2 FIG. When the distance d is increased to d2 by the movement of the input means 250, the size of the first object 220 and the size of the first object 220 are displayed. The original area of the first object 220 may be displayed to be retained.

4A to 4C are schematic views illustrating a zoom-in and zoom-out operation of the gesture recognition apparatus according to an embodiment of the present invention. The touch screen display 310, the input means 350 and the area 330 of the first object correspond to the touch screen display 210, the input means 250, the area of the first object 230), so redundant description will be omitted. 4A shows a case where an input of the input means 350 is detected in an area 330 in the first object 320 on the touch screen display 310 and a button input of the input means 350 is detected.

In FIG. 4A, the first object 320 is a zoom control object. The zoom control object is maintained at the input of the input means 350 similar to the first object 320 described above with reference to Figure 3, The size of the second object 340 or the size of the second object 340 is changed.

The second object 340 may be an object of the content display area displaying a different object from the first object 320, for example, a web page or an electronic document. The content display area may include a scroll bar 360 for controlling the content display position.

The gesture recognition apparatus according to an embodiment of the present invention detects the input of the input means 350 and the button input of the input means 350 in the first object region 330. [ The distance d is changed according to the movement of the input means 350 and the size of the second object 340 is changed as the distance d is changed.

4B shows a case where the input of the input means 350 is detected in the first object area 330 on the touch screen display and the distance d is reduced to d1 when the button input of the input means 350 is detected do. When the distance d is reduced to d1 by the movement of the input means 350, the size of the second object 342 may be displayed small. The first object 320 may not be resized as a zoom control object. Also, as the size of the second object 342 decreases, the scroll bar 362 of the content display area displaying the second object 342 may be changed to correspond to the size of the second object 342. As the size of the second object 342 becomes smaller, the screen of the gesture recognition apparatus is zoomed out, and a larger number of the second objects 342 can be displayed.

4C shows a case where the input of the input means 350 is detected in the first object area 330 on the touch screen display and the distance d is increased to d2 when the button input of the input means 350 is detected do. When the distance d is increased to d2 by the movement of the input means 350, the size of the second object 344 is increased. Also, as the size of the second object 344 increases, more space is required to display the second object 344. The scroll bar 364 of the content display area may be varied in size corresponding to the space for displaying the second object 344. [ Also, as the size of the second object 344 increases, the screen of the gesture recognition apparatus is zoomed in and the second object 344 can be displayed in more detail.

5A and 5B are schematic diagrams illustrating an indicator of a gesture recognition apparatus according to an embodiment of the present invention. The touch screen display 410, the first object 420, the area 430 of the first object, the second object 440, and the input unit 450 are the same as those of the touch screen display 210 ), The first object 220, the first object region 230, the second object 240, and the input means 250, the duplicated description will be omitted. 4A shows a case where an input of the input means 450 is detected in an area in the first object 420 on the touch screen display 410 and a button input of the input means 450 is detected.

5A is a schematic view of a gesture recognition apparatus 400 according to an embodiment of the present invention, and FIGS. 5B, 5C, and 5D are diagrams illustrating a gesture recognition apparatus 400 including an input coordinate indicator 460. FIG. 5 (a). In FIG. 5A, the size of the input coordinate indicator 460 may be exaggerated for convenience of explanation, and the size and shape may be different according to the embodiment.

The input coordinate indicator 460 in the gesture recognition apparatus 400 according to an exemplary embodiment of the present invention may be displayed differently according to the distance between the input means 450 and the touch screen display 410. [ The user can recognize the distance between the input means 450 and the touch screen display 410 according to the display of the indicator 460. [

5 (b) shows a case where the distance d is the maximum distance that can be recognized as the input by the input means 450, for example. A point 462 is created in the input coordinate indicator 460 and the size of the point 462 may be small relative to the size of the indicator 460. [ (C) of FIG. 5A may be a case where the input means 450 is closer to the case of (b) of FIG. The size of the point 464 in the input coordinate indicator 460 may be relatively larger than (b). 5 (d), the input means 450 may be closer to or more or less in contact with the touch screen display 410 than in (c) of FIG. The point 466 in the input coordinate indicator 460 may be substantially the same as or substantially the same as the input coordinate indicator 460. Further, the distance d is not limited to the size of the point 466 in the indicator 460, but can be represented by a change in the color, saturation, brightness, etc. of the indicator 460.

In the gesture recognition apparatus 400 according to an exemplary embodiment of the present invention, since the size and displayed size of objects vary according to the distance between the input means 450 and the touch screen display 410, The gesture can be more efficiently controlled when the difference can be recognized through the input coordinate indicator 460 described above.

FIG. 5B is a schematic diagram of a gesture recognition apparatus 400 according to an embodiment of the present invention, and includes a zoom scroll 470. FIG. The zoom scroll 470 may include a zoom level indicator 472. The zoom level indicator 472 in the gesture recognition apparatus 400 according to an embodiment of the present invention can move within the zoom scroll corresponding to the distance between the input means 450 and the touch screen display 410. [ According to the position of the zoom level indicator 472 in the zoom scroll 470, the user can recognize the zoom level due to the distance and distance variation between the input means 450 and the touch screen display 410.

6 is a flowchart of a gesture recognition method according to an embodiment of the present invention. First, the input coordinates of the input means capable of interacting in the touch screen display are detected (S100). In applying the gesture recognition method in the touch screen display according to an embodiment of the present invention, an electronic document editing program window may be displayed on the touch screen display.

Further, a button input of the input means is received (S200). In the gesture recognition method of the present invention, the input of a button may include various button inputs such as a continuation of the button input, i.e., the user continuously pressing the button, rapidly pressing the button twice, have.

In the gesture recognition method according to an embodiment of the present invention, it can be determined whether the input coordinates are within the area of the object displayed on the touch screen display and at the same time, there is a button input of the input means.

The detected input coordinates continue in the area of the first object displayed on the touch screen display and when a button input is received, the first object or the second object 2 The size at which the object is displayed may be changed (S300).

One or more objects displayed in the electronic document editing program window may be, for example, one or more graphics. If more than one object is more than one object, each object may be an object. Each of the one or more objects on the touch screen display is displayed independently and identifiably.

The distance between the touch screen display and the input means is changed according to the movement of the input means and the displayed size of the first object or the second object and the size of the second object can be changed as the distance is changed.

Each block of the accompanying block diagrams and combinations of steps of the flowcharts may be implemented in firmware, software, or hardware, or may be implemented by algorithms or computer program instructions. These algorithms or computer program instructions may be embedded in a processor of a general purpose computer, special purpose computer, or other programmable digital signal processing device, so that the instructions that are executed by a processor of a computer or other programmable data processing apparatus Generate means for performing the functions described in each block or flowchart of the block diagram. These algorithms or computer program instructions may also be stored in a computer usable or computer readable memory capable of directing a computer or other programmable data processing apparatus to implement a function in a particular manner, It is also possible for instructions stored in a possible memory to produce a manufacturing item containing instruction means for performing the function described in each block or flowchart of each block diagram. Computer program instructions may also be stored on a computer or other programmable data processing equipment so that a series of operating steps may be performed on a computer or other programmable data processing equipment to create a computer- It is also possible that the instructions that perform the processing equipment provide the steps for executing the functions described in each block of the block diagram and at each step of the flowchart.

Also, each block or each step may represent a module, segment, or portion of code that includes one or more executable instructions for executing the specified logical function (s). It should also be noted that in some alternative embodiments, the functions mentioned in the blocks or steps may occur out of order. For example, two blocks or steps shown in succession may in fact be performed substantially concurrently, or the blocks or steps may sometimes be performed in reverse order according to the corresponding function.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the present invention is not limited to the disclosed exemplary embodiments, but various changes and modifications may be made without departing from the spirit and scope of the invention. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas falling within the scope of the same shall be construed as falling within the scope of the present invention.

100, 200, 300, 400: Gesture recognition device (400)
110:
120: Button input receiver
130:
140:
210, 310, 410: Touch screen display
220, 320, 420: the first object
230, 330, and 430: a first object area
240, 340, 342, 344, 440: the second object
250, 350, 450: input means
252, 352, 452: input means tip
252, 354, 454: input means buttons
260, 262, 264, 266, 360, 460: Input coordinate indicator
360, 362, 364: scroll bar
462, 464, 466:
470: Zoom scroll
472: Level indicator

Claims (18)

A gesture recognition method in a touch screen display,
Detecting input coordinates of an interactable input means;
Receiving a button input of the input means;
Wherein the detected input coordinates are maintained in an area of a first object displayed on a touch screen display and when the button input is received, depending on the distance between the touch screen display and the input means, And changing a size at which the second object different from the first object is displayed.
The method according to claim 1,
Wherein the first object comprises a selected object and the second object comprises a non-selected object.
3. The method of claim 2,
Further comprising changing a size of the selected object or the non-selected object.
The method according to claim 1,
The first object including a zoom control object,
Wherein the second object comprises a content display object.
The method according to claim 1,
Characterized in that the interaction-enabling input means is a stylus pen.
The method according to claim 1,
Displaying an indicator displayed to correspond to a distance between the touch screen display and the input means when an input of the input means continues in an area of the first object and the received button input continues; ≪ / RTI >
The method according to claim 6,
Wherein the indicator is displayed on the touch screen display at a location where the input of the input means is applied.
The method according to claim 1,
Wherein changing the displayed size comprises:
Determining the distance between the touch screen display and the input means as a reference distance if the detected input coordinate is within an area of a first object displayed on the touch screen display and the button input is received; And changing a size at which the second object different from the first object or the first object is displayed according to a difference between the reference distance and the changed distance.
The method according to claim 1,
Wherein the detected input coordinates persist in an area of a first object displayed on a touch screen display and after the input of the input means is in the area of the first object when the button input is received, The gesture recognition method comprising the steps of:
A detecting unit detecting an input coordinate of an inputting means capable of interaction;
A button input receiving unit for receiving a button input of the input means;
Wherein the input coordinates are maintained in an area of a first object displayed on a touch screen display and when the button input is received, depending on the distance between the touch screen display and the input means, the first object or the first object And a processor for changing a size at which the second object different from the first object is displayed.
11. The method of claim 10,
Wherein the first object comprises a selected object and the second object comprises a non-selected object.
12. The method of claim 11,
Wherein the processing unit changes the size of the selected object or the non-selected object.
11. The method of claim 10,
The first object including a zoom control object,
Wherein the second object includes an object of the content display area.
11. The method of claim 10,
Characterized in that the interaction-enabling input means is a stylus pen.
11. The method of claim 10,
Wherein the input coordinates persist in an area of a first object displayed on the touch screen display and when the button input is received an indicator is displayed corresponding to the distance between the touch screen display and the input means Further comprising an indicator display unit.
16. The method of claim 15,
Wherein the indicator is displayed on the touch screen display at a position where the input of the input means is applied.
11. The method of claim 10,
Wherein,
Wherein the input coordinates are maintained in an area of a first object displayed on the touch screen display and the distance between the touch screen display and the input means is determined as a reference distance when the button input is received, And changes the magnitude of display of the second object different from the first object or the first object according to the difference in distance.
11. The method of claim 10,
Wherein the input coordinates are maintained in an area of a first object displayed on a touch screen display and when the button input is received after the input of the input means is in the area of the first object, And the gesture recognition device.
KR1020130023258A 2013-03-05 2013-03-05 Method and apparatus for gesture recognition KR20140109062A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130023258A KR20140109062A (en) 2013-03-05 2013-03-05 Method and apparatus for gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130023258A KR20140109062A (en) 2013-03-05 2013-03-05 Method and apparatus for gesture recognition

Publications (1)

Publication Number Publication Date
KR20140109062A true KR20140109062A (en) 2014-09-15

Family

ID=51755864

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130023258A KR20140109062A (en) 2013-03-05 2013-03-05 Method and apparatus for gesture recognition

Country Status (1)

Country Link
KR (1) KR20140109062A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021025457A1 (en) * 2019-08-06 2021-02-11 삼성전자 주식회사 Electronic device for recognizing stylus pen and method for operating same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021025457A1 (en) * 2019-08-06 2021-02-11 삼성전자 주식회사 Electronic device for recognizing stylus pen and method for operating same
US11874978B2 (en) 2019-08-06 2024-01-16 Samsung Electronics Co., Ltd Electronic device for recognizing stylus pen and method of operating the same

Similar Documents

Publication Publication Date Title
EP3404520B1 (en) Method of displaying information by using touch input in mobile terminal
US20140125615A1 (en) Input device, information terminal, input control method, and input control program
US20140129990A1 (en) Interactive input system having a 3d input space
US20100295806A1 (en) Display control apparatus, display control method, and computer program
JP5808712B2 (en) Video display device
US20050156946A1 (en) Image display method, image display program, and information device
KR102393295B1 (en) Apparatus and method for styling a content
US10013156B2 (en) Information processing apparatus, information processing method, and computer-readable recording medium
US20180121076A1 (en) Drawing processing method, drawing program, and drawing device
US20140173532A1 (en) Display control apparatus, display control method, and storage medium
US11003340B2 (en) Display device
US20140289672A1 (en) Graph display apparatus, graph display method and storage medium having stored thereon graph display program
US20150268828A1 (en) Information processing device and computer program
WO2013136594A1 (en) Plan display device and plan display program
CN109656435B (en) Display control device and recording medium
US9292185B2 (en) Display device and display method
US20130162562A1 (en) Information processing device and non-transitory recording medium storing program
EP2827237B1 (en) Zoom control of screen image in electronic device
CN103914228A (en) Mobile terminal and touch screen operating method thereof
KR101505806B1 (en) Method and apparatus for activating and controlling a pointer on a touch-screen display
KR20140109062A (en) Method and apparatus for gesture recognition
JP6722239B2 (en) Information processing device, input method, and program
JP2015032261A (en) Display device and control method
CN108932054B (en) Display device, display method, and non-transitory recording medium
US10949078B2 (en) Display apparatus, display method, and non-transitory computer-readable recording medium

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination