KR20140109062A - Method and apparatus for gesture recognition - Google Patents
Method and apparatus for gesture recognition Download PDFInfo
- Publication number
- KR20140109062A KR20140109062A KR1020130023258A KR20130023258A KR20140109062A KR 20140109062 A KR20140109062 A KR 20140109062A KR 1020130023258 A KR1020130023258 A KR 1020130023258A KR 20130023258 A KR20130023258 A KR 20130023258A KR 20140109062 A KR20140109062 A KR 20140109062A
- Authority
- KR
- South Korea
- Prior art keywords
- input
- touch screen
- screen display
- input means
- displayed
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to a gesture recognition method and apparatus, and more particularly, to a gesture recognition method and apparatus using an input tool on a touch display.
A gesture recognition method in a touch screen display, in accordance with an embodiment of the present invention, includes the steps of detecting input coordinates of an interactable input means, receiving a button input of an input means, displaying the detected input coordinates on a touch screen display Changing the size of the first object or a second object that is different from the first object is displayed according to the distance between the touch screen display and the input means when the button input is received, And a control unit.
The present invention provides a gesture recognition method and apparatus capable of more easily controlling an inserted object when using an input means in an electronic device employing a touch-based input method.
Description
The present invention relates to a gesture recognition method and apparatus, and more particularly, to a gesture recognition method and apparatus using an input tool on a touch display.
The input method of the electronic device is gradually shifting from the button or keypad input method to the touch-based input method. The touch-based input method provides more various input methods than the button or keypad input method. In electronic devices adopting a touch-based input method, basic operations such as operation and selection of an application are easy, but the creation of an electronic document is still limited to the input of text, the insertion of a simple figure or a photograph.
In an electronic device employing a touch-based input method, various types of touch gestures or supplementary input means such as a stylus pen are introduced to supplement an input method in which input precision or degree of freedom is lower than that of a keyboard or a mouse have. However, stylus pens and the like are useful for handwriting input such as handwriting or drawing, but they are merely a selection tool for editing objects such as an input figure or character.
Therefore, it has been difficult to edit figures and characters more easily when using the touch-based input means. In addition, in the case of using the stylus fan, in the case of the double tap which rapidly double tapes in web surfing or document view, it is possible to perform simple zoom in / zoom out and to control the degree of fine zoom, I needed a way to utilize it.
Accordingly, it is an object of the present invention to provide a gesture recognition method and apparatus capable of more easily controlling an inserted object when using an input means in an electronic device employing a touch-based input method.
Another object of the present invention is to provide a gesture recognition method and apparatus capable of easily zooming in and out of a display without using a finger when using an input means in an electronic device adopting a touch-based input method.
Another object of the present invention is to provide a gesture recognition method and apparatus capable of providing an indicator capable of more efficiently displaying an interval between an input means and a touch screen display in an electronic device employing a touch-based input method .
The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.
According to an aspect of the present invention, there is provided a method of recognizing a gesture in a touch screen display, comprising the steps of: detecting input coordinates of an interactive input means; Wherein the detected input coordinates are maintained in the area of the first object displayed on the touch screen display and when a button input is received, the first object or first And changing a size at which a second object different from the object is displayed.
According to another aspect of the present invention, the first object includes a selected object and the second object includes a non-selected object.
According to still another aspect of the present invention, there is provided an image processing method including the steps of changing a size of a selected object or a non-selected object.
According to another aspect of the present invention, the first object includes a zoom control object, and the second object includes a content display object.
According to another aspect of the present invention, the interactive input means is a stylus pen.
According to another aspect of the present invention there is provided an apparatus comprising an indicator that is displayed to correspond to a distance between a touch screen display and an input means when the input of the input means continues in an area of the first object and the received button input continues, The method comprising the steps of:
According to another aspect of the present invention, an indicator is displayed on a touch screen display at a position where the input of the input means is applied.
According to another aspect of the present invention, the step of varying the displayed size comprises the steps of: detecting the input coordinates in an area of the first object displayed on the touch screen display; when the button input is received, Determining a distance of the means as a reference distance; And changing the size of the second object displayed on the first object or the second object different from the first object according to the difference between the reference distance and the changed distance.
According to another aspect of the present invention, the detected input coordinates are maintained in the area of the first object displayed on the touch screen display, and when the button input is received, after the input of the input means is in the area of the first object, And the button input is received.
According to an aspect of the present invention, there is provided a gesture recognition apparatus including a detection unit for detecting input coordinates of an input means capable of interaction, a button input receiving unit for receiving a button input of the input means, Wherein the first object or a second object that is different from the first object is displayed according to the distance between the touch screen display and the input means when the button input is received and continues in the area of the first object displayed on the screen display And a processing unit for changing the state of the image processing apparatus.
According to another aspect of the present invention, the first object includes a selected object and the second object includes a non-selected object.
According to another aspect of the present invention, the processing unit changes the size of the selected object or the non-selected object.
According to another aspect of the present invention, the first object includes a zoom control object, and the second object includes an object of the content display area.
According to another aspect of the present invention, the interactive input means is a stylus pen.
According to another aspect of the present invention there is provided an apparatus comprising an indicator that is displayed to correspond to a distance between a touch screen display and input means when input coordinates are persisted in an area of a first object displayed on the touch screen display, and an indicator display unit for displaying an indicator.
According to another aspect of the present invention, an indicator is displayed on a touch screen display at a position where the input of the input means is applied.
According to another aspect of the present invention, the processing unit determines the distance between the touch screen display and the input means as a reference distance when input coordinates are continued in an area of the first object displayed on the touch screen display and a button input is received And changes the magnitude of displaying the first object or the second object different from the first object according to the difference between the reference distance and the changed distance.
According to another aspect of the present invention, when the input coordinate is continued in the area of the first object displayed on the touch screen display and the input of the input means is in the area of the first object when the button input is received, Is received.
The embodiments of the present invention have at least the following effects.
That is, it is possible to provide a gesture recognition method and apparatus that can more easily control an inserted object when using an input means in an electronic device employing a touch-based input method.
It is also possible to provide a gesture recognition method and apparatus capable of easily zooming in and out of a display without using a finger when using an input means in an electronic device employing a touch-based input method.
It is also possible to provide a gesture recognition method and apparatus capable of providing an indicator capable of more efficiently displaying an interval between an input means and a touch screen display in an electronic device employing a touch-based input method.
The effects according to the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.
1 is a block diagram of a gesture recognition apparatus according to an embodiment of the present invention.
2 is a schematic diagram of a gesture recognition apparatus according to an embodiment of the present invention.
3A to 3C are schematic views illustrating enlargement and reduction operations of a gesture recognition apparatus according to an embodiment of the present invention.
4A to 4C are schematic views illustrating a zoom-in and zoom-out operation of the gesture recognition apparatus according to an embodiment of the present invention.
5A and 5B are schematic diagrams illustrating an indicator of a gesture recognition apparatus according to an embodiment of the present invention.
6 is a flowchart of a gesture recognition method according to an embodiment of the present invention.
The present invention is not limited to the embodiments disclosed below but may be embodied in various forms without departing from the spirit and scope of the invention. To fully disclose the scope of the invention to a person skilled in the art, and the invention is only defined by the scope of the claims.
In the present specification, when one element 'transmits' data or a signal to another element, the element can transmit data or signals directly to another element, and data can be transmitted through at least one other element Or signals can be transmitted to other components.
In this specification, an electronic document means information transmitted, received or stored in an electronic form by an information processing system, and can be used in electronic form or printed. The electronic document is not only an electronic document created using a document creation program such as Microsoft Office, a Hancom Office of Hangul and a computer company, or an Acrobat of Adobe, but also a structured document such as HTML, XML, and SGML Electronic documents, and may also include moving images or images that are created in electronic form and transmitted, received, or stored. The electronic document may also include documents generated in various memo programs implemented on an electronic device with a touch screen display. In addition, the electronic document includes all documents that store input, such as text, graphics, etc., to an electronic device with a touch screen display.
1 is a block diagram of a gesture recognition apparatus according to an embodiment of the present invention. Referring to FIG. 1, a
The interactable input means means means for inputting the position of contact with the touch screen display and may include any input means capable of interacting with the touch screen display. The input means capable of interacting will be described later with reference to Fig.
Further, the input of the input means capable of interacting in this specification means an input which is generated in a state in which the input means and the touch screen display are not in contact with each other unless otherwise mentioned. That is, in the following, it is distinguished that the input signal is generated by the interaction means capable of being contacted with the touch screen display and the input signal is generated by the input means being at a certain distance on the touch screen display, The input is described as being done in a state where the input means and the touch screen display are not in contact.
A gesture refers to a change in a series of motions or input coordinates input by an input means capable of interacting on a touch screen device. For example, one or more inputs may be included herein as moving, and may involve button input. In particular, in this specification, a gesture may include a variation in height, such as the distance between the input means and the touch screen display, in addition to a two-dimensional coordinate movement such as left, right, up and down on the touch screen display.
The
The
In the case where there is no conductive object in proximity to the sensing cell in the
In the present specification, the position detection through the change of reciprocal capacitance is exemplarily described in the
The button
The
In this specification, an object refers to all kinds of objects that can be displayed on a device. That is, any object may be defined as an object of the present invention as long as it is an object which can be displayed separately on the display so that its shape, shape, color, luminance, contrast, contour, etc. are distinguished from one or more of its surroundings. That is, the object includes everything that is distinguishable within the image displayed by the touch screen display, and may include, for example, objects in the image, background, person, text, icon, An object may be referred to herein as an area, an object, an object, or the like. An area of an object means an area determined based on the boundary line of the object.
The
The
The area of the displayed object refers to the area determined based on the boundary line surrounding the object. Determination of an area based on an object's boundary means that the area of the object includes at least an area defined by the boundary of the object and may also include the surrounding area of the object.
In addition, the
The
Further, when the input coordinates are continued in the area of the object displayed on the touch screen display and the button input is received, the
It is also possible that the input coordinates persist in the area of the first object displayed on the touch screen display and the button input is received if the input of the input means is in the area of the first object and then the button input is received .
A specific configuration in which the
Further, the
Changing the size at which an object is displayed can mean image processing to make the displayed size smaller or larger. That is, the
In addition, it is possible to change the size of an object to be displayed by performing image processing in addition to changing the size at which the object is displayed. For example, changing the size of the object may mean enlarging or reducing the size of a display, or changing the size of a graphic or font when editing an electronic document.
In the present specification, image processing or image correction refers to processing an input image through a
The
The
2 is a schematic diagram of a gesture recognition apparatus according to an embodiment of the present invention. 2 shows a
The
The
The
The input means 250 is means for inputting a contact position with the
The
The distance between the input tip of the input means 250 and the portion extending in the direction of the input means 250 and intersecting the
The mutual electrostatic capacitance of the sensing cell is changed according to the distance d that extends in the direction of the input means 250 and the input means 250 and intersects with the
That is, the change in the displayed size may be constant with a constant distance difference between 0 and the maximum possible distance. For example, if the distance d is controlled within the range of 0 to 1 cm, 0.1 cm can be increased or decreased at a constant rate of 50% for 0.5 cm of enlargement of 10% and 100% for 1 cm, and the maximum value of zoom in and zoom out May be arbitrarily settable.
A button input different from the input to the
The input coordinate
Hereinafter, the operation of changing the size of the object or the size of the object displayed on the
3A to 3C are schematic views illustrating enlargement and reduction operations of a gesture recognition apparatus according to an embodiment of the present invention. For the sake of convenience of description, the same structures as those shown in the
3A shows a case where the input of the input means 250 is detected in an area in the
3B shows that when the input of the input means 250 is detected in an area within the
3C shows that when the input of the input means 250 is detected in an area within the
4A to 4C are schematic views illustrating a zoom-in and zoom-out operation of the gesture recognition apparatus according to an embodiment of the present invention. The touch screen display 310, the input means 350 and the
In FIG. 4A, the
The second object 340 may be an object of the content display area displaying a different object from the
The gesture recognition apparatus according to an embodiment of the present invention detects the input of the input means 350 and the button input of the input means 350 in the
4B shows a case where the input of the input means 350 is detected in the
4C shows a case where the input of the input means 350 is detected in the
5A and 5B are schematic diagrams illustrating an indicator of a gesture recognition apparatus according to an embodiment of the present invention. The
5A is a schematic view of a
The input coordinate
5 (b) shows a case where the distance d is the maximum distance that can be recognized as the input by the input means 450, for example. A
In the
FIG. 5B is a schematic diagram of a
6 is a flowchart of a gesture recognition method according to an embodiment of the present invention. First, the input coordinates of the input means capable of interacting in the touch screen display are detected (S100). In applying the gesture recognition method in the touch screen display according to an embodiment of the present invention, an electronic document editing program window may be displayed on the touch screen display.
Further, a button input of the input means is received (S200). In the gesture recognition method of the present invention, the input of a button may include various button inputs such as a continuation of the button input, i.e., the user continuously pressing the button, rapidly pressing the button twice, have.
In the gesture recognition method according to an embodiment of the present invention, it can be determined whether the input coordinates are within the area of the object displayed on the touch screen display and at the same time, there is a button input of the input means.
The detected input coordinates continue in the area of the first object displayed on the touch screen display and when a button input is received, the first object or the second object 2 The size at which the object is displayed may be changed (S300).
One or more objects displayed in the electronic document editing program window may be, for example, one or more graphics. If more than one object is more than one object, each object may be an object. Each of the one or more objects on the touch screen display is displayed independently and identifiably.
The distance between the touch screen display and the input means is changed according to the movement of the input means and the displayed size of the first object or the second object and the size of the second object can be changed as the distance is changed.
Each block of the accompanying block diagrams and combinations of steps of the flowcharts may be implemented in firmware, software, or hardware, or may be implemented by algorithms or computer program instructions. These algorithms or computer program instructions may be embedded in a processor of a general purpose computer, special purpose computer, or other programmable digital signal processing device, so that the instructions that are executed by a processor of a computer or other programmable data processing apparatus Generate means for performing the functions described in each block or flowchart of the block diagram. These algorithms or computer program instructions may also be stored in a computer usable or computer readable memory capable of directing a computer or other programmable data processing apparatus to implement a function in a particular manner, It is also possible for instructions stored in a possible memory to produce a manufacturing item containing instruction means for performing the function described in each block or flowchart of each block diagram. Computer program instructions may also be stored on a computer or other programmable data processing equipment so that a series of operating steps may be performed on a computer or other programmable data processing equipment to create a computer- It is also possible that the instructions that perform the processing equipment provide the steps for executing the functions described in each block of the block diagram and at each step of the flowchart.
Also, each block or each step may represent a module, segment, or portion of code that includes one or more executable instructions for executing the specified logical function (s). It should also be noted that in some alternative embodiments, the functions mentioned in the blocks or steps may occur out of order. For example, two blocks or steps shown in succession may in fact be performed substantially concurrently, or the blocks or steps may sometimes be performed in reverse order according to the corresponding function.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the present invention is not limited to the disclosed exemplary embodiments, but various changes and modifications may be made without departing from the spirit and scope of the invention. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas falling within the scope of the same shall be construed as falling within the scope of the present invention.
100, 200, 300, 400: Gesture recognition device (400)
110:
120: Button input receiver
130:
140:
210, 310, 410: Touch screen display
220, 320, 420: the first object
230, 330, and 430: a first object area
240, 340, 342, 344, 440: the second object
250, 350, 450: input means
252, 352, 452: input means tip
252, 354, 454: input means buttons
260, 262, 264, 266, 360, 460: Input coordinate indicator
360, 362, 364: scroll bar
462, 464, 466:
470: Zoom scroll
472: Level indicator
Claims (18)
Detecting input coordinates of an interactable input means;
Receiving a button input of the input means;
Wherein the detected input coordinates are maintained in an area of a first object displayed on a touch screen display and when the button input is received, depending on the distance between the touch screen display and the input means, And changing a size at which the second object different from the first object is displayed.
Wherein the first object comprises a selected object and the second object comprises a non-selected object.
Further comprising changing a size of the selected object or the non-selected object.
The first object including a zoom control object,
Wherein the second object comprises a content display object.
Characterized in that the interaction-enabling input means is a stylus pen.
Displaying an indicator displayed to correspond to a distance between the touch screen display and the input means when an input of the input means continues in an area of the first object and the received button input continues; ≪ / RTI >
Wherein the indicator is displayed on the touch screen display at a location where the input of the input means is applied.
Wherein changing the displayed size comprises:
Determining the distance between the touch screen display and the input means as a reference distance if the detected input coordinate is within an area of a first object displayed on the touch screen display and the button input is received; And changing a size at which the second object different from the first object or the first object is displayed according to a difference between the reference distance and the changed distance.
Wherein the detected input coordinates persist in an area of a first object displayed on a touch screen display and after the input of the input means is in the area of the first object when the button input is received, The gesture recognition method comprising the steps of:
A button input receiving unit for receiving a button input of the input means;
Wherein the input coordinates are maintained in an area of a first object displayed on a touch screen display and when the button input is received, depending on the distance between the touch screen display and the input means, the first object or the first object And a processor for changing a size at which the second object different from the first object is displayed.
Wherein the first object comprises a selected object and the second object comprises a non-selected object.
Wherein the processing unit changes the size of the selected object or the non-selected object.
The first object including a zoom control object,
Wherein the second object includes an object of the content display area.
Characterized in that the interaction-enabling input means is a stylus pen.
Wherein the input coordinates persist in an area of a first object displayed on the touch screen display and when the button input is received an indicator is displayed corresponding to the distance between the touch screen display and the input means Further comprising an indicator display unit.
Wherein the indicator is displayed on the touch screen display at a position where the input of the input means is applied.
Wherein,
Wherein the input coordinates are maintained in an area of a first object displayed on the touch screen display and the distance between the touch screen display and the input means is determined as a reference distance when the button input is received, And changes the magnitude of display of the second object different from the first object or the first object according to the difference in distance.
Wherein the input coordinates are maintained in an area of a first object displayed on a touch screen display and when the button input is received after the input of the input means is in the area of the first object, And the gesture recognition device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130023258A KR20140109062A (en) | 2013-03-05 | 2013-03-05 | Method and apparatus for gesture recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130023258A KR20140109062A (en) | 2013-03-05 | 2013-03-05 | Method and apparatus for gesture recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20140109062A true KR20140109062A (en) | 2014-09-15 |
Family
ID=51755864
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130023258A KR20140109062A (en) | 2013-03-05 | 2013-03-05 | Method and apparatus for gesture recognition |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20140109062A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021025457A1 (en) * | 2019-08-06 | 2021-02-11 | 삼성전자 주식회사 | Electronic device for recognizing stylus pen and method for operating same |
-
2013
- 2013-03-05 KR KR1020130023258A patent/KR20140109062A/en not_active Application Discontinuation
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021025457A1 (en) * | 2019-08-06 | 2021-02-11 | 삼성전자 주식회사 | Electronic device for recognizing stylus pen and method for operating same |
US11874978B2 (en) | 2019-08-06 | 2024-01-16 | Samsung Electronics Co., Ltd | Electronic device for recognizing stylus pen and method of operating the same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3404520B1 (en) | Method of displaying information by using touch input in mobile terminal | |
US20140125615A1 (en) | Input device, information terminal, input control method, and input control program | |
US20140129990A1 (en) | Interactive input system having a 3d input space | |
US20100295806A1 (en) | Display control apparatus, display control method, and computer program | |
JP5808712B2 (en) | Video display device | |
US20050156946A1 (en) | Image display method, image display program, and information device | |
KR102393295B1 (en) | Apparatus and method for styling a content | |
US10013156B2 (en) | Information processing apparatus, information processing method, and computer-readable recording medium | |
US20180121076A1 (en) | Drawing processing method, drawing program, and drawing device | |
US20140173532A1 (en) | Display control apparatus, display control method, and storage medium | |
US11003340B2 (en) | Display device | |
US20140289672A1 (en) | Graph display apparatus, graph display method and storage medium having stored thereon graph display program | |
US20150268828A1 (en) | Information processing device and computer program | |
WO2013136594A1 (en) | Plan display device and plan display program | |
CN109656435B (en) | Display control device and recording medium | |
US9292185B2 (en) | Display device and display method | |
US20130162562A1 (en) | Information processing device and non-transitory recording medium storing program | |
EP2827237B1 (en) | Zoom control of screen image in electronic device | |
CN103914228A (en) | Mobile terminal and touch screen operating method thereof | |
KR101505806B1 (en) | Method and apparatus for activating and controlling a pointer on a touch-screen display | |
KR20140109062A (en) | Method and apparatus for gesture recognition | |
JP6722239B2 (en) | Information processing device, input method, and program | |
JP2015032261A (en) | Display device and control method | |
CN108932054B (en) | Display device, display method, and non-transitory recording medium | |
US10949078B2 (en) | Display apparatus, display method, and non-transitory computer-readable recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |