KR20140122682A - Method for processing touch event where touch area rotates and device for the same - Google Patents

Method for processing touch event where touch area rotates and device for the same Download PDF

Info

Publication number
KR20140122682A
KR20140122682A KR20140042615A KR20140042615A KR20140122682A KR 20140122682 A KR20140122682 A KR 20140122682A KR 20140042615 A KR20140042615 A KR 20140042615A KR 20140042615 A KR20140042615 A KR 20140042615A KR 20140122682 A KR20140122682 A KR 20140122682A
Authority
KR
South Korea
Prior art keywords
touch
value
region
determining
touch event
Prior art date
Application number
KR20140042615A
Other languages
Korean (ko)
Inventor
강회식
소병철
장선웅
윤일현
Original Assignee
주식회사 지니틱스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 지니틱스 filed Critical 주식회사 지니틱스
Priority to PCT/KR2014/003116 priority Critical patent/WO2014168431A1/en
Priority to CN201480020923.1A priority patent/CN105308540A/en
Publication of KR20140122682A publication Critical patent/KR20140122682A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form

Abstract

The present invention discloses a method for processing a touch-sensitive surface as one intended user input gesture when the finger is rotated in contact with the touch-sensitive surface.

Figure P1020140042615

Description

TECHNICAL FIELD [0001] The present invention relates to a touch event processing method and a touch event processing method,

The present invention relates to a method of processing a touch event in which a touch tool is brought into contact with a touch sensitive surface of a touch input apparatus.

The touch input device can be used in various user devices. It has been used in devices such as smart phones, PDAs, laptops, and tablets that provide display screens and touch input pads to date. In the future, a touch input device can be used for a user device having a very small display screen and a touch input pad, such as a wrist watch.

When a finger is used as the input means of the touch input device, it is convenient because there is no inconvenience of using a tool such as a stylus pen.

The pen tip of the stylus pen is so thin that it allows precise input. However, when the finger is used, since the contact surface between the finger and the touch sensing surface of the touch input device is large, it is difficult to perform the user input gesture using the finger if the total area of the touch sensing surface provided is relatively small. It may be difficult to recognize the finger gesture correctly. For example, in the case of a touch input device of a wristwatch size, the above problem may occur. Therefore, there is a need to provide a new type of touch input technology capable of accepting efficient user input even when the touch detection surface of the touch input device is narrow.

The present invention provides a new processing technique for processing a touch event generated in a touch input device. Specifically, it is intended to provide a technique for accurately conveying a user's input intention even on a touch-sensitive surface having a small area.

The touch event processing method provided in accordance with the first aspect of the present invention can be performed in the case where a finger is held in contact with the touch sensitive surface and a rubbing gesture is taken in a rotational form (in comparison, A part of the case where a gesture progressing in a parallel form while taking a state in contact with the gesture can be referred to as " drag ").

At this time, in the first aspect of the present invention, the contact surface between the finger and the touch-sensitive surface is modeled in the form of a figure capable of defining a major axis and a minor axis. For example, if the contact surface is modeled as an ellipse or a rectangle, a parameter corresponding to the major axis and the minor axis can be generated. It will be understood by those skilled in the art that the technique of modeling a shape capable of defining long axes and short axes is a technique that can be provided separately from the ideas provided by the present invention and can be sufficiently achieved at the present state of the art.

The direction of the major axis or minor axis thus defined is at an angle to the touch sensitive surface. When the fingers are rotated in the one direction (ex: clockwise or counterclockwise) in the rotation mode, the direction of the long axis or the short axis changes. According to the first aspect of the present invention, it is determined that a user input is received when a change in the direction of the axis satisfies a predetermined condition.

According to a second aspect of the present invention, there is provided a method of determining whether to execute a predetermined processing step when a touch event is generated by a touch tool with respect to a touch-sensitive surface of a touch input device. The method includes determining a first region of the touch sensing surface that is determined to have been touched by the touch event at a first time of the duration of the touch event and determining an angle at which the first axis of the first region ≪ / RTI > Determining a second region of the touch sensing surface that is determined to have been touched by the touch event at a second time of the duration of the touch event and determining a second region of the second region, Calculating a value; And if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing the predetermined processing step, otherwise determining not to execute the predetermined processing step.

The predetermined processing step may include causing a change in a display state of a display device to display an image in a stationary application window, and displaying the fixed application window at the first time And causing the first image to be rotated by a predetermined angle with respect to the fixed application window.

The method may further include calculating a first center position that is a center position of the first region and a second center position that is a center position of the second region, 1 value and the second value differ by more than a predetermined threshold and when the distance between the first central position and the second central position is less than or equal to a predetermined threshold distance, Can be.

At this time, the first region is modeled as a rectangle or an ellipse, and the first axis may represent the long axis or short axis of the rectangle or the ellipse.

At this time, the second region may be modeled as a rectangle or an ellipse, and the second axis may represent the major axis or minor axis of the rectangle or the ellipse.

According to a third aspect of the present invention, there is provided a user equipment including a touch input device having a touch sensitive surface, a processor, a memory, and a program stored in the memory and configured to be executed by the processor. At this time, the program determines a first area, which is determined to have been touched by the touch event among the touch sensing surfaces, at a first time of the duration of the touch event generated by the touch tool with respect to the touch sensing surface Calculating a first value relating to an angle of the first region with respect to a first axis; Determining a second region of the touch sensing surface that is determined to have been touched by the touch event at a second time of the duration of the touch event and determining a second region of the second region, Calculating a value; And executing a predetermined processing step if the difference between the first value and the second value is greater than or equal to a predetermined threshold and otherwise determining not to execute the predetermined processing step, ).

According to a fourth aspect of the present invention, there is provided a touch sensing device including a touch input device having a touch sensing surface, a processor, and a user device including a memory, Determining a first region of the touch sensitive surface that is determined to have been touched by the touch event and calculating a first value related to an angle of the first region with respect to the first axis; Determining a second region of the touch sensing surface that is determined to have been touched by the touch event at a second time of the duration of the touch event and determining a second region of the second region, Calculating a value; And if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing a predetermined processing step, otherwise determining not to execute the predetermined processing step A computer-readable medium may be provided, including a program for executing the program. At this time, the program is stored in the memory and is configured to be executed by the processor.

According to the present invention, it is possible to provide a new processing technique for processing a touch event generated in the touch input device. Specifically, it is possible to provide a technique that can accurately transmit a user's input intention even on a touch sensing surface having a small area. Even if only a narrow touch sensitive surface is provided, which is particularly difficult to multi-touch, the present invention can be used to perform various user inputs. Also, even in the case of a single touch, even if only a touch-sensitive surface that is narrow enough to be difficult to drag is provided, various user inputs can be performed using the present invention.

FIG. 1 shows an example of a user device capable of performing a touch event processing method according to an embodiment of the present invention.
2 is a conceptual diagram illustrating the principle of a capacitive touch input device that can be used in an embodiment of the present invention.
FIG. 3 shows an example of a method of determining the directionality of a touch region by a touch event according to an embodiment of the present invention through modeling.
4 is a diagram for explaining a process of performing one touch event for changing an angle in an embodiment of the present invention.
FIG. 5 shows an example of an image processing process to be issued subsequently when a touch event occurs according to FIG.
6 is a diagram for explaining a process of performing one touch event for changing an angle in another embodiment of the present invention.
7 is a view for explaining a process of performing one touch event for changing an angle in another embodiment of the present invention.
FIG. 8 is a flowchart illustrating a method according to an embodiment of the present invention described in FIG.

Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. However, the present invention is not limited to the embodiments described herein, but may be implemented in various other forms. The terminology used herein is for the purpose of understanding the embodiments and is not intended to limit the scope of the present invention. In addition, the singular forms used below include plural forms unless the phrases expressly have the opposite meaning.

FIG. 1 shows an example of a user device capable of performing a touch event processing method according to an embodiment of the present invention.

The user device 100 includes a memory 110, a control chip 120, an external port 130, a power unit 140, and an input / output subsystem 150 and other types of functional units not shown .

The control chip 120 may include a memory control unit 121, a processor 122, and a peripheral device interface unit 123 for controlling the memory 110. The power unit 140 may provide power to all power consuming elements included in the user equipment 100. The input / output subsystem 150 includes a touch input device control unit 151 having a function of controlling the touch input device 10, a display device control unit 152 having a function of controlling the display device 20, Output device control unit 153 having a function of controlling the input / The external port 130 may refer to a physical / logical port for connecting the user device 100 to an external device.

The touch input device 10, the display device 20 and the other input and output devices 30 may be integrally installed in the user device 100 or may be provided separately from the user device 100, Or may be a device that is connected to the user device 100 through a network.

2 is a conceptual diagram illustrating the principle of a capacitive touch input device that can be used in an embodiment of the present invention.

The operation principle according to some embodiments of the electrostatic touch input device is disclosed in Korean Patent Laid-Open Nos. KR 10-2011-0076059, KR 10-2011-0126026, and the contents of these prior patent documents are incorporated herein by reference .

2 (a) shows a touch sensing surface 2 included in an electrostatic touch input device, hereinafter referred to as a touch input device. The touch sensitive surface 2 is actually a surface provided to receive the touch input and may be covered with a cover or the like to prevent external contaminants from entering the user equipment. An embodiment of the touch sensing surface 2 is shown in the above-mentioned prior patent documents.

The touch sensing surface 2 may include a plurality of touch nodes 21. [ Each touch node can be connected to a touch IC of the touch input device, and the touch IC is capable of measuring the electrostatic capacitance of each touch node, that is, a capacitance value. The different touch nodes may refer to a basic unit in which the touch ICs can measure the capacitance values separately from each other and may have a constant area. For example, in one example of a self-capacitance manner, each touch node may be provided by an electrode that is clearly distinct from one another, and may be defined as an intersection region of a driving electrode and a sensing electrode in one example of mutual capacitance type, The idea of the invention may not depend on the specific way of implementing the touch node.

It can be assumed that the touch tool contacts an area occupied by the dotted circle 200 on the plurality of touch nodes 21 shown in FIG. 2A. At this time, the amount of change in capacitance due to each contact can be calculated for each of the touch nodes 201 to 204 shown in FIG. 2B. At this time, the amount of change in the capacitance of each touch node may be proportional to the area of the contact surface between each touch node and the touch tool. For example, the amount of change in capacitance of the touch node 204 is the largest, and the amount of change in capacitance can be reduced in the order of the touch node 203, the touch node 202, and the touch node 201.

3 (a) shows an example in which a touch tool such as a finger touches the touch-sensitive surface 2 shown in Fig. In the following description, a portion that is contacted by a touch event as shown in FIG. 3A may be referred to as a 'touch region', a 'first region', or a 'second region'.

FIG. 3B shows an example of a change value of the capacitance of each touch node in a situation where a touch event as shown in FIG. 3A is generated. The capacitance of the six touch nodes 204 is large, the capacitance of the two touch nodes 202 is small and the capacitances of the other touch nodes 201 are not changed.

FIG. 3C conceptually illustrates an example of modeling the touch region according to FIG. 3A by an ellipse. In FIG. 3 (c), the illustration of the touch nodes is omitted. In FIG. 3A, the contact surface between the finger and the touch sensing surface 2 has a shape close to an ellipse, but the touch areas shown in FIG. 3B (the set of touch nodes judged to have a capacitance change ) May have a shape other than an elliptical shape. However, in FIG. 3 (b), the elliptical region shown in FIG. 3 (c) can be determined based on the change values of the capacitance of each touch node, and the long and short axes of the elliptical region can be defined.

FIG. 3 (d) conceptually illustrates an example of modeling a touch region according to FIG. 3 (a) as a rectangle. It is possible to define the directions of the long and short axes of the rectangle similarly to FIG. 3 (c).

In FIGS. 3 (c) and 3 (d), the touch region is modeled as an ellipse or a rectangle in order to define the long axis and the short axis. For example, if the touching tool touching the touch sensitive surface 2 is not a finger but a long triangle on one side, the touch region may be modeled as a long triangle on one side. However, in the present invention, the shape of the figure to be modeled may be one that can define a major axis and / or a minor axis. It is only necessary to be able to determine the value of the parameter that can indicate the directionality of the touch region even if it is not necessarily the major axis and / or the minor axis.

4 is a diagram illustrating a method of defining a touch event when one touch event occurs and a subsequent processing method according to the touch event according to an embodiment of the present invention.

4 (a) shows the time from the start to the end of one touch event. The instant Ts at which the touching tool contacts the touch sensing surface 2 can be defined as the start of the occurrence of the touch event. Then, it is possible to define the instant Te at which the touching tool is separated from the touch sensing surface 2 as the ending point of the touch event. Therefore, it can be defined that one touch event exists from the time Ts to the time Te, and this time period can be defined as the duration 55 of the touch event. 4A shows a case where the finger starts to come into contact with the touch sensing surface 2 at time Ts and the finger is separated from the touch sensing surface 2 at time Te At this time, at the second time T2, the direction in which the finger is pointed at the first time T1 is inclined counterclockwise from the vertical line 81 by the angle? And is inclined by an angle? 2 °. At this time, since it is assumed that the finger does not fall off the touch sensing surface 2 during the duration 55 of the touch event, the finger is moved between the first time T1 and the second time T2, ) Can be regarded as a clockwise direction.

At this time, in one embodiment of the present invention, it is possible to determine a first area of the touch sensing surface 2 which is determined to have been touched by the touch event at a first time T1 of the duration 55 of the touch event (Step S110). The reason for determining the first region is that the angle of the long axis and / or the minor axis of the first region (? 1 °) or the angle of the first region May mean to calculate the first value with respect to the angle [theta] 1 [deg.] In the facing direction.

Next, a second area, which is determined to have been touched by the touch event among the touch-sensitive surface 2 at a second time T2 of the duration 55 of the touch event, may be determined (step S120) ). The reason for determining the second area is that, as illustrated in FIGS. 4C and 4E, the angle of the long axis and / or short axis of the second area is 2 °, May be calculated to calculate a second value with respect to an angle?

At this time, the second time T2 may be later than the first time T1.

Then, if the difference between the first value and the second value is greater than or equal to a predetermined threshold, then a predetermined processing step may be executed, otherwise it may be determined not to execute the predetermined processing step )). Here, the first value and the second value may be, for example, an angle (? 1 °) of the direction toward the first region and an angle (? 2 °) of the direction toward the second region. This step may be a step of judging that a predetermined user input has been performed when the finger touching the touch sensing surface 2 has rotated sufficiently in the clockwise or counterclockwise direction while maintaining the contact state. Therefore, the predetermined processing step is executed in response to the user input.

In the case of FIG. 4, it can be assumed that the difference value (2 - - 1 deg.) Is larger than a predetermined threshold value. Accordingly, it is determined that the user command is reliably input, and a subsequent process such as rotating the screen displayed to the user in a certain direction, for example, can be started.

Hereinafter, the meaning of the predetermined processing step will be described with reference to FIG. 4 and FIG. 5. FIG.

5A shows an example of a first image 91 displayed on a user screen according to an embodiment of the present invention. A dotted line portion of the first image 91 represents a virtual boundary portion of the first image 91. [

FIG. 5B shows a relative arrangement relationship between a stationary application window 95 and a first image 91 provided according to an embodiment of the present invention. The fixed application window 95 may be a part printed in advance on a synthetic resin or a glass constituting the case of the screen display part of the user equipment, and the inside and the outside of which are to be distinguished. Or the fixed application window 95 may mean a fixed display area that is defined and displayed as software in the screen display unit of the user equipment. The first image 91 is arranged upright in the fixed application window 95 in FIG. 5 (b), and as a result, a screen as shown in FIG. 5 (c) can be displayed.

5D shows that the first image 91 is rotated 90 degrees clockwise in the fixed application window 95, and as a result, a screen as shown in FIG. 5E can be displayed.

5F shows that the first image 91 is rotated by 40 degrees in the clockwise direction in the fixed application window 95, and as a result, a screen as shown in FIG. 5G can be displayed.

The predetermined processing step is a step of causing a change in the display state of the display device to display an image on the fixed application window 95, (E.g., 40 DEG or 90 DEG) with respect to the fixed application window 95, as shown in FIG. Therefore, when the touch event as shown in Fig. 4 occurs, the screen displayed at the first time (T1) as shown in (c) of Fig. 5 is displayed after the second time (T2) ). ≪ / RTI >

Fig. 6 illustrates a state in which the finger touching the touch-sensitive surface 2 is not rotated sufficiently, unlike Fig. In the case of FIG. 6, it can be assumed that the first value and the second value do not differ by more than a predetermined threshold value. In this case, the screen displayed at the first time (T1) as shown in (c) of FIG. 5 can still maintain the display state as shown in (c) of FIG. 5 even after the second time (T2).

7 is a diagram for explaining a method of processing another type of touch event according to another embodiment of the present invention.

7 (a) shows the duration 55 of the touch event.

7 (b) shows a case where the finger moves while maintaining the contact state from the left portion to the right portion of the touch-sensitive surface 2.

FIG. 7C is a conceptual diagram of the first area and the second area modeled as an ellipse.

Referring to Figs. 7A and 7B, at the time Ts, the finger starts to contact the touch-sensitive surface 2 and comes off the touch-sensitive surface 2 at the time Te. At the first time T1, the first area of the left side of the touch sensing surface 2 is in contact with the second area of the right side of the touch sensing surface 2 at the second time T2. State. In this process, the finger is dragged while maintaining the contact state from the first area to the second area.

At this time, the same steps as described in Fig. 4 can be executed. However, when the touch event of FIG. 7 is compared with the touch event of FIG. 4, the first center position (x1, y1) and the second time point (T2) of the first region at the first time The distance between the second center positions (x2, y2) of the second region of the first region of the first region is very close to that of the second region of the second region.

The difference between FIG. 4 and FIG. 7 may or may not be distinguished.

In one embodiment of the present invention, a step S140 of calculating a first center position x1, y1 as a center position of the first area and a second center position x2, y2 as a center position of the second area ). In the step S130, the first value and the second value differ from each other by a predetermined threshold value or more and the difference between the first center position (x1, y1) and the second center position (x2, y2) May be adapted to execute the predetermined processing step when the distance of the predetermined threshold distance is less than or equal to the predetermined threshold distance.

Alternatively, in another embodiment of the present invention, even if the distance between the first center position (x1, y1) and the second center position (x2, y2) is greater than or equal to the predetermined threshold distance, . In this case, even when the touch event as shown in Fig. 7 occurs, the first image 91 may be rotated, for example, as shown in Fig. 5, which is not caused by the drag operation of Fig. 7, Can be attributed to the change.

In yet another embodiment of the present invention, the method may further include calculating a first width of the first area and a second width of the second area (S150). And the step (S130) may be executed only when the first width and the second width are larger than a predetermined critical width. This is because the first area of the first area and the second area of the second area may be very small if a touch event is accidentally generated by a touch tool such as a finger.

FIG. 8 is a flowchart illustrating a method according to an embodiment of the present invention described in FIG.

According to another embodiment of the present invention, there is provided a touch input device 10 having a touch sensitive surface 2, a processor 122, a memory 110, The user device 100 may be provided with a program configured to be executed by the user device 100. [

At this time, the program may include instructions for executing the above-described steps S110, S120, and S130. And may further include instructions for further performing step S140 described above.

Meanwhile, according to another embodiment of the present invention, the user device 100 including the touch input device 10 having the touch-sensitive surface 2, the processor 122, A program stored in the memory 110 and configured to be executed by the processor 122, the program comprising instructions for causing the computer to execute steps S110, S120, and S130, A computer readable medium may be provided. At this time, the program may include an instruction for further executing the above-described step S140.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the essential characteristics thereof. The contents of each claim in the claims may be combined with other claims without departing from the scope of the claims.

Claims (10)

A method of determining whether to execute a predetermined processing step when a touch event by a touch tool occurs on a touch sensitive surface of a touch input device,
Determining a first region of the touch sensing surface that is determined to have been touched by the touch event at a first time of the duration of the touch event and determining a first region of the first region, Calculating a value;
Determining a second region of the touch sensing surface that is determined to have been touched by the touch event at a second time of the duration of the touch event and determining a second region of the second region, Calculating a value; And
Determining if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing the predetermined processing step and otherwise not executing the predetermined processing step
/ RTI >
How to handle touch events.
The method according to claim 1,
Wherein the predetermined processing step is a step of causing a change in a display state of a display device which is adapted to display an image in a stationary application window, Causing the first image to be rotated by a predetermined angle relative to the fixed application window.
The method according to claim 1,
Further comprising calculating a first center position that is a center position of the first area and a second center position that is a center position of the second area,
Wherein when the first value and the second value differ by more than a predetermined threshold value and the distance between the first central position and the second central position is less than or equal to a predetermined critical distance, And to execute a predetermined processing step,
How to handle touch events.
The touch event processing method according to claim 1, wherein the first region is modeled as a rectangular or elliptical shape, and the first axis indicates a major axis or a minor axis of the rectangle or the elliptical shape. 5. The touch event processing method according to claim 4, wherein the second region is modeled as a rectangle or an ellipse, and the second axis represents a major axis or a minor axis of the rectangle or the ellipse. A user device comprising a touch input device having a touch sensitive surface, a processor, a memory, and a program stored in the memory and configured to be executed by the processor,
The program includes:
Detecting a first region of the touch sensing surface that is determined to have been touched by the touch event at a first time of a duration of a touch event generated by the touch tool with respect to the touch sensing surface, Calculating a first value with respect to an angle toward a first axis;
Determining a second region of the touch sensing surface that is determined to have been touched by the touch event at a second time of the duration of the touch event and determining a second region of the second region, Calculating a value; And
Determining if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing a predetermined processing step, otherwise determining not to execute the predetermined processing step
Comprising instructions for performing the steps < RTI ID = 0.0 >
User device.
The method according to claim 6,
Further comprising a display device,
Wherein the predetermined processing step is a step of causing a change in a display state of a display device which is adapted to display an image in a stationary application window, And causing a process of rotating and displaying the first image by a predetermined angle with respect to the fixed application window.
User device.
The method according to claim 6,
The program includes:
Further comprising instructions for performing a step of calculating a first center position which is a center position of the first area and a second center position which is a center position of the second area,
Wherein when the first value and the second value differ by more than a predetermined threshold value and the distance between the first central position and the second central position is less than or equal to a predetermined critical distance, And to execute a predetermined processing step,
User device.
A user device including a touch input device having a touch sensitive surface, a processor, and a memory,
Detecting a first region of the touch sensing surface that is determined to have been touched by the touch event at a first time of a duration of a touch event generated by the touch tool with respect to the touch sensing surface, Calculating a first value with respect to an angle toward a first axis;
Determining a second region of the touch sensing surface that is determined to have been touched by the touch event at a second time of the duration of the touch event and determining a second region of the second region, Calculating a value; And
Determining if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing a predetermined processing step, otherwise determining not to execute the predetermined processing step
The program comprising:
The program stored in the memory and configured to be executed by the processor,
/ RTI >
Computer-readable medium.
10. The method of claim 9,
The program includes:
Further comprising instructions for performing a step of calculating a first center position which is a center position of the first area and a second center position which is a center position of the second area,
Wherein when the first value and the second value differ by more than a predetermined threshold value and the distance between the first central position and the second central position is less than or equal to a predetermined critical distance, The computer program being adapted to execute a predetermined processing step.
KR20140042615A 2013-04-10 2014-04-09 Method for processing touch event where touch area rotates and device for the same KR20140122682A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/KR2014/003116 WO2014168431A1 (en) 2013-04-10 2014-04-10 Method for processing touch event and apparatus for same
CN201480020923.1A CN105308540A (en) 2013-04-10 2014-04-10 Method for processing touch event and apparatus for same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20130039553 2013-04-10
KR1020130039553 2013-04-10

Publications (1)

Publication Number Publication Date
KR20140122682A true KR20140122682A (en) 2014-10-20

Family

ID=51993693

Family Applications (3)

Application Number Title Priority Date Filing Date
KR20140042615A KR20140122682A (en) 2013-04-10 2014-04-09 Method for processing touch event where touch area rotates and device for the same
KR1020140042616A KR101661606B1 (en) 2013-04-10 2014-04-09 Method for processing touch event when a touch point is rotating respectively to other touch point
KR1020140043052A KR102191321B1 (en) 2013-04-10 2014-04-10 Method for processing touch event and device for the same

Family Applications After (2)

Application Number Title Priority Date Filing Date
KR1020140042616A KR101661606B1 (en) 2013-04-10 2014-04-09 Method for processing touch event when a touch point is rotating respectively to other touch point
KR1020140043052A KR102191321B1 (en) 2013-04-10 2014-04-10 Method for processing touch event and device for the same

Country Status (2)

Country Link
KR (3) KR20140122682A (en)
CN (1) CN105308540A (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250022B (en) * 2016-07-29 2019-07-09 努比亚技术有限公司 Content selection method of adjustment, device and terminal
CN106020712B (en) * 2016-07-29 2020-03-27 青岛海信移动通信技术股份有限公司 Touch gesture recognition method and device
CN106569723A (en) * 2016-10-28 2017-04-19 努比亚技术有限公司 Device and method for controlling cursor movement
US11922008B2 (en) 2021-08-09 2024-03-05 Samsung Electronics Co., Ltd. Electronic device processing input of stylus pen and method for operating the same

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5684136B2 (en) * 2008-10-28 2015-03-11 サーク・コーポレーション Multi-contact area rotation gesture recognition method
US8957918B2 (en) * 2009-11-03 2015-02-17 Qualcomm Incorporated Methods for implementing multi-touch gestures on a single-touch touch surface
EP2378403A1 (en) * 2010-04-19 2011-10-19 Tyco Electronics Services GmbH Method and device for determining a user's touch gesture
CN101917548A (en) * 2010-08-11 2010-12-15 无锡中星微电子有限公司 Image pickup device and method for adaptively adjusting picture
KR101095851B1 (en) * 2010-11-10 2011-12-21 채상우 Touch screen apparatus and method for controlling touch screen
KR101718893B1 (en) * 2010-12-24 2017-04-05 삼성전자주식회사 Method and apparatus for providing touch interface
TWI478041B (en) * 2011-05-17 2015-03-21 Elan Microelectronics Corp Method of identifying palm area of a touch panel and a updating method thereof

Also Published As

Publication number Publication date
KR102191321B1 (en) 2020-12-16
KR20140122687A (en) 2014-10-20
KR101661606B1 (en) 2016-09-30
CN105308540A (en) 2016-02-03
KR20140122683A (en) 2014-10-20

Similar Documents

Publication Publication Date Title
AU2018282404B2 (en) Touch-sensitive button
US10379727B2 (en) Moving an object by drag operation on a touch panel
JP5716503B2 (en) Information processing apparatus, information processing method, and computer program
KR101892315B1 (en) Touch event anticipation in a computing device
JP5716502B2 (en) Information processing apparatus, information processing method, and computer program
CN107741824B (en) Detection of gesture orientation on repositionable touch surface
US20110234522A1 (en) Touch sensing method and system using the same
JP5738707B2 (en) Touch panel
US20150193037A1 (en) Input Apparatus
CN104007932A (en) Touch point recognition method and device
TW201327310A (en) Multi-surface touch sensor device with mode of operation selection
JP6410537B2 (en) Information processing apparatus, control method therefor, program, and storage medium
KR20140122682A (en) Method for processing touch event where touch area rotates and device for the same
KR102198596B1 (en) Disambiguation of indirect input
US9367169B2 (en) Method, circuit, and system for hover and gesture detection with a touch screen
JP2014109883A (en) Electronic apparatus and method of controlling the same
JP6255321B2 (en) Information processing apparatus, fingertip operation identification method and program
US9317167B2 (en) Touch control system and signal processing method thereof
KR101835952B1 (en) Apparatus and method for controlling scroll of screen
JP2013246481A (en) Operation input device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application