WO2014168431A1 - Procédé de traitement d'événement tactile et appareil correspondant - Google Patents

Procédé de traitement d'événement tactile et appareil correspondant Download PDF

Info

Publication number
WO2014168431A1
WO2014168431A1 PCT/KR2014/003116 KR2014003116W WO2014168431A1 WO 2014168431 A1 WO2014168431 A1 WO 2014168431A1 KR 2014003116 W KR2014003116 W KR 2014003116W WO 2014168431 A1 WO2014168431 A1 WO 2014168431A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
value
area
center position
touch event
Prior art date
Application number
PCT/KR2014/003116
Other languages
English (en)
Korean (ko)
Inventor
강회식
소병철
장선웅
Original Assignee
주식회사 지니틱스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140042616A external-priority patent/KR101661606B1/ko
Application filed by 주식회사 지니틱스 filed Critical 주식회사 지니틱스
Priority to CN201480020923.1A priority Critical patent/CN105308540A/zh
Publication of WO2014168431A1 publication Critical patent/WO2014168431A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • the present invention relates to a method of processing a touch event generated by a touch tool contacting a touch sensing surface of a touch input device.
  • the touch input device can be used in various user devices. To date, it has been used in devices such as smartphones, PDAs, laptops, and tablets that provide display screens and touch input pads. In the future, a touch input device may be used for a user device having a very small display screen and a touch input pad, such as a watch worn on a wrist.
  • the stylus pen has a thin pen tip that allows precise input.
  • a finger since the contact surface between the finger and the touch sensing surface of the touch input device is large, it is difficult to perform a user input gesture using the finger when the total area of the provided touch sensing surface is relatively small. Finger gestures can be difficult to properly recognize. For example, the above-described problem may occur in the case of a wristwatch-type touch input device. Accordingly, there is a need to provide a new type of touch input technology capable of accepting efficient user input even when the touch sensing surface of the touch input device is narrow.
  • the present invention is to provide a new processing technology for processing a touch event generated in the touch input device. Specifically, it is intended to provide a technology capable of accurately transmitting a user's input intention even on a touch-sensitive surface having a small area.
  • the thumb is thicker than the other fingers. That is, the contact area when touching with the thumb is larger than the contact area when touching with another finger. Therefore, in the present invention, when a single touch input occurs, it is determined that the contact is made with the thumb if the contact area is wide, and if it is narrow, the contact is made with another finger.
  • the criterion for being wide or narrow may be determined based on a predetermined threshold.
  • a predetermined threshold since the size of the finger varies from person to person, the contact area may also vary from person to person. Therefore, it may be necessary to set the above threshold value differently for each user.
  • pre-calibration can be carried out.
  • an input area registration step using a thumb and an input area registration step using another finger can be performed.
  • These two steps allow you to set a threshold that distinguishes between input by the thumb of a particular user and input by another finger.
  • the above registration steps may be repeatedly performed several times, and the threshold value may be determined by statistical values of repeatedly inputted contact areas.
  • the pre-calibration process as described above may be executed whenever the user uses the user device according to the present invention, but in another embodiment, the threshold value determined by executing the pre-calibration process once is stored for each user, and the user is later operated. It is also possible to select and use a threshold value.
  • the touch input by the thumb may be used as a gesture for activating the first function of the user device, and the touch input by another finger may be used as a gesture for activating the second function of the user device.
  • a new single touch input gesture is defined and used. For example, after the user touches the surface of the touch input panel with one finger tip, the contact surface may be rotated about the center of the contact surface while maintaining the touch state. That is, instead of dragging the contact portion of the finger, it is possible to rotate the contact portion itself.
  • the user device can recognize that the contact surface is rotated because the contact surface by the finger is not square. That is, it can be modeled as an ellipse, since the ellipse has a long axis and a short axis, and when the contact surface of the ellipse rotates itself, the long axis and the short axis rotate together.
  • the method according to an aspect of the present invention by analyzing the shape of the contact surface from the time of touch contact to the time of release, it is possible to determine whether the long axis or short axis angle of the contact surface is rotated between the time period. have.
  • the rotation angle and / or rotation angle speed may be used as one touch input gesture parameter to define a new gesture.
  • the contact surface occupied by the single touch input is rotated as described above, it is determined that one gesture is made, and the gesture may be used as a gesture for activating a specific function of the user device.
  • FIG. 17 A figure to help understand the second idea is shown in FIG. 17.
  • a new multi-touch input gesture is defined and used. This method deals with the case where the touch input is made simultaneously at two spaced points. In this case, it may be determined whether the first touch point is relatively rotated with respect to the second touch point while maintaining a constant distance with respect to the second touch point.
  • both of the above touch points may have a state of moving on the touch panel, or one of the touch points may have a state of being fixed to a specific point on the touch panel.
  • the first touch point may be moved by turning the compass around the second touch point.
  • the gesture may be used as a gesture for activating a specific function of the user device.
  • FIG. 18 A figure to help understand the third idea of the invention is shown in FIG. 18.
  • a new multi-touch input gesture is defined and used. This method deals with the case where the touch input is made simultaneously at two spaced points.
  • a center point between the first touch point and the second touch point is set.
  • the gesture may be used as a gesture for activating a specific function of the user device.
  • 19 is a view to help understand the fourth idea of the present invention.
  • a new multi-touch input gesture is defined and used. This method deals with the case where the touch input is made simultaneously at two spaced points.
  • the first touch point and the second touch point may rotate with respect to each other in contact with the touch panel as the binary stars move.
  • An angle between the first line connecting the first first touch point and the second touch point and the second line connecting the last first touch point and the second touch point may be calculated.
  • the measured rotational angular velocity from the start of this rotation to completion. This angle and / or rotational angular velocity may be used as a parameter for gesture determination. 20 is a view to help understand the fifth idea of the present invention.
  • a new multi-touch input gesture is defined and used. This method deals with the case where the touch input is made simultaneously at two spaced points.
  • a gesture may be defined by distinguishing between a first case in which one of the first touch input and the second touch input is made by the thumb and a second case in which both the first touch input and the second touch input are made without the thumb. .
  • the first contact surface contacted by the thumb and the second contact surface contacted by the other finger have different characteristics.
  • the first contact surface is wider than the second contact surface.
  • the inclination of the long axis of the first contact surface is different from the inclination of the long axis of the second contact surface by a certain degree or more.
  • the aspect ratio when the first contact surface is viewed in an elliptical shape is larger than the aspect ratio when the second contact surface is viewed in an elliptical shape. Detailed description thereof will be described in the following detailed description.
  • the gestures presented in the second to fifth ideas of the present invention described above include a technical element of 'rotation'.
  • the user device according to the present invention can further detect the rotation angle and the rotation angle speed, it is possible to further refine the gesture using these two parameters. For example, it may be determined only whether the rotation angle is greater than or equal to the critical angle, or alternatively, the specific rotation angle itself may be further determined when the rotation angle is greater than or equal to the critical angle. Alternatively, it may be determined whether the rotation angle velocity exceeds a threshold regardless of the rotation angle. Furthermore, in the case where the rotational angular velocity exceeds the threshold, the specific rotational angular velocity may be further determined. In addition, it is also possible to combine the above determination results for the rotation angle and the rotation angle speed. Depending on the determination result described above, it may be determined that different gestures have been taken.
  • a first function of the above two functions is to rotate the entire display screen clockwise or counterclockwise by a specific angle.
  • the specific angle may be, for example, 90 °.
  • the specific angle may be the actual rotation angle described in the second to fifth ideas as described above.
  • the user device itself is tilted to rotate the display screen itself, which is different from the present invention in that it uses a geomagnetic sensor or the like.
  • a second function of the above two functions may be a function of triggering a function for directly transmitting data from a user device having an input command according to the above gesture to another user device.
  • the touch event processing method provided according to the first aspect of the present invention may be performed when the finger makes a gesture of rubbing in a rotational form while maintaining a state in which the finger is in contact with the touch sensing surface (compared with the touch sensing surface) Some of the gestures that proceed in parallel while remaining in contact may be referred to as 'drag'.
  • the contact surface between the finger and the touch-sensitive surface is modeled into a figure shape that can define a long axis and a short axis.
  • the contact surface is modeled as an elliptical or rectangular shape, parameters corresponding to long and short axes may be generated.
  • the technology of modeling a figure that can define a long axis and a short axis is a technology that can be provided separately from the spirit provided by the present invention, and can be made sufficiently at the present technology level.
  • the direction of the long axis or short axis defined as described above is at an angle with respect to the touch sensing surface.
  • the direction of the long axis or short axis is changed.
  • the change in the direction of the axis satisfies a predetermined condition, it is determined that the user input is input.
  • a method of determining whether to execute a predetermined processing step may be provided.
  • the method may further include determining, at a first time of the duration of the touch event, a first area of the touch sensing surface that is determined to have been touched by the touch event, and the angle at which the first axis of the first area is directed. Calculating a first value for; At a second time of the duration of the touch event, a second area of the touch sensing surface determined to be touched by the touch event is determined, and a second point regarding an angle to which the second axis of the second area is directed. Calculating a value; And if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing the predetermined processing step; otherwise, determining not to execute the predetermined processing step.
  • the predetermined processing step is a step of causing a change in the display state of a display device configured to display an image in a stationary application window, which is displayed on the fixed application window at the first time.
  • the first image may be rotated by a predetermined angle with respect to the fixed application window to cause a process of displaying the first image.
  • the method may further include calculating a first center position, which is a center position of the first region, and a second center position, which is a center position of the second region.
  • the predetermined processing step may be executed if the value between the first value and the second value is greater than or equal to a predetermined threshold value and the distance between the first center position and the second center position is less than or equal to a predetermined threshold distance. There may be.
  • the first region may be modeled as a rectangle or an ellipse, and the first axis may represent a long axis or a short axis of the rectangle or the ellipse.
  • the second region may be modeled as a rectangle or an ellipse, and the second axis may represent a long axis or a short axis of the rectangle or the ellipse.
  • a user device including a touch input device having a touch sensing surface, a processor, a memory, and a program stored in the memory and configured to be executed by the processor can be provided.
  • the program determines a first region of the touch sensing surface that is determined to be touched by the touch event at a first time of a duration of the touch event generated by the touch tool with respect to the touch sensing surface.
  • a user device including a touch input device, a processor, and a memory having a touch sensing surface causes a first time of a duration of a touch event generated by a touch tool with respect to the touch sensing surface. Determining a first area of the touch sensing surface on which the touch is determined by the touch event, and calculating a first value regarding an angle to which a first axis of the first area is directed; At a second time of the duration of the touch event, a second area of the touch sensing surface determined to be touched by the touch event is determined, and a second point regarding an angle to which the second axis of the second area is directed.
  • a computer readable medium may be provided that includes a program for performing the same. At this time, the program is stored in the memory and is configured to be executed by the processor.
  • the touch event processing method provided according to the fifth aspect of the present invention may be performed when a gesture of rubbing in a rotational form while keeping two fingers in contact with the touch sensing surface (compared with a finger touch) Some of the gestures that proceed in parallel while maintaining contact with the sensing surface may be referred to as 'drag'.
  • a value relating to an angle formed by a straight line connecting the points touched by the two fingers with each other with respect to one reference line of the fixed application window may be calculated.
  • the change of the angle at a different first time point and a second time point of the time period in which the touch by two fingers is maintained is over the threshold value, it may be determined that a meaningful user input has been made. .
  • a method for determining whether to execute a predetermined processing step when a touch event by a first touch tool and a second touch tool occurs on a touch sensing surface of a touch input device can be.
  • the method may further include a first point indicated by a first area of the touch sensing surface that is determined to have been touched by the first touch tool at a first time of the touch event, and the second touch tool.
  • the predetermined processing step is a step of causing a change in the display state of a display device configured to display an image in a stationary application window, which is displayed on the fixed application window at the first time.
  • the first image may be rotated by a predetermined angle with respect to the fixed application window at a time point after the second time.
  • the predetermined first processing step or the predetermined second processing step is executed, and the first area If the difference between the first area occupied and the second area occupied by the second area is larger than a predetermined area threshold value, the first processing step is executed in the determining step, otherwise the second processing step is performed. It may be arranged to execute.
  • the determining step if the difference between the first value and the second value is equal to or greater than a predetermined threshold value, the predetermined first processing step or the predetermined second processing step is executed, and the first point And when the distance between the third point and the third point is larger than a predetermined distance threshold value, the first processing step may be executed in the determining step, and the second processing step may be executed otherwise.
  • a user device including a touch input device having a touch sensing surface, a processor, a memory, and a program stored in the memory and configured to be executed by the processor
  • the program may include a first area of the touch sensing surface determined to be touched by the first touch tool at a first time of a duration of a touch event generated by the touch tool with respect to the touch sensing surface.
  • the user device may further include a display device, wherein the predetermined processing step is a step of causing a change in the display state of the display device configured to display an image in a stationary application window. And a step of causing a process of rotating and displaying a first image displayed in the fixed application window at a first angle by a predetermined angle with respect to the fixed application window.
  • a computer-readable medium configured to cause a user device including a touch input device, a processor, and a memory having a touch sensing surface, at a first time of a duration of a touch event generated by a touch tool with respect to the touch sensing surface.
  • a new processing technology for processing a touch event generated in a touch input device can be provided.
  • a technology for accurately transmitting a user's input intention can be provided even in a touch sensing surface having a small area.
  • various user inputs can be performed using the present invention.
  • the present invention can perform various user inputs.
  • FIG. 1 illustrates an example of a user device capable of performing a touch event processing method according to an embodiment of the present invention.
  • FIG. 2 is a conceptual diagram illustrating the principle of a capacitive touch input device that can be used in an embodiment of the present invention.
  • FIG. 3 illustrates an example of a method of determining the directionality of a touch area by a touch event made by an embodiment of the present invention through modeling.
  • FIG. 4 is a view for explaining a process of performing a single touch event of varying the angle in an embodiment of the present invention.
  • FIG. 5 illustrates an example of an image processing process that is subsequently issued when the touch event according to FIG. 4 occurs.
  • FIG. 6 is a diagram for describing a process of performing one touch event of varying an angle in another embodiment of the present invention.
  • FIG. 7 is a diagram for describing a process of performing one touch event of varying an angle according to another embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a method according to an embodiment of the present invention described with reference to FIG. 4.
  • FIG. 9 is a view for explaining a process of performing a touch event that rotates with respect to each other while two touch points maintain contact in another embodiment of the present invention.
  • FIG. 10 illustrates an example in which one touch point of the two continues to maintain substantially the same position as a special example of FIG. 9.
  • FIG. 11 is a flowchart illustrating a method according to an embodiment of the present invention described with reference to FIG. 9.
  • FIG. 12 is a diagram for defining an input object in one embodiment of the present invention.
  • FIG. 13 is a diagram for defining a slope of an input object in one embodiment of the present invention.
  • FIG. 14 is a view for explaining a contact area of an input object implemented according to an embodiment of the present invention.
  • FIG. 15 illustrates a slope of an input object implemented according to an embodiment of the present invention.
  • FIG. 16 illustrates an aspect ratio of an input object implemented according to an embodiment of the present invention.
  • 17 to 20 illustrate a touch input method in various aspects of the present invention.
  • FIG. 21 shows an example of a user device to which the present invention can be applied.
  • FIG. 1 illustrates an example of a user device capable of performing a touch event processing method according to an embodiment of the present invention.
  • the user device 100 includes a memory 110, a control chip 120, an external port 130, a power unit 140, and an input / output subsystem 150 and other types of functional units not shown. Can be.
  • the control chip 120 may include a memory controller 121, a processor 122, and a peripheral device interface 123 that controls the memory 110.
  • the power unit 140 may provide power to all power consumption devices included in the user device 100.
  • the input / output subsystem 150 includes a touch input device controller 151 having a function of controlling the touch input device 10, a display device controller 152 having a function of controlling the display device 20, and other input / output devices ( And the other input / output device controller 153 having a function of controlling 30.
  • the external port 130 may refer to a physical / logical port for connecting the user device 100 to an external device.
  • the touch input device 10, the display device 20, and the other input / output device 30 may be installed integrally with the user device 100, or may be provided separately from the user device 100, and may provide an external port 130. It may be a device connected to the user device 100 through.
  • FIG. 2 is a conceptual diagram illustrating the principle of a capacitive touch input device that can be used in an embodiment of the present invention.
  • the touch sensing surface 2 is actually a surface provided to receive a touch input, and may be covered with a cover or the like to prevent external pollutants from entering the user device.
  • An embodiment of the touch sensing surface 2 is shown in the above-mentioned prior patent document.
  • the touch sensing surface 2 may include a plurality of touch nodes 21.
  • Each touch node may be connected to a touch IC of the touch input device, and the touch IC may measure capacitance, that is, capacitance value, of each touch node.
  • Different touch nodes may refer to a basic unit in which the touch IC can measure the capacitance values separately from each other, and have a constant area.
  • each touch node may be provided by electrodes that are clearly distinguished from each other.
  • each touch node may be defined as an intersection region of the driving electrode and the sensing electrode. The spirit of the invention may not depend on the specific method of implementing the touch node.
  • a touch tool contacts a region occupied by the dotted circle 200 in the plurality of touch nodes 21 illustrated in FIG. 2A.
  • the amount of change in capacitance due to each contact may be calculated.
  • the amount of change in capacitance for each touch node may exhibit a tendency proportional to the area of the contact surface between each touch node and the touch tool.
  • the capacitance change amount of the touch node 204 is the largest, and the capacitance change amount may decrease in the order of the touch node 203, the touch node 202, and the touch node 201.
  • FIG. 3A illustrates an example in which a touch tool such as a finger touches the touch sensing surface 2 shown in FIG. 2.
  • a touch tool such as a finger touches the touch sensing surface 2 shown in FIG. 2.
  • a part where contact is made by a touch event as shown in FIG. 3A may be referred to as a 'touch area', 'first area', or 'second area'.
  • 3B illustrates an example of a change value of capacitance of each touch node in a situation in which a touch event as shown in FIG. 3A occurs.
  • the change in capacitance is large in the six touch nodes 204, the change in capacitance is small in the two touch nodes 202, and there is no change in capacitance in the remaining touch nodes 201.
  • FIG. 3 (c) conceptually illustrates an example of modeling the touch area of FIG. 3 (a) in an elliptical shape.
  • the touch nodes are omitted.
  • the contact surface between the finger and the touch sensing surface 2 has a shape close to an ellipse, but the touch areas (a set of touch nodes determined to have a change in capacitance) shown in FIG. 3B. ) May have a shape other than elliptical.
  • the elliptical region shown in FIG. 3C can be determined based on the change values of capacitances of the touch nodes. In this case, the long axis and the short axis of the elliptical region may be defined.
  • FIG. 3D conceptually illustrates an example of modeling the touch area of FIG. 3A as a rectangle. As in FIG. 3C, directions of the long and short axes of the rectangle can be defined.
  • the touch area is modeled as an oval or a rectangle in order to define a long axis and a short axis.
  • the touch area may be modeled in another form.
  • the touch tool contacting the touch sensing surface 2 is a long triangular shape to one side instead of a finger
  • the touch area may be modeled to a long triangular shape to one side.
  • the shape of the figure to be modeled may be one that can define a long axis and / or a short axis.
  • FIG. 4 is a diagram for describing a method of defining a touch event and a subsequent processing method according to the touch event when one touch event occurs according to an embodiment of the present invention.
  • FIG. 4 (a) shows the time from the start of one touch event to the end of the touch event.
  • the moment Ts at which the touch tool touches the touch sensing surface 2 can be defined as the occurrence of a touch event. Then, while the contact is continued, it is possible to define the moment Te away from the touch sensing surface 2 by the touch tool as the end point of the touch event. Therefore, it can be defined that one touch event exists from the time Ts to the time Te, and this time period can be defined as the duration 55 of the touch event.
  • the finger starts to contact the touch sensing surface 2 at time Ts, and the finger is separated from the touch sensing surface 2 at time Te.
  • the direction in which the finger faces at the first time T1 is inclined counterclockwise from the vertical line 81 by an angle ⁇ 1 °, and at the second time T2, the clockwise direction is perpendicular to the vertical line 81. It shows the case inclined by an angle ⁇ 2 °.
  • the finger is moved between the first sensing time T1 and the second sensing time T2. Can be considered empty in the clockwise direction.
  • the first region of the touch sensing surface 2 determined to have been touched by the touch event at the first time T1 of the duration 55 of the touch event may be determined.
  • determining the first area means that the angle ( ⁇ 1 °) of the long axis and / or short axis of the first area, or the first area, as illustrated in FIGS. 3C and 3D. It may mean calculating a first value with respect to the angle ⁇ 1 ° in the facing direction.
  • a second area of the touch sensing surface 2 determined to be touched by the touch event may be determined (step S120). ).
  • determining the second area means that the angle ( ⁇ 2 °) of the long axis and / or short axis of the second area, or the second area, as illustrated in (c) and (d) of FIG. 3. It may mean calculating a second value with respect to the angle ⁇ 2 ° in the facing direction.
  • the second time T2 may be a time later than the first time T1.
  • the first value and the second value may be, for example, an angle ⁇ 1 ° of the direction in which the first region faces and an angle ⁇ 2 ° of the direction in which the second region faces.
  • This step may be a step of determining that a predetermined user input has been performed when the finger in contact with the touch sensing surface 2 is sufficiently rotated clockwise or counterclockwise while maintaining the contact state. Therefore, the predetermined processing step is executed in response to the user input.
  • FIG. 5A illustrates an example of a first image 91 displayed on a user screen according to an embodiment of the present invention.
  • the dotted line portion of the first image 91 represents a virtual boundary of the first image 91.
  • FIG. 5B illustrates a relative arrangement relationship between the stationary application window 95 and the first image 91 provided according to an embodiment of the present invention.
  • the fixed application window 95 may mean a portion that is previously printed on a synthetic resin or glass constituting a case of the screen display unit of the user device to distinguish between inside and outside.
  • the fixed application window 95 may mean a fixed display area defined and displayed in software in the screen display unit of the user device.
  • the first image 91 is disposed upright in the application window 95 fixed in FIG. 5B, and as a result, a screen as shown in FIG. 5C may be displayed.
  • FIG. 5D the first image 91 is rotated 90 ° clockwise in the fixed application window 95, and as a result, a screen as shown in FIG. 5E may be displayed.
  • FIG. 5F the first image 91 is rotated 40 ° clockwise in the fixed application window 95, and as a result, a screen as shown in FIG. 5G may be displayed.
  • the predetermined processing step is a step of causing a change in the display state of the display device, which is configured to display an image in the fixed application window 95, which is displayed in the fixed application window 95 at the first time T1.
  • the first image 91 may be rotated by a predetermined angle (eg, 40 ° or 90 °) with respect to the fixed application window 95 to cause a process of displaying the first image 91. Therefore, when the touch event as shown in FIG. 4 occurs, the screen displayed as shown in FIG. 5C at the first time T1 is displayed in FIG. 5E or (g) after the second time T2. You can change the display status to
  • FIG. 6 illustrates a state in which a finger in contact with the touch sensing surface 2 is not sufficiently rotated unlike FIG. 4.
  • the first value and the second value do not differ by more than a predetermined threshold value.
  • the screen displayed as shown in FIG. 5C at the first time T1 can still maintain the display state as shown in FIG. 5C after the second time T2.
  • FIG. 7 is a diagram for describing a method of processing another type of touch event according to a modified embodiment of the present invention.
  • FIG. 7A illustrates the duration 55 of the touch event.
  • FIG. 7B illustrates a case in which the finger moves while maintaining the contact state from the left side to the right side of the touch sensing surface 2.
  • FIG. 7C is a conceptual diagram of ellipsoidal modeling of the first region and the second region.
  • the finger starts to contact the touch sensing surface 2 at the time Ts and is separated from the touch sensing surface 2 at the time Te.
  • the first region is in contact with the first region on the left side of the touch sensing surface 2, and at the second time T2, it is in contact with the second region at the right side of the touch sensing surface 2.
  • FIG. 7 when the touch event of FIG. 7 is compared with the touch event of FIG. 4, in FIG. 4, at the first center positions x1 and y1 and the second time T2 of the first area at the first time T1, FIG. While the distance between the second center positions (x2, y2) of the second region of the is very close, it can be seen that in FIG.
  • FIG. 4 and FIG. 7 may or may not be separately processed.
  • calculating the first center positions x1 and y1, which are the center positions of the first region, and the second center positions x2 and y2, which are the center positions of the second region, in operation S140. ) May be further included.
  • the first value and the second value are different from each other by a predetermined threshold value, and between the first center position x1 and y1 and the second center position x2 and y2. If the distance of is less than or equal to a predetermined threshold distance, the predetermined processing step may be arranged to be executed.
  • the predetermined processing step It may be arranged to execute.
  • the first image 91 may rotate, which is not caused by the drag operation of FIG. This may be due to the change.
  • the method may further include calculating a first area of the first area and a second area of the second area (S150).
  • the step S130 may be performed only when the first width and the second width are larger than a predetermined threshold width.
  • the touch event is accidentally generated by a touch tool such as a finger, the first area of the first area and the second area of the second area may be very small. This is to filter out the mistake.
  • FIG. 8 is a flowchart illustrating a method according to an embodiment of the present invention described with reference to FIG. 4.
  • the touch input device 10 having the touch sensing surface 2, the processor 122, the memory 110, and the memory 110 are stored in the processor 110.
  • a user device 100 may be provided that includes a program configured to be executed by 122.
  • the program may include instructions for executing the above-described step (S110), step (S120), and step (S130). Furthermore, the method may further include instructions for executing step S140 described above.
  • FIG. 9 is a diagram illustrating a method of defining a touch event and a subsequent processing method according to the touch event when one touch event occurs according to an embodiment of the present invention.
  • the touch event may be defined by touching two touch tools, that is, the first touch tool and the second touch tool, together with the touch sensing surface.
  • the inclination shows a case in which the inclination is inclined by the angle ⁇ 2 ° counterclockwise from the vertical line 81 at the second time T2.
  • the fingers 61 and 62 are between the first time T1 and the second time T2. Can be regarded as being rotated clockwise with respect to each other while maintaining a contact state with respect to the touch-sensitive surface 2.
  • the first touch tool (ex: first finger) 61 of the touch sensing surface 2 is determined to have made a touch.
  • a first value with respect to the first angle ⁇ 1 ° with respect to the vertical line 81 of the line 71 connecting s may be calculated (step S210).
  • the second time T2 may be a time later than the first time T1.
  • the first value and the second value may be, for example, the first angle ⁇ 1 ° and the second angle ⁇ 2 ° itself.
  • step S230 when two fingers in contact with the touch sensing surface 2 are rotated relatively sufficiently in the clockwise or counterclockwise direction while maintaining the contact state, it is determined that the predetermined user input has been performed. It may be a step. Therefore, the predetermined processing step is executed in response to the user input.
  • the predetermined processing step of the step S230 is a step of causing a change in the display state of the display device configured to display an image in the fixed application window 95, wherein the application is fixed at the first time T1.
  • the first image 91 displayed on the window 95 is rotated and displayed by a predetermined angle (eg, 40 ° or 90 °) with respect to the fixed application window 95 after the second time T2. May be a step inducing a process. Therefore, when the touch event as shown in FIG. 9 occurs, the screen displayed as shown in FIG. 5C at the first time T1 is the same as FIG. 5E or FIG. 5G after the second time T2. It can be changed to display state such as).
  • FIG. 10 shows a special example of the embodiment described with reference to FIG. 9.
  • the method of processing the touch event illustrated in FIG. 9 does not vary depending on the distance between the second points x21 and y21 and the fourth points x41 and y41.
  • 10 illustrates a special case in which the distance between the second points x21 and y21 and the fourth points x41 and y41 is smaller than the predetermined distance threshold.
  • the touch event of FIG. 10 particularly assumes an operation of rotating the finger 61 in a clockwise direction using the finger 62 as the rotation axis among the two fingers.
  • the second predetermined processing step is performed. If the point where one finger touches is not the central axis as shown in FIG. 10, the predetermined first processing step is performed.
  • the first processing step only needs to be different from the second processing step.
  • a touch is applied by a first touch tool (eg, first finger) 61 of the touch sensing surface 2.
  • the difference between the relative size of the first area indicated by the first area determined to be made and the second area indicated by the second area determined to be made by the second touch tool (ex: second finger) 62 is different. Performed independently.
  • the case where the difference between the first area and the second area is larger than the predetermined area threshold may be distinguished from each other.
  • one of two fingers may be, for example, a thumb and the other is an index finger.
  • the area where the thumb contacts is generally larger than the area where the index finger contacts.
  • one of the two fingers may be the second finger (the index finger) and the other the third finger (the middle finger).
  • the area where the second finger and the third finger contact each other is similar.
  • the touch event shown in FIG. 9 when one of the areas contacted by the two touch tools is larger than the predetermined area threshold value compared to the other, It is supposed to perform two processing steps. If one of the areas touched by the two touch tools is smaller than the predetermined area threshold compared to the other, the first predetermined processing step is performed.
  • the above embodiment is a technique for identifying whether a thumb is included in the two fingers when performing a rotational operation by contacting two fingers, so as to perform a different subsequent process if included or not. To provide.
  • FIG. 11 is a flowchart illustrating a method according to an embodiment of the present invention described with reference to FIG. 9.
  • the touch input device 10 having the touch sensing surface 2, the processor 122, the memory 110, and the memory 110 are stored in the processor 110.
  • a user device 100 may be provided that includes a program configured to be executed by 122.
  • the program may include an instruction for executing the above-described step 210, step S220, and step S230.
  • the user device 100 including the touch input device 10 having the touch sensing surface 2, the processor 122, and the memory 110 may be described above.
  • a computer readable medium may be provided that includes the same.
  • An embodiment of the present invention provides an input gesture processing method of a touch input device that interprets one touch input lasting from a first time point to a second time point, the method comprising: recognizing an input object by the touch input; Calculating a first slope of the one input object at the first time point; And calculating a second slope of the one input object at the second time point, wherein the difference value between the first slope and the second slope is used as a criterion for determining an input gesture.
  • the “input object” may refer to a figure represented by the surface where contact is made.
  • the regions 11 and 12 in the touch panel 900 may be regarded as different input objects.
  • the input object may be defined in various ways.
  • one input object may be modeled as an ellipse, whereby long and short axes may be defined, and the slope of the long axis may be defined.
  • the input object may be presented with five parameters: the center coordinates (x, y) of the modeled ellipse, the length of the major axis, the length of the major axis, and the slope of the major axis.
  • the first time point t1 may be defined as any time point belonging to the first time period 21 in one touch input continuously made during the first time period 21.
  • the first time point t1 may or may not be the first time point S of the first time period 21.
  • the second time point t2 may be defined as any other time point belonging to the first time period 21 above. However, the second time point t2 is later than the first time point 21.
  • the second time point t2 may or may not be the last time point E of the first time period 21.
  • the slope of the one input object will be described with reference to FIG. 4.
  • the first slope refers to the slope of the input object at the first time point T1 and the second slope refers to the slope of the input object at the second time point T2.
  • the input object may be modeled as an ellipse, and the inclination may be a slope of a long axis or a short axis of the ellipse.
  • the x-axis or the y-axis of the touch panel 900 may be used as a reference line.
  • the finger may be rotated while maintaining the contact state.
  • the input object 11 formed by the finger may be presented as reference numeral 31 at the first time point T1, and as shown by reference numeral 32 at the second time point T2.
  • the reference line described above is the x-axis of the touch panel 900
  • the first slope is ⁇ 1
  • the second slope is ⁇ 2.
  • the difference between the first slope and the second slope is ⁇ 1- ⁇ 2, and it may be determined that a specific gesture has occurred based on a result of comparing this value with a predetermined threshold value.
  • the touch input is mapped to the first gesture, and if not, to the second gesture.
  • it may be recognized as a command to rotate the screen to the right.
  • the difference is less than 45 °, it may be recognized as a command to rotate the screen to the left. have.
  • it may be recognized as a command to rotate the screen continuously, that is, in real time, in proportion to the difference value as soon as the difference value deviates from the interval of ⁇ 10 ° to + 10 °.
  • the multi-touch input gesture processing method includes recognizing a difference value between a first width of a first input object and a second width of a second input object, and determining whether the difference is greater than or less than a threshold. It is characterized by the reference.
  • Human fingers can be largely divided into thumb and other fingers. Touching with the thumb may have different input characteristics than touching with the rest of the fingers.
  • the contact area 41 when contacting the thumb may be larger than the contact area 42 when contacting the index finger.
  • the characteristic of the difference between the contact area 41 with the thumb and the contact area 42 with the index finger is the detection function. In the case where the multi-touch is simultaneously performed with the stop and the stop, it may be different from the characteristics of the contact area difference between the contact area 42 by the detection and the contact area 43 by the stop.
  • the difference value of the contact area is larger than the predetermined threshold value, it may be determined that the multi-touch is made with two or more fingers including the thumb. On the contrary, if it is smaller than the threshold value, it may be determined that the multi-touch is performed by two other fingers not including the thumb. Using these results, it is possible to determine whether a thumb is used. As a result, different gestures can be recognized depending on whether the thumb is used.
  • the multi-touch input gesture processing method includes recognizing a difference value between a first slope of a first input object and a second slope of a second input object, and gestures whether the difference is greater than or less than a threshold. It is characterized by the basis of the judgment. For example, referring to FIG. 15, when the multi-touch is simultaneously performed with the thumb and the index finger 300, the relative displacement ( ⁇ 3- ⁇ 4) characteristic of the long axis of the input object by two fingers is simultaneously multi-detected by the index and the middle finger. In the case of touching 400, the relative displacement ( ⁇ 5- ⁇ 6) of the long axis of the input object by two fingers may be different.
  • the relative displacement when the relative displacement is greater than the predetermined threshold value, it may be determined that the multi-touch is made with two or more fingers including the thumb. On the contrary, if it is smaller than the threshold value, it may be determined that the multi-touch is performed by two other fingers not including the thumb. For example, if the relative displacement is greater than 20 °, it may be determined that two or more fingers including the thumb are simultaneously multi-touched. On the contrary, if it is smaller than 20 degrees, it can be judged that two other fingers which do not contain a thumb are simultaneously multi-touched. Using these results, it is possible to determine whether a thumb is used. As a result, different gestures can be recognized depending on whether the thumb is used. The idea of this method is that the fingers of the fingers, except for the thumb, are arranged almost parallel to each other, but the thumb extends in a different direction from the other four fingers.
  • the multi-touch input gesture processing method includes recognizing a difference value between a first aspect ratio of a first input object and a second aspect ratio of a second input object, and determining whether the difference value is greater than or less than a threshold value. It is characterized by the basis of the judgment. For example, referring to FIG. 16, when the multi-touch is simultaneously performed with the thumb and the index finger (300), the characteristic of the difference between the aspect ratios of the input objects by the two fingers is the case where the multi-touch is simultaneously performed with the index finger and the middle finger ( 400 may differ from the characteristic of the difference between the aspect ratios of the input objects by two fingers.
  • the difference between the aspect ratios is greater than the predetermined threshold value, it may be determined that two or more fingers including the thumb are simultaneously multi-touched. On the contrary, if it is smaller than the threshold value, it can be determined that two other fingers that do not include the thumb are simultaneously multi-touched.
  • the aspect ratio of one input object is defined. This definition can be applied not only to the case where the finger touches the tip but also to the touch using two or three nodes of the finger. That is, for example, when a touch input contacting all of the lower surfaces (palms) of the two or three nodes of the index finger is performed, the input object may be interpreted as having a long bar shape. Such a long rod shape may be modeled by approximating an elliptical having a very high aspect ratio.
  • the touch input according to the embodiments of the present invention described above may be applied to a very small wrist watch.
  • 21 (a) and 21 (b) show the shape of a watch, reference numeral 1001 denotes a wrist strap, reference numeral 1002 denotes a bezel of the wristwatch, and reference numeral 1003 denotes a wristwatch.
  • the screen display is shown.
  • the touch input panel of the wrist watch may be covered on the screen display unit 1003.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé permettant le traitement d'un événement tactile comme un geste intentionnel de saisie, d'un utilisateur, par rotation d'un doigt faisant contact avec une surface de détection tactile.
PCT/KR2014/003116 2013-04-10 2014-04-10 Procédé de traitement d'événement tactile et appareil correspondant WO2014168431A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201480020923.1A CN105308540A (zh) 2013-04-10 2014-04-10 触摸事件处理方法及用于其的装置

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR10-2013-0039553 2013-04-10
KR20130039553 2013-04-10
KR10-2014-0042616 2014-04-09
KR1020140042616A KR101661606B1 (ko) 2013-04-10 2014-04-09 터치지점들이 상대적으로 회전하는 터치이벤트 처리방법
KR10-2014-0042615 2014-04-09
KR20140042615A KR20140122682A (ko) 2013-04-10 2014-04-09 접촉면이 회전하는 터치이벤트 처리방법 및 이를 위한 장치

Publications (1)

Publication Number Publication Date
WO2014168431A1 true WO2014168431A1 (fr) 2014-10-16

Family

ID=51689771

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/003116 WO2014168431A1 (fr) 2013-04-10 2014-04-10 Procédé de traitement d'événement tactile et appareil correspondant

Country Status (1)

Country Link
WO (1) WO2014168431A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549852A (zh) * 2016-02-03 2016-05-04 广东欧珀移动通信有限公司 一种图片的旋转方法及装置
CN112181263A (zh) * 2019-07-02 2021-01-05 北京奇虎科技有限公司 触摸屏的绘画操作响应方法、装置及计算设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100214235A1 (en) * 2007-01-03 2010-08-26 Motorola , Inc. Electronic Device and Method of Touch Screen Input Detection
US20110074544A1 (en) * 2009-09-29 2011-03-31 Tyco Electronics Corporation Method and apparatus for detecting simultaneous touch events on a bending-wave touchscreen
US20110102464A1 (en) * 2009-11-03 2011-05-05 Sri Venkatesh Godavari Methods for implementing multi-touch gestures on a single-touch touch surface
EP2378403A1 (fr) * 2010-04-19 2011-10-19 Tyco Electronics Services GmbH Procédé et dispositif pour déterminer un geste tactile d'un utilisateur
WO2012064128A2 (fr) * 2010-11-10 2012-05-18 Chae Sang-Woo Appareil à écran tactile et procédé permettant de commander cet appareil

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100214235A1 (en) * 2007-01-03 2010-08-26 Motorola , Inc. Electronic Device and Method of Touch Screen Input Detection
US20110074544A1 (en) * 2009-09-29 2011-03-31 Tyco Electronics Corporation Method and apparatus for detecting simultaneous touch events on a bending-wave touchscreen
US20110102464A1 (en) * 2009-11-03 2011-05-05 Sri Venkatesh Godavari Methods for implementing multi-touch gestures on a single-touch touch surface
EP2378403A1 (fr) * 2010-04-19 2011-10-19 Tyco Electronics Services GmbH Procédé et dispositif pour déterminer un geste tactile d'un utilisateur
WO2012064128A2 (fr) * 2010-11-10 2012-05-18 Chae Sang-Woo Appareil à écran tactile et procédé permettant de commander cet appareil

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549852A (zh) * 2016-02-03 2016-05-04 广东欧珀移动通信有限公司 一种图片的旋转方法及装置
CN112181263A (zh) * 2019-07-02 2021-01-05 北京奇虎科技有限公司 触摸屏的绘画操作响应方法、装置及计算设备
CN112181263B (zh) * 2019-07-02 2024-04-09 三六零科技集团有限公司 触摸屏的绘画操作响应方法、装置及计算设备

Similar Documents

Publication Publication Date Title
WO2015023136A1 (fr) Procédé et appareil de reconnaissance d'état de préhension dans un dispositif électronique
WO2015084111A1 (fr) Dispositif de traitement d'entrée d'utilisateur utilisant un nombre limité de capteurs de champ magnétique
WO2017119745A1 (fr) Dispositif électronique et procédé de commande associé
WO2012064128A2 (fr) Appareil à écran tactile et procédé permettant de commander cet appareil
WO2013089392A1 (fr) Dispositif d'affichage pliable et son procédé d'affichage
WO2014073926A1 (fr) Dispositif de télécommande, dispositif d'affichage, et son procédé de commande
WO2011046270A1 (fr) Système de commande d'entrées de type tactile multipoints
WO2016093526A1 (fr) Panneau à écran tactile
TWI502474B (zh) 使用者介面的操作方法與電子裝置
WO2020153810A1 (fr) Procédé de commande dispositif et dispositif électronique
US9235287B2 (en) Touch panel apparatus and touch panel detection method
WO2016129923A1 (fr) Dispositif d'affichage, procédé d'affichage et support d'enregistrement lisible par ordinateur
WO2014029170A1 (fr) Procédé de commande tactile d'un écran tactile bimodal capacitif et électromagnétique et dispositif électronique portatif
WO2019160348A1 (fr) Dispositif électronique d'acquisition d'une entrée d'utilisateur en état submergé à l'aide d'un capteur de pression, et procédé de commande de dispositif électronique
WO2017173841A1 (fr) Procédé en fonction d'un écran tactile de commande du défilement automatique d'un livre électronique et terminal mobile
WO2019151642A1 (fr) Appareil et procédé d'affichage d'informations de guidage pour changer l'état d'empreinte digitale
WO2014168431A1 (fr) Procédé de traitement d'événement tactile et appareil correspondant
TWI575429B (zh) 電容式觸控面板模組之操作模式切換方法
WO2017191895A1 (fr) Procédé d'affichage pour un dispositif d'entrée tactile
WO2019199086A1 (fr) Dispositif électronique et procédé de commande pour dispositif électronique
KR20140122687A (ko) 터치이벤트 처리방법 및 이를 위한 장치
WO2017131251A1 (fr) Dispositif d'affichage et procédé de traitement d'entrée tactile associé
WO2013115440A1 (fr) Appareil et procédé d'entrée de lettres par mouvement de contact sur un écran tactile multipoint
WO2015194705A1 (fr) Terminal mobile et son procédé de commande
WO2020171610A1 (fr) Dispositif électronique d'identification des coordonnées d'un objet externe touchant un capteur tactile

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480020923.1

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14782432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14782432

Country of ref document: EP

Kind code of ref document: A1