CN108733302B - Gesture triggering method - Google Patents
Gesture triggering method Download PDFInfo
- Publication number
- CN108733302B CN108733302B CN201810497373.2A CN201810497373A CN108733302B CN 108733302 B CN108733302 B CN 108733302B CN 201810497373 A CN201810497373 A CN 201810497373A CN 108733302 B CN108733302 B CN 108733302B
- Authority
- CN
- China
- Prior art keywords
- gesture
- periphery
- processor
- touch
- control instruction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention provides a gesture triggering method, which is used for a window system comprising an operation area. The method comprises the following steps: detecting that a first gesture contacts a periphery of the operating region; detecting whether a second gesture contacts the periphery within a preset time after the first gesture leaves the operation area from the periphery; and when the second gesture is not detected within the preset time, generating a first control instruction corresponding to the first gesture.
Description
The application is a divisional application of Chinese invention patent application with the application number of 201410118128.8, the application date of 2014, 03 and 27 and the name of a method for preventing the false triggering of the edge sliding gesture.
Technical Field
The present invention relates to a gesture triggering method, and more particularly, to a method for preventing false triggering of an edge sliding gesture.
Background
In a known touch system, for example, a touch pad generally has a touch surface and a processing unit. When a user moves a finger on the touch surface, the processing unit can calculate the position of the finger relative to the touch surface and generate a displacement signal. And then the processing unit outputs the displacement signal to the host and controls the displacement of the cursor of the host relatively.
With the popularization of touch systems, the displacement signal generated according to the movement of an object relative to a touch surface can be used to control the displacement of a cursor and also to implement the application of touch gestures. That is, the user can perform different functions, such as screen data printing, window scrolling, screen zooming, menu calling, or other applications executing, by different touch gestures. Therefore, the user experience can be improved.
The edge sliding gesture is a common touch gesture, and a user can move a finger from the edge of the touch surface to the center of the touch surface to trigger the edge sliding gesture. For example, in the eighth version of the Microsoft Windows operating system (Microsoft Windows 8), the user may call out the application menu through the edge swipe gesture; in Google Android operating system (Google Android), the user may call the drop-down menu through the edge swipe gesture.
Fig. 1 shows a schematic diagram of a triggering edge-swipe gesture, in which a user can move a finger 8 in a touch area 9 of a touch system to generate a displacement signal. When the finger 8 enters the touch area 9 from outside the touch area 9, as shown by a trace 8a in fig. 1, the touch system triggers an edge sliding gesture. However, it is preferable to avoid false triggering of the edge swipe gesture under certain operations.
Disclosure of Invention
In view of the above, the present invention provides a method for preventing an edge sliding gesture from being triggered by mistake and a gesture triggering method.
The present invention provides a method for preventing false triggering of an edge sliding gesture, which can determine whether to prevent triggering of the edge sliding gesture according to a time difference and/or a position difference between an object leaving from a periphery and entering a touch surface.
Another objective of the present invention is to provide a method for preventing false triggering of an edge sliding gesture and a gesture triggering method, which can provide better user experience.
To achieve the above objective, the present invention provides a gesture triggering method, which is suitable for a window system including an operation area. The method comprises the following steps: detecting that a first gesture contacts a periphery of the operating region; detecting whether a second gesture contacts the periphery within a preset time after the first gesture leaves the operation area from the periphery; and generating a first control instruction corresponding to the first gesture when the second gesture is not detected within the preset time; wherein the first gesture is earlier than the second gesture.
In one embodiment, the processor determines whether to trigger the edge swipe gesture according to a time difference between the first time and the second time.
In one embodiment, the processor determines whether to trigger the edge sliding gesture according to a time difference between the first time and the second time and a distance between the first position and the second position.
In one embodiment, the processor determines whether to trigger the edge sliding gesture according to the count stop signal when determining that the object enters the touch screen from the periphery.
In one embodiment, the processor determines whether to trigger the edge sliding gesture according to the counting stop signal and the distance between the positions where the object leaves from the periphery and enters the touch surface.
In one embodiment, the processor determines to generate the first control instruction corresponding to the first gesture according to whether a second gesture contacts the periphery within a predetermined time after the first gesture leaves the operating region from the periphery or combine the first gesture and the second gesture to generate the combined control instruction.
In one embodiment, the processor determines to generate the peripheral slide control command or perform the gesture combination to stop generating the peripheral slide control command according to whether other gestures contact the periphery within a predetermined time before the edge slide gesture contacts the periphery.
The gesture triggering method of the embodiment of the invention can determine whether to trigger the edge sliding gesture by recording the time difference of the object leaving from the periphery and entering the touch surface. In addition, the touch system can record the position difference of the object which leaves from the periphery and enters the touch surface to determine whether to trigger the edge sliding gesture, so that the accuracy of preventing the edge sliding gesture from being triggered by mistake is improved.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 shows a schematic diagram of a trigger edge swipe gesture;
FIG. 2A is a schematic diagram of a touch system for preventing false triggering of an edge sliding gesture according to a first embodiment of the present invention;
FIG. 2B is a flowchart illustrating a method for preventing false triggering of an edge swipe gesture according to a first embodiment of the present invention;
FIG. 3 shows trigger conditions for an edge swipe gesture in a second embodiment of the first embodiment of the present invention;
FIG. 4A is a schematic diagram illustrating an object operating on a circular touch surface;
FIG. 4B is a schematic diagram illustrating an object operating on a rectangular touch surface;
FIG. 5 is a block diagram of a touch system for preventing false triggering of an edge sliding gesture according to a second embodiment of the present invention;
FIG. 6 is a flowchart illustrating a method for preventing false triggering of an edge swipe gesture according to a second embodiment of the present invention;
FIG. 7 is a flowchart illustrating a gesture triggering method according to a third embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method for triggering an edge swipe gesture according to a fourth embodiment of the present invention.
Description of the reference numerals
1 touch system
10 touch control surface
101-104 edge
10a periphery of
12 sensor
14 processor
16 counter
2. 8 finger
8a locus
9 touch area
d1-d4Distance between two adjacent plates
F detection frame
P1、P2Position of
S11-S44Step (ii) of
SInitiation ofCount start signal
SStopA count stop signal.
Detailed Description
In order that the manner in which the above recited and other objects, features and advantages of the present invention are obtained will become more apparent, a more particular description of the invention briefly described below will be rendered by reference to the appended drawings. In the description of the present invention, the same components are denoted by the same reference numerals and will be described in advance herein.
Fig. 2A is a schematic diagram of a touch system 1 for preventing a false triggering of an edge sliding gesture according to a first embodiment of the invention. The touch system 1 includes a touch surface 10, a sensor 12, and a processor 14. The sensor 12 is electrically connected to the processor 14. A user can touch or approach the touch surface 10 with an object 2 (here, illustrated as a finger), and the processor 14 can calculate the position or position change of the object 2 relative to the touch surface 10 according to the detection frames F generated by the sensor 12 continuously detecting the object 2. A cursor (not shown) on the display device can move correspondingly according to the position or the position change.
The touch system 1 of the present embodiment may be a capacitive touch screen, and therefore, the touch system 1 may be directly disposed on the display device, but the invention is not limited thereto. In other embodiments, the touch system 1 may be a touch pad, a navigation device, a mobile phone or a computer system. In addition, a device capable of detecting the contact of the user's finger with the screen or directly calculating the coordinates of the cursor can be applied to the present invention, such as finger navigation (finger navigation), mouse or optical touch panel, but not limited to the capacitive touch screen. It should be noted that, if the touch system 1 does not have a display function and corresponds to a display device, such as a touch pad, the touch system 1 preferably has the same shape as the display device, but is not limited thereto.
Referring to fig. 2A, the touch surface 10 is used for the object 2 to operate thereon. Since the touch system 1 of the present embodiment is described by taking a capacitive touch screen as an example, the touch surface 10 preferably corresponds to a display device so that a user can view the position of the object 2 corresponding to the cursor through the display device in real time. The touch surface 10 may be the surface of a suitable object.
The sensor 12 is used for continuously outputting the detection frames F related to the touch surface 10. It can be understood that, since the touch surface 10 has a periphery 10a, the boundary of the detection frame F may correspond to the periphery 10 a. In this embodiment, the sensor 12 is disposed under the touch surface 10, as shown in fig. 2A, but is not limited thereto. The relative positions of the sensor 12 and the touch surface 10 may depend on the application.
It should be noted that the sensor 12 may be a capacitive touch sensor, wherein the capacitive touch sensor has a plurality of detection units. When the object 2 contacts the touch surface 10, the capacitance variation amount may be generated correspondingly to the detection units under the object 2 and around the object 2, and then the sensor 12 may output a detection frame F, but the present invention is not limited thereto. In other embodiments, the sensor 12 may be a resistive or optical touch sensor.
The sensing principle and structure of the capacitive, resistive and optical touch sensors are known, and therefore are not described herein, and the present invention processes the detection frame output by the sensor 12 and determines whether to trigger the edge sliding gesture. In addition, the material of the object 2 is not particularly limited, and depends on the type of the sensor 12. For example, when the sensor 12 is a capacitive touch sensor, the object 2 is preferably a finger or a capacitive stylus. When the sensor 12 is an optical touch sensor, the object 2 preferably has a light-shielding property.
The processor 14, for example, can be a Digital Signal Processor (DSP) or other processing device capable of processing the detection frame F, and records a first information of the object 2 leaving the touch surface 10 from the periphery 10a and a second information of the object 2 entering the touch surface 10 from the periphery 10a according to the detection frame F, so as to determine whether to trigger an edge sliding gesture. In this embodiment, the processor 14 is implemented in hardware. In other embodiments, the processor 14 may be integrated into software, such as an operating system or a predetermined program, or implemented in firmware.
Fig. 2B is a flowchart of a method for preventing false triggering of an edge sliding gesture according to a first embodiment of the present invention, which is applied to a touch system including a touch surface, and the method includes the following steps: recording first information when detecting that the first gesture ends at the periphery of the touch surface (step S)11) (ii) a Recording a second gesture when detecting a second gesture beginning at the periphery of the touch surfaceInformation (step S)12) (ii) a And determining whether to prevent triggering the edge sliding gesture according to the first information and the second information by using a processor (step S)13) (ii) a Wherein the first gesture is earlier in time than the second gesture.
It should be noted that the processor 14 calculates the first gesture and the second gesture based on the detection frames F of the object 2 continuously output by the sensor 12. That is, the first gesture and the second gesture represent the position change (i.e. the track) of the object 2 moving on the touch surface 10, for example, in this embodiment, the first gesture ending at the periphery 10a of the touch surface 10 represents that the object 2 leaves the touch surface 10 from the periphery 10 a; the second gesture starting from the periphery 10a of the touch surface 10 represents the entry of the object 2 from the periphery 10a into the touch surface 10, but the invention is not limited thereto. The method for the processor 14 to determine whether the object 2 leaves from the periphery 10a or enters the touch surface 10 according to the detection frame F is known, and thus the present invention is not repeated herein.
Referring to fig. 2A and 2B, the following describes an embodiment of the present invention.
Step S11: first, after the touch system 1 starts to operate (i.e., is initialized), the user can move the object 2 on the touch surface 10, and the sensor 12 continuously outputs the related detection frames F to the processor 14. When the processor 14 detects that the first gesture ends at the periphery 10a of the touch surface 10 according to the detection frame F, first information is recorded.
Step S12: then, when the processor 14 detects that a second gesture starts from the periphery 10a of the touch surface 10 according to the detection frame F, second information is recorded.
It is understood that the touch system 1 or the processor 14 may further include a storage unit (not shown) for recording the first information and the second information, and the processor 14 may directly access the storage unit at any time. In this embodiment, the storage unit only records a set of first information and a set of second information, respectively, for example, when the object is detected to leave the touch surface 10 from the periphery 10a, the processor 14 records the first information and covers the first information of the previous time when the object leaves the touch surface 10 from the periphery 10 a; similarly, when the object is detected to enter the touch surface 10 from the periphery 10a, the processor 14 records second information and covers the second information of the previous object entering the touch surface 10 from the periphery 10 a. That is, the processor 14 records the latest first information and second information in the storage unit, but the present invention is not limited thereto, and may be determined according to the characteristics and capacity of the storage unit. In another embodiment, the processor 14 records only the first information in the storage unit, for example, and directly calculates with the first information without recording the first information when the second information is detected.
Step S13: finally, the processor 14 may decide whether to prevent triggering of an edge swipe gesture based on the first information and the second information.
In a first embodiment, the first information includes a first time and the second information includes a second time. When the time difference between the first time and the second time is less than a time threshold, the processor 14 does not trigger the edge swipe gesture; and when the time difference exceeds the time threshold, the processor 14 triggers the edge swipe gesture. Thereby, the processor 14 may decide whether to trigger the edge slide gesture according to the time difference. For example, the time threshold is pre-stored as 500 milliseconds before the touch system 1 leaves the factory, and the processor 14 may determine whether to stop triggering the edge sliding gesture according to the comparison result between the time difference and the time threshold.
It should be noted that the time threshold may be determined according to the size of the touch surface 10, the application of the touch system 1, or the requirement of a preset program executed by the touch system 1, and is not limited to a fixed value.
In a second embodiment, the first information and the second information respectively include the first time and the second time, the first information may further include a first position and the second information may further include a second position, and the processor 14 may calculate a distance between the first position and the second position according to the positions. When the time difference is less than a time threshold and the distance is less than a distance threshold, the processor 14 does not trigger the edge swipe gesture; and the processor 14 triggers the edge swipe gesture when the time difference exceeds the time threshold or the distance exceeds the distance threshold.
It can be appreciated that, since the touch system 1 of the second embodiment needs to consider the time difference and the distance to determine whether to prevent the edge sliding gesture from being triggered, the touch system 1 of the second embodiment has a stricter determination condition than the touch system 1 of the first embodiment, as shown in fig. 3.
It should be noted that the calculation manner of the distance between the first position and the second position may be preset in the processor 14 before the touch system 1 is shipped. For example, referring to fig. 4A, the fingers 2 are respectively located at positions P along the dotted lines1Away from the touch surface 10 from the periphery 10a and later on at position P2Entering the touch surface 10 from the periphery 10 a. The processor 14 can calculate the position P from the image frame F1And P2Inter pixel distance d1Or d2Wherein the pixel distance d1Represents said position P1And P2A distance along said periphery 10a, and d2Represents said position P1And P2The linear distance therebetween.
Furthermore, if the touch surface 10 is not circular, for example, referring to fig. 4B, the touch surface 10 is rectangular and the periphery 10a has at least two edges, for example, four edges 101, 102, 103 and 104 are shown here. When the object 2 leaves and enters the touch surface 10 from the same edge (e.g., edge 103), the processor 14 may calculate the distance d3(ii) a When the object 2 leaves and enters the touch surface 10 from two adjacent edges (such as the edges 101 and 104), the processor 14 can calculate the distance d4. Thus, when the periphery 10a includes at least two edges, the distance may be a pixel distance relative to the same edge or two adjacent edges.
Fig. 5 is a block diagram of a touch system 1 for preventing a false triggering of an edge sliding gesture according to a second embodiment of the invention, and a schematic diagram thereof can be represented by fig. 2A. The touch system 1 includes a touch surface 10, a sensor 12, a processor 14, and a counter 16. The sensor 12 and the counter 16 are each electrically connected to the processor 14. Similarly, the user can touch the touch surface 10 with an object 2, and the processor 14 can continuously detect the detection frames F generated by the object 2 according to the sensor 12 to calculate the position or the position change of the object 2 relative to the touch surface 10.
Similar to the touch system 1 of the first embodiment, the touch surface 10 has a periphery 10a, and the sensor 12 is used for continuously outputting the detection frames F related to the touch surface 10, so that the description thereof is omitted.
The processor 14 is configured to determine that the object 2 leaves from the periphery 10a or enters the touch surface 10 according to the detection frame F, and send a counting start signal S when determining that the object 2 leaves from the periphery 10a and the touch surface 10Initiation ofTo the counter 16.
The counter 16 is used for counting when receiving the counting start signal SInitiation ofStarting to count, and sending out a counting stop signal S when the counting reaches a preset countingStopTo the processor 14. When the counter 16 sends out the counting stop signal SStopOr when the processor 14 determines that the object 2 enters the touch surface 10 from the periphery 10a, the processor 14 resets to zero the counter 16. Therefore, when the processor 14 determines that the object 2 enters the touch surface 10 from the periphery 10a, the counting stop signal S can be usedStopIt is determined whether an edge swipe gesture is triggered.
Referring to fig. 2A, 5 and 6, the following describes an embodiment of the present invention; FIG. 6 is a flowchart illustrating a method for preventing false triggering of an edge sliding gesture according to a second embodiment of the present invention.
Step S21: first, after the touch system 1 starts to operate (i.e., is initialized), the user can move the object 2 on the touch surface 10, and the sensor 12 continuously outputs the related detection frames F to the processor 14. When the processor 14 determines that the object 2 leaves the touch surface 10 from the periphery 10a, a counting start signal S is sentInitiation ofTo the counter 16.
Step S22: the counter 16 is receiving the counting start signal SInitiation ofAfter that, counting is started.
In the first embodiment, before the counter 16 stops counting (i.e. the counting does not exceed the preset counting), the processor 14 does not receive the counting stop signal S sent by the counter 16Stop. Therefore, when the processor 14 receives the count stop signal SStopWhen it is determined that the object 2 enters the touch surface 10 from the periphery 10a, the edge sliding gesture is not triggered, as shown in step S23、S25And S29. At the same time, the processor 14 zeroes the counter 16.
In the second embodiment, after the counter 16 stops counting (i.e. the counting exceeds the preset counting), the processor 14 has received the counting stop signal S sent by the counter 16Stop. Therefore, when the processor 14 determines that the object 2 enters the touch surface 10 from the periphery 10a and has received the counting stop signal SStopIf so, triggering the edge sliding gesture, and if so, triggering the edge sliding gesture23And S24. At the same time, the processor 14 zeroes the counter 16.
In other words, when the processor 14 detects that an object leaves the touch surface 10 from the periphery 10a and the counting stop signal S has not been received yetStopBefore, the triggering of the edge sliding gesture is continuously stopped; when the processor 14 detects that an object leaves the touch surface 10 from the periphery 10a and has received the counting stop signal SStopAnd when so, ending the function of preventing the triggering of the edge sliding gesture.
As in the first embodiment, the processor 14 may also record the position where the object 2 leaves and enters the touch surface 10 from the periphery 10 a. Therefore, the processor 14 can also stop the signal S according to the counting at the same timeStopThe distance from the position (i.e. the position difference) determines whether to trigger the edge sliding gesture, in step S26-S29。
For example, in the first implementation manner of the second embodiment, when the processor 14 determines that the object 2 enters the touch surface 10 from the periphery 10a, the processor 14 does not receive the counting stop signal S sent by the counter 16Stop. Meanwhile, if the distance is less than the distance threshold, the processor 14 does not trigger the edge sliding gesture, as shown in step S27And S29. However, if the processor 14 does not accept the count stop signal SStopWhile the distance exceeds the distance threshold, the processor 14 still triggers the edge sliding gesture, as shown in step S27And S28。
In addition, when the processor 14 detects that an object leaves the touch surface 10 from the periphery 10a and has received the counting stop signal SStopThen the distance is not calculated again.
As mentioned above, the touch system 1 or the processor 14 further includes a storage unit for recording the position, and the processor 14 can directly access the storage unit at any time to calculate the distance.
Similarly, the calculation of the distance is described in the first embodiment of the present invention. Furthermore, when the periphery 10a includes at least two edges, the distance may be a pixel distance with respect to the same edge or two adjacent edges.
In this embodiment, the processor 14 may send out a counting start signal S when determining that the object 2 leaves the touch surface 10 from the periphery 10aInitiation ofTo the counter 16. In other embodiments, the processor 14 may only send the counting start signal S when the touch system 1 executes a predetermined programInitiation of。
For example, the touch system 1 may be a mobile electronic device, such as a tablet computer, a smart phone, or a handheld game console. When a user uses the mobile electronic device to browse a webpage or process a document, the finger or the touch pen used by the user cannot perform violent operation on the touch surface of the mobile electronic device so as to trigger the edge sliding gesture by mistake. At this time, even if the processor 14 detects that the finger or the stylus is separated from the touch surface from the periphery, the counting start signal S is not sentInitiation ofTo the counter 16. However, if the mobile electronic device executes a game program, when the processor 14 detects that the finger or the stylus is away from the touch surface from the periphery, the processor 14 may issue the counting start signal SInitiation ofTo the counter 16, so that the mobile electronic device can prevent the user from performing a violent operation to trigger the edge sliding gesture by mistake. In other words, the present invention can determine whether to execute the method for preventing the false triggering of the edge sliding gesture according to the current execution program.
In this embodiment, the touch surface 10 of the touch system 1 is provided for the finger operation of the user, so the touch surface 10 can be defined as a touch operation area. In one embodiment, the window system may also be adapted to the present invention, for example, but not limited to, the user may operate the cursor operation area of the window system through a mouse or other navigation device.
Please refer to fig. 2A, 4A and 7 at the same time; fig. 7 is a flowchart illustrating a method for triggering a gesture according to a third embodiment of the present invention. The present embodiment can be used to confirm that the gesture leaving the periphery of the touch operation area is not accidentally caused, and therefore, the confirmation is performed by waiting for a predetermined time.
Step S31: first, when the finger 2 operates on the touch surface 10 and moves outward, the sensor 12 can detect that a first gesture contacts the periphery 10a of the touch surface 10 (i.e. the operation area), for example, the finger 2 moves from the touch surface 10 to the position P in fig. 4A1Then (c) is performed. It can be understood that, in the present embodiment, the finger 2 is shown in fig. 4ASaid position P of said periphery 10a1May be defined as a first gesture, while the finger 2 is at the position P of the periphery 10a2The latter trajectory may be defined as a second gesture.
Step S32: after the first gesture leaves the operating region from the periphery 10a, the sensor 12 detects whether there is a second gesture contacting the periphery 10a for a predetermined time, wherein the processor 14 may determine whether the second gesture occurs over the predetermined time by a counter (e.g., the counter 16 of the second embodiment of the present invention) or other timing method. In addition, the predetermined time may be a preset value (e.g., 500 ms) or may be adjusted by the user.
Step S33: then, when the second gesture is not detected within the predetermined time, a first control instruction corresponding to the first gesture is generated, wherein the first control instruction can be used for executing the action or movement corresponding to the first gesture. For example, the motion corresponding to the first gesture can adjust the brightness, volume or page turning of a picture; the movement corresponding to the first gesture may output a trajectory of the first gesture. In addition, the finger 2 is at step S31May not have completed the first gesture when leaving the operating area from the periphery 10a, then in step S33When the second gesture is not detected within the predetermined time, the first control instruction corresponding to the first gesture is certainly not generated.
That is, the first gesture of the present embodiment does not immediately generate the corresponding first control instruction after contacting the periphery 10a and when the first gesture is finished. The processor 14 must wait and confirm that no relevant second gesture enters the operation region within the predetermined time before generating the first control command. Furthermore, the second gesture is not detected until after the processor 14 generates the first control command (i.e., after the predetermined time), and may be used, for example, to trigger a peripheral slide command.
Step S34: however, if the second time is detected within the predetermined timeA gesture, the first gesture and the second gesture may then be combined to generate a combined control instruction. At this point the combined control instruction may be used to combine the second gesture to perform an action or movement in accordance with the first gesture. It will be appreciated that the second gesture is not used to trigger a peripheral swipe instruction when the combination of gestures is performed.
For example, assuming the operating region is a browser window (browser window), the first control command may enable a "previous page" and the combined control commands may enable a "rearrangement" action, but is not limited thereto, depending on the application.
As described in the first embodiment of the present invention, the processor 14 may also record a first location (e.g., location P) where the first gesture departs from the periphery 10a1) And a second position (e.g. the position P) where the second gesture enters the periphery 10a2) And accordingly determines the trigger command. The processor 14 may also be based on a distance between the first location and the second location (e.g., the distance d)1Or d2) Whether a distance threshold is exceeded to generate the first control instruction corresponding to the first gesture, the combined control instruction, or a second control instruction corresponding to the second gesture. For example, the processor 14 may generate the first control instruction and the second control instruction sequentially when the distance exceeds a distance threshold. And when the distance is less than the distance threshold, the processor 14 may combine the first gesture and the second gesture to generate the combined control instruction.
Please refer to fig. 2A, 4A and 8 at the same time; FIG. 8 is a flowchart illustrating a method for triggering an edge swipe gesture according to a fourth embodiment of the present invention. The present embodiment can also be used to confirm that the gesture leaving the periphery of the touch operation area is not accidentally caused, and the gesture is confirmed by other gestures within a predetermined time.
Step S41: first, the sensor 12 can detect the edge-sliding gesture contacting the periphery 10a of the operation area (i.e. the touch control surface 10), for example, the finger 2 shown in fig. 4A is from the position P of the periphery 10a2Move towards the inside of the touch surface 10. It must be said thatIt should be noted that the edge sliding gesture of the embodiment is a gesture of entering the operation area from the periphery 10a or sliding on the periphery 10a, and the edge sliding gesture may correspond to functions such as a pull-down menu, volume adjustment, screen brightness adjustment, and the like, but is not limited thereto, and may be determined according to applications of a touch system or a window system.
Step S42: next, the processor 14 determines whether any other gesture contacts the periphery 10a within a predetermined time period before the edge swipe gesture contacts the periphery 10 a.
Step S43-S44: when no other gesture contacts the periphery 10a within a predetermined time before the edge swipe gesture contacts the periphery 10a, the processor 14 generates a peripheral swipe control instruction. Otherwise, if it is detected that the other gesture contacts the peripheral 10a within the predetermined time, the processor 14 may output a displacement signal according to the edge sliding gesture and does not generate the peripheral sliding control command.
The difference between this embodiment and the third embodiment of the present invention is that the processor 14 of this embodiment preferably includes a temporary storage unit or a buffer unit for recording whether there is another gesture contacting the periphery 10a within the predetermined time. For example, when the sensor 12 detects a previous gesture contacting the periphery 10a, the previous gesture may be stored in a buffer unit of the processor 14 for a duration of time. And when the time exceeds the preset time, clearing the information of the previous gesture. Thus, in step S43At this time, the processor 14 does not find the record related to the previous gesture in the buffer unit, so that the peripheral sliding control command can be triggered according to the edge sliding gesture.
As described in the first embodiment of the present invention, the processor 14 may also record a first position of the edge swipe gesture relative to the periphery 10 a; and upon detecting a previous gesture contacting the periphery 10a within the predetermined time, recording a second location of the previous gesture relative to the periphery. The processor 14 may further determine whether to generate the peripheral sliding control command according to the distance between the first position and the second position. For example, the processor 14 may generate the peripheral slip control instruction when the distance exceeds a distance threshold. And when the distance is less than the distance threshold, the processor 14 may combine the edge slide gesture and the previous gesture to form a continuous gesture and not generate the peripheral slide control instruction.
As described above, the known touch system does not have a function of preventing false triggering of the edge swipe gesture. Therefore, the present invention provides a method for preventing triggering an edge sliding gesture, which can determine whether to prevent from triggering the edge sliding gesture by mistake according to the time difference and/or the position difference of the object leaving from the periphery and entering the touch surface, so as to bring better user experience.
Although the present invention has been disclosed in the context of the foregoing embodiments, it is not intended to be limited thereto, and various changes and modifications can be made by one skilled in the art without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention is subject to the scope defined by the appended claims.
Claims (9)
1. A gesture-triggered method for a windowing system comprising an operating region, the method comprising:
detecting that a first gesture contacts a periphery of the operating region;
detecting whether a second gesture contacts the periphery within a preset time after the first gesture leaves the operation area from the periphery, wherein the second gesture is a gesture entering the operation area from the periphery;
when the second gesture is not detected within the preset time, generating a first control instruction corresponding to the first gesture; and
when the second gesture is detected within the preset time, combining the first gesture and the second gesture to generate a combined control instruction so as to prevent the edge sliding gesture from being triggered by mistake when the gesture leaves and enters the periphery of the operation area, wherein the edge sliding gesture is a gesture entering the operation area from the periphery.
2. The method of claim 1, wherein the first control instruction is to perform an action or movement corresponding to the first gesture.
3. The method of claim 2, wherein,
the action is to adjust the brightness and volume of the picture or turn pages,
the movement is outputting a trajectory of the first gesture.
4. The method of claim 1, wherein the combined control instruction is to perform an action or movement corresponding to a combination of the first gesture and the second gesture.
5. The method of claim 1, further comprising:
recording a first location where the first gesture departs from the periphery; and
when the second gesture is detected within the predetermined time, recording that the second gesture enters a second location of the periphery.
6. The method of claim 5, further comprising:
when the distance between the first position and the second position exceeds a distance threshold, sequentially generating a first control instruction corresponding to the first gesture and a second control instruction corresponding to the second gesture; and
combine the first gesture and the second gesture to generate a combined control instruction when the distance of the first location and the second location is less than the distance threshold.
7. The method of claim 1, wherein the operating region is a cursor operating region or a touch operating region of the window system.
8. The method of claim 1, wherein the predetermined time is adjustable.
9. The method of claim 1, wherein the first control instruction of the first gesture is not generated for the predetermined time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810497373.2A CN108733302B (en) | 2014-03-27 | 2014-03-27 | Gesture triggering method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810497373.2A CN108733302B (en) | 2014-03-27 | 2014-03-27 | Gesture triggering method |
CN201410118128.8A CN104951213B (en) | 2014-03-27 | 2014-03-27 | The method for preventing false triggering boundary slip gesture |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410118128.8A Division CN104951213B (en) | 2014-03-27 | 2014-03-27 | The method for preventing false triggering boundary slip gesture |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108733302A CN108733302A (en) | 2018-11-02 |
CN108733302B true CN108733302B (en) | 2020-11-06 |
Family
ID=54165897
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810539945.9A Active CN108874284B (en) | 2014-03-27 | 2014-03-27 | Gesture triggering method |
CN201410118128.8A Active CN104951213B (en) | 2014-03-27 | 2014-03-27 | The method for preventing false triggering boundary slip gesture |
CN201810497373.2A Active CN108733302B (en) | 2014-03-27 | 2014-03-27 | Gesture triggering method |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810539945.9A Active CN108874284B (en) | 2014-03-27 | 2014-03-27 | Gesture triggering method |
CN201410118128.8A Active CN104951213B (en) | 2014-03-27 | 2014-03-27 | The method for preventing false triggering boundary slip gesture |
Country Status (1)
Country | Link |
---|---|
CN (3) | CN108874284B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105468194B (en) * | 2015-11-17 | 2018-09-18 | 小米科技有限责任公司 | Touch-control response method and device |
CN105511776B (en) * | 2015-11-24 | 2018-12-21 | 努比亚技术有限公司 | A kind of mobile terminal and its control method |
CN106527933B (en) * | 2016-10-31 | 2020-09-01 | 努比亚技术有限公司 | Control method and device for edge gesture of mobile terminal |
CN106791005A (en) * | 2016-11-28 | 2017-05-31 | 努比亚技术有限公司 | Mobile terminal and edge gesture false-touch prevention method |
CN108762557A (en) * | 2018-05-22 | 2018-11-06 | 北京集创北方科技股份有限公司 | A kind of touch detecting method and computer readable storage medium |
CN109697012A (en) * | 2018-12-25 | 2019-04-30 | 华勤通讯技术有限公司 | Control method, smartwatch and the storage medium of smartwatch |
CN109756627B (en) * | 2018-12-29 | 2021-09-07 | 努比亚技术有限公司 | Light sensation control method and device and computer readable storage medium |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007103631A2 (en) * | 2006-03-03 | 2007-09-13 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US8547347B2 (en) * | 2008-09-26 | 2013-10-01 | Htc Corporation | Method for generating multiple windows frames, electronic device thereof, and computer program product using the method |
US20130088450A1 (en) * | 2010-04-09 | 2013-04-11 | Sony Computer Entertainment Inc. | Information processing system, operation input device, information processing device, information processing method, program, and information storage medium |
TWI436247B (en) * | 2010-12-31 | 2014-05-01 | Acer Inc | Method for moving objects and electronic apparatus using the same |
TWI446236B (en) * | 2011-01-04 | 2014-07-21 | Sentelic Corp | An electronic device and a control method thereof |
US20120304107A1 (en) * | 2011-05-27 | 2012-11-29 | Jennifer Nan | Edge gesture |
CN102819331B (en) * | 2011-06-07 | 2016-03-02 | 联想(北京)有限公司 | Mobile terminal and touch inputting method thereof |
US9395852B2 (en) * | 2012-05-07 | 2016-07-19 | Cirque Corporation | Method for distinguishing between edge swipe gestures that enter a touch sensor from an edge and other similar but non-edge swipe actions |
TWI464647B (en) * | 2012-09-10 | 2014-12-11 | Elan Microelectronics Corp | Touch device and gesture identifying method thereof |
TWI475440B (en) * | 2012-09-10 | 2015-03-01 | Elan Microelectronics Corp | Touch device and gesture identifying method thereof |
CN102929528A (en) * | 2012-09-27 | 2013-02-13 | 鸿富锦精密工业(深圳)有限公司 | Device with picture switching function and picture switching method |
-
2014
- 2014-03-27 CN CN201810539945.9A patent/CN108874284B/en active Active
- 2014-03-27 CN CN201410118128.8A patent/CN104951213B/en active Active
- 2014-03-27 CN CN201810497373.2A patent/CN108733302B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN108874284B (en) | 2020-11-06 |
CN108874284A (en) | 2018-11-23 |
CN104951213B (en) | 2018-06-22 |
CN104951213A (en) | 2015-09-30 |
CN108733302A (en) | 2018-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI514248B (en) | Method for preventing from accidentally triggering edge swipe gesture and gesture triggering | |
CN108733302B (en) | Gesture triggering method | |
CN105824559B (en) | False touch recognition and processing method and electronic equipment | |
US9529527B2 (en) | Information processing apparatus and control method, and recording medium | |
US9069386B2 (en) | Gesture recognition device, method, program, and computer-readable medium upon which program is stored | |
US9411418B2 (en) | Display device, display method, and program | |
US20140237422A1 (en) | Interpretation of pressure based gesture | |
JP6177876B2 (en) | Touch panel system | |
JP5805890B2 (en) | Touch panel system | |
EP2839357A1 (en) | Rapid gesture re-engagement | |
WO2015131675A1 (en) | Compensation method for broken slide paths, electronic device and computer storage medium | |
KR20140031254A (en) | Method for selecting an element of a user interface and device implementing such a method | |
JP6410537B2 (en) | Information processing apparatus, control method therefor, program, and storage medium | |
US20160196034A1 (en) | Touchscreen Control Method and Terminal Device | |
EP2634680A1 (en) | Graphical user interface interaction on a touch-sensitive device | |
US10296130B2 (en) | Display control apparatus, display control method, and storage medium storing related program | |
US10394442B2 (en) | Adjustment of user interface elements based on user accuracy and content consumption | |
JP2015141526A (en) | Information processor, information processing method and program | |
JP6151087B2 (en) | Touch panel system | |
US20140320430A1 (en) | Input device | |
US20210286499A1 (en) | Touch position detection system | |
WO2016206438A1 (en) | Touch screen control method and device and mobile terminal | |
EP2750016A1 (en) | Method of operating a graphical user interface and graphical user interface | |
JP6971573B2 (en) | Electronic devices, their control methods and programs | |
KR101706909B1 (en) | Finger Input Devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |