US20100088595A1 - Method of Tracking Touch Inputs - Google Patents

Method of Tracking Touch Inputs Download PDF

Info

Publication number
US20100088595A1
US20100088595A1 US12/244,780 US24478008A US2010088595A1 US 20100088595 A1 US20100088595 A1 US 20100088595A1 US 24478008 A US24478008 A US 24478008A US 2010088595 A1 US2010088595 A1 US 2010088595A1
Authority
US
United States
Prior art keywords
frame
variation
axis
center position
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/244,780
Inventor
Chen-Hsiang Ho
Yu-Min Hsu
Chia-Feng Yang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AU Optronics Corp
Original Assignee
AU Optronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AU Optronics Corp filed Critical AU Optronics Corp
Priority to US12/244,780 priority Critical patent/US20100088595A1/en
Assigned to AU OPTRONICS CORP. reassignment AU OPTRONICS CORP. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HO, CHEN-HSIANG, HSU, YU-MIN, YANG, CHIA-FENG
Priority to TW097150988A priority patent/TW201015394A/en
Priority to CN2009100027681A priority patent/CN101464749B/en
Publication of US20100088595A1 publication Critical patent/US20100088595A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to touch input devices, and more particularly, to a method of tracking touch inputs for a multitouch input device.
  • Input devices that interface with computing devices provide means for digitizing and transferring text, images, video, and also commands, according to control by a user.
  • a keyboard may be utilized for transmitting text in a sequence dictated by keystrokes made by the user.
  • a webcam may capture sequences of images, and transfer the images to the computing device for processing and storage.
  • a mouse may be utilized to operate the computing device, allowing the user to point at and click on graphical controls, such as icons, scroll bars, and menus.
  • Touchpads are input devices which detect physical contact, and transfer coordinates thereof to the computing device. For example, if the user taps the touchpad, coordinates corresponding to the center of an area touched by the user, along with duration of the tap, may be transferred to the computing device for controlling the computing device. Likewise, if the user drags his/her finger in a path along the surface of the touchpad, a series of coordinates may be transferred to the computing device, such that the computing device may discern direction of motion of the user's finger, and respond with an appropriate action.
  • touchpad input devices were limited to tracking contact from one source, such as contact from one finger or a stylus.
  • simultaneous tracking of multiple points of contact known as “multitouch,” is rapidly becoming a feasible technology.
  • Popular commands typically associated with multitouch input devices include zooming and rotating. For example, by contacting the multitouch input device with two fingers, and bringing the two fingers together, the user may control the computing device to zoom out. Likewise, by moving the two fingers apart, the user may control the computing device to zoom in.
  • a method of tracking touch inputs comprises calculating a first center position corresponding to two touch points along a first axis for a first frame, detecting variation of the first center position from the first frame to a second frame, and determining a gesture type according to the variation of the first center position.
  • a method of tracking touch inputs comprises calculating a first center position corresponding to two touch points along a first axis for a first frame, detecting variation of the first center position from the first frame to a second frame, calculating a second center position corresponding to the two touch points along a second axis for a first frame, detecting variation of the second center position from the first frame to the second frame, and determining a zoom gesture type when the variation of the first center position and the variation of the second position are both lower than a predetermined threshold.
  • FIG. 1 is a flowchart of a process for tracking touch inputs.
  • FIG. 2 is a flowchart of a second process for tracking touch inputs.
  • FIG. 3 is a diagram of a touch input tracking device.
  • FIG. 4 is a diagram of a computer system utilizing the touch input tracking device of FIG. 3 .
  • FIG. 5 is a diagram of a multitouch input captured by a mulitouch device.
  • FIG. 6 is a diagram illustrating detecting change of position for multiple inputs in a multitouch device through midpoint calculations.
  • FIG. 7 to FIG. 10 are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through component changes.
  • FIG. 11 to FIG. 16 are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through midpoint shifts.
  • FIG. 17 is a diagram of tracking touch inputs according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating detecting change of position for multiple inputs in a multitouch device through midpoint calculations.
  • a capacitive sensor array may be utilized to detect touch points made by a first input, labeled “One finger,” and a second input, labeled “Another finger.”
  • the first input is at a first position ⁇ X 1 ,Y 2 >
  • the second input is at a second position ⁇ X 2 ,Y 1 >.
  • the second input is at a third position ⁇ X 4 ,Y 4 >
  • the first input remains near the first position at a fourth position ⁇ X 3 ,Y 3 >.
  • ⁇ Xc′,Yc′> ⁇ (X 3 +X 4 )/2,(Y 2 +Y 4 )/2>.
  • a first variation ⁇ X and a second variation ⁇ Y from the first center position to the second center position may be calculated.
  • the first variation ⁇ X may represent change along the X-axis from the previously captured frame to the presently captured frame of the midpoint between the first input and the second input.
  • the second variation ⁇ Y may represent change along the Y-axis from the previously captured frame to the presently captured frame of the midpoint between the first input and the second input.
  • FIG. 7 to FIG. 10 are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through component changes.
  • between the first input and the second input may be calculated for a previous frame.
  • between the first input and the second input may be calculated for the previous frame.
  • may be calculated for a present frame.
  • may be lower than the second Y-axis difference
  • may remain nominally constant or exhibit little variation.
  • may be greater than the second Y-axis difference
  • may remain nominally constant or exhibit little variation.
  • may be lower than the second X-axis difference
  • may remain nominally constant or exhibit little variation.
  • may be greater than the second X-axis difference
  • may remain nominally constant or exhibit little variation.
  • the midpoint may remain nominally constant or exhibit little variation along both the Y-axis and the X-axis. In other words, the first variation ⁇ X and the second variation ⁇ Y may be close to zero.
  • FIG. 11 to FIG. 16 are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through midpoint shifts.
  • the first input may remain nominally constant (shown by a circle), whereas the second input may move in clockwise rotation around the first input (shown by an arcing arrow).
  • the first variation ⁇ X is positive along the X-axis
  • the second variation ⁇ Y but to a lesser degree along the Y-axis.
  • the second variation ⁇ Y is positive along the Y-axis
  • the first variation ⁇ X is positive along the X-axis.
  • the second variation ⁇ Y is negative along the Y-axis, and the first variation ⁇ X is positive along the X-axis.
  • the second variation ⁇ Y is negative along the Y-axis, and the first variation ⁇ X is negative along the X-axis.
  • the second variation ⁇ Y is negative along the Y-axis, and the first variation ⁇ X is negative along the X-axis.
  • the second variation ⁇ Y is positive along the Y-axis, and the first variation ⁇ X is negative along the X-axis.
  • FIG. 17 is a diagram of tracking touch inputs according to an embodiment of the present invention.
  • FIG. 1 is a flowchart of a process 10 for tracking touch inputs according to the embodiment of FIG. 17 .
  • the process 10 comprises the following steps:
  • Step 100 Calculate a first center position corresponding to two touch points along a first axis for a first frame.
  • Step 102 Detect variation of the first center position from the first frame to a second frame.
  • Step 104 Determine a gesture type according to the variation of the first center position.
  • the gesture type may be determined (Step 104 ), which is shown as clockwise rotation (Step 1704 ) or counter-clockwise rotation (Step 1705 ) in FIG. 17 .
  • Step 104 the gesture type may be determined (Step 104 ), which is shown as clockwise rotation (Step 1704 ) or counter-clockwise rotation (Step 1705 ) in FIG. 17 .
  • the X-axis variation ⁇ X is greater than a predetermined variation M
  • clockwise rotation may be determined (Step 1704 ).
  • counter-clockwise rotation may be determined (Step 1705 ).
  • the determined rotation i.e. the clockwise rotation or the counter-clockwise rotation, may be shown on a screen 1709 via a communication interface 1707 and a host computer system 1708 .
  • FIG. 2 is a flowchart of a second process 20 for tracking touch inputs according to the embodiment of FIG. 17 .
  • the second process 20 may be utilized in conjunction with the process 10 , and comprises the following steps:
  • Step 200 Calculate a first center position corresponding to two touch points along a first axis for a first frame.
  • Step 202 Detect variation of the first center position from the first frame to a second frame.
  • Step 204 Calculate a second center position corresponding to the two touch points along a second axis for a first frame.
  • Step 206 Detect variation of the second center position from the first frame to the second frame.
  • Step 208 Determine a zoom gesture type when the variation of the first center position and the variation of the second position are both lower than a predetermined threshold.
  • the first frame may be the previous frame, and the second frame may be the present frame, as described above.
  • center vector variation is calculated on an X-Y slide (Step 1700 ), such as the X-axis and the Y-axis shown in FIG. 7 to FIG. 16 .
  • the center vector variation may include the X-axis variation ⁇ X and the Y-axis variation ⁇ Y, and Step 200 to Step 206 of FIG. 2 may be utilized to calculate, for example, the X-axis variation ⁇ X by calculating the first center position ⁇ Xc,Yc> and the second center position ⁇ Xc′,Yc′>, and finding a difference between the second center position and the first center position, e.g.
  • the zoom gesture type may be determined (Step 208 ; Step 1702 to Step 1703 ). As shown in FIG. 17 , if the first Y-axis difference
  • the zoom out gesture is determined (Step 1702 ).
  • the zoom in gesture is determined (Step 1703 ).
  • the zoom in gesture is determined (Step 1703 ).
  • the zoom in gesture is determined (Step 1703 ).
  • the determined zoom gesture i.e. the zoom in gesture or the zoom out gesture, may be shown on the screen 1709 via the communication interface 1707 and the host computer system 1708 .
  • FIG. 3 is a diagram of a touch input tracking device 30 , which may be utilized to interface with a touch input device 31 for tracking touch inputs and determining gesture type.
  • the touch input tracking device 30 comprises a receiving module 301 , a center point calculation module 302 , and a gesture determination module 303 .
  • the receiving module 301 receives the first frame and the second frame from the touch input device 31 .
  • the center point calculation module 302 calculates the first center point and the second center point of two touch points in the first frame and the second frame.
  • the first center point corresponds to a first axis, such as the X-axis
  • the second center point corresponds to a second axis, such as the Y-axis.
  • the gesture determination module 302 determines a gesture type according to variation, e.g. the X-axis variation ⁇ X, of the first center point from the first frame to the second frame, and variation, e.g. the Y-axis variation ⁇ Y, of the second center point from the first frame to the second frame.
  • variation e.g. the X-axis variation ⁇ X
  • variation e.g. the Y-axis variation ⁇ Y
  • FIG. 4 is a diagram of a computer system 40 , which may be utilized to interface with the touch input device 31 .
  • the computer system 40 further comprises a communication interface 32 , a processor 33 , and a display 34 .
  • the communication interface 32 receives the gesture type from the gesture determination module 303 .
  • the processor 33 modifies an image according to the gesture type and drives the display 34 to display the image.
  • the display 34 may display the image before modification, or a modified image resulting from the processor 33 modifying the image.

Abstract

For a multitouch input configuration, tracking touch inputs includes calculating a first center position corresponding to two touch points along a first axis for a first frame, detecting variation of the first center position from the first frame to a second frame, and determining a gesture type according to the variation of the first center position.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to touch input devices, and more particularly, to a method of tracking touch inputs for a multitouch input device.
  • 2. Description of the Prior Art
  • Input devices that interface with computing devices provide means for digitizing and transferring text, images, video, and also commands, according to control by a user. A keyboard may be utilized for transmitting text in a sequence dictated by keystrokes made by the user. A webcam may capture sequences of images, and transfer the images to the computing device for processing and storage. A mouse may be utilized to operate the computing device, allowing the user to point at and click on graphical controls, such as icons, scroll bars, and menus.
  • Touchpads are input devices which detect physical contact, and transfer coordinates thereof to the computing device. For example, if the user taps the touchpad, coordinates corresponding to the center of an area touched by the user, along with duration of the tap, may be transferred to the computing device for controlling the computing device. Likewise, if the user drags his/her finger in a path along the surface of the touchpad, a series of coordinates may be transferred to the computing device, such that the computing device may discern direction of motion of the user's finger, and respond with an appropriate action.
  • Previously, touchpad input devices were limited to tracking contact from one source, such as contact from one finger or a stylus. However, simultaneous tracking of multiple points of contact, known as “multitouch,” is rapidly becoming a feasible technology. Popular commands typically associated with multitouch input devices include zooming and rotating. For example, by contacting the multitouch input device with two fingers, and bringing the two fingers together, the user may control the computing device to zoom out. Likewise, by moving the two fingers apart, the user may control the computing device to zoom in.
  • Please refer to FIG. 5, which is a diagram of a multitouch input captured by a mulitouch device. To detect contact, the multitouch device may include an array of sensors, each sensor corresponding to a row and a column. For example, each row of sensors may form a channel along a Y axis, and each column of sensors may form a channel along an X axis. Then, each sensor may generate a signal in response to the contact, and the signal may be read out as a response on the X axis and a response on the Y axis. For a single input, only one response, or cluster of responses, will be detected on each axis. However, as shown in FIG. 5, for multiple inputs, virtual touched positions will be generated in addition to finger touches. In other words, the multitouch input cannot be utilized to distinguish the finger touches from the virtual touch positions.
  • SUMMARY OF THE INVENTION
  • According to one embodiment of the present invention, a method of tracking touch inputs comprises calculating a first center position corresponding to two touch points along a first axis for a first frame, detecting variation of the first center position from the first frame to a second frame, and determining a gesture type according to the variation of the first center position.
  • According to another embodiment of the present invention, a method of tracking touch inputs comprises calculating a first center position corresponding to two touch points along a first axis for a first frame, detecting variation of the first center position from the first frame to a second frame, calculating a second center position corresponding to the two touch points along a second axis for a first frame, detecting variation of the second center position from the first frame to the second frame, and determining a zoom gesture type when the variation of the first center position and the variation of the second position are both lower than a predetermined threshold.
  • According to the embodiments of the present invention, a touch input tracking device comprises a receiving module, a center point calculation module, and a gesture determination module. The receiving module is for receiving a first frame and a second frame. The center point calculation module is for calculating a first center point and a second center point of two touch points in the first frame and the second frame, the first center point corresponding to a first axis and the second center point corresponding to a second axis. The gesture determination module is for determining a gesture type according to variation of the first center point from the first frame to the second frame, and variation of the second center point from the first frame to the second frame.
  • According to the embodiments of the present invention, a computer system comprises a touch input tracking device, a communication interface, a display, and a processor. The touch input tracking device comprises a receiving module, a center point calculation module, and a gesture determination module. The receiving module is for receiving a first frame and a second frame. The center point calculation module is for calculating a first center point and a second center point of two touch points in the first frame and the second frame, the first center point corresponding to a first axis and the second center point corresponding to a second axis. The gesture determination module is for determining a gesture type according to variation of the first center point from the first frame to the second frame, and variation of the second center point from the first frame to the second frame. The communication interface is for receiving the gesture type from the gesture determination module. The processor is for modifying an image according to the gesture type and driving the display to display the image.
  • These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a flowchart of a process for tracking touch inputs.
  • FIG. 2 is a flowchart of a second process for tracking touch inputs.
  • FIG. 3 is a diagram of a touch input tracking device.
  • FIG. 4 is a diagram of a computer system utilizing the touch input tracking device of FIG. 3.
  • FIG. 5 is a diagram of a multitouch input captured by a mulitouch device.
  • FIG. 6 is a diagram illustrating detecting change of position for multiple inputs in a multitouch device through midpoint calculations.
  • FIG. 7 to FIG. 10 are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through component changes.
  • FIG. 11 to FIG. 16 are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through midpoint shifts.
  • FIG. 17 is a diagram of tracking touch inputs according to an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Please refer to FIG. 6, which is a diagram illustrating detecting change of position for multiple inputs in a multitouch device through midpoint calculations. A capacitive sensor array may be utilized to detect touch points made by a first input, labeled “One finger,” and a second input, labeled “Another finger.” Initially, in a previously captured frame, the first input is at a first position <X1,Y2>, and the second input is at a second position <X2,Y1>. After the second input is moved, in a presently captured frame, the second input is at a third position <X4,Y4>, and the first input remains near the first position at a fourth position <X3,Y3>. In each frame, center positions may be calculated. For instance, in the previously captured frame, a first center position <Xc,Yc> may be calculated. The first center position <Xc,Yc> may be calculated as a midpoint of the first position and the second position, e.g. <Xc,Yc>=<(X1+X2)/2,(Y1+Y2)/2>. Likewise, in the presently captured frame, a second center position <Xc′,Yc′> may be calculated. The second center position <Xc′,Yc′> may be calculated as a midpoint of the third position and the fourth position, e.g. <Xc′,Yc′>=<(X3+X4)/2,(Y2+Y4)/2>. Then, utilizing the first center position <Xc,Yc> and the second center position <Xc′,Yc′>, a first variation ΔX and a second variation ΔY from the first center position to the second center position may be calculated. In other words, the first variation ΔX may represent change along the X-axis from the previously captured frame to the presently captured frame of the midpoint between the first input and the second input. Likewise, the second variation ΔY may represent change along the Y-axis from the previously captured frame to the presently captured frame of the midpoint between the first input and the second input. The first variation ΔX may be calculated as ΔX=Xc′−Xc, whereas the second variation ΔY may be calculated as ΔY=Yc′−Yc.
  • Please refer to FIG. 7 to FIG. 10, which are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through component changes. As shown in FIG. 7, if the first input and the second input are drawn apart along the Y-axis, a first Y-axis difference |Yp| between the first input and the second input may be calculated for a previous frame. Likewise, a first X-axis difference |Xp| between the first input and the second input may be calculated for the previous frame. Then, for a present frame, a second Y-axis difference |Y| and a second X-axis difference |X| may be calculated for a present frame. For the case of drawing the first input and the second input apart along the Y-axis (FIG. 7), the first Y-axis difference |Yp| may be lower than the second Y-axis difference |Y|, whereas the first X-axis difference |Xp| and the second X-axis difference |X| may remain nominally constant or exhibit little variation. For the case of drawing the first input and the second input together along the Y-axis (FIG. 8), the first Y-axis difference |Yp| may be greater than the second Y-axis difference |Y|, whereas the first X-axis difference |Xp| and the second X-axis difference |X| may remain nominally constant or exhibit little variation. For the case of drawing the first input and the second input apart along the X-axis (FIG. 9), the first X-axis difference |Xp| may be lower than the second X-axis difference |X|, whereas the first Y-axis difference |Yp| and the second Y-axis difference |Y| may remain nominally constant or exhibit little variation. For the case of drawing the first input and the second input apart along the X-axis (FIG. 10), the first X-axis difference |Xp| may be greater than the second X-axis difference |X|, whereas the first Y-axis difference |Yp| and the second Y-axis difference |Y| may remain nominally constant or exhibit little variation. In all of the above cases for FIG. 7 to FIG. 10, the midpoint may remain nominally constant or exhibit little variation along both the Y-axis and the X-axis. In other words, the first variation ΔX and the second variation ΔY may be close to zero.
  • Please refer to FIG. 11 to FIG. 16, which are diagrams illustrating detecting change of position for multiple inputs in a multitouch device through midpoint shifts. As shown in FIG. 11, the first input may remain nominally constant (shown by a circle), whereas the second input may move in clockwise rotation around the first input (shown by an arcing arrow). In this case, the first variation ΔX is positive along the X-axis, and the second variation ΔY but to a lesser degree along the Y-axis. For clockwise rotation as shown in FIG. 12, the second variation ΔY is positive along the Y-axis, and the first variation ΔX is positive along the X-axis. For clockwise rotation as shown in FIG. 13, the second variation ΔY is negative along the Y-axis, and the first variation ΔX is positive along the X-axis. For counter-clockwise rotation as shown in FIG. 14, the second variation ΔY is negative along the Y-axis, and the first variation ΔX is negative along the X-axis. For counter-clockwise rotation as shown in FIG. 15, the second variation ΔY is negative along the Y-axis, and the first variation ΔX is negative along the X-axis. For counter-clockwise rotation as shown in FIG. 16, the second variation ΔY is positive along the Y-axis, and the first variation ΔX is negative along the X-axis.
  • In the following, please refer to FIG. 17 in conjunction with FIG. 1 to FIG. 2. FIG. 17 is a diagram of tracking touch inputs according to an embodiment of the present invention. FIG. 1 is a flowchart of a process 10 for tracking touch inputs according to the embodiment of FIG. 17. The process 10 comprises the following steps:
  • Step 100: Calculate a first center position corresponding to two touch points along a first axis for a first frame.
  • Step 102: Detect variation of the first center position from the first frame to a second frame.
  • Step 104: Determine a gesture type according to the variation of the first center position.
  • In the process 10, the first frame may be the previous frame, and the second frame may be the present frame, as described above. In FIG. 17, center vector variation is calculated on an X-Y slide (Step 1700), such as the X-axis and the Y-axis shown in FIG. 7 to FIG. 16. The center vector variation may include the X-axis variation ΔX and the Y-axis variation ΔY, and Step 100 to Step 102 of FIG. 1 may be utilized to calculate, for example, the X-axis variation ΔX by calculating the first center position <Xc,Yc> and the second center position <Xc′,Yc′>, and finding a difference between the second center position and the first center position, e.g. ΔX=Xc′−Xc. Likewise, the Y-axis variation ΔY may be calculated as ΔY=Yc′−Yc. Then, utilizing the X-axis variation ΔX, the gesture type may be determined (Step 104), which is shown as clockwise rotation (Step 1704) or counter-clockwise rotation (Step 1705) in FIG. 17. For example, if the X-axis variation ΔX is greater than a predetermined variation M, clockwise rotation may be determined (Step 1704). On the other hand, if the X-axis variation ΔX is less than the predetermined M, counter-clockwise rotation may be determined (Step 1705). Then, the determined rotation, i.e. the clockwise rotation or the counter-clockwise rotation, may be shown on a screen 1709 via a communication interface 1707 and a host computer system 1708.
  • Please refer to FIG. 2, which is a flowchart of a second process 20 for tracking touch inputs according to the embodiment of FIG. 17. The second process 20 may be utilized in conjunction with the process 10, and comprises the following steps:
  • Step 200: Calculate a first center position corresponding to two touch points along a first axis for a first frame.
  • Step 202: Detect variation of the first center position from the first frame to a second frame.
  • Step 204: Calculate a second center position corresponding to the two touch points along a second axis for a first frame.
  • Step 206: Detect variation of the second center position from the first frame to the second frame.
  • Step 208: Determine a zoom gesture type when the variation of the first center position and the variation of the second position are both lower than a predetermined threshold.
  • In the second process 20, the first frame may be the previous frame, and the second frame may be the present frame, as described above. In FIG. 17, center vector variation is calculated on an X-Y slide (Step 1700), such as the X-axis and the Y-axis shown in FIG. 7 to FIG. 16. The center vector variation may include the X-axis variation ΔX and the Y-axis variation ΔY, and Step 200 to Step 206 of FIG. 2 may be utilized to calculate, for example, the X-axis variation ΔX by calculating the first center position <Xc,Yc> and the second center position <Xc′,Yc′>, and finding a difference between the second center position and the first center position, e.g. ΔX=Xc′−Xc. Likewise, the Y-axis variation ΔY may be calculated as ΔY=Yc′−Yc. Then, if little or no variation is detected on the X-axis variation ΔX and the Y-axis variation ΔY, the zoom gesture type may be determined (Step 208; Step 1702 to Step 1703). As shown in FIG. 17, if the first Y-axis difference |Yp| is less than the second Y-axis difference |Y| by a predetermined variation threshold N, the zoom out gesture is determined (Step 1702). Likewise, if the first X-axis difference |Xp| is less than the second X-axis difference |X| by a predetermined variation threshold K, the zoom out gesture is determined (Step 1702). On the other hand, if the first Y-axis difference |Yp| is greater than the second Y-axis difference |Y| by the predetermined variation threshold N, the zoom in gesture is determined (Step 1703). Likewise, if the first X-axis difference |Xp| is greater than the second X-axis difference |X| by the predetermined variation threshold K, the zoom in gesture is determined (Step 1703). Then, the determined zoom gesture, i.e. the zoom in gesture or the zoom out gesture, may be shown on the screen 1709 via the communication interface 1707 and the host computer system 1708.
  • Please refer to FIG. 3, which is a diagram of a touch input tracking device 30, which may be utilized to interface with a touch input device 31 for tracking touch inputs and determining gesture type. The touch input tracking device 30 comprises a receiving module 301, a center point calculation module 302, and a gesture determination module 303. The receiving module 301 receives the first frame and the second frame from the touch input device 31. The center point calculation module 302 calculates the first center point and the second center point of two touch points in the first frame and the second frame. The first center point corresponds to a first axis, such as the X-axis, and the second center point corresponds to a second axis, such as the Y-axis. The gesture determination module 302 determines a gesture type according to variation, e.g. the X-axis variation ΔX, of the first center point from the first frame to the second frame, and variation, e.g. the Y-axis variation ΔY, of the second center point from the first frame to the second frame.
  • Please refer to FIG. 4, which is a diagram of a computer system 40, which may be utilized to interface with the touch input device 31. In addition to the touch input tracking device 30 described above, the computer system 40 further comprises a communication interface 32, a processor 33, and a display 34. The communication interface 32 receives the gesture type from the gesture determination module 303. The processor 33 modifies an image according to the gesture type and drives the display 34 to display the image. The display 34 may display the image before modification, or a modified image resulting from the processor 33 modifying the image.
  • Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention.

Claims (13)

1. A method of tracking touch inputs, the method comprising:
calculating a first center position corresponding to two touch points along a first axis for a first frame;
detecting variation of the first center position from the first frame to a second frame; and
determining a gesture type according to the variation of the first center position.
2. The method of claim 1, wherein determining the gesture type according to the variation of the first center position comprises determining a rotation when the variation of the first center position is greater than a predetermined rotation threshold.
3. The method of claim 1, further comprising:
calculating a second center position corresponding to the two touch points along a second axis for the first frame;
detecting variation of the second center position from the first frame to the second frame; and
determining a rotation direction of the gesture type according to the variation of the first center position and the variation of the second center position.
4. The method of claim 3, wherein determining the rotation of the gesture type according to the variation of the first center position and the variation of the second center position comprises determining the rotation of the gesture type according to polarities of the variation of the first center position and the variation of the second center position.
5. The method of claim 4, wherein determining the rotation of the gesture type according to the variation of the first center position and the variation of the second center position comprises determining clockwise rotation when the variation of the first center position is greater than zero.
6. The method of claim 4, wherein determining the rotation of the gesture type according to the variation of the first center position and the variation of the second center position comprises determining counter-clockwise rotation when the variation of the first center position is less than zero.
7. A method of tracking touch inputs, the method comprising:
calculating a first center position corresponding to two touch points along a first axis for a first frame;
detecting variation of the first center position from the first frame to a second frame;
calculating a second center position corresponding to the two touch points along a second axis for a first frame;
detecting variation of the second center position from the first frame to the second frame; and
determining a zoom gesture type when the variation of the first center position and the variation of the second position are both lower than a predetermined threshold.
8. The method of claim 7, wherein determining the zoom gesture type comprises determining a zoom out gesture when a first distance variation of the two touch points corresponding to the first frame along the first axis is greater than a second distance variation of the two touch points corresponding to the second frame along the first axis by a predetermined zoom threshold.
9. The method of claim 7, wherein determining the zoom gesture type comprises determining a zoom out gesture when a first distance variation of the two touch points corresponding to the first frame along the second axis is greater than a second distance variation of the two touch points corresponding to the second frame along the second axis by a predetermined zoom threshold.
10. The method of claim 7, wherein determining the zoom gesture type comprises determining a zoom in gesture when a first distance variation of the two touch points corresponding to the first frame along the second axis is less than a second distance variation of the two touch points corresponding to the second frame along the second axis by a predetermined zoom threshold.
11. The method of claim 7, wherein determining the zoom gesture type comprises determining a zoom in gesture when a first distance variation of the two touch points corresponding to the first frame along the second axis is less than a second distance variation of the two touch points corresponding to the second frame along the second axis by a predetermined zoom threshold.
12. A touch input tracking device comprising:
a receiving module for receiving a first frame and a second frame;
a center point calculation module for calculating a first center point and a second center point of two touch points in the first frame and the second frame, the first center point corresponding to a first axis and the second center point corresponding to a second axis; and
a gesture determination module for determining a gesture type according to variation of the first center point from the first frame to the second frame, and variation of the second center point from the first frame to the second frame.
13. A computer system comprising:
a touch input tracking device comprising:
a receiving module for receiving a first frame and a second frame;
a center point calculation module for calculating a first center point and a second center point of two touch points in the first frame and the second frame, the first center point corresponding to a first axis and the second center point corresponding to a second axis; and
a gesture determination module for determining a gesture type according to variation of the first center point from the first frame to the second frame, and variation of the second center point from the first frame to the second frame;
a communication interface for receiving the gesture type from the gesture determination module;
a display; and
a processor for modifying an image according to the gesture type and driving the display to display the image.
US12/244,780 2008-10-03 2008-10-03 Method of Tracking Touch Inputs Abandoned US20100088595A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/244,780 US20100088595A1 (en) 2008-10-03 2008-10-03 Method of Tracking Touch Inputs
TW097150988A TW201015394A (en) 2008-10-03 2008-12-26 Method of tracking touch inputs and related touch input tracking device and computer system
CN2009100027681A CN101464749B (en) 2008-10-03 2009-01-22 Method for processing touch control type input signal, its processing apparatus and computer system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/244,780 US20100088595A1 (en) 2008-10-03 2008-10-03 Method of Tracking Touch Inputs

Publications (1)

Publication Number Publication Date
US20100088595A1 true US20100088595A1 (en) 2010-04-08

Family

ID=40805361

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/244,780 Abandoned US20100088595A1 (en) 2008-10-03 2008-10-03 Method of Tracking Touch Inputs

Country Status (3)

Country Link
US (1) US20100088595A1 (en)
CN (1) CN101464749B (en)
TW (1) TW201015394A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110007007A1 (en) * 2009-07-13 2011-01-13 Hon Hai Precision Industry Co., Ltd. Touch control method
US20110012927A1 (en) * 2009-07-14 2011-01-20 Hon Hai Precision Industry Co., Ltd. Touch control method
US20110265021A1 (en) * 2010-04-23 2011-10-27 Primax Electronics Ltd. Touchpad controlling method and touch device using such method
US20120212440A1 (en) * 2009-10-19 2012-08-23 Sharp Kabushiki Kaisha Input motion analysis method and information processing device
US20120249440A1 (en) * 2011-03-31 2012-10-04 Byd Company Limited method of identifying a multi-touch rotation gesture and device using the same
US20120327126A1 (en) * 2011-06-27 2012-12-27 Nokia Corporation Method and apparatus for causing predefined amounts of zooming in response to a gesture
US20140145991A1 (en) * 2012-11-29 2014-05-29 Konica Minolta, Inc. Information processing apparatus installed with touch panel as user interface
US8971572B1 (en) 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887326B (en) * 2010-06-13 2012-02-22 友达光电股份有限公司 Image scanning system and control method thereof
CN102568403B (en) * 2010-12-24 2014-06-04 联想(北京)有限公司 Electronic instrument and object deletion method thereof
CN102736838B (en) * 2011-03-31 2016-06-22 比亚迪股份有限公司 The recognition methods of multi-point rotating movement and device
TWI456454B (en) * 2012-02-08 2014-10-11 Acer Inc Method for processing touch signal and electronic device using the same
CN103257729B (en) * 2012-02-17 2015-12-02 宏碁股份有限公司 The disposal route of touch signal and electronic installation
CN103885700A (en) * 2012-12-21 2014-06-25 富士康(昆山)电脑接插件有限公司 Touch screen zoom command method
CN105092919B (en) * 2014-05-04 2017-11-10 固纬电子实业股份有限公司 The gear switch method of touch measuring instrument

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060066588A1 (en) * 2004-09-24 2006-03-30 Apple Computer, Inc. System and method for processing raw data of track pad device
US20080168404A1 (en) * 2007-01-07 2008-07-10 Apple Inc. List Scrolling and Document Translation, Scaling, and Rotation on a Touch-Screen Display
US20080309630A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Techniques for reducing jitter for taps
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20090184939A1 (en) * 2008-01-23 2009-07-23 N-Trig Ltd. Graphical object manipulation with a touch sensitive screen
US20090322701A1 (en) * 2008-06-30 2009-12-31 Tyco Electronics Corporation Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080192025A1 (en) * 2007-02-13 2008-08-14 Denny Jaeger Touch input devices for display/sensor screen

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060026536A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060066588A1 (en) * 2004-09-24 2006-03-30 Apple Computer, Inc. System and method for processing raw data of track pad device
US20080168404A1 (en) * 2007-01-07 2008-07-10 Apple Inc. List Scrolling and Document Translation, Scaling, and Rotation on a Touch-Screen Display
US20080309630A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Techniques for reducing jitter for taps
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20090184939A1 (en) * 2008-01-23 2009-07-23 N-Trig Ltd. Graphical object manipulation with a touch sensitive screen
US20090322701A1 (en) * 2008-06-30 2009-12-31 Tyco Electronics Corporation Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110007007A1 (en) * 2009-07-13 2011-01-13 Hon Hai Precision Industry Co., Ltd. Touch control method
US20110012927A1 (en) * 2009-07-14 2011-01-20 Hon Hai Precision Industry Co., Ltd. Touch control method
US20120212440A1 (en) * 2009-10-19 2012-08-23 Sharp Kabushiki Kaisha Input motion analysis method and information processing device
US20110265021A1 (en) * 2010-04-23 2011-10-27 Primax Electronics Ltd. Touchpad controlling method and touch device using such method
US8370772B2 (en) * 2010-04-23 2013-02-05 Primax Electronics Ltd. Touchpad controlling method and touch device using such method
US20120249440A1 (en) * 2011-03-31 2012-10-04 Byd Company Limited method of identifying a multi-touch rotation gesture and device using the same
US8743065B2 (en) * 2011-03-31 2014-06-03 Byd Company Limited Method of identifying a multi-touch rotation gesture and device using the same
US20120327126A1 (en) * 2011-06-27 2012-12-27 Nokia Corporation Method and apparatus for causing predefined amounts of zooming in response to a gesture
US8971572B1 (en) 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
US9372546B2 (en) 2011-08-12 2016-06-21 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
US20140145991A1 (en) * 2012-11-29 2014-05-29 Konica Minolta, Inc. Information processing apparatus installed with touch panel as user interface
JP2014106853A (en) * 2012-11-29 2014-06-09 Konica Minolta Inc Information processor, control method for information processor, and control program for information processor

Also Published As

Publication number Publication date
CN101464749A (en) 2009-06-24
CN101464749B (en) 2012-04-04
TW201015394A (en) 2010-04-16

Similar Documents

Publication Publication Date Title
US20100088595A1 (en) Method of Tracking Touch Inputs
US8970503B2 (en) Gestures for devices having one or more touch sensitive surfaces
EP2641149B1 (en) Gesture recognition
US10061510B2 (en) Gesture multi-function on a physical keyboard
US20090278812A1 (en) Method and apparatus for control of multiple degrees of freedom of a display
US20130154933A1 (en) Force touch mouse
US20080134078A1 (en) Scrolling method and apparatus
JP2010134895A (en) Selective input signal rejection and modification
US20120249599A1 (en) Method of identifying a multi-touch scaling gesture and device using the same
JP6004716B2 (en) Information processing apparatus, control method therefor, and computer program
US9778780B2 (en) Method for providing user interface using multi-point touch and apparatus for same
US20130106707A1 (en) Method and device for gesture determination
US10976864B2 (en) Control method and control device for touch sensor panel
WO2012129975A1 (en) Method of identifying rotation gesture and device using the same
US20120249487A1 (en) Method of identifying a multi-touch shifting gesture and device using the same
TWI581127B (en) Input device and electrical device
GB2527918A (en) Glove touch detection
US20150355769A1 (en) Method for providing user interface using one-point touch and apparatus for same
US20140298275A1 (en) Method for recognizing input gestures
US9256360B2 (en) Single touch process to achieve dual touch user interface
CN105474164A (en) Disambiguation of indirect input
KR101436587B1 (en) Method for providing user interface using two point touch, and apparatus therefor
KR20140086805A (en) Electronic apparatus, method for controlling the same and computer-readable recording medium
TWI603226B (en) Gesture recongnition method for motion sensing detector
TW201528114A (en) Electronic device and touch system, touch method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: AU OPTRONICS CORP.,TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HO, CHEN-HSIANG;HSU, YU-MIN;YANG, CHIA-FENG;REEL/FRAME:021626/0154

Effective date: 20081003

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION