TWI452494B - Method for combining at least two touch signals in a computer system - Google Patents

Method for combining at least two touch signals in a computer system Download PDF

Info

Publication number
TWI452494B
TWI452494B TW100101410A TW100101410A TWI452494B TW I452494 B TWI452494 B TW I452494B TW 100101410 A TW100101410 A TW 100101410A TW 100101410 A TW100101410 A TW 100101410A TW I452494 B TWI452494 B TW I452494B
Authority
TW
Taiwan
Prior art keywords
touch
point
touch panel
mouse
panel
Prior art date
Application number
TW100101410A
Other languages
Chinese (zh)
Other versions
TW201218036A (en
Inventor
Taizo Yasutake
Original Assignee
Primax Electronics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US12/914,649 priority Critical patent/US8614664B2/en
Application filed by Primax Electronics Ltd filed Critical Primax Electronics Ltd
Publication of TW201218036A publication Critical patent/TW201218036A/en
Application granted granted Critical
Publication of TWI452494B publication Critical patent/TWI452494B/en

Links

Description

Method of combining at least two touch signals into a computer system

The present invention relates to a method of incorporating at least two touch signals into a computer system, and more particularly to a mouse having at least two touch pads.

The recently developed multi-touch sensor provides a wider range of input capabilities, including multi-directional input commands for computer graphics. The intuitive and easy-to-follow multi-finger gesture-based multi-dimensional input is significantly improved over standard keyboards such as a keyboard and/or traditional two-dimensional mouse. Production efficiency of 2D/3D operations.

The invention provides a novel design concept, which applies a multi-touch sensor to a mouse body for multi-touch and multi-directional navigation and control, and then to traditional two-dimensional applications and three-dimensional computers. The user interface of the drawing application provides a new way of operating.

One embodiment of the present invention includes a novel mouse hardware design and an interface method for generating multi-touch input commands for any application that recognizes multi-touch messages defined by the operating system. Another embodiment of the present invention includes an interface method to utilize a data packet of a multi-touch sensor as an interface command for an application that cannot accept a standard input as a multi-touch message. However, the above embodiments are not intended to limit the invention.

The interface program for generating multi-finger touch input commands includes a kernel mode device driver and a user application level driver for outputting specific messages to the target application.

The above described features and advantages of the invention will be apparent from the following description.

1. Multi-touch multi-directional mouse and control command generation

FIG. 1 illustrates a first embodiment of a multi-touch and multi-directional mouse. A mouse (or computer mouse) can be any conventional form of indicator device that is often used for computer work by detecting its own two-dimensional motion relative to its support surface. In fact, in some embodiments, the mouse can include an item held by the user's hand and one or more buttons. The mouse can also include a scroll wheel.

In some embodiments, the mouse 100 has a deep V-shaped recess 102 or other indentation, and the V-shaped recess 102 or notch has a flat surface 104 for receiving a multi-touch sensing plate 106. With the multi-touch sensing board 106 disposed on the surface 104 of the V-shaped recess 102, the user can send out the multi-touch finger gesture command desired by the interface driver. The multi-touch sensing board 106 can independently detect multi-finger touch actions. Among them, the above sensing board can be obtained in the existing personal computer market. In some examples, the sensor board includes Taiwan Yifa Technology ( Smart touch-sensitive multi-function remote control (smart pad) and Synopsys International Technology ( ) of the touchpad.

Based on ergonomics, the industrial design of a multi-touch and multi-directional mouse can be very diverse. In some embodiments, the mouse 100 has a deep V-shaped notch or depression 102 in the central region of its mouse body 108. The surface 104 of the recess 102 is a flat surface and does not have a physical boundary at its left and right ends. The depth and width of the recess 102 should be sufficient to mount a small touchpad (area at least about 30 mm x 30 mm). That is to say, on the body of the mouse, the industrial design of the recess allows the user to smoothly place a plurality of fingers and smoothly drag the fingers to the right, left, forward and backward.

The recess 102 provides an ergonomically comfortable touchpad design that allows the user to accidentally trigger the touchpad when using a conventional two-dimensional mouse operation. It is worth noting that the industrial design of the multi-touch mouse is not limited by the appearance of the mouse body and the calibration of the touchpad.

FIG. 2 illustrates an embodiment of a multi-touch and multi-directional mouse 200. The mouse 200 has a first multi-touch sensing panel 106 contained within a deep V-shaped recess 102 and a second sensing panel 202 on the side 204 of the mouse body 108. The second sensing board 202 can be a multi-touch sensing board or a single touch sensing board.

16 and 17 illustrate other embodiments of multi-touch mouse 1600 and 1700. FIG. 16 illustrates a mouse body 1602 having two side extensions 1608, and the two side extensions 1608 include a touchpad 1604 and a touchpad 1606, respectively. The touch panels 1604 and 1606 can be a single touch panel or a multi-touch panel, respectively. In one embodiment, the multi-touch mouse 1600 of Figure 16 includes two single-point trackpads. 17 shows a mouse body 1702 having two touch panels. One first touch panel 1706 is disposed on a top of the mouse body 1702, and one second touch panel 1704 is disposed on the mouse body. One side extension 1608 of 1702. That is to say, two separate touch panels can be placed at different positions on a mouse body.

The multi-touch and multi-directional mouse shown in Figures 1, 2, 16 and 17 has a traditional two-dimensional mouse function and can be connected via a universal serial bus (USB). Send a multi-touch input data packet to the PC host for a USB) connector, a blue tooth connector, or other similar connector. The user touches the surface of the main touchpad on the V-shaped recess and/or the second sensing panel on the side of the body of the mouse with his finger. These finger touch actions are used to generate raw data packets including touch point coordinates related data. These data packets are used to generate one of a set of pre-defined touch messages in a personal computer operating system, such as Microsoft Corporation. The WM_TOUCH or WM_GESTURE in the 7 operating system are common pre-defined touch messages. That is to say, the sensing board 106 (the touch panel) generates touch data including coordinates of the respective touch points on the touch panel. These touch points are used to generate touch command messages that can be recognized by a computer application.

In some embodiments in which an interface software algorithm is utilized, the touch points on the first sensing board and the touch points on the second sensing board are collectively processed. For example, the user can use three fingers on the first sensing board to try to generate a three-finger touch gesture. However, when the mouse body is supported only by the thumb and the little finger, the operation of using three fingers on the first sensing board may be less comfortable. Therefore, in some embodiments, the interface software combines the two-finger touch action on the first sensing board with the single-finger touch action on the second sensing board to generate a three-finger touch message. In addition, in some embodiments, the interface software can be combined with a touch action, and the combined finger touch action mapping to the final multi-touch gesture message can be programmable through the interface driver.

3 is a top plan view of the multi-touch and multi-directional mouse 200 of FIG. FIG. 3 shows a three-finger touch generated by a three-finger touch action using a two-finger touch action on the first sensing plate 106 and a single-finger touch action on the second sensing plate 202. Control input instructions. 4 is a side view showing the three-finger touch action illustrated in FIG. 3.

FIG. 5 illustrates another use of the second sensing pad 502 to control the touch hand index, in accordance with some embodiments. In this figure, the second sensing board 502 (touching pad) includes two setting blocks, namely, a front half 502a and a rear half 502b. The touch front half 502a can generate a single finger touch action, and the touch back half 502b can generate a two finger touch action. These single-finger and two-finger touch actions are generated using a user-programmable interface software. According to the programming of the user-programmable interface software, the user can perform a two-finger touch action and a drag action on the first sensor board 106 and a single-finger touch on the second half 502b of the second sensor board 502. Action to generate a four-finger touch gesture.

6 illustrates another use of a second touchpad 602 (sensing panel) to control the touch hand index, in accordance with some embodiments. As shown, the second touchpad 602 includes four set blocks. Under such settings, the number of "finger touches" generated on the second touch panel 602 will vary according to the four positions of front lower, front upper, lower rear, and upper rear. For example, in some embodiments, the front and lower portions of the touch second touch panel 602 represent a single finger touch action. Similarly, the upper part of the touch represents the two-finger touch action; the lower part of the touch represents the three-finger touch action; and the upper part after the touch represents the four-finger touch action. That is to say, under such a program setting, a two-finger touch action and a drag action on the first sensing board 106 are added to the upper part of the second touch panel 602 (sensing board). A touch action allows the user to generate up to one six-finger touch gesture.

1. Map the touch point data from the local coordinates on the surface of the multi-touch track to the PC screen coordinates.

Touch points on the surface of a mouse trackpad can be mapped to a personal computer screen coordinate using at least two mapping methods (or mapping modes). The first method uses absolute position data from finger touch actions on a multi-touch panel to map absolute coordinates across the entire PC screen area. This method is called the entire mapping method. The second mapping method uses absolute position data from finger touch actions on a multi-touch panel to map absolute coordinates to a small portion of the mapping area on the PC screen coordinates. This method is called Partition mapping method.

Figure 7 illustrates the overall entropy in detail. The abscissa 702 and the ordinate 704 are comprised of local two-dimensional coordinates on the surface of the sensing plate 106. A set of abscissas 702 and ordinates 704 are formed by display screen coordinates on the surface of the personal computer screen 714. The absolute position data of the upper left corner 710 on the sensing plate 106 is mapped to the absolute position of the upper left corner 712 on the display screen coordinates. Similarly, absolute position data of the lower left corner, the lower right corner, and the upper right corner are mapped to respective corners of the sensing board 106.

Finger touch action 706 on sensor panel 106 provides raw data for the local X position and the local Y position. The touch data is mapped to display screen point 708 or to individual X, Y positions in the screen coordinates. In the overall mapping mode, if the other engineering capabilities and/or specifications of the touchpad are not changed, the resolution of the touchpad data will be proportional to the size of the touchpad. The larger the size of the touchpad, the higher the resolution of the touch command input on the display screen.

Figures 8A and 8B illustrate a second mapping method in which the absolute coordinates on the multi-point sensing pad 106 (touchpad) are mapped to a portion of the mapping area on the PC screen coordinates. As shown in the figure, the user can move the partial mapping area by dragging the mouse, and then use the sensing board 106 to generate a multi-finger touch command in the partial mapping area.

Figure 8A illustrates in detail the partial entropy. In this embodiment, the center point defined by the local X coordinate on the surface of the sensing plate 106 and the local Y coordinate is mapped to the center point of the predetermined area 800 on the display screen coordinates. As shown, the predetermined area 800 will only cover a portion of the PC screen area. That is to say, the absolute position data in the upper left corner of the touch panel will be reflected in the absolute position of the upper left corner of the mapping area 800. Similarly, the absolute positions of the lower left corner, the lower right corner, and the upper right corner of the sensing panel 106 are reflected in respective corners of the engraving area 800.

As illustrated in FIG. 8B, the user can move the position of the enemies area 800 by dragging the mouse body 108. In accordance with the above, in some embodiments, the personal computer host interface program utilizes mouse cursor data to execute encoding instructions for the location to be reached by the mapping area 800. Therefore, the user can move the mouse cursor to the desired position on the display screen of the personal computer, and then start the multi-touch command via the surface of the multi-point sensing board 106 on the touch mouse. Therefore, in some embodiments, for input control on the display screen, since the touch data from the touch panel provides a higher resolution input on a smaller enemies, the predetermined portion The entropy mode has the technical advantage of being able to recognize higher-accuracy touch data compared to the overall entropy mode.

In some embodiments, the multi-point sensing board 106 (touchpad) passes the firmware in the mouse (the firmware) even if the sensor area is much smaller than the conventional tablet. ) is defined as a universal human address device (USB-HID, hereinafter referred to as USB-HID) digitizer. The firmware provides an absolute local coordinate data set for each finger touch on the surface of the main touchpad, wherein the main touchpad is defined via a USB system.

2. Generate multi-finger gestures via multi-touch mouse design with multi-touch pad

FIG. 9A is a functional block diagram of a basic hardware component and a firmware of a multi-touch and multi-directional mouse. In some embodiments, the firmware 900 can be disposed on a printed circuit board to systematically define two independent USB devices, such as a conventional USB two-dimensional mouse and logic device #2, such as USB. -HID. The logic device #1 captures the mouse sensor data 904 and the mouse button and wheel data 906, and outputs the conventional two-dimensional mouse data packet to a personal computer host via a USB connector. Logic device #2 retrieves data from multi-touch sensor 902 and treats the data packet as a USB-HID input device defined via a USB system.

Once the firmware on the mouse defines the multi-touch pad as a tablet, the touch signal will be used by Windows. 7 The kernel driver of the personal computer operating system such as the operating system is received and converted into a touch message such as WM_TOUCH in the Windows operating system. When the mouse moves, the mouse will output the mouse input data to the firmware, software or hardware according to the mouse input type to define the change of the mouse coordinates according to the movement of the mouse. Part of the mapping area moves along the direction of movement of the mouse in response to mouse input data. FIG. 9A illustrates a block diagram of firmware functionality in a multi-touch mouse. The firmware intercepts the data packets of the local absolute coordinates of the touch points on the multi-touch panel in a real time manner. These coordinates are then mapped to the PC screen coordinates via the firmware. The output data packet from the firmware contains the number of finger touches and the XY position data of each touch data in the PC screen coordinates.

3. Generated by a multi-touch mouse design with a main multi-touch panel and a digital switch (a sensor based on a switching signal) or a secondary touch panel (only touch/untouched) Multi-finger gesture

9B is a functional block diagram of a basic hardware component and a firmware of a multi-touch and multi-directional mouse, including a main multi-touch sensor 902 and a secondary touch sensor. 908 (only for single touch detection or multi-touch). In some embodiments, firmware 900 systematically defines two separate USB devices, or logic device #1 and logic device #2.

10A and 10B illustrate two-finger touch actions that are mapped via different touch data sets. In FIG. 10A, the user places two fingers 1000 on a main sensor 106, and the touch data is then displayed as two touch points 1002 and 1004 on the personal computer screen. In FIG. 10B, the touch data from a first finger 1010 on a main sensor is reflected as a first touch point 1004 on the personal computer screen 714, and is touched by the thumb 1008. Controlling the touch state data (touch or no touch) of a secondary sensor will be used to create a virtual touch point 1006 as a second touch point on the PC screen. The distance between the first touch point 1004 and the second touch point (ie, the virtual touch point 1006) can be programmed by the firmware. The firmware calculates the absolute local coordinate data of the second touch point by using the absolute local coordinate data of the first touch point on the main sensor and the increase or decrease value of the predetermined small X, Y data. Once the virtual touch point 1006 is calculated, the coordinates of the touch point and other touch points are included in a data packet and transmitted to the personal computer host. The PC host uses these touch points to generate a touch command message that can be recognized by the computer application.

11A and 11B illustrate three-finger touch actions that are mapped via different touch data sets. In FIG. 11A, the user places three fingers 1100 on one main sensor 106, and then the touch data generated by the touch actions of the three fingers is used as the touch points 1102, 1104 and 1106. The screen is on the screen of the personal computer.

In FIG. 11B, two touch data from two fingers 1108 on one main sensor are reflected as two touch points 1114 and 1116 on the screen of the personal computer, and are from the user. Thumb touch The touch state data (touch or no touch) of a secondary sensor 1110 is used to create a virtual touch point 1112 to serve as a third touch point on the personal computer screen. The distance between the two actual touch points and the third touch point (ie, the virtual touch point) can be programmed via the firmware. The firmware calculates the absolute local coordinate data of the third touch point by using the absolute local coordinate data of the first and second touch points on the main sensing board and the increase or decrease value of the predetermined small X, Y data.

12A and 12B illustrate a transform gesture generated via a two-finger touch action. In FIG. 12A, the user can drag two fingers 1200 along a horizontal or vertical direction on the main touch panel (detectable multi-touch sensor) to generate a two-finger transformation gesture (two "Actual touch" action, as indicated by reference numerals 1202 and 1204. In FIG. 12B, the user can drag a finger 1206 along a horizontal or vertical direction on the main touch panel (detectable multi-touch sensor) while touching the thumb 1208 with a touch. A detector is used to generate a two-finger transformation gesture represented by 1210 and 1212. It is worth noting that the two-finger transformation gesture is performed by an actual touch point from the main touchpad and a virtual second touch point (or virtual touch point) from the secondary touchpad. composition. If the main finger track of the main touch point is at the horizontal or vertical direction at each time point, the firmware recognizes the gesture as a transform gesture, and calculates the virtual touch point into the coordinates along with the main touch point. ,as the picture shows. Using the correlation between the data and these touch points (the PC host recognizes this association at any time), the first touch point will have a horizontal or vertical trajectory and will produce a transformation that can be recognized by the computer application. Gesture touch command message.

Figures 13A and 13B illustrate a stretch/pinch gesture generated via a two-finger touch action. In FIG. 13A, the user can expand or contract two fingers 1300 on the main touchpad (detectable multi-touch sensor) to generate a two-finger expansion/contraction gesture, such as reference numerals 1302 and 1304. Shown. In FIG. 13B, the user can drag a finger 1308 along a diagonal direction on the main touch panel (detectable multi-touch sensor) while using another finger or The thumb 1310 is touched on the secondary touchpad to produce a two-finger expansion/contraction gesture represented by 1304 and 1306. If the main finger trajectory is in a tilt/diagonal direction, the firmware will calculate the coordinates by treating the virtual touch point (ie, gesture 1306) as a stationary pivot point. In other words, the PC host will generate a shrink or expand gesture touch command message.

14A and 14B illustrate a rotation gesture generated via a two-finger touch action. In FIG. 14A, the user can drag two fingers 1400 on the main touchpad (detectable multi-touch sensor) to generate a two-finger rotation gesture, as indicated by reference numerals 1402 and 1404. In FIG. 14B, the user can drag a finger 1408 on the main touchpad (detectable multi-touch sensor) to draw a circular trajectory while using another finger or thumb 1410. Touching is performed on the secondary sensor to produce a two-finger rotation gesture represented by 1404 and 1406. When the trajectory of the main finger is recognized by the firmware as a circular trajectory, the firmware will use this virtual touch point (ie, gesture 1406) as a dynamic pivot point to calculate the coordinates. In other words, the personal computer host will generate a rotating gesture touch command message.

4. Generate multi-finger gestures via a multi-touch mouse design with a single touch sensor and a digital switch or a secondary touchpad (only touch/untouched)

Multi-touch gestures can be created by setting up a single touchpad and a digital switch (or touch detection sensor). In some embodiments, such as those disclosed in FIG. 16, the multi-touch mouse is comprised of two single touch detection sensors. This embodiment can be defined as a multi-touch mouse to implement a reduced gesture function.

Referring again to FIG. 9B, a functional block diagram of a multi-touch and multi-directional mouse basic hardware component and firmware including two single-touch detection sensors is illustrated. In some embodiments, firmware 900 systematically defines two separate USB devices, or logical device #1 and logical device #2. In this embodiment, the mouse is equipped with a touch panel 908 that can only detect single touch and a digital switch that can only detect touch (ON) or release (OFF). The hardware of the switch can be a push in/out switch or a touch sensor capable of detecting a touch/non-touch state.

This embodiment can generate up to two finger gesture commands. FIG. 12B illustrates a set of finger touch actions included in the main main sensor) pads (detection only at the single touch position) and the secondary sensor (only touch/non-touch states). The data can be viewed on the PC screen. The secondary touch data (touch on/off state) is used as a virtual touch point, which is described in the above paragraph 3 "via a multi-touch panel and a digital switch (based on the switching signal) The multi-touch mouse is designed to produce multi-finger gestures."

The two-finger transformation gesture here is similar to the finger gesture generation example illustrated in FIG. 12B. The generation of the two-finger contraction/expansion gesture here is similar to the finger gesture generation example illustrated in FIGS. 13A and 13B. The generation of the two-finger rotation gesture here is similar to the finger gesture generation example illustrated in FIG. 14B.

5. Device driver on the personal computer host

15 is a functional block diagram of a multi-touch interface driver of a computer operating system in a personal computer host 1500, in place of an application to manage data packets of the multi-touch and multi-directional mouse 200, and manages multiple Directional control commands are generated. The device driver module in a kernel mode layer in the operating system will capture the raw data of the logical device #1 and the logical device #2 defined by the mouse firmware. In some embodiments, an input device (such as a mouse) is connected to the computer via a USB connector. Other connection types such as wireless, Bluetooth, etc. are used in other embodiments. In order to capture USB data packets, such as Windows The operating system in the computer such as the operating system provides a built-in core mode driver 1502. The device driver module 1504 in a user mode layer in the operating system captures the mouse's original data packet and performs the following two operation steps: (1) compiling from the USB driver data packet, and (2) Multi-touch and multi-directional commands are generated.

In the first step, the user level device driver 1506 compiles the finger touch action by using a set of software, that is, defines the finger touch of each finger at each time point for each sensing board. The number of points and the local location. In the second step, if the application is ready to receive the multi-touch message as one of the standard interactive input streams, the interface driver will generate an operation with the computer. System message related multi-touch message, that is, outputting a predetermined multi-touch message according to the total number of finger touch actions and fingertip tracks (that is, Windows) 7WM_TOUCH or WM_GESTURE in the operating system).

If the application 1512 can receive multi-touch input but cannot recognize the multi-touch message as its standard interface input command, the user-level device driver 1506 will start an auxiliary interface driver 1508 (supplemental interface driver) ). The auxiliary interface driver will output to the application 1512 the corrected sensor data that has been converted into an application specific input format that can be recognized by the application 1512.

For example, if the application 1510 cannot receive multi-touch input and only recognizes the conventional standard input data defined in the operating system, such as a mouse in the case of the old Windows operating system / The keyboard, the auxiliary interface driver 1508 converts the data packet of the multi-touch sensor board into a set of conventional standard input data, and outputs the traditional standard input analog input message to the application 1510, so that the application The program 1510 can execute its interactive instructions.

Although the present invention has been disclosed in the above embodiments, it is not intended to limit the present invention, and those skilled in the art can make some modifications and refinements without departing from the spirit and scope of the present invention. The scope of the invention is defined by the scope of the appended claims.

100, 200, 1600, 1700. . . mouse

102. . . Depression

104. . . Flat surface

106, 202, 502, 602, 902, 1110. . . Sensing board

108, 1602, 1702. . . Mouse body

204. . . side

502a. . . First half

502b. . . The second half

702. . . Horizontal coordinate

704. . . Y-axis

706, 1202, 1204. . . Touch action

710, 712. . . corner

714. . . Personal computer screen

800. . . Scheduled area

900. . . firmware

904, 906. . . data

908, 1604, 1606, 1704, 1706. . . touchpad

1000, 1008, 1010, 1100, 1108, 1200, 1206, 1208, 1300, 1308, 1310, 1400, 1408, 1410. . . finger

1002, 1004, 1006, 1112, 1114, 1116. . . Touch point

1210, 1212, 1302, 1304, 1306, 1402, 1404, 1406. . . gesture

1608. . . Side extension

Location: X, Y

#1, #2. . . Logic device

1500. . . Multi-touch interface driver

1502. . . Core mode driver

1504. . . Device driver module

1506. . . User level interface driver

1508. . . Auxiliary interface driver

1510, 1512. . . application

Figure 1 depicts a perspective view of a mouse with a touchpad.

2 is a perspective view of a multi-touch and multi-directional mouse with two touch panels.

3 illustrates a top view of the touchpad of FIG. 2 with the user's two fingers on the top multi-touch panel and the user's third finger on one of the side touchpads.

4 is a side view of the touch panel of FIG. 3.

Figure 5 illustrates a touchpad on a mouse with two set blocks.

Figure 6 illustrates a touchpad on a mouse with four set blocks.

Figure 7 depicts a schematic of an overall enantiomeric method.

Figure 8A depicts a schematic of a partial enantiomeric method.

Figure 8B depicts a schematic of a partial entropy in which a portion of the enantiomeric region is moving.

Figure 9A depicts a schematic view of a bonded firmware and a mouse having a multi-point sensing plate.

FIG. 9B is a schematic diagram of a connecting firmware and a mouse having a multi-point sensor trackpad and a second trackpad.

FIG. 10A is a schematic diagram showing a two-finger touch action on a single touch panel on a personal computer screen coordinate.

FIG. 10B is a schematic diagram showing a two-finger touch action on two separate touch panels on a personal computer screen coordinate.

FIG. 11A is a schematic diagram showing a three-finger touch action on a single touch panel on a personal computer screen coordinate.

FIG. 11B is a schematic diagram showing a three-finger touch action on two separate touch panels on a personal computer screen coordinate.

FIG. 12A is a schematic diagram showing a two-finger touch gesture on a single touch panel on a personal computer screen coordinate.

FIG. 12B is a schematic diagram showing a two-finger touch gesture on two separate touch panels on a personal computer screen coordinate.

FIG. 13A is a schematic diagram showing another two-finger touch gesture on a single touch panel on a personal computer screen coordinate.

FIG. 13B is a schematic diagram showing another two-finger touch gesture on two separate touch panels on a personal computer screen coordinate.

FIG. 14A is a schematic diagram showing another two-finger touch gesture on a single touch panel on a personal computer screen coordinate.

FIG. 14B is a schematic diagram showing another two-finger touch gesture on two separate touch panels on a personal computer screen coordinate.

Figure 15 depicts a block diagram of a hardware and software component connected to a multi-touch mouse.

Figure 16 illustrates another embodiment of a computer mouse having two separate trackpads.

Figure 17 depicts yet another embodiment of a computer mouse having two separate trackpads.

200. . . mouse

102. . . Depression

104. . . Flat surface

106. . . First sensing board

108. . . Mouse body

202. . . Second sensing board

204. . . side

Claims (20)

  1. A method for combining at least two touch signals into a computer system includes: receiving a first touch signal from a first touch panel, the first touch signal representing a user touching with a first finger a first touch point on the first touch panel; a second touch signal received from the second touch panel, the second touch signal representing the user touching the first touch point Simultaneously touching the second touch panel with a second finger; calculating the second touch according to the second touch panel when the first touch point of the first touch panel is touched Pointing at an absolute coordinate of the first touch panel, an absolute coordinate of the second touch point is generated from an absolute coordinate offset of the first touch point; and utilizing the first touch point and the second The touch point generates a touch command message that can be recognized by a computer application, and the touch command message defines an absolute coordinate of the first touch point and an absolute coordinate of the second touch point.
  2. The method of claim 1, wherein the absolute coordinate of the second touch point is generated by offsetting an absolute coordinate of the first touch point by a distance.
  3. The method of claim 1, further comprising mapping the first touch point and the second touch point on a computer screen.
  4. The method of claim 1, further comprising: mapping the absolute coordinate of the first touch panel to a portion of an area of the coordinates of the computer screen, wherein the partial mapping area is smaller than the computer screen The entire area.
  5. The method of claim 4, further comprising receiving mouse input data from a mouse, wherein the mouse input data defines a standard change of the mouse, and moving the computer screen according to the mouse input data The coordinates of the portion of the enantiomer on the top.
  6. The method of claim 1, wherein the first touch panel is a multi-touch panel, and the method further comprises: receiving a third touch signal from the multi-touch panel, wherein the method Third touch signal definition An absolute coordinate of a third touch point on the first touch panel; and wherein the touch command message also defines an absolute coordinate of the third touch point.
  7. The method of claim 1, wherein the second touch panel comprises a plurality of setting blocks, and the second touch signal indicates at least one of the touched pixels in the setting blocks.
  8. The method of claim 7, further comprising: calculating a third touch point on the first touch panel when receiving a second touch signal that is touched by a first designated block The absolute coordinate of the third touch point is generated by the offset of the absolute coordinate of the first touch point.
  9. The method of claim 7, wherein at least one of the set blocks of the second touch panel represents a multi-finger touch action.
  10. The method of claim 9, wherein the multi-finger touch action represents a two-touch action, a three-touch action, or a four-touch action, and the method further comprises using the second touch signal Calculating a third touch point, a fourth touch point, a fifth touch point or a sixth touch point, each of the touch points having an absolute coordinate offset from the first touch point .
  11. The method of claim 1, further comprising identifying a direction of movement of the first touch point at each time point, thereby calculating a position and a direction of the first touch point at each time point.
  12. The method of claim 1, further comprising identifying a trajectory of the first touch point at each time point, and identifying that the first touch point has at least one of a horizontal trajectory and a vertical trajectory , generating a change gesture touch command message.
  13. The method of claim 1, further comprising: identifying a trajectory of the first touch point at each time point, and generating a contraction gesture when recognizing that the first touch point has an oblique trajectory At least one of the control command message and an unfolding gesture touch command message.
  14. The method of claim 1, further comprising: identifying a trajectory of the first touch point at each time point, and generating a circular trajectory when the first touch point is recognized A rotating gesture touch command message.
  15. The method of claim 1, further comprising providing the first touch panel and the second touch panel on a computer mouse.
  16. A computer mouse includes: a mouse body; a first touch panel disposed on the mouse body, the first touch panel for generating a first touch signal, the first touch signal definition An absolute coordinate of the first touch point on the first touch panel; a second touch panel disposed on the mouse body, wherein the second touch panel is configured to generate a second touch signal. The second touch signal indicates whether the second touch panel is touched; and a logic circuit for performing the following operations: receiving the first touch signal from the first touch panel and from the second touch The second touch signal of the control panel; the second touch signal from the second touch panel, the second touch signal representing the first touch panel of the second touch panel The touch point is touched when touched; when the first touch point of the first touch panel is touched, the second touch panel is simultaneously touched, and one of the first touch panels is calculated. a second touch point, the second touch point is assigned an absolute coordinate, wherein the absolute coordinate of the second touch point is absolute from the first touch point Standard deviation is generated; and generating the command message may be one recognized by the computer of the touch, the absolute coordinates of the touch command message defines the first and the second touch point.
  17. The computer mouse of claim 16, further comprising a notch on a top surface of the mouse body, wherein the first touch panel is disposed in the notch.
  18. The computer mouse of claim 16, wherein the second touch panel is disposed on one side of the mouse body.
  19. The computer mouse according to claim 16, wherein the first touch panel is a multi-touch touch panel, and the second touch panel is a single touch touch panel.
  20. The computer mouse of claim 16, wherein the second touch panel includes at least two setting blocks for indicating when each of the setting blocks is touched.
TW100101410A 2009-11-09 2011-01-14 Method for combining at least two touch signals in a computer system TWI452494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/914,649 US8614664B2 (en) 2009-11-09 2010-10-28 Multi-touch multi-dimensional mouse

Publications (2)

Publication Number Publication Date
TW201218036A TW201218036A (en) 2012-05-01
TWI452494B true TWI452494B (en) 2014-09-11

Family

ID=46086184

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100101410A TWI452494B (en) 2009-11-09 2011-01-14 Method for combining at least two touch signals in a computer system

Country Status (2)

Country Link
CN (1) CN102467261A (en)
TW (1) TWI452494B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472931A (en) * 2012-06-08 2013-12-25 宏景科技股份有限公司 Method for operating simulation touch screen by mouse
TW201433938A (en) 2013-02-19 2014-09-01 Pixart Imaging Inc Virtual navigation apparatus, navigation method, and computer program product thereof
CN104007849B (en) * 2013-02-26 2017-09-22 原相科技股份有限公司 Virtual navigation device and its air navigation aid
US9727231B2 (en) 2014-11-19 2017-08-08 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
CN104656992A (en) * 2015-02-13 2015-05-27 业成光电(深圳)有限公司 Operation method of touch system
CN105320298A (en) * 2015-11-23 2016-02-10 攀枝花学院 Wireless handheld mouse

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179650A1 (en) * 2004-02-13 2005-08-18 Ludwig Lester F. Extended parameter-set mouse-based user interface device offering offset, warping, and mixed-reference features
US20090184936A1 (en) * 2008-01-22 2009-07-23 Mathematical Inventing - Slicon Valley 3D touchpad
TWM377639U (en) * 2009-11-24 2010-04-01 Sunrex Technology Corp Mouse structure of multi-fingers touch on suface
TWM383156U (en) * 2010-02-09 2010-06-21 Sunrex Technology Corp Improved touch control mouse device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179650A1 (en) * 2004-02-13 2005-08-18 Ludwig Lester F. Extended parameter-set mouse-based user interface device offering offset, warping, and mixed-reference features
US20090184936A1 (en) * 2008-01-22 2009-07-23 Mathematical Inventing - Slicon Valley 3D touchpad
TWM377639U (en) * 2009-11-24 2010-04-01 Sunrex Technology Corp Mouse structure of multi-fingers touch on suface
TWM383156U (en) * 2010-02-09 2010-06-21 Sunrex Technology Corp Improved touch control mouse device

Also Published As

Publication number Publication date
TW201218036A (en) 2012-05-01
CN102467261A (en) 2012-05-23

Similar Documents

Publication Publication Date Title
JP5075867B2 (en) Touch event model
EP1774429B1 (en) Gestures for touch sensitive input devices
US8466934B2 (en) Touchscreen interface
US5748185A (en) Touchpad with scroll and pan regions
US7877707B2 (en) Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US9513798B2 (en) Indirect multi-touch interaction
JP3909230B2 (en) Coordinate input device
US9104308B2 (en) Multi-touch finger registration and its applications
CN103513927B (en) Selectively refuse the touch contact in the fringe region of touch-surface
US9720544B2 (en) Techniques for reducing jitter for taps
AU2008100547A4 (en) Speed/position mode translations
US8462134B2 (en) Multi-finger mouse emulation
JP5237848B2 (en) Gesture recognition method and touch system incorporating the same
JP2007334870A (en) Method and system for mapping position of direct input device
US20050052427A1 (en) Hand gesture interaction with touch surface
KR20100108116A (en) Apparatus and method for recognizing touch gesture
US20050162402A1 (en) Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
JP2006236339A (en) Method for operating graphical user interface and graphical user interface
WO2013018480A1 (en) User interface device comprising touch pad for shrinking and displaying source image within screen capable of touch input, input processing method and program
US20100148995A1 (en) Touch Sensitive Mechanical Keyboard
TWI479369B (en) Computer-storage media and method for virtual touchpad
US8432362B2 (en) Keyboards and methods thereof
JP2011503709A (en) Gesture detection for digitizer
KR101492678B1 (en) Multi-mode touchscreen user interface for a multi-state touchscreen device
US8686959B2 (en) Touch screen multi-control emulator

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees