CN102426491A - Multipoint touch realization method and system for touch screen - Google Patents
Multipoint touch realization method and system for touch screen Download PDFInfo
- Publication number
- CN102426491A CN102426491A CN2011102223087A CN201110222308A CN102426491A CN 102426491 A CN102426491 A CN 102426491A CN 2011102223087 A CN2011102223087 A CN 2011102223087A CN 201110222308 A CN201110222308 A CN 201110222308A CN 102426491 A CN102426491 A CN 102426491A
- Authority
- CN
- China
- Prior art keywords
- touch
- screen
- gesture
- data
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to a multipoint touch realization method and system for a touch screen, belonging to the technical field of human-machine interaction. The multipoint touch realization method comprises the following steps of: firstly, acquiring original touch data; secondly, identifying gesture according to the original touch data; thirdly, acquiring a keyboard and/or mouse command corresponding to the gesture; and finally, executing the keyboard and/or mouse command. The multipoint touch realization method and system disclosed by the invention are not limited by an operation system or platform, can be suitable for anyone operation system or platform and has strong generality.
Description
Technical field
The invention belongs to human-computer interaction technique field, be specifically related to a kind of multi-point touch implementation method and system that is used for touch-screen.
Background technology
In recent years, no matter computing machine is from memory data output or from the angle of processor calculating speed, all obtained significantly promoting, but human-computer interaction technology still rests on the original place basically, does not have actual progress.But along with the listing of the Surface computer of the IPhone of Apple and Microsoft, this phenomenon will have significantly takes on a new look.And the attractive spot of IPhone and Surface, no more than its unique multi-point touch technology.
The multiple input pattern that multi-point touch technology is advocated has been overturned traditional list indication fully and has been hit notion, between being affectedly bashful of two fingers, accomplishes the control of the convergent-divergent or the screen of picture efficiently.Based on the multiple point touching of computer vision and mode identification technology, seem just and simply traditional single-point input is derived in the multiple spot input, but this is the revolution of input technology in fact.
In October, 2009, support the issue of Windows 7 operating systems of multi-point touch technology to can be described as the arrival of having declared the multi-point touch generation, popularizing of multi-point touch technology is irresistible.Along with the continuous development of touch screen technology, Windows 7 has had the software environment that can let touch-screen hardware bring into play, for multiple point touching The Application of Technology software provides the foundation.
At present popular multi-point touch gesture mainly contains Zoom (convergent-divergent), Rotate (rotation), Translate (moving), common several kinds such as Scroll (rolling).Because the gesture notion is just risen in the recent period, therefore former operating system such as Windows XP, Linux etc. do not support the multi-point touch function, and the software of development support multi-point touch function is relatively more difficult above that, and generally touch-screen manufacturer provides gesture SDK.
Summary of the invention
To the defective that exists in the prior art, technical matters to be solved by this invention provides a kind of touch-screen multi-point touch implementation method and system that is applicable to any operating system.
For solving the problems of the technologies described above, the technical scheme that the present invention adopts is following:
A kind of multi-point touch implementation method that is used for touch-screen may further comprise the steps:
Gather original touch data;
Carry out gesture identification according to said original touch data;
Obtain said gesture corresponding keyboard and/or mouse command;
Carry out said keyboard and/or mouse command.
The aforesaid multi-point touch implementation method that is used for touch-screen, wherein, original touch data comprises touch point coordinate information and touch point status information, the touch point state comprises to be pressed and lifts.
The aforesaid multi-point touch implementation method that is used for touch-screen, wherein, the corresponding relation of gesture and keyboard and/or mouse command is stored in the tables of data, and data table stores is in the control chip of touch-screen.After judging gesture, the data query table obtains keyboard corresponding with gesture and/or mouse command.Gesture comprises convergent-divergent, rotation and moves.
A kind of multi-point touch that is used for touch-screen is realized system, comprises the harvester that is used to gather original touch data;
Be used for carrying out the recognition device of gesture identification according to said original touch data;
Be used to obtain the corresponding keyboard of said gesture and/or the deriving means of mouse command;
Be used to carry out the actuating unit of keyboard and/or mouse command.
The aforesaid multi-point touch that is used for touch-screen is realized system, also comprises the memory storage that is used for the corresponding relation of gesture and keyboard and/or mouse command is stored in tables of data.Said deriving means also comprises the query unit that is used for the data query table.
The method of the invention and system realize the multi-point touch function more simply by means of keyboard and/or mouse under the operating system of not supporting multi-point touch, do not receive the restriction of operating system or platform, applicable to any operating system or platform, highly versatile.And the existing application software of not supporting the multi-point touch function can directly be used or just can use the multi-point touch function through minor modifications, has saved the cost of Application and Development software again.
Description of drawings
Fig. 1 is the structured flowchart that is used for the multi-point touch realization system of touch-screen in the embodiment;
Fig. 2 is the process flow diagram that is used for the multi-point touch implementation method of touch-screen in the embodiment;
Fig. 3 is the exemplary plot of tables of data in the embodiment;
Fig. 4 is the functional schematic of drawing tools part Macintosh in the Windows operating system in the embodiment.
Embodiment
Describe the present invention below in conjunction with embodiment and accompanying drawing.
As shown in Figure 1, the multi-point touch realization system that is used for touch-screen in this embodiment comprises harvester 11, recognition device 12, deriving means 13, actuating unit 14 and memory storage 15.
Harvester 11 is used to gather original touch data.Recognition device 12 is used for carrying out gesture identification according to said original touch data.Memory storage 15 is used for the corresponding relation of gesture and keyboard and/or mouse command is stored in tables of data.Deriving means 13 is used to obtain said gesture corresponding keyboard and/or mouse command, and query unit is used for the data query table.Actuating unit 14 is used to carry out keyboard and/or mouse command.
As shown in Figure 2, adopt system shown in Figure 1 to realize that the method for multi-point touch may further comprise the steps:
(1) harvester 11 is gathered original touch data (201).
Original touch data mainly comprises touch point coordinate and touch point state information such as (press or lift).The acquisition method of original touch data is a prior art, only makes brief description here.For example, in the process of recognizing touch operation thing, imageing sensor is every to be that unit is taken touch screen continuously fully with the frame at a distance from the set time at the touch-screen that utilizes imageing sensor to realize to touch the location.If during touch objects touch display screen curtains such as finger, in a two field picture, photographed fully with the contact site branch of display screen.Said image signal transmission is carried out graphical analysis to computer system, obtain original touch data.
(2) recognition device 12 carries out gesture identification (202) according to original touch data.
Gesture comprises convergent-divergent gesture, rotate gesture, moves gesture etc.Gesture identification method is a prior art, no longer launches explanation here.
(3) deriving means 13 obtains said gesture corresponding keyboard and/or mouse command (203).
When recognition device 12 identifies after the touch operation gesture of carrying out on the touch-screen, query unit data query table, deriving means 13 obtain this gesture corresponding keyboard and/or mouse command.
(4) actuating unit 14 is carried out keyboard and/or mouse command (204).
Be that example is illustrated above-mentioned embodiment with the drawing tools in the Windows operating system below.In drawing tools, the function of part Macintosh is as shown in Figure 4.
At first harvester 11 is gathered the original touch data of the touch operation that users carry out on touch-screen, then the gesture of recognition device 12 this touch operation of carrying out according to original touch data identification user.If the amplification gesture, then deriving means 13 data query tables obtain corresponding keyboard combination key " Shift+ '+' " order of amplifieroperation, and actuating unit 14 is carried out and is somebody's turn to do order, realizes the amplification of operand.If dwindle gesture, then deriving means 13 data query tables obtain corresponding keyboard "-" order of reduction operation, and actuating unit 14 is carried out and is somebody's turn to do order, realizes dwindling of operand.If turn clockwise gesture, then deriving means 13 data query tables obtain to turn clockwise and operate corresponding keyboard combination key " Ctrl+K " order, and actuating unit 14 is carried out and should be ordered, and realizes that the operand dextrorotation turn 90 degrees.If be rotated counterclockwise gesture, then deriving means 13 data query tables obtain to be rotated counterclockwise and operate corresponding keyboard combination key " Ctrl+L " order, and actuating unit 14 is carried out and should be ordered, and realizes that operand is rotated counterclockwise 90 degree.
For another example, in Word software, " the Ctrl+ mouse roller scrolls up " is combined as the amplifieroperation to the Word document page, and " the Ctrl+ mouse roller rolls downwards " is combined as the reduction operation to the Word document page.When recognition device 12 identifies the touch operation gesture of user on touch-screen for amplification; Deriving means 13 data query tables obtain corresponding keyboard of amplifieroperation and mouse combination order " the Ctrl+ mouse roller scrolls up "; Actuating unit 14 is carried out and should be ordered, and realizes the amplification of the Word document page.When recognition device 12 identifies the touch operation gesture of user on touch-screen when dwindling; Deriving means 13 data query tables obtain corresponding keyboard of reduction operation and mouse combination order " the Ctrl+ mouse roller rolls " downwards; Actuating unit 14 is carried out and should be ordered, and realizes dwindling of the Word document page.
Obviously, those skilled in the art can carry out various changes and modification to the present invention and not break away from the spirit and scope of the present invention.Like this, belong within the scope of claim of the present invention and equivalent technology thereof if of the present invention these are revised with modification, then the present invention also is intended to comprise these changes and modification interior.
Claims (9)
1. a multi-point touch implementation method that is used for touch-screen is characterized in that, may further comprise the steps:
Gather original touch data;
Carry out gesture identification according to said original touch data;
Obtain said gesture corresponding keyboard and/or mouse command;
Carry out said keyboard and/or mouse command.
2. the multi-point touch implementation method that is used for touch-screen as claimed in claim 1 is characterized in that: said original touch data comprises touch point coordinate information and touch point status information, and said touch point state comprises to be pressed and lift.
3. the multi-point touch implementation method that is used for touch-screen as claimed in claim 1, it is characterized in that: the corresponding relation of said gesture and keyboard and/or mouse command is stored in the tables of data.
4. the multi-point touch implementation method that is used for touch-screen as claimed in claim 3, it is characterized in that: said data table stores is in the control chip of touch-screen.
5. the multi-point touch implementation method that is used for touch-screen as claimed in claim 4 is characterized in that: said method is inquired about said tables of data after judging gesture, obtains keyboard corresponding with said gesture and/or mouse command.
6. like each described multi-point touch implementation method that is used for touch-screen in the claim 1~5, it is characterized in that: said gesture comprises convergent-divergent, rotation and moves.
7. a multi-point touch that is used for touch-screen is realized system, it is characterized in that, comprises the harvester (11) that is used to gather original touch data;
Be used for carrying out the recognition device (12) of gesture identification according to said original touch data;
Be used to obtain the corresponding keyboard of said gesture and/or the deriving means (13) of mouse command;
Be used to carry out the actuating unit (14) of keyboard and/or mouse command.
8. the multi-point touch that is used for touch-screen as claimed in claim 7 is realized system, and it is characterized in that: said system also comprises the memory storage (15) that is used for the corresponding relation of gesture and keyboard and/or mouse command is stored in tables of data.
9. the multi-point touch that is used for touch-screen as claimed in claim 8 is realized system, and it is characterized in that: said deriving means (13) also comprises the query unit that is used for the data query table.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011102223087A CN102426491A (en) | 2011-05-12 | 2011-08-04 | Multipoint touch realization method and system for touch screen |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110122432 | 2011-05-12 | ||
CN201110122432.6 | 2011-05-12 | ||
CN2011102223087A CN102426491A (en) | 2011-05-12 | 2011-08-04 | Multipoint touch realization method and system for touch screen |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102426491A true CN102426491A (en) | 2012-04-25 |
Family
ID=45960485
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011102223087A Pending CN102426491A (en) | 2011-05-12 | 2011-08-04 | Multipoint touch realization method and system for touch screen |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102426491A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102929389A (en) * | 2012-09-27 | 2013-02-13 | 北京像素软件科技股份有限公司 | Device and method for converting user input information |
CN103809912A (en) * | 2014-03-03 | 2014-05-21 | 欧浦登(福建)光学有限公司 | Tablet personal computer based on multi-touch screen |
CN104182163A (en) * | 2013-05-27 | 2014-12-03 | 华为技术有限公司 | Method and device for displaying virtual keyboard |
CN104850337A (en) * | 2014-02-17 | 2015-08-19 | 苏州浩创信息科技有限公司 | Touch control and gesture recognition device for data information collecting terminal |
CN108475126A (en) * | 2017-05-27 | 2018-08-31 | 深圳市柔宇科技有限公司 | The processing method and touch keyboard of touch operation |
CN109976553A (en) * | 2019-04-26 | 2019-07-05 | 广州视源电子科技股份有限公司 | Operation processing method, device, equipment and medium based on keyboard |
CN111104041A (en) * | 2019-12-24 | 2020-05-05 | 北京东土科技股份有限公司 | Gesture operation recognition method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101498973A (en) * | 2008-01-30 | 2009-08-05 | 义隆电子股份有限公司 | Touch control interpretation structure and method for executing touch control application program by multi-finger gesture |
CN101517519A (en) * | 2006-09-29 | 2009-08-26 | Lg电子株式会社 | Method of generating key code in coordinate recognition device and apparatus using the same |
CN101667081A (en) * | 2008-09-03 | 2010-03-10 | 安恭爀 | User interface method |
-
2011
- 2011-08-04 CN CN2011102223087A patent/CN102426491A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101517519A (en) * | 2006-09-29 | 2009-08-26 | Lg电子株式会社 | Method of generating key code in coordinate recognition device and apparatus using the same |
CN101498973A (en) * | 2008-01-30 | 2009-08-05 | 义隆电子股份有限公司 | Touch control interpretation structure and method for executing touch control application program by multi-finger gesture |
CN101667081A (en) * | 2008-09-03 | 2010-03-10 | 安恭爀 | User interface method |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102929389A (en) * | 2012-09-27 | 2013-02-13 | 北京像素软件科技股份有限公司 | Device and method for converting user input information |
CN102929389B (en) * | 2012-09-27 | 2016-03-09 | 北京像素软件科技股份有限公司 | A kind of conversion equipment of user's input information and method |
CN104182163A (en) * | 2013-05-27 | 2014-12-03 | 华为技术有限公司 | Method and device for displaying virtual keyboard |
CN104182163B (en) * | 2013-05-27 | 2018-07-13 | 华为技术有限公司 | A kind of method and device of display dummy keyboard |
CN104850337A (en) * | 2014-02-17 | 2015-08-19 | 苏州浩创信息科技有限公司 | Touch control and gesture recognition device for data information collecting terminal |
CN103809912A (en) * | 2014-03-03 | 2014-05-21 | 欧浦登(福建)光学有限公司 | Tablet personal computer based on multi-touch screen |
CN108475126A (en) * | 2017-05-27 | 2018-08-31 | 深圳市柔宇科技有限公司 | The processing method and touch keyboard of touch operation |
WO2018218392A1 (en) * | 2017-05-27 | 2018-12-06 | 深圳市柔宇科技有限公司 | Touch operation processing method and touch keyboard |
CN109976553A (en) * | 2019-04-26 | 2019-07-05 | 广州视源电子科技股份有限公司 | Operation processing method, device, equipment and medium based on keyboard |
CN111104041A (en) * | 2019-12-24 | 2020-05-05 | 北京东土科技股份有限公司 | Gesture operation recognition method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102426491A (en) | Multipoint touch realization method and system for touch screen | |
US8446389B2 (en) | Techniques for creating a virtual touchscreen | |
US8432301B2 (en) | Gesture-enabled keyboard and associated apparatus and computer-readable storage medium | |
TWI505155B (en) | Touch-control method for capactive and electromagnetic dual-mode touch screen and handheld electronic device | |
CN102197359A (en) | Multi-touch manipulation of application objects | |
CN102253709A (en) | Method and device for determining gestures | |
US20130106707A1 (en) | Method and device for gesture determination | |
US10282087B2 (en) | Multi-touch based drawing input method and apparatus | |
US9778780B2 (en) | Method for providing user interface using multi-point touch and apparatus for same | |
JP2015090670A (en) | Electronic apparatus, method, and program | |
US20140189609A1 (en) | Method for controlling two or three dimensional figure based on touch and apparatus thereof | |
CN102752435A (en) | Mobile phone with cursor and cursor control method thereof | |
JP5634617B1 (en) | Electronic device and processing method | |
US20150355769A1 (en) | Method for providing user interface using one-point touch and apparatus for same | |
CN104407793A (en) | Method and equipment for processing touch signal | |
WO2012059595A1 (en) | Touch detection | |
CN101887332B (en) | Positioning method and positioning device for touch panel | |
CN102662592A (en) | Data output method and data output device | |
CN103616973A (en) | Operation method of touch screen and touch screen device | |
US20130241844A1 (en) | Method of Touch Command Integration and Touch System Using the Same | |
CN103365456A (en) | Gesture operation method | |
TWI478017B (en) | Touch panel device and method for touching the same | |
CN104679312A (en) | Electronic device as well as touch system and touch method of electronic device | |
CN102650926B (en) | Electronic device with touch type screen and display control method of touch type screen | |
CN115867883A (en) | Method and apparatus for receiving user input |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20120425 |