CN103164064A - System and method enabling corresponding control to be carried out when target gesture is input at any position - Google Patents

System and method enabling corresponding control to be carried out when target gesture is input at any position Download PDF

Info

Publication number
CN103164064A
CN103164064A CN2011104216798A CN201110421679A CN103164064A CN 103164064 A CN103164064 A CN 103164064A CN 2011104216798 A CN2011104216798 A CN 2011104216798A CN 201110421679 A CN201110421679 A CN 201110421679A CN 103164064 A CN103164064 A CN 103164064A
Authority
CN
China
Prior art keywords
target
gesture
instruction set
track
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011104216798A
Other languages
Chinese (zh)
Other versions
CN103164064B (en
Inventor
邱全成
刘晓安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rydberg Information Technology (Zhejiang) Co., Ltd.
Original Assignee
Shun Shun Yuan (shanghai) Technology Co Ltd
Inventec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shun Shun Yuan (shanghai) Technology Co Ltd, Inventec Corp filed Critical Shun Shun Yuan (shanghai) Technology Co Ltd
Priority to CN201110421679.8A priority Critical patent/CN103164064B/en
Publication of CN103164064A publication Critical patent/CN103164064A/en
Application granted granted Critical
Publication of CN103164064B publication Critical patent/CN103164064B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed are a system and a method enabling corresponding control to be carried out when a target gesture is input at any position. According to the system and the method enabling the corresponding control to be carried out when the target gesture is input at any position, a full-screen transparent input layer is provided, a gesture track of the target gesture input at any position of the transparent input layer is captured, a technological mean of a corresponding target instruction set is judged according to the captured gesture track, and therefore the technical function of customized gestures is achieved by means of gesture input at any position of a touch control electronic device.

Description

Input at an arbitrary position the target gesture to carry out corresponding system and the method thereof of controlling
Technical field
The present invention relates to a kind of system and method thereof of carrying out corresponding control according to the target gesture, particularly a kind of optional position input target gesture that is provided at is to carry out corresponding system and the method thereof of controlling.
Background technology
Traditionally, the communication interface of people and computing machine or machine is mainly take keyboard and mouse as main.Along with the progress of science and technology, the control mode of electronic installation is also more and more polynary, thereby has produced more friendly man-machine interface, and in order to allow user's operating electronic devices more intuitively, the mode of therefore utilizing Touch Screen to operate is more and more general.In recent years, flourish due to high-tech industry, the rise of drive information and consumer products, the user is also day by day ardent for ease of Use interface demand, makes the touch interface become already the product design trend.
The electronic installation that for example has many scientific ﹠ technical corporation all to deliver to utilize Touch Screen to control.But in present touch control electronic device, if will be with the gesture operation touch control electronic device, the user can only input gesture on fixing position, can't input in the position that is fit to voluntarily.That is to say, the user can not change to some extent for the input position of operation touch control electronic device, therefore, operate the degree of freedom of touch control electronic device except reducing the user, if the position of input gesture is overlapping with the content of wanting to browse, easily affect browsing of content when the input gesture, cause user's puzzlement.
In sum, prior art has existed the gesture input position of touch control electronic device to be restricted the problem that can't change since medium-term and long-term always as can be known, therefore is necessary to propose improved technological means, solves this problem.
Summary of the invention
Because the problem that prior art exists the gesture input position of touch control electronic device to change, the present invention discloses a kind of optional position input target gesture that is provided at then to carry out corresponding system and the method thereof of controlling, wherein:
The optional position input target gesture that is provided at disclosed in this invention is to carry out the corresponding system that controls, and comprise at least: input layer provides module, in order to the transparent input layer of full frame to be provided; The track capturing module is in order to catch the gesture track of the target gesture of inputting in the optional position of transparent input layer; The instruction set setting module, in order to set corresponding target instruction set according to the gesture track, wherein, target instruction set is corresponding with objective function, and comprises one or more target instruction target words; The instruction set judge module is in order to the target instruction set according to gesture track judgement correspondence; The instruction set execution module is in order to the performance objective instruction set to be provided, so as to completing objective function.
The optional position input target gesture that is provided at disclosed in this invention is to carry out the corresponding method of controlling, its step comprises at least: provide and set the target instruction set corresponding with the target gesture, target instruction set is corresponding with objective function, and comprises one or more target instruction target words; The transparent input layer of full frame is provided; The gesture track of the target gesture that seizure is inputted in the optional position of transparent input layer; According to target instruction set corresponding to gesture track judgement; Provide the performance objective instruction set to complete objective function.
System and method for disclosed in this invention as above, and after the difference between prior art is that the present invention is by the transparent input layer that full frame is provided, the gesture track of the target gesture that seizure is inputted in the optional position of transparent input layer, and according to target instruction set corresponding to gesture track judgement that captures, so as to solving the existing problem of prior art, and can reach the technology effect of customized gesture.
Description of drawings
Fig. 1 is that the optional position input target gesture that is provided at of the present invention is to carry out the corresponding system architecture diagram of controlling.
Fig. 2 a is that the optional position input target gesture that is provided at of the present invention is to carry out the corresponding method flow diagram of controlling.
Fig. 2 b is target setting gesture of the present invention and the method detailed process flow diagram of corresponding target instruction set.
Fig. 2 c is the method detailed process flow diagram of target instruction set corresponding to judgement target gesture of the present invention.
Fig. 3 is the described setting of embodiment of the present invention interface schematic diagram.
[primary clustering symbol description]
100 systems
110 input layers provide module
120 track capturing modules
130 instruction set setting modules
150 instruction set judge modules
170 instruction set execution modules
190 input pattern judge modules
300 set the interface
310 target trajectory viewing areas
320 predetermined instructions are selected field
330 self-ordered instruction input fields
341 set button
342 cancel button
350 have set instruction field
420 Touch Screens
460 mediums
Step 210 provides sets the target instruction set corresponding with the target gesture, and target instruction set is corresponding with objective function and comprise one or more target instruction target words
Step 211 provides the transparent input layer of full frame
Step 213 catches the gesture track of the target gesture of inputting in transparent input layer
The setting feature of step 215 judgement gesture track
Step 219 storing and setting is characterized as the index feature corresponding with target instruction set
Step 230 judges whether to enter input pattern
Step 250 provides the transparent input layer of full frame
Step 260 catches the gesture track of the target gesture of inputting in the optional position of transparent input layer
Step 270 is according to target instruction set corresponding to gesture track judgement
Step 271 is by gesture track judgement track characteristic
Step 273 comparison track characteristic and the index feature corresponding with target instruction set
When step 277 meets the index feature when track characteristic, the corresponding gesture track of judgement target instruction set
Step 290 provides the performance objective instruction set to complete objective function
Embodiment
Below will coordinate graphic and embodiment describes feature of the present invention and embodiment in detail, content is enough to make any related art techniques person of haveing the knack of can fully understand easily the applied technological means of technical solution problem of the present invention and implement according to this, realizes whereby the attainable effect of the present invention.
The invention provides the user and input self-defining target gesture, and definition and the corresponding target instruction set of target gesture, and when the user inputs the target gesture, carry out corresponding target instruction set, so as to completing objective function.
The target instruction set that the present invention carries comprises one or more target instruction target words, each target instruction target word is after execution, can reach corresponding result, for example target instruction target word can be the deletion document, specific functions in unlatching specific software, input specific character, executive software etc., the present invention is not as limit.In addition, target instruction set can be completed corresponding objective function after being performed.
Wherein, objective function of the present invention can be the predetermined function that job platform that the present invention operates provides, document such as storage after a certain software editing is also closed this software or closeall software etc., can be even also the corresponding character of input, and the present invention is not as limit.Objective function can be also the self-defining function of user, for example stores the document after a certain software editing and opens or reload the document or particular document is copied to a plurality of catalogues medium with another software, and the present invention is not limited with above-mentioned yet.
Following elder generation is that the optional position input target gesture that is provided at of the present invention illustrates System Operation of the present invention to carry out the corresponding system architecture diagram of controlling with Fig. 1.As shown in Figure 1, system 100 of the present invention contains input layer provides module 110, track capturing module 120, instruction set setting module 130, instruction set judge module 150 and instruction set execution module 170.
Input layer provides module 110 to be responsible for providing the transparent input layer of full frame on the Touch Screen 420 of system 100 outsides.Input layer provides the transparent input layer that module 110 provides can't find for the user, can provide the user in the optional position of Touch Screen 420 input target gesture.
The target gesture that is transfused on Touch Screen 420 is by user's arbitrary decision, for example, the user draws a line, a triangle of picture or draws at Touch Screen 420 does not have well-regulated lines or shape, and the lines that these are marked or figure can be the target gestures that the present invention carries.
Track capturing module 120 is responsible for after having the target gesture to be transfused on transparent input layer, catches the gesture track of the target gesture that is transfused to, for example, and horizontal line, triangle, circle, star or an irregular shape etc.
Track capturing module 120 can be after capturing the gesture track of target gesture, the gesture track that captures is carried out the judgement of feature, the feature that track capturing module 120 judges, such as the direction of gesture track, turning point, turning angle, length etc., but the present invention is not as limit.
It is worth mentioning that, if the target gesture is to be transfused in the setting interface of target setting gesture and corresponding target instruction set, the feature that judges of track capturing module 120 is called as " setting feature " in the present invention, if and the target gesture is not to be transfused in the setting interface of target setting gesture and corresponding target instruction set, for example in general inputting interface, the feature that judges of track capturing module 120 is called as " track characteristic " in the present invention.
Instruction set setting module 130 is responsible for providing and is inputted target instruction target word and/or selection with the target gesture corresponding target instruction target word corresponding with the target gesture, and the target instruction target word that is transfused to is the target instruction set corresponding with the target gesture with selecteed target instruction target word.
Instruction set setting module 130 also is responsible for setting feature that the gesture track that track capturing module 120 catches is judged as the index of corresponding target instruction set, and be stored in medium 460, make setting feature that track capturing module 120 is judged become the index feature of corresponding target instruction set, so as to completing the setting of gesture track that track capturing module 120 catches and corresponding target instruction set.
The gesture track that instruction set judge module 150 is responsible for catching according to track capturing module 120 judges corresponding target instruction set.Generally speaking, instruction set judge module 150 can be compared the track characteristic of the gesture track judgement that track capturing module 120 catches and the index feature corresponding with target instruction set, as compare position relationship between the turning point, breakpoint, point of crossing of track or comparison track process coordinate points position relationship and through the order of coordinate points etc., when instruction set judge module 150 is compared out track characteristic and is consistent with the index feature, namely can judge the gesture track that the corresponding track capturing module 120 of the target instruction set that conforms to corresponding index feature catches.
Instruction set execution module 170 is responsible for providing and is carried out the target instruction set that instruction set judge module 150 is judged, so as to completing the objective function corresponding with the target instruction set that is judged out.
Generally speaking, instruction set execution module 170 is the performance objective instruction set directly, so as to completing objective function.But the present invention is as limit, for example, instruction set execution module 170 also can calling system the Application Program Interface (API) of software of 100 outsides, so as to the Application Program Interface performance objective instruction set by software.Wherein, provide the software of the Application Program Interface of being called out by instruction set execution module 170 to be installed in the performed job platform of system 100.
In addition, the system 100 that the present invention carries can also comprise input pattern judge module 190, input pattern judge module 190 is responsible for judging whether system 100 enters input pattern, make input layer provide module 110 after input pattern judge module 190 judgement systems 100 enter input pattern, just provide transparent input layer to give the input that the user carries out the target gesture.
Then explain orally operation system of the present invention and method with an embodiment, and please refer to Fig. 2 a optional position input target gesture that is provided at of the present invention to carry out the corresponding method flow diagram of controlling.In the present embodiment, the present invention carries out on portable unit, and this portable unit comprises Touch Screen 420 and medium 460.
Use when of the present invention the user, instruction set setting module 130 can first provide the user to set the target instruction set (step 210) corresponding with the target gesture, so as to the target gesture is stored in the medium 460 of portable unit with corresponding target instruction set, and complete the corresponding setting of target gesture and target instruction set the user after, just can use the target gesture performance objective instruction set of having set.In the present embodiment, suppose in user's setting, comprise the target gestures such as left arrow, pentalpha, X, its left arrow will be controlled the instruction that browser is carried out twice " returning to page up " and " the rearrangement page ", and pentalpha is carried out " adding collection to " for controlling browser instruction, X control the instruction of carrying out job platform deletion document of the present invention.
In actual applications, if on portable unit, performed the present invention comprises input pattern judge module 190, after the present invention begins to carry out, no matter whether instruction set setting module 130 provides user's target setting gesture and corresponding target instruction set (step 210), input pattern judge module 190 can judge just whether portable unit enters input pattern (step 230).When the user uses the browser browsing page, because the user may not only have simple browsing page, and may carry out the predetermined function that browser provides, therefore, input pattern judge module 190 can judge that portable unit enters input pattern, so, input layer provides module 110 that the transparent input layer (step 250) of full frame can be provided on Touch Screen 420.
And if on portable unit, performed the present invention does not comprise input pattern judge module 190, after the present invention begins to carry out, no matter input layer provides module 110 whether to provide user's target setting gesture and corresponding target instruction set (step 210), the transparent input layer (step 250) that full frame is provided that all can continue with instruction set setting module 130 on Touch Screen 420.
After input layer provides the transparent input layer (step 250) that module 110 provides full frame, if the user wishes the webpage of browsing " www.inv.com " is added in collection, just the user can be directly at the center of Touch Screen 420, the target gesture of the optional position input pentalphas such as upper left, lower-left, upper right, bottom right.
When the user inputted the target gesture on Touch Screen 420, track capturing module 120 just can capture the gesture track (step 260) of the target gesture that the user inputs.Due in the present embodiment, the target gesture that the user inputs is pentalpha, and therefore, track capturing module 120 can catch the gesture track of pentalpha.
Due to user's constipation bundle input after the target gesture of input pentalpha, therefore, capture the gesture track of pentalpha in track capturing module 120 after, to temporarily can not capture again the gesture track, therefore, over time, for example after 0. 5 seconds, track capturing module 120 can confirm that the user has completed input, and beginning is according to the gesture track judgement track characteristic that captures, namely the track characteristic of pentalpha, for example have 5 isometric line segments, direction be sequentially lower-left, upper right, left and right under, upper left etc.
Capture the gesture track (step 260) of the target gesture that the user inputs in track capturing module 120 after, instruction set judge module 150 can judge corresponding target instruction set (step 270) according to the gesture track that track capturing module 120 captures, and reads corresponding target instruction set.in the present embodiment, presumptive instruction collection judge module 150 can be as shown in the flow process of Fig. 2 c, after the track characteristic (step 271) that track capturing module 120 is judged by capturing gesture track, the index feature (step 273) of the target instruction set that the medium 460 of the track characteristic comparison portable unit that use track capturing module 120 is judged is stored, due in medium 460, really stored the feature judged according to the gesture track of pentalpha as the target instruction set of index feature, the instruction of namely " adding collection to ", therefore, instruction set judge module 150 can be judged the index feature that has the track characteristic that meets pentalpha after comparison, therefore, instruction set judge module 150 can judge that index is characterized as the gesture track corresponding (step 277) that target instruction set and the track capturing module 120 of the track characteristic of pentalpha catch.
After instruction set judge module 150 was read the corresponding target instruction set of the gesture track that captures with track capturing module 120, instruction set execution module 170 can provide the performance objective instruction set to complete objective function (step 290).In the present embodiment, instruction set execution module 170 can directly be carried out the instruction of " adding collection to ", for example, the document of the browser collection folder that the unlatching user uses, and the network address " www.inv.com " of the webpage of user's browsing is write in document, then close document, so just complete the instruction of adding collection to.In fact, the Application Program Interface that instruction set execution module 170 also can provide by the browser that the user uses, call out the function of opening the bookmark list, and the network address " www.inv.com " of the webpage of user's browsing is added in the bookmark list, so, after the user presses confirmation in the bookmark list, namely can complete the instruction of " adding bookmark ".
In the above-described embodiment, presumptive instruction collection setting module 130 provides setting interface 300 as shown in Figure 3, wherein, set that interface 300 comprises target trajectory viewing area 310, predetermined instruction is selected field 320, self-ordered instruction input field 330, sets button 341, cancelled button 342 and set instruction field 350.
when instruction set setting module 130 provides user's target setting gesture with corresponding target instruction set, will be as shown in the flow process of Fig. 2 b, input layer provides module 110 that the transparent input layer (step 211) of full frame can be provided in Touch Screen 420, wherein, if on portable unit, performed the present invention comprises input pattern judge module 190, input pattern judge module 190 can provide at instruction set setting module 130 that the judgement portable unit enters load module when setting interface 300, therefore, input layer provides module 110 can provide the transparent input layer (step 211) of full frame equally in Touch Screen 420.
Then, the user can input the target gesture on Touch Screen 420, such as left arrow, pentagram, X etc., track capturing module 120 just can capture the gesture track (step 213) of the target gesture that the user inputs, and the setting feature (step 215) of the gesture track that captures of judgement.Wherein, because the process of the setting feature of the gesture track of captured target gesture and judgement gesture track is similar to above-mentioned step 260, therefore no longer add description.
Capture the gesture track (step 213) of the target gesture that the user inputs in track capturing module 120 after, provide gesture track display that instruction set setting module 130 can capture in the target trajectory viewing area 310 of setting interface 300.
And provide when setting interface 300 at instruction set setting module 130, instruction set setting module 130 also provides the user to select select target instruction in field 320 at predetermined instruction, and/or input target instruction target word in self-ordered instruction input field 330, and provide the user to press to set button 341 to set in instruction field 350 so as to the selected target instruction target word of user and/or the target instruction target word inputted are presented at.Instruction set setting module 130 also provides the user select target instruction and press and cancel button 342 in setting instruction field 350, the target instruction target word of having set so as to deletion.
After the setting feature (step 215) of the gesture track that track capturing module 120 judgement captures and instruction set setting module 130 provide the target setting instruction set, the setting feature that instruction set setting module 130 can be judged track capturing module 120 is as the index feature of the instruction set setting module 130 target setting instruction set that provides, store index feature and target instruction set (step 219), so, instruction set setting module 130 just provides the user to complete the setting of target gesture and corresponding target instruction set.
In sum, after difference between the present invention and prior art is that it has the transparent input layer that full frame is provided as can be known, the gesture track of the target gesture that seizure is inputted in the optional position of transparent input layer, and the technological means of the target instruction set corresponding according to the gesture track judgement that captures, can solve by this technological means the problem that the gesture input position of the existing touch control electronic device of prior art can't change, and then reach the technology effect of customized gesture.
Moreover, the optional position input target gesture that is provided at of the present invention is to carry out the corresponding method of controlling, can be implemented in the combination of hardware, software or hardware and software, also can be in computer system realize or realize with the dispersing mode that different assemblies intersperse among some interconnected computer systems with centralized system.
Although embodiment disclosed in this invention as above, described content is not in order to direct restriction scope of patent protection of the present invention.Any persond having ordinary knowledge in the technical field of the present invention under the prerequisite that does not break away from spirit and scope disclosed in this invention, does a little change retouching to reaching of enforcement of the present invention in form on details, all belong to scope of patent protection of the present invention.Scope of patent protection of the present invention, still must with claim the person of being defined be as the criterion.

Claims (10)

1. one kind is provided at optional position input target gesture to carry out the corresponding method of controlling, and it is characterized in that, the method comprises the following step at least:
Provide and set a target instruction set corresponding with a target gesture, this target instruction set is corresponding with an objective function, and comprises one or more target instruction target words;
One transparent input layer of full frame is provided;
One gesture track of this target gesture that seizure is inputted in the optional position of this transparent input layer;
According to this target instruction set corresponding to this gesture track judgement; And
Provide and carry out this target instruction set to complete this objective function.
2. the optional position input target gesture that is provided at as claimed in claim 1 is to carry out the corresponding method of controlling, it is characterized in that, the step of the target instruction set corresponding according to the judgement of this gesture track this gesture track judgement one track characteristic of serving as reasons, and compare this track characteristic and an index feature corresponding with target instruction set, when track characteristic meets the index feature, judge that this target instruction set is to should the gesture track.
3. the optional position input target gesture that is provided at as claimed in claim 1 is to carry out the corresponding method of controlling, it is characterized in that, provide the step of setting a target instruction set corresponding with a target gesture also to comprise a gesture track that catches the target gesture of inputting in this transparent input layer, one of judgement gesture track is set feature, stores the step that this setting is characterized as the index feature corresponding with this target instruction set.
4. the optional position input target gesture that is provided at as claimed in claim 1 is to carry out the corresponding method of controlling, it is characterized in that, the method is before the step of the transparent input layer that full frame is provided, also comprise and judge whether to enter input pattern, and the step of transparent input layer just is provided after judgement enters input pattern.
5. one kind is provided at optional position input target gesture to carry out the corresponding system that controls, and it is characterized in that, this system comprises at least:
One input layer provides module, in order to a transparent input layer of full frame to be provided;
One track capturing module is in order to catch a gesture track of a target gesture of inputting in the optional position of this transparent input layer;
One instruction set setting module, in order to set a corresponding target instruction set according to this gesture track, wherein, target instruction set is corresponding with an objective function, and comprises one or more target instruction target words;
One instruction set judge module is in order to the target instruction set according to this gesture track judgement correspondence; And
One instruction set execution module is carried out this target instruction set in order to provide, so as to completing objective function.
6. the optional position input target gesture that is provided at as claimed in claim 5 is to carry out the corresponding system that controls, it is characterized in that, this track capturing module is catch the gesture track of the target gesture of inputting in transparent input layer after, one of judgement gesture track is set feature, this instruction set setting module storing and setting is characterized as the index feature corresponding with target instruction set, and is corresponding with this target instruction set so as to setting this gesture track.
7. the optional position input target gesture that is provided at as claimed in claim 5 is to carry out the corresponding system that controls, it is characterized in that, this instruction set judge module is track characteristic and the index feature corresponding with target instruction set that comparison track capturing module is judged by the gesture track, when track characteristic meets the index feature, judge that this target instruction set is to should the gesture track.
8. the optional position input target gesture that is provided at as claimed in claim 5 is to carry out the corresponding system that controls, it is characterized in that, this system also comprises the input pattern judge module, in order to judging whether this system enters input pattern, input layer provides module also in order to just to provide transparent input layer after input pattern judge module judgement system enters input pattern.
9. the optional position input target gesture that is provided at as claimed in claim 5 to carry out the corresponding system that controls, is characterized in that, the predetermined function that this objective function provides for the job platform of System Operation, or the self-defining function of user.
10. the optional position input target gesture that is provided at as claimed in claim 5 to carry out the corresponding system that controls, is characterized in that, this instruction set execution module is the performance objective instruction set, or carries out this target instruction set by the Application Program Interface of external software.
CN201110421679.8A 2011-12-15 2011-12-15 Input target gesture at an arbitrary position to perform system and the method thereof of corresponding control Active CN103164064B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110421679.8A CN103164064B (en) 2011-12-15 2011-12-15 Input target gesture at an arbitrary position to perform system and the method thereof of corresponding control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110421679.8A CN103164064B (en) 2011-12-15 2011-12-15 Input target gesture at an arbitrary position to perform system and the method thereof of corresponding control

Publications (2)

Publication Number Publication Date
CN103164064A true CN103164064A (en) 2013-06-19
CN103164064B CN103164064B (en) 2016-02-10

Family

ID=48587212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110421679.8A Active CN103164064B (en) 2011-12-15 2011-12-15 Input target gesture at an arbitrary position to perform system and the method thereof of corresponding control

Country Status (1)

Country Link
CN (1) CN103164064B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702156A (en) * 2013-12-09 2014-04-02 乐视致新电子科技(天津)有限公司 Method and device for user-customizing gesture track
CN104731439A (en) * 2013-12-19 2015-06-24 青岛海信移动通信技术股份有限公司 Gesture packaging and task executing method and device
CN105138169A (en) * 2015-08-26 2015-12-09 苏州市新瑞奇节电科技有限公司 Touch panel control device based on gesture recognition
CN105224212A (en) * 2014-07-01 2016-01-06 中国移动通信集团公司 A kind of terminal screen unlock method, device and terminal
CN105468270A (en) * 2014-08-18 2016-04-06 腾讯科技(深圳)有限公司 Terminal application control method and device
CN105763724A (en) * 2016-02-01 2016-07-13 携程计算机技术(上海)有限公司 Method for processing product ID in mobile terminal and mobile terminal
CN106610773A (en) * 2015-10-22 2017-05-03 阿里巴巴集团控股有限公司 Interface interaction method, device and equipment
CN106681354A (en) * 2016-12-02 2017-05-17 广州亿航智能技术有限公司 Flight control method and flight control device for unmanned aerial vehicles
CN106775427A (en) * 2017-01-18 2017-05-31 百度在线网络技术(北京)有限公司 Method and apparatus for collecting the page
CN107153552A (en) * 2017-06-06 2017-09-12 深圳市乃斯网络科技有限公司 The quick of terminal app exits method and system
CN109189316A (en) * 2018-09-06 2019-01-11 苏州好玩友网络科技有限公司 A kind of multistage command control method, device, touch screen terminal and storage medium
CN111273847A (en) * 2020-01-07 2020-06-12 北京字节跳动网络技术有限公司 Page processing method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101349944A (en) * 2008-09-03 2009-01-21 宏碁股份有限公司 Gesticulation guidance system and method for controlling computer system by touch control gesticulation
US20090217211A1 (en) * 2008-02-27 2009-08-27 Gesturetek, Inc. Enhanced input using recognized gestures
CN101546233A (en) * 2009-05-05 2009-09-30 上海华勤通讯技术有限公司 Identification and operation method of touch screen interface gestures
CN102073428A (en) * 2011-01-07 2011-05-25 中国科学院苏州纳米技术与纳米仿生研究所 Capacitance type flexible and transparent touch screen based on CNT film

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090217211A1 (en) * 2008-02-27 2009-08-27 Gesturetek, Inc. Enhanced input using recognized gestures
CN101349944A (en) * 2008-09-03 2009-01-21 宏碁股份有限公司 Gesticulation guidance system and method for controlling computer system by touch control gesticulation
CN101546233A (en) * 2009-05-05 2009-09-30 上海华勤通讯技术有限公司 Identification and operation method of touch screen interface gestures
CN102073428A (en) * 2011-01-07 2011-05-25 中国科学院苏州纳米技术与纳米仿生研究所 Capacitance type flexible and transparent touch screen based on CNT film

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702156A (en) * 2013-12-09 2014-04-02 乐视致新电子科技(天津)有限公司 Method and device for user-customizing gesture track
CN104731439A (en) * 2013-12-19 2015-06-24 青岛海信移动通信技术股份有限公司 Gesture packaging and task executing method and device
CN105224212A (en) * 2014-07-01 2016-01-06 中国移动通信集团公司 A kind of terminal screen unlock method, device and terminal
CN105468270A (en) * 2014-08-18 2016-04-06 腾讯科技(深圳)有限公司 Terminal application control method and device
CN105138169A (en) * 2015-08-26 2015-12-09 苏州市新瑞奇节电科技有限公司 Touch panel control device based on gesture recognition
CN106610773A (en) * 2015-10-22 2017-05-03 阿里巴巴集团控股有限公司 Interface interaction method, device and equipment
CN105763724A (en) * 2016-02-01 2016-07-13 携程计算机技术(上海)有限公司 Method for processing product ID in mobile terminal and mobile terminal
CN106681354A (en) * 2016-12-02 2017-05-17 广州亿航智能技术有限公司 Flight control method and flight control device for unmanned aerial vehicles
CN106681354B (en) * 2016-12-02 2019-10-08 广州亿航智能技术有限公司 The flight control method and device of unmanned plane
CN110377053A (en) * 2016-12-02 2019-10-25 广州亿航智能技术有限公司 The flight control method and device of unmanned plane
CN106775427A (en) * 2017-01-18 2017-05-31 百度在线网络技术(北京)有限公司 Method and apparatus for collecting the page
CN107153552A (en) * 2017-06-06 2017-09-12 深圳市乃斯网络科技有限公司 The quick of terminal app exits method and system
CN109189316A (en) * 2018-09-06 2019-01-11 苏州好玩友网络科技有限公司 A kind of multistage command control method, device, touch screen terminal and storage medium
CN111273847A (en) * 2020-01-07 2020-06-12 北京字节跳动网络技术有限公司 Page processing method and device, electronic equipment and storage medium
CN111273847B (en) * 2020-01-07 2021-07-20 北京字节跳动网络技术有限公司 Page processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN103164064B (en) 2016-02-10

Similar Documents

Publication Publication Date Title
CN103164064B (en) Input target gesture at an arbitrary position to perform system and the method thereof of corresponding control
US10963007B2 (en) Presentation of a virtual keyboard on a multiple display device
US20200110511A1 (en) Method and system for performing copy-paste operations on a device via user gestures
KR102133410B1 (en) Operating Method of Multi-Tasking and Electronic Device supporting the same
US20140075394A1 (en) Method and apparatus to facilitate interoperability of applications in a device
US20130111391A1 (en) Adjusting content to avoid occlusion by a virtual input panel
CN103116453B (en) A kind of operation management method of Drawing Object and operation management device
US20140145945A1 (en) Touch-based input control method
CN105339900A (en) Proxy gesture recognizer
CN101458591A (en) Mobile phone input system with multi-point touch screen hardware structure
US20140096082A1 (en) Display terminal and method for displaying interface windows
WO2014056338A1 (en) Method and device for interaction of list data of mobile terminal
CN112099695B (en) Icon position adjusting method and device and electronic equipment
US20140325400A1 (en) Multi-panel view interface for a browser operating on a computing device
US20150169147A1 (en) Information interchange via selective assembly using single gesture
WO2023045927A1 (en) Object moving method and electronic device
CN114518820A (en) Icon sorting method and device and electronic equipment
CN112764613A (en) Icon arranging method and device, electronic equipment and readable storage medium
CN112269501A (en) Icon moving method and device and electronic equipment
WO2024104157A1 (en) Application interface management method and apparatus, and electronic device and readable storage medium
TWI425438B (en) Device and method for polymorphism button of the stock quoting software on a mobile apparatus
CN113821288A (en) Information display method and device, electronic equipment and storage medium
CN112765500A (en) Information searching method and device
WO2022262722A1 (en) Response method and apparatus of electronic device, and electronic device
KR101381878B1 (en) Method, device, and computer-readable recording medium for realizing touch input using mouse

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20191023

Address after: Room 301, floor 3, building 1, No. 500, Ping'an Road, Wutong street, Tongxiang City, Jiaxing City, Zhejiang Province

Patentee after: Rydberg Information Technology (Zhejiang) Co., Ltd.

Address before: 200233, Xuhui District, Yishan Road, No. 2, building 1295, Shanghai

Co-patentee before: Yingda Co., Ltd.

Patentee before: Yingshunyuan (Shanghai) Technology Co., Ltd.

TR01 Transfer of patent right