CN101339453A - Simulated mouse input method based on interactive input apparatus - Google Patents

Simulated mouse input method based on interactive input apparatus Download PDF

Info

Publication number
CN101339453A
CN101339453A CNA2008100301855A CN200810030185A CN101339453A CN 101339453 A CN101339453 A CN 101339453A CN A2008100301855 A CNA2008100301855 A CN A2008100301855A CN 200810030185 A CN200810030185 A CN 200810030185A CN 101339453 A CN101339453 A CN 101339453A
Authority
CN
China
Prior art keywords
class
size
mouse
time period
fixed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008100301855A
Other languages
Chinese (zh)
Other versions
CN101339453B (en
Inventor
卢如西
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Gaohang Intellectual Property Operation Co ltd
JINGJIANG CHANGYUAN HYDRAULIC MACHINERY CO Ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN2008100301855A priority Critical patent/CN101339453B/en
Publication of CN101339453A publication Critical patent/CN101339453A/en
Application granted granted Critical
Publication of CN101339453B publication Critical patent/CN101339453B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

A simulated mouse input method based on an interactive input device is provided, and is characterized in that: the mapping relation between an action command sequence and mouse actions is predefined according to the habit of mouse control actions; when an action command sequence indicating a target is in the detection area, the mouse actions corresponding to the received action command sequence are sent to a computer in a computer-recognizable form, so that the computer executes the mouse actions corresponding to the action command sequence. In the present invention, an action command sequence is mapped to mouse actions, so that sequential actions can be combined to improve operation flexibility and accelerate response, and extends the range of definable actions; therefore, the definable actions are more diversified, and the achievable functions are more sophisticated. Moreover, in the present invention, the mapping relation between action command sequence and mouse actions is predefined according to the habits of mouse control actions; therefore a more practical and humanized simulation of mouse operations is realized.

Description

Simulated mouse input method based on interactive input equipment
Technical Field
The invention relates to an intelligent input technology based on interactive input equipment, in particular to a mouse simulation input method based on interactive input equipment.
Background
The existing method for simulating mouse input based on interactive input equipment can only realize simple clicking or dragging action by one finger, weakens the original complete function of the mouse, is not flexible and convenient to operate and use, and is not strong in individuation.
Chinese patent application No. 200610041804.1, publication No. CN1811684A, discloses a method for simulating mouse movement and click operation by dragging a finger on a touch screen of a mobile phone, which simulates movement of a mouse arrow, click of a left mouse button, and click of a right mouse button by dragging a finger on a touch screen of a mobile phone, but this method for simulating mouse input intelligently simulates partial functions of mouse operation, and the number of defined actions is small and single.
Disclosure of Invention
In view of the above problems in the prior art, an object of the present invention is to provide a method for simulating mouse input based on an interactive input device, which can improve the function of simulating mouse input, and has the advantages of flexible and convenient operation and strong humanization.
In order to achieve the purpose, the invention adopts the following technical scheme:
an analog mouse input method based on interactive input equipment comprises the following steps:
presetting a mapping relation between an action instruction sequence and a mouse operation action according to an action habit of holding a mouse operation, and when receiving an action instruction sequence of a target object in a detection area of interactive input equipment, sending the mouse operation action corresponding to the action instruction sequence to a computer in a computer-recognizable mode according to the mapping relation, wherein the mapping relation comprises any one or any combination of the following:
three first-class-sized objects move, representing a moving mouse;
after the three first-class-size target objects are fixed for a preset time period, two first-class-size target objects on the right side leave, and the last remaining first-class-size target object leaves to indicate that a left mouse button is clicked;
after three first-class-size target objects are fixed for a preset time period, two first-class-size target objects on the right side leave, and the remaining first-class-size target object moves to represent that the mouse is moved after a left mouse button is clicked;
after the three first-class-size target objects are fixed for a preset time period, one first-class-size target object on the right side leaves, and finally the two remaining first-class-size target objects leave to represent that the left button of the mouse is double-clicked;
after three first-class-size target objects are fixed for a preset time period, two first-class-size target objects positioned on the left side leave, and finally one first-class-size target object leaves to indicate that a right mouse button is clicked;
after a first-class-size target object is fixed for a preset time period, respectively putting down a first-class-size target object on the left side and the right side of the first-class-size target object, and finally, enabling the three first-class-size target objects to leave to represent that a mouse roller is pressed down;
after the two first-class-size objects are fixed for a preset time period, putting down a third first-class-size object between the two first-class-size objects, wherein the third first-class-size object moves upwards to represent that a mouse wheel rolls upwards;
after the two first-class-size objects are fixed for a preset time period, a third first-class-size object is put down between the two first-class-size objects, and the third first-class-size object moves downwards to show that the mouse wheel rolls downwards.
According to the invention, the mapping relation between the action instruction sequence and the mouse operation action is preset according to the action habit of holding the mouse operation, when the action instruction sequence of the target object in the detection area is received, the mouse operation action corresponding to the received action instruction sequence is sent to the computer in a computer recognizable mode according to the mapping relation between the preset action instruction sequence and the mouse operation action, and the computer executes the mouse operation action corresponding to the action instruction sequence. In addition, the invention presets the mapping relation between the action command sequence and the mouse operation action according to the habitual action of the mouse operation, is closer to the situation of actually operating the mouse and has stronger humanization.
Drawings
FIG. 1 is a schematic diagram of the composition of an interactive input device;
fig. 2 is an indicating intention of the operator in the embodiment of the present invention.
Detailed Description
As shown in fig. 1, it is a schematic diagram of an interactive input device, which mainly includes two major parts of a positioning detection system 2 and an information processing system 1 connected thereto. The positioning detection system 2 is mainly used for providing an input platform for the interactive input system, the function of the positioning detection system is equivalent to that of human eyes, the positioning detection system can be specifically realized through technologies such as resistance, capacitance, surface ultrasonic waves, infrared rays and electromagnetic induction, different implementation modes can be selected according to specific application or different requirements, and the information processing system 1 is mainly used for analyzing and processing information detected by the positioning detection system 2 and making corresponding operations such as application program execution or function options. In an embodiment of the present invention, the information processing system 1 is a computer system, and the information processing system 1 and the positioning detection system 2 are connected by a connection cable 3.
The invention relates to a mouse simulating input method based on interactive input equipment, which mainly comprises the following steps: firstly, presetting a mapping relation between an action instruction sequence and a mouse operation action according to an action habit of holding mouse operation, then receiving an action instruction sequence of a target object in a detection area of the interactive input equipment, when the action instruction sequence meets the mapping relation, sending the mouse operation action corresponding to the action instruction sequence to a computer in a computer-recognizable mode according to the mapping relation, and executing the mouse operation action corresponding to the action instruction sequence by the computer.
According to the scheme of the invention, the mapping relation between the received action instruction sequence and the mouse operation action is realized according to the habitual action of holding the mouse for operation, and in general, people are used to touch the screen with hands to realize input operation.
In the following description of the embodiments of the present invention, the first-size object is referred to as a finger, the second-size object is referred to as a fist, and the third-size object is referred to as a palm, and in actual application and use, other objects of different entities may be used.
In the solution of the present invention, the mapping relationship between the received action instruction sequence and the mouse operation function includes any one or any combination of table 1 below. In the following tables, the operation command sequence will be described in an implementation manner of the operator's specific operation for the sake of understanding.
TABLE 1 correspondence between action command sequences and mouse operating functions
Sequence of action commands Represented mouse function
Three finger movements Mobile mouse
Three fingers fixed t1-t2Second, then lift the right two fingers and finally lift the remaining one finger Left mouse click
Three fingers fixed t1-t2Second, then lift the right two fingers and move the remaining one After clicking the left button of the mouse, moving the mouse
Three fingers fixed t1-t2Second, then lift one finger on the right and finally lift the remaining two fingers Double-click left mouse button
Three fingers fixed t1-t2Second, then lift the left two fingers and finally lift the remaining one finger Right mouse click
One finger fixing t1-t2Second, put down one finger on each of the left and right sides, and lift up three fingers Pressing mouse wheel
Two fingers fixed t1-t2Second, then put down the third finger between the two fingers, the third finger moves up Upward rolling mouse roller
Two fingers fixed t1-t2Second, then put down the third finger between the two fingers, the third finger moves down Scroll downMouse roller
According to the mapping corresponding relationship between the action command sequence and the mouse operation function in the above table 1, the functions of each component such as the left button, the right button, the scroll wheel and the like of the mouse operation can be realized, and in order to further improve the function of the mouse operation, the mapping relationship between the action command sequence and the mouse operation function of the present invention may further include any one or any combination in the following table 2.
TABLE 2 correspondence between additional action command sequences and mouse operating functions
Sequence of action commands Represented mouse function
Three fingers fixed t1-t2Second, then lift three fingers Three-click left mouse button
One finger fixing t1-t2Second, then put down one finger on each of the left and right sides, and then move three fingers After the mouse roller is pressed down, the mouse is moved
One finger fixing t1-t2Second, put two fingers down on its right, then move three fingers up When the left mouse button is pressed, the mouse roller is rolled upwards
One finger fixing t1-t2Second, put two fingers down on its right, then move three fingers down When the left mouse button is pressed, the mouse roller is rolled downwards
One finger fixing t1-t2Second, put two fingers down to the left, and then move three fingers up When the right mouse button is pressed, the mouse roller is rolled upwards
One finger fixing t1-t2Second, put two fingers down to the left, and then move three fingers down When the right mouse button is pressed, the mouse roller is rolled downwards
Therefore, on the basis of realizing the basic mouse operation function, the scheme of the invention can further realize the operations of three-click of the left mouse button, up/down rolling of the mouse wheel after the left/right mouse button is pressed down, and the like, thereby further perfecting the function of simulating the mouse input.
Wherein, the above-mentioned t1-t2Second, is a short period of time, and t1<t2The specific time period length can be adjusted according to the requirements of the user. This is to consider that, due to different personal operation habits and the reason that it is not easy to grasp an accurate time point, even if the same operation is performed, the staying time of the finger of each operator in the detection area is different, and therefore, a preset time zone is set, and the staying time of the finger in the detection area can be regarded as an effective operation as long as the staying time is within the preset time zone, thereby avoiding the difficulty in operation caused by people who are limited to one time pointThe operation is more humanized.
In addition, according to the analog mouse input method based on the interactive input device of the present invention, when the finger that has been put into the detection system moves and performs the operation, and when a new finger is put into the detection area, the operation that is being performed is not affected. For example: when three fingers move upwards on the detection system, the computer system can execute corresponding operation of rolling the mouse wheel upwards, and in the moving process of the three fingers, if a fourth finger and a fifth finger are put in, the computer system can execute the previous operation of rolling the mouse wheel upwards, so that the influence of the carelessly put-in fingers on the execution of normal operation commands in the operation command executing process can be prevented, and misjudgment is avoided.
According to the corresponding relation shown in the tables 1 and 2, the mouse input method simulates the operation of holding the mouse by a human hand by three fingers, and accords with the input method of the daily mouse use habit of people. The mouse accords with the habit of using the mouse in daily life of people and is mainly embodied as follows: when a user uses the mouse, the hand is usually in an arc-shaped holding structure, and the mouse also adopts three fingers to simulate the arc-shaped holding structure of the hand. In addition, when the motion command sequence is associated with mouse operation, that is, in terms of motion design, the operation habit of a human is also taken into consideration, for example, three fingers are fixed to t1-t2Second, the action sequence of lifting the two fingers on the right side and lifting the rest finger at last simulates the operation of clicking the left button of the mouse; fix three fingers t1-t2Second, the action sequence of lifting the two fingers on the left side and lifting the rest finger at last is used for simulating the operation of clicking the right mouse button; fix a finger at t1-t2Second, then put down a finger on its left and right sides respectively, then three fingers move this action sequence to simulate the operation of moving the mouse after pressing the mouse wheel, etc., the setting of these corresponding relations is designed to simulate the habit of people to operate the mouse in daily life, and it is in line with the use habit of peopleThe humanization is strong, and people are more flexible when using the device.
In the above description of the specific embodiment, the present invention does not limit the specific operating finger, and for the consideration of the convenience of operation and the similarity to the actual operation, the present invention recommends using three fingers, i.e. the thumb, the index finger and the middle finger, to perform the simulated mouse input, so as to be closer to the true mouse operation gesture, as shown in fig. 2, however, according to the difference of the individual operation habits, the operator may select the operating finger according to the own operation habit. In the case that the target object is not a finger, the operator can also freely select a corresponding specific operation mode, which is not described herein.
According to the input method of the analog mouse, the operation speed can be improved, and the misjudgment probability can be reduced. This is particularly true in that the present invention distinguishes between different mouse operations by a combination of sequences of actions, e.g. fixing t with three fingers1-t2Second, then lift the right two fingers and finally lift the remaining finger to indicate clicking the left mouse button, and fix t with three fingers1-t2The second, then mention two fingers on the left, mention this action sequence of remaining finger at last and show the right key of mouse of clicking, namely, when distinguishing left key of clicking and right key of clicking, distinguish through the last finger that remains and mention at last whether the leftmost finger or the rightmost finger, and in prior art, represent the right key of mouse of clicking through the mode of long-time touch, operating time is long, and specific touch time is difficult to control, humanized inadequately and be unfavorable for high-efficient operation. According to the scheme of the invention, the left mouse click and the right mouse click are distinguished through different actions, so that the mouse is easy to control and highly humanized, the operation speed is increased, and the probability of misjudgment is reduced.
In order to express each function of mouse operation completely and expand each function item expressed by the method of the invention easily, the method of the invention can simulate each operation input expressed by the mouse through the action instruction sequence to increase the range of definable action, thereby expanding the basic function of the mouse while expressing each function of the mouse completely. In table 3 below, the correspondence between the extended action command sequence and the mouse operation action according to the present invention is illustrated.
TABLE 3 correspondence between extended action Command sequences and mouse operation actions
Sequence of action commands Represented mouse function
One finger fixing t1-t2Second, put a finger to the left Replication
One finger fixing t1-t2Second, put a finger on its right Sticking
Two fingers fixed t1-t2Second, then lift two fingers Deleting
Two fingers fixed t1-t2Second, then lift one of the fingers Insert into
A fist fixed t1-t2Retract after second and then move left with a finger Undo key-in/rollback
A fist fixed t1-t2Retract after second and then move to the right with one finger Resume typing/advancing
Two fingers fixed t1-t2Second, then put another finger on the left Open
Two fingers fixed t1-t2Second, then put another finger in the middle New construction
Two fingers fixed t1-t2Second, then put another finger on the right Preservation of
Two fingers fixed t1-t2Second, then put two more fingers in between Minimization
Two fingers fixed t1-t2Second, then put two more fingers on the two outer sides of the two fingers Maximization
Four fingers fixed t1-t2Second, then lift all over Close off
Two fingers fixed t1-t2Second, then put two more fingers on its right side Printing
Three fingers moving close to each other Shrinking
Three fingers open and move Amplification of
A palm rest t1-t2Second, then lift up Refreshing
...... ......
According to the correspondence between the action command sequence and the mouse operation action listed in the above tables 1, 2 and 3, when an action command is detected in the detection area, different implementations can be adopted to obtain the mouse operation action corresponding to the action command, and only two preferred implementations are listed below:
one is as follows: firstly, coding a preset action instruction sequence according to a preset coding mode, and storing the code and the corresponding mouse operation action in a database; when the action of the target object in the detection area is detected, namely the action instruction of the target object in the monitoring area is received, the action instruction is coded according to the preset coding mode, the code is stored in a preset storage area, and the code and the action code already stored in the preset storage area are combined into a code combination sequence; then comparing the code combination sequence with the codes stored in the database, when the codes corresponding to the code combination sequence exist in the database, sending the operation corresponding to the code combination sequence to the computer in a computer-recognizable mode, and executing the corresponding mouse operation action by the computer;
the second step is as follows: writing each defined action command sequence and the corresponding mouse operation action into a sub-function form in advance, taking the action command sequence as a definition condition of the corresponding sub-function, judging an action command of a target object in a detection area by detection, calling the corresponding sub-function when the action command meets the condition defined by the corresponding sub-function, and entering the corresponding operation action.
When detecting and judging the target objects, particularly when two or more target objects exist, in order to avoid misjudgment caused by a tiny sequential placement sequence, the invention also sets a preset time threshold value as a minimum time judgment threshold value of the sequential placement, when the time difference of the sequential placement is greater than the preset time threshold value, the sequential placement is judged to be effective, the sequential placement is judged to be sequential, and when the time difference of the sequential placement is less than the preset time threshold value, the sequential placement is judged to be ineffective, and the sequential placement is judged to be simultaneous. For example, two fingers fix t1-t2Second, the action sequence of retracting one finger on the right and leaving one finger on the left cannot be mistakenly judged to be a finger fixed t by introducing a preset time threshold1-t2Second, then a finger is fixed again t1-t2And second, the right finger is collected, and the left finger is left finally, so that the misjudgment probability is reduced.
In the case of determining the number of objects, the determination may be performed by: the detection system scans a detection area, and when a target object enters the detection area, several continuous shielding areas are calculated in the detection area, and several continuous shielding areas exist in the detection area, namely several target objects. The minimum distance between every two continuous shielding areas can be set according to the scanning resolution and the operation habit of the system, when human fingers are used as the target object, the average value of the distances between the two fingers when a plurality of operators operate with one hand can be calculated, the average value is used as the minimum distance threshold value for distinguishing the two continuous shielding areas, if the average value is larger than the minimum distance threshold value, the target object is judged to be two, and if the average value is smaller than the minimum distance threshold value, the target object is judged to be one.
The above-described embodiments of the present invention do not limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (10)

1. An analog mouse input method based on interactive input equipment is characterized by comprising the following steps:
according to the mouse holding operation action habit, the mapping relation between the action instruction sequence and the mouse operation action is preset, and when the action instruction sequence of a target object in the detection area of the interactive input equipment is received, the mouse operation action corresponding to the action instruction sequence is sent to a computer in a computer recognizable mode according to the mapping relation.
2. The interactive input device-based simulated mouse input method according to claim 1, wherein the preset manner of the mapping relationship comprises:
coding a preset action instruction sequence according to a preset coding mode, and storing the codes and mouse operation actions corresponding to the codes into a database; or,
respectively compiling corresponding subfunctions for each preset action instruction sequence and the mouse operation action corresponding to the action instruction sequence, and taking each preset action instruction sequence as a definition condition of the corresponding subfunction.
3. The interactive input device-based simulated mouse input method according to claim 1, wherein the mapping relationship comprises any one or any combination of the following:
three first-class-sized objects move, representing a moving mouse;
after the three first-class-size target objects are fixed for a preset time period, two first-class-size target objects on the right side leave, and the last remaining first-class-size target object leaves to indicate that a left mouse button is clicked;
after three first-class-size target objects are fixed for a preset time period, two first-class-size target objects on the right side leave, and the remaining first-class-size target object moves to represent that the mouse is moved after a left mouse button is clicked;
after the three first-class-size target objects are fixed for a preset time period, one first-class-size target object on the right side leaves, and finally the two remaining first-class-size target objects leave to represent that the left button of the mouse is double-clicked;
after three first-class-size target objects are fixed for a preset time period, two first-class-size target objects positioned on the left side leave, and finally one first-class-size target object leaves to indicate that a right mouse button is clicked;
after a first-class-size target object is fixed for a preset time period, respectively putting down a first-class-size target object on the left side and the right side of the first-class-size target object, and finally, enabling the three first-class-size target objects to leave to represent that a mouse roller is pressed down;
after the two first-class-size objects are fixed for a preset time period, putting down a third first-class-size object between the two first-class-size objects, wherein the third first-class-size object moves upwards to represent that a mouse wheel rolls upwards;
after the two first-class-size objects are fixed for a preset time period, a third first-class-size object is put down between the two first-class-size objects, and the third first-class-size object moves downwards to show that the mouse wheel rolls downwards.
4. The interactive input device-based simulated mouse input method according to claim 1 or 3, wherein the mapping relationship further comprises any one or any combination of the following:
three first-class-size target objects leave after being fixed for a preset time period, and three clicks of a left mouse button are indicated;
after a first-class object is fixed for a preset time period, putting down a first-class object on the left side and the right side of the first-class object respectively, and then moving the three first-class objects to represent that the mouse is moved after a mouse wheel is pressed down;
after one first-class-size target object is fixed for a preset time period, two first-class-size target objects are put down on the right side of the first-class-size target object, and then the three first-class-size target objects move upwards to show that a left mouse button is not pressed down and a mouse roller is rolled upwards;
after one first-class-size target object is fixed for a preset time period, two first-class-size target objects are put down on the right side of the first-class-size target object, and then the three first-class-size target objects move downwards to show that a left mouse button is not pressed down and a mouse roller is rolled downwards;
after one first-class-size target object is fixed for a preset time period, two first-class-size target objects are put down on the left side of the first-class-size target object, and then the three first-class-size target objects move upwards to show that the right button of the mouse is not pressed down and the mouse wheel is rolled upwards;
after one first-class-size object is fixed for a preset time period, two first-class-size objects are put down on the left side of the first-class-size object, and then the three first-class-size objects move downwards to show that the right button of the mouse is not pressed down and the mouse wheel is rolled downwards.
5. The interactive input device-based simulated mouse input method according to claim 1 or 3, wherein the mapping relationship further comprises any one or any combination of the following:
after a first-class target object is fixed for a preset time period, placing a next first-class target object on the left side of the first-class target object to represent copying;
after a first-class object is fixed for a preset time period, placing a next first-class object on the right side of the first-class object to represent adhesion;
two first-class size target objects leave after being fixed for a preset time period, and deletion is represented;
after the two first-class size target objects are fixed for a preset time period, any one of the first-class size target objects leaves and indicates insertion;
a second type of object leaves after being fixed for a preset time period, and a first type of object moves to the left to represent that key input/backward withdrawing is performed;
a second type of object leaves after being fixed for a preset time period, and a first type of object moves to the right to represent that the key input/forward is resumed;
after two first-class size target objects are fixed for a preset time period, placing a next first-class size target object on the left side of the two first-class size target objects to show opening;
after two first-class size targets are fixed for a preset time period, placing a next first-class size target between the two first-class size targets to represent new construction;
after two first-class size target objects are fixed for a preset time period, placing a next first-class size target object on the right side of the two first-class size target objects for representing and storing;
after the two first-class size target objects are fixed for a preset time period, placing the two first-class size target objects between the two first-class size target objects to represent minimization;
after two first-class size targets are fixed for a preset time period, respectively putting down one first-class size target on the left side and the right side of the two first-class size targets to represent maximization;
the four first-class target objects leave after being fixed for a preset time period, and represent closing;
after the two first-class size target objects are fixed for a preset time period, placing the two first-class size target objects on the right sides of the two first-class size target objects again to represent printing;
three objects of a first type of size are close to each other, indicating a zoom-out;
three objects of a first type of size are far from each other, indicating magnification;
an object of a third type size leaves after a fixed preset time period, indicating a refresh.
6. The interactive input device-based simulated mouse input method according to claim 4, wherein the mapping relationship further comprises any one or any combination of the following:
after a first-class target object is fixed for a preset time period, placing a next first-class target object on the left side of the first-class target object to represent copying;
after a first-class object is fixed for a preset time period, placing a next first-class object on the right side of the first-class object to represent adhesion;
two first-class size target objects leave after being fixed for a preset time period, and deletion is represented;
after the two first-class size target objects are fixed for a preset time period, any one of the first-class size target objects leaves and indicates insertion;
a second type of object leaves after being fixed for a preset time period, and a first type of object moves to the left to represent that key input/backward withdrawing is performed;
a second type of object leaves after being fixed for a preset time period, and a first type of object moves to the right to represent that the key input/forward is resumed;
after two first-class size target objects are fixed for a preset time period, placing a next first-class size target object on the left side of the two first-class size target objects to show opening;
after two first-class size targets are fixed for a preset time period, placing a next first-class size target between the two first-class size targets to represent new construction;
after two first-class size target objects are fixed for a preset time period, placing a next first-class size target object on the right side of the two first-class size target objects for representing and storing;
after the two first-class size target objects are fixed for a preset time period, placing the two first-class size target objects between the two first-class size target objects to represent minimization;
after two first-class size targets are fixed for a preset time period, respectively putting down one first-class size target on the left side and the right side of the two first-class size targets to represent maximization;
the four first-class target objects leave after being fixed for a preset time period, and represent closing;
after the two first-class size target objects are fixed for a preset time period, placing the two first-class size target objects on the right sides of the two first-class size target objects again to represent printing;
three objects of a first type of size are close to each other, indicating a zoom-out;
three objects of a first type of size are far from each other, indicating magnification;
an object of a third type size leaves after a fixed preset time period, indicating a refresh.
7. The interactive input device-based simulated mouse input method of claim 3, wherein the first size object is a finger, the second size object is a fist, and the third size object is a palm.
8. The interactive input device-based simulated mouse input method of claim 4, wherein the first size object is a finger, the second size object is a fist, and the third size object is a palm.
9. The interactive input device-based simulated mouse input method of claim 5, wherein the first size object is a finger, the second size object is a fist, and the third size object is a palm.
10. The interactive input device-based simulated mouse input method of claim 6, wherein the first size object is a finger, the second size object is a fist, and the third size object is a palm.
CN2008100301855A 2008-08-15 2008-08-15 Simulated mouse input method based on interactive input apparatus Expired - Fee Related CN101339453B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008100301855A CN101339453B (en) 2008-08-15 2008-08-15 Simulated mouse input method based on interactive input apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008100301855A CN101339453B (en) 2008-08-15 2008-08-15 Simulated mouse input method based on interactive input apparatus

Publications (2)

Publication Number Publication Date
CN101339453A true CN101339453A (en) 2009-01-07
CN101339453B CN101339453B (en) 2012-05-23

Family

ID=40213533

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008100301855A Expired - Fee Related CN101339453B (en) 2008-08-15 2008-08-15 Simulated mouse input method based on interactive input apparatus

Country Status (1)

Country Link
CN (1) CN101339453B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853076A (en) * 2010-04-29 2010-10-06 郭小卫 Method for acquiring input information by input equipment
CN102478959A (en) * 2010-11-28 2012-05-30 蒋霞 Control system and method for electronic device
CN102662530A (en) * 2012-03-20 2012-09-12 北京鸿合盛视数字媒体技术有限公司 Control method of multipoint touch infrared whiteboard in PPT mode
CN102687101A (en) * 2009-10-12 2012-09-19 拉奥纳克斯株式会社 Multi-touch type input controlling system
CN102929485A (en) * 2012-10-30 2013-02-13 广东欧珀移动通信有限公司 Character input method and device
CN101872263B (en) * 2009-04-24 2013-05-22 华硕电脑股份有限公司 Method for determining mouse instructions by trigger points
CN103389816A (en) * 2012-05-08 2013-11-13 昆盈企业股份有限公司 Signal sending method of touch input device
CN104007999A (en) * 2013-02-21 2014-08-27 西门子公司 Method for control application and relative system
CN104049975A (en) * 2009-03-16 2014-09-17 苹果公司 Event recognition
CN105302303A (en) * 2015-10-15 2016-02-03 广东欧珀移动通信有限公司 Game control method and apparatus and mobile terminal
US9575648B2 (en) 2007-01-07 2017-02-21 Apple Inc. Application programming interfaces for gesture operations
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US9690481B2 (en) 2008-03-04 2017-06-27 Apple Inc. Touch event model
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
US9798459B2 (en) 2008-03-04 2017-10-24 Apple Inc. Touch event model for web pages
CN108446073A (en) * 2018-03-12 2018-08-24 阿里巴巴集团控股有限公司 A kind of method, apparatus and terminal for simulating mouse action using gesture
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
CN110633044A (en) * 2019-08-27 2019-12-31 联想(北京)有限公司 Control method, control device, electronic equipment and storage medium
US10719225B2 (en) 2009-03-16 2020-07-21 Apple Inc. Event recognition
US10963142B2 (en) 2007-01-07 2021-03-30 Apple Inc. Application programming interfaces for scrolling

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9575648B2 (en) 2007-01-07 2017-02-21 Apple Inc. Application programming interfaces for gesture operations
US11954322B2 (en) 2007-01-07 2024-04-09 Apple Inc. Application programming interface for gesture operations
US11449217B2 (en) 2007-01-07 2022-09-20 Apple Inc. Application programming interfaces for gesture operations
US10963142B2 (en) 2007-01-07 2021-03-30 Apple Inc. Application programming interfaces for scrolling
US10613741B2 (en) 2007-01-07 2020-04-07 Apple Inc. Application programming interface for gesture operations
US10175876B2 (en) 2007-01-07 2019-01-08 Apple Inc. Application programming interfaces for gesture operations
US9665265B2 (en) 2007-01-07 2017-05-30 Apple Inc. Application programming interfaces for gesture operations
US10936190B2 (en) 2008-03-04 2021-03-02 Apple Inc. Devices, methods, and user interfaces for processing touch events
US10521109B2 (en) 2008-03-04 2019-12-31 Apple Inc. Touch event model
US11740725B2 (en) 2008-03-04 2023-08-29 Apple Inc. Devices, methods, and user interfaces for processing touch events
US9971502B2 (en) 2008-03-04 2018-05-15 Apple Inc. Touch event model
US9798459B2 (en) 2008-03-04 2017-10-24 Apple Inc. Touch event model for web pages
US9720594B2 (en) 2008-03-04 2017-08-01 Apple Inc. Touch event model
US9690481B2 (en) 2008-03-04 2017-06-27 Apple Inc. Touch event model
US11163440B2 (en) 2009-03-16 2021-11-02 Apple Inc. Event recognition
CN104049975B (en) * 2009-03-16 2017-04-19 苹果公司 Event recognition
US10719225B2 (en) 2009-03-16 2020-07-21 Apple Inc. Event recognition
US9965177B2 (en) 2009-03-16 2018-05-08 Apple Inc. Event recognition
US11755196B2 (en) 2009-03-16 2023-09-12 Apple Inc. Event recognition
CN104049975A (en) * 2009-03-16 2014-09-17 苹果公司 Event recognition
CN101872263B (en) * 2009-04-24 2013-05-22 华硕电脑股份有限公司 Method for determining mouse instructions by trigger points
CN102687101A (en) * 2009-10-12 2012-09-19 拉奥纳克斯株式会社 Multi-touch type input controlling system
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US12061915B2 (en) 2010-01-26 2024-08-13 Apple Inc. Gesture recognizers with delegates for controlling and modifying gesture recognition
US10732997B2 (en) 2010-01-26 2020-08-04 Apple Inc. Gesture recognizers with delegates for controlling and modifying gesture recognition
CN101853076A (en) * 2010-04-29 2010-10-06 郭小卫 Method for acquiring input information by input equipment
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
CN102478959A (en) * 2010-11-28 2012-05-30 蒋霞 Control system and method for electronic device
CN102662530A (en) * 2012-03-20 2012-09-12 北京鸿合盛视数字媒体技术有限公司 Control method of multipoint touch infrared whiteboard in PPT mode
CN103389816A (en) * 2012-05-08 2013-11-13 昆盈企业股份有限公司 Signal sending method of touch input device
CN102929485B (en) * 2012-10-30 2015-11-04 广东欧珀移动通信有限公司 A kind of characters input method and device
CN102929485A (en) * 2012-10-30 2013-02-13 广东欧珀移动通信有限公司 Character input method and device
CN104007999B (en) * 2013-02-21 2020-01-10 西门子公司 Method for controlling an application and related system
CN104007999A (en) * 2013-02-21 2014-08-27 西门子公司 Method for control application and relative system
US11429190B2 (en) 2013-06-09 2022-08-30 Apple Inc. Proxy gesture recognizer
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
CN105302303A (en) * 2015-10-15 2016-02-03 广东欧珀移动通信有限公司 Game control method and apparatus and mobile terminal
CN108446073A (en) * 2018-03-12 2018-08-24 阿里巴巴集团控股有限公司 A kind of method, apparatus and terminal for simulating mouse action using gesture
CN110633044B (en) * 2019-08-27 2021-03-19 联想(北京)有限公司 Control method, control device, electronic equipment and storage medium
CN110633044A (en) * 2019-08-27 2019-12-31 联想(北京)有限公司 Control method, control device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN101339453B (en) 2012-05-23

Similar Documents

Publication Publication Date Title
CN101339453A (en) Simulated mouse input method based on interactive input apparatus
CN103324440B (en) A kind of method utilizing multi-point touch to select word content
TWI478041B (en) Method of identifying palm area of a touch panel and a updating method thereof
CN103793057B (en) Information processing method, device and equipment
CN105549813B (en) A kind of method for controlling mobile terminal and mobile terminal
CN104035677B (en) The display methods and device of prompt message
CN103246382B (en) Control method and electronic equipment
US20120154313A1 (en) Multi-touch finger registration and its applications
CN103324271B (en) A kind of input method and electronic equipment based on gesture
CN103034427A (en) Touch screen page turning method and device and touch screen equipment
EP3044660A1 (en) Multi-touch virtual mouse
CN103218044B (en) A kind of touching device of physically based deformation feedback and processing method of touch thereof
US20140049513A1 (en) Terminal and method for inputting to terminal using two opposite ends of stylus
CN104536607A (en) Input method and device of touch ring based on watch
CN107273009A (en) A kind of method and system of the quick screenshotss of mobile terminal
CN104077066A (en) Portable device and operation method
CN103164160A (en) Left hand and right hand interaction device and method
WO2013071198A2 (en) Finger-mapped character entry systems
CN107870705A (en) A kind of change method and device of the picture mark position of application menu
EP2400379A1 (en) Graphical control of a computer by a user
CN104461365A (en) Touch method and device of terminal
CN107132927A (en) Input recognition methods and device and the device for identified input character of character
CN202267933U (en) Mouse-imitating touch pad
CN103176723A (en) Processing method and processing device for touch response
CN104423657B (en) The method and electronic equipment of information processing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address

Address after: Kezhu road high tech Industrial Development Zone, Guangzhou city of Guangdong Province, No. 233 510670

Patentee after: VTRON GROUP Co.,Ltd.

Address before: 510663 Guangzhou province high tech Industrial Development Zone, Guangdong, Cai road, No. 6, No.

Patentee before: VTRON TECHNOLOGIES Ltd.

CP03 Change of name, title or address
TR01 Transfer of patent right

Effective date of registration: 20201207

Address after: Unit 2414-2416, main building, no.371, Wushan Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee after: GUANGDONG GAOHANG INTELLECTUAL PROPERTY OPERATION Co.,Ltd.

Address before: Kezhu road high tech Industrial Development Zone, Guangzhou city of Guangdong Province, No. 233 510670

Patentee before: VTRON GROUP Co.,Ltd.

Effective date of registration: 20201207

Address after: 214516 south side of North Second Ring Road, Chengbei Park, Jingjiang Economic Development Zone, Taizhou City, Jiangsu Province

Patentee after: JINGJIANG CHANGYUAN HYDRAULIC MACHINERY Co.,Ltd.

Address before: Unit 2414-2416, main building, no.371, Wushan Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee before: GUANGDONG GAOHANG INTELLECTUAL PROPERTY OPERATION Co.,Ltd.

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120523

CF01 Termination of patent right due to non-payment of annual fee