CN113467690A - Mouse control method and computing device - Google Patents

Mouse control method and computing device Download PDF

Info

Publication number
CN113467690A
CN113467690A CN202110801693.4A CN202110801693A CN113467690A CN 113467690 A CN113467690 A CN 113467690A CN 202110801693 A CN202110801693 A CN 202110801693A CN 113467690 A CN113467690 A CN 113467690A
Authority
CN
China
Prior art keywords
mouse
moving
directions
determining
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110801693.4A
Other languages
Chinese (zh)
Inventor
刘阳明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Tongxin Software Technology Co ltd
Original Assignee
Chengdu Tongxin Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Tongxin Software Technology Co ltd filed Critical Chengdu Tongxin Software Technology Co ltd
Priority to CN202110801693.4A priority Critical patent/CN113467690A/en
Publication of CN113467690A publication Critical patent/CN113467690A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a mouse control method, which is executed in computing equipment, wherein the computing equipment comprises an application, and a window suitable for displaying the application is arranged on a system desktop of the computing equipment, and the method comprises the following steps: when detecting the operation of pressing a mouse button and moving the mouse in the window, recording a moving path according to the movement operation of the mouse; determining one or more moving directions in sequence according to the moving path; and responding to the operation of releasing the mouse button, and determining a corresponding control instruction according to the one or more moving directions so as to execute the control instruction. The invention also discloses corresponding computing equipment. According to the mouse control method, the corresponding control instruction can be triggered only by operating the mouse, and the operation is simpler and more convenient.

Description

Mouse control method and computing device
Technical Field
The invention relates to the technical field of computers, in particular to a mouse control method and computing equipment.
Background
Based on the current operating system, when a file manager browses a file directory, the operating mode is single, and operations such as returning to a previous stage, advancing to a next stage, switching tags and the like are mainly executed by clicking a button on a graphical interface. A user familiar with the document manager may use the associated shortcut keys of the keyboard to perform the above-described operations, but these conventional manners of operation are not simple and straightforward.
The mainstream operation mode of the document manager in the market is based on the most traditional interface interaction mode, and the document manager provides buttons corresponding to various operations on an upper toolbar. For example, in a scenario where a user browses file directories by using a file manager, when the user needs to switch between file directories, the user needs to move the line of sight from the file browsing area to a toolbar above a window, find a return button, then move a mouse from the file browsing area to the toolbar above the window, click the return button on the toolbar, then move the line of sight from the toolbar above the window back to the file browsing area for continuous browsing, then move the mouse from the toolbar above the window back to the file browsing area, and continue to perform the next operation. According to the operation scheme, the switching process of the file directories is complex, and particularly when the directories are required to be switched repeatedly to search for files, a user needs to frequently and greatly move the sight and move a mouse, so that the user experience is poor.
In addition, in the prior art, the current moving trend of the mouse is mainly judged through the coordinates of the mouse, and only the four directions of the upper direction, the lower direction, the left direction and the right direction are distinguished. According to this method, the judgment of the direction is blurred when the mouse moving direction approaches 45 degrees, and the judgment result may not meet the user's expectation. Moreover, the combination of the gestures based on four directions is less, and the self-defining freedom degree of the supported gestures is lower.
Based on this, a mouse control method for a file manager is needed to solve the problems in the above technical solutions.
Disclosure of Invention
To this end, the present invention provides a mouse control method in an attempt to solve or at least alleviate the above-identified problems.
According to an aspect of the present invention, there is provided a mouse control method, executed in a computing device including an application therein, the computing device having a system desktop adapted to display a window of the application, the method comprising the steps of: when detecting the operation of pressing a mouse button and moving the mouse in the window, recording a moving path according to the movement operation of the mouse; determining one or more moving directions in sequence according to the moving path; and responding to the operation of releasing the mouse button, and determining a corresponding control instruction according to the one or more moving directions so as to execute the control instruction.
Optionally, in the mouse control method according to the present invention, the method further includes: presenting one or more indicator icons corresponding to the one or more movement directions in the window.
Optionally, in the mouse control method according to the present invention, the step of sequentially determining one or more moving directions according to the moving path includes: determining the current moving direction according to the moving path, and displaying an indication icon corresponding to the current moving direction on a window; when the moving direction is changed, a new changed moving direction is determined, and an indication icon corresponding to the new moving direction is presented after the current indication icon.
Optionally, in the mouse control method according to the present invention, the step of determining a corresponding control instruction according to the one or more moving directions includes: determining corresponding control instructions according to the one or more moving directions and the sequence thereof; and displaying a control icon corresponding to the control instruction on the window.
Optionally, in the mouse control method according to the present invention, the step of determining a corresponding control instruction according to the one or more moving directions includes: obtaining a configuration file, wherein each item in the configuration file comprises a control instruction and one or more moving directions associated with the control instruction; determining control instructions corresponding to the one or more movement directions based on a profile.
Optionally, in the mouse control method according to the present invention, the step of determining one or more moving directions according to the moving path includes: acquiring two adjacent coordinate points when the mouse moves according to the moving path, and calculating a mouse moving vector according to the two coordinate points; calculating an included angle between the mouse movement vector and a preset direction vector; and determining a direction corresponding to the included angle from a plurality of preset directions as a moving direction.
Optionally, in the mouse control method according to the present invention, the preset plurality of directions include: the upper direction, the lower direction, the left direction, the right direction, the upper left direction, the lower right direction and the lower left direction.
Optionally, in the mouse control method according to the present invention, the window includes a mouse moving direction display list, and one or more indication icons are adapted to be sequentially displayed in the mouse moving direction display list.
Optionally, in the mouse control method according to the present invention, the step of recording a corresponding movement path according to a movement operation of the mouse further includes: and presenting a moving track corresponding to the moving path in the window.
Optionally, in the mouse control method according to the present invention, when an operation of pressing a mouse button and moving the mouse in the window is detected, the recording the movement path according to the movement operation of the mouse includes: and when the operation of pressing a right mouse button and moving the mouse in the window is detected, recording a corresponding moving path in real time according to the moving operation of the mouse.
Optionally, in the mouse control method according to the present invention, when it is detected that a mouse button is pressed in a window, a corresponding mousePressEvent function is called, and whether a right mouse button is pressed is determined through the mousePressEvent function; when it is determined that the right mouse button is pressed and it is detected that the mouse is moved in a state where the right mouse button is pressed, calling a mouseoveevent function, recording a movement path according to the movement operation of the mouse through the mouseoveevent function, sequentially determining one or more movement directions according to the movement path of the mouse, and recording the sequence of the one or more movement directions; and when the operation of releasing the right mouse button is detected, calling a mousereleaseEvent function, and determining a control instruction corresponding to one or more moving directions and the sequence thereof through the mousereleaseEvent function.
Optionally, in the mouse control method according to the present invention, the application includes a file manager adapted to execute the mouse control method.
According to an aspect of the present invention, there is provided a computing device comprising: at least one processor; and a memory storing program instructions, wherein the program instructions are configured to be executed by the at least one processor, the program instructions comprising instructions for performing the mouse control method as described above.
According to an aspect of the present invention, there is provided a readable storage medium storing program instructions which, when read and executed by a computing device, cause the computing device to perform the method as described above.
According to the technical scheme of the invention, the mouse control method is provided, and the corresponding control instruction can be triggered by pressing the right mouse button and moving the mouse for operation. Specifically, one or more moving directions and the sequence thereof are associated with corresponding control instructions, so that when the mouse is moved under the condition that a right mouse button is pressed, the corresponding control instructions can be determined by determining the moving direction or directions and the sequence of the moving directions in the moving process of the mouse, and the control instructions are triggered to execute corresponding operations. Therefore, the corresponding control instruction can be triggered only by operating the mouse, and the operation is simpler and quicker, so that the usability and convenience of the user in using the application are enhanced, and the operation efficiency and experience of the user are improved.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a computing device 100, according to one embodiment of the invention;
FIG. 2 shows a flow diagram of a mouse control method 200 according to one embodiment of the invention;
FIG. 3 shows a schematic diagram of a window 300 of an application in accordance with an embodiment of the invention; and
FIG. 4 is a diagram illustrating a curve of a cosine function according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 is a schematic block diagram of an example computing device 100.
As shown in FIG. 1, in a basic configuration 102, a computing device 100 typically includes a system memory 106 and one or more processors 104. A memory bus 108 may be used for communication between the processor 104 and the system memory 106.
Depending on the desired configuration, the processor 104 may be any type of processing, including but not limited to: a microprocessor (UP), a microcontroller (UC), a digital information processor (DSP), or any combination thereof. The processor 104 may include one or more levels of cache, such as a level one cache 110 and a level two cache 112, a processor core 114, and registers 116. The example processor core 114 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The example memory controller 118 may be used with the processor 104, or in some implementations the memory controller 118 may be an internal part of the processor 104.
Depending on the desired configuration, system memory 106 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 106 may include an operating system 120, one or more applications 122, and program data 124. In some implementations, the application 122 can be arranged to execute instructions on an operating system with program data 124 by one or more processors 104.
Computing device 100 also includes a storage device 132, storage device 132 including removable storage 136 and non-removable storage 138.
Computing device 100 may also include a storage interface bus 134. The storage interface bus 134 enables communication from the storage devices 132 (e.g., removable storage 136 and non-removable storage 138) to the basic configuration 102 via the bus/interface controller 130. At least a portion of the operating system 120, applications 122, and data 124 may be stored on removable storage 136 and/or non-removable storage 138, and loaded into system memory 106 via storage interface bus 134 and executed by the one or more processors 104 when the computing device 100 is powered on or the applications 122 are to be executed.
Computing device 100 may also include an interface bus 140 that facilitates communication from various interface devices (e.g., output devices 142, peripheral interfaces 144, and communication devices 146) to the basic configuration 102 via the bus/interface controller 130. The example output device 142 includes a graphics processing unit 148 and an audio processing unit 150. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 152. Example peripheral interfaces 144 may include a serial interface controller 154 and a parallel interface controller 156, which may be configured to facilitate communication with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 158. An example communication device 146 may include a network controller 160, which may be arranged to facilitate communications with one or more other computing devices 162 over a network communication link via one or more communication ports 164.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in a manner that encodes information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Computing device 100 may be implemented as a personal computer including both desktop and notebook computer configurations. Of course, computing device 100 may also be implemented as part of a small-form factor portable (or mobile) electronic device such as a cellular telephone, a digital camera, a Personal Digital Assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset, an application specific device, or a hybrid device that include any of the above functions. And may even be implemented as a server, such as a file server, a database server, an application server, a WEB server, and so forth. The embodiments of the present invention are not limited thereto.
In an embodiment in accordance with the invention, the computing device 100 is configured to execute a mouse control method 200 in accordance with the invention. Among other things, the computing device 100 contains a plurality of program instructions for executing the mouse control method 200 of the present invention, which are adapted to be read and executed by a processor such that the mouse control method 200 of the present invention can be executed in the computing device 100.
According to an embodiment of the present invention, the computing device 100 includes a plurality of program instructions in an application for executing the mouse control method 200 of the present invention, such that the mouse control method 200 of the present invention can be executed in the application.
According to one embodiment, the application executing the mouse control method 200 of the present invention includes a file manager, such that the mouse control method 200 of the present invention may be executed in the file manager.
FIG. 2 shows a flow diagram of a mouse control method 200 according to one embodiment of the invention. The mouse control method 200 may be performed in a computing device, such as the computing device 100 described above. Computing device 100 includes one or more applications, and a window for the one or more applications may be displayed on a system desktop of the computing device.
As shown in fig. 2, the method 200 begins at step S210.
In step S210, when it is detected that a mouse button is pressed in the window of the application and the operation of moving the mouse is started, a corresponding moving path is recorded in real time according to the moving operation of the mouse. Here, the application may be implemented as a file manager, but the present invention is not limited thereto.
In one embodiment, the mouse button may be a right mouse button to distinguish operation of a left mouse button. Therefore, when the operation that the right mouse button is pressed in the window and the mouse is moved is detected, the corresponding moving path is recorded in real time according to the moving operation of the mouse. Specifically, an operation of pressing a right mouse button in the window may be received first. When the mouse moves in a state that the right button is pressed, the moving path of the mouse is recorded according to the moving operation of the mouse in response to the moving operation of the mouse when the right button of the mouse is pressed.
FIG. 3 shows a schematic diagram of a window 300 of an application in accordance with an embodiment of the invention. According to an embodiment, when a corresponding moving path is recorded according to a moving operation of the mouse, a moving track corresponding to the moving path, such as the moving track shown by a dotted arrow in fig. 3, may be drawn and presented in the window 300, so that the user may view the current moving state of the mouse in real time, so as to operate the mouse in time to change the moving direction according to actual needs.
Subsequently, in step S220, one or more moving directions are sequentially determined according to the moving path of the mouse.
It should be noted that the moving direction of the mouse may be changed once or several times during the moving process, so that a plurality of moving directions may be generated, and thus, a plurality of moving directions may be included in the recorded moving path of the complete mouse.
According to one embodiment, sequentially determining one or more moving directions according to the moving path of the mouse may be performed according to the following method: and in the moving process of the mouse, determining the current moving direction in real time according to the moving path of the mouse, and determining a new moving direction after the change when the moving direction of the mouse is changed. Thus, each moving direction during the movement of the mouse is determined in turn according to the moving sequence of the mouse. Thus, the one or more moving directions determined according to the moving path of the present invention are arranged based on the moving order of the mouse.
According to an embodiment, when step S220 is performed, as shown in fig. 3, one or more indication icons corresponding to one or more moving directions may also be presented in the window 300. Here, one indication icon corresponds to each moving direction.
According to one embodiment, during the movement of the mouse, the current movement direction is determined in real time according to the movement path of the mouse, and an indication icon corresponding to the current movement direction is presented in the window 300. When the mouse moving direction is changed, a new moving direction after the change is determined, and a new indication icon corresponding to the new moving direction is added after the currently existing indication icon of the window 300, so that a new indication icon corresponding to the new moving direction is presented.
In one implementation, as shown in fig. 3, a window 300 of the application includes a mouse moving direction display list, and one or more indication icons may be displayed in the mouse moving direction display list in sequence (occurrence sequence of moving direction during mouse moving), for example, indication icons corresponding to moving directions of "right", "upper right", and "upper" are sequentially arranged in the mouse moving direction display list in fig. 3. Here, "right", "upper right" and "upper up" are the respective moving directions determined according to the moving trajectory in fig. 3.
It should be understood that, in the mouse moving process, the indication icon corresponding to each moving direction in the mouse moving process is presented in the fixed area (the mouse moving direction display list) of the window in real time, so that the user can intuitively see the current moving direction of the mouse in real time without moving the sight back and forth, know the current operating state of the mouse, and conveniently control the further moving operation of the mouse in time according to the current operating state and the target operating state.
Finally, in step S230, when an operation of releasing the mouse button (right button) is detected, a corresponding control instruction is determined according to one or more moving directions in response to the operation of releasing the mouse button, so as to execute the control instruction. Here, it should be noted that the key released in step S230 is the same key as the key pressed in the aforementioned step S210. In one embodiment, when the pressed button in step S210 is a right button, and the operation of releasing the right button of the mouse by the user is detected in step S230, a corresponding control instruction is determined according to one or more moving directions in response to the operation of releasing the right button of the mouse.
Here, in one embodiment, the corresponding control command is determined according to one or more moving directions and their sequence (the occurrence sequence of each moving direction during the mouse movement). In addition, after the control instruction is determined, a control icon corresponding to the control instruction may be displayed in the window so that the user can view the control instruction to be executed according to the mouse operation. For example, when it is determined that the control instruction is a return instruction, a "return" icon is displayed in the window 300, as shown in fig. 3.
According to one embodiment, the invention establishes a corresponding configuration file in advance based on the mapping relation between the moving direction of the mouse and the control command. Each entry in the profile includes a control instruction and one or more movement directions associated with the control instruction in a sequential order.
In one embodiment, when determining the corresponding control instruction according to one or more moving directions, a configuration file is first obtained, and the control instruction corresponding to the one or more moving directions and the sequence thereof is determined by querying the configuration file.
In one embodiment, the movement direction and control instructions are defined by enumeration when the configuration file is established. Specifically, a corresponding direction value is defined for each direction, and a corresponding control instruction value is defined for each control instruction. Furthermore, for each control instruction, one or more moving directions corresponding to the control instruction and arranged in sequence are determined, and a control instruction value corresponding to the control instruction is associated with a moving direction value corresponding to the one or more moving directions, so that an association relationship is established between the control instruction and the one or more moving directions. Therefore, when the control instruction corresponding to one or more moving directions in the mouse moving process is inquired based on the configuration file, the control instruction value matched with the moving direction value corresponding to one or more moving directions in the mouse moving process in the configuration file can be inquired, and the control instruction corresponding to the control instruction value is further determined.
For example, the correspondence between the moving direction and the direction value is shown in the following table:
Direction value of
On the upper part 0
Lower part 1
Left side of 2
Right side 3
Upper right part 4
Upper left of 5
Lower right 6
Left lower part 7
The correspondence between the control command and the control command value is shown in the following table:
Figure BDA0003164822250000091
Figure BDA0003164822250000101
if one or more moving directions corresponding to the return instruction are defined as 'right' and 'upper right' in sequence. The association of the return instruction with "right", "upper right" can be achieved by matching the control instruction value "0" with the movement direction values "3, 4" in the configuration file. Therefore, when the window of the application needs to perform the return operation, as long as the mouse is controlled to move to the right first and then move to the upper right in the moving process, the return instruction can be triggered, so that the corresponding return operation can be performed based on the window of the current application.
According to one embodiment, determining one or more moving directions from the movement path may specifically be performed according to the following method:
and in the moving process of the mouse, two coordinate points which are adjacent in the front and back directions in the moving process of the mouse are obtained in real time according to the recorded moving path, and a mouse moving vector is calculated according to the two coordinate points in the front and back directions so as to determine the moving direction according to the mouse moving vector. Further, an angle between the mouse movement vector and a predetermined direction vector (e.g., an X-axis direction vector) is calculated. Then, a corresponding moving direction may be determined according to an angle between the mouse moving vector and a predetermined direction vector, and specifically, a direction corresponding to the angle is determined from a plurality of preset directions, and the direction is taken as the moving direction.
Here, the present invention is not particularly limited to the predetermined plurality of directions. It should be noted that the preset directions are preset based on the actual requirement of triggering the control command by the mouse moving direction. The preset directions are easy to distinguish, and the corresponding moving direction is easy to distinguish according to the mouse moving operation, so that the corresponding control instruction is accurately and conveniently triggered through the mouse moving operation. In one implementation, the predetermined plurality of directions includes, for example: the upper direction, the lower direction, the left direction, the right direction, the upper left direction, the lower right direction and the lower left direction. Based on the eight preset directions, the supported combinations of the mouse moving directions are more, the degree of freedom is higher, the moving direction combinations corresponding to the control instructions can be provided more flexibly, the moving direction combinations of the mouse can be configured for the control instructions, and the control instructions are controlled more through the shortcut operation of the right button of the mouse.
For example, when the mouse moves from coordinate point (1,1) to point (4,5), the mouse movement vector can be calculated as
Figure BDA0003164822250000111
Based on the invention, only the movement direction information of the mouse needs to be determined, and the movement distance does not need to be determined, therefore, for the subsequent convenient calculation, the calculated information can be used
Figure BDA0003164822250000118
Normalizing the vector to obtain a normalized vector
Figure BDA0003164822250000119
Conversion to unit vectors
Figure BDA0003164822250000112
Then, by calculating the movement of the mouseThe angle between the direction vector and the preset direction vector determines the moving direction of the mouse. For example, the predetermined direction may be the X-axis direction, and the predetermined direction vector is
Figure BDA0003164822250000113
(Vector)
Figure BDA0003164822250000114
And vector
Figure BDA0003164822250000115
The angle of (d) is denoted as θ. The point-by-point formula for the known vector is:
Figure BDA0003164822250000116
and the vector has been normalized, based on which the vector dot product formula can be simplified as:
Figure BDA0003164822250000117
the cosine of the angle θ can thus be calculated by the following formula: cos θ ═ xa*xb+ya*yb. Therefore, the cosine value of the included angle between the mouse moving direction and the X-axis direction can be calculated.
FIG. 4 is a diagram illustrating a curve of a cosine function according to an embodiment of the present invention. As shown in fig. 4, in the cosine function curve, 0-2 pi is a period, and 2 pi is divided into 8 parts to correspond to 8 moving directions of the mouse, wherein each part corresponds to a moving direction, so that the current moving direction of the mouse can be determined to belong to one of eight directions by determining the interval where the cos θ is located. It should be noted that, except that the cosine values of the sections located in the horizontal left direction and the horizontal right direction are unique, the cosine values of the three sections of the upper right, the upper left and the upper left are respectively the same as the cosine values of the three sections of the lower right, the lower left and the lower left, and based on this, if the cosine values are in the sections, the positive and negative of the y value of the mouse movement vector are further judged. If y is greater than 0, the upper right, upper left and lower right, and if y is less than 0, the upper right, lower left and lower right are determined.
According to one implementation mode, the logic of the invention for responding to the mouse operation is based on a mouse event response mechanism provided by a Qt development framework, and when a mouse button is pressed, moved or released in a window, a mouse event QMouseEvent is generated, which includes various parameters for describing the mouse operation event. Qt provides processing functions corresponding to a plurality of mouse operation events, and different mouse operations trigger corresponding functions. The processing function corresponding to the mouse operation event comprises the following steps: the method comprises the steps of pressing a mousepressEssevent function corresponding to a mouse button, releasing a mouseReleaseEvent function corresponding to the mouse button, double-clicking a mouseDoubleClickEvent function corresponding to the mouse button and moving a mouseEveEvent function corresponding to the mouse.
For example, in one window, when a mouse button is pressed, a mousepressEsseEvent function is called, when the mouse moves, the mouseEveEvent function is called, when the mouse button is released, the mouseReleaseEvent function is called, and after that, the mouse is moved to not trigger the mouseEveEvent function. When the series of functions are called, a QMueseEvent event object is received, and the current event type of the mouse and the current relative position of the mouse in the window can be acquired from the QMueseEvent.
In addition, according to the event delivery rule of Qt, in a multi-level interface, events are delivered in the order of delivery from lower level to higher level, for example, if there are several sub-components in a window, when the mouse presses the left button on a sub-component, the mousePressEvent function of the sub-component where the mouse is located is called and receives a qmousesevent event object, and if the sub-component chooses not to process the event, the mousePressEvent function of the window is called.
In one embodiment, according to the mouse event response mechanism provided by Qt, when a mouse button (right button) is detected to be pressed in a window, a corresponding mousePressEvent function is called, whether the right button of the mouse is pressed is determined by the mousePressEvent function, and a variable value of the window is modified.
When it is determined that the right mouse button is pressed and it is detected that the mouse starts to move in the state that the right mouse button is pressed, a mousemovieevent function corresponding to the moving mouse is called, the mousemovieevent function can record a corresponding moving path in real time according to the moving operation of the mouse and judge the moving direction of the mouse in real time according to the moving path, so that one or more moving directions are sequentially determined according to the moving path of the mouse, and the sequence of the one or more moving directions is recorded.
When the operation of releasing a mouse button (right button) is detected, a mouseReleaseEvent function is called, control instructions corresponding to one or more moving directions and sequences thereof are determined through the mouseReleaseEvent function, namely, whether corresponding control instructions exist in a configuration file or not is inquired according to the one or more moving directions and the sequences thereof, and if the corresponding control instructions exist, a subsequent control operation method is triggered.
According to the mouse control method 200 of the present invention, the corresponding control command can be triggered by pressing the right mouse button and moving the mouse. Specifically, one or more moving directions and the sequence thereof are associated with corresponding control instructions, so that when the mouse is moved under the condition that a right mouse button is pressed, the corresponding control instructions can be determined by determining the moving direction or directions and the sequence of the moving directions in the moving process of the mouse, and the control instructions are triggered to execute corresponding operations. Therefore, the corresponding control instruction can be triggered only by operating the mouse, and the operation is simpler and quicker, so that the usability and convenience of the user in using the application are enhanced, and the operation efficiency and experience of the user are improved.
A7, the method as in a6, wherein the predetermined plurality of directions comprises: the upper direction, the lower direction, the left direction, the right direction, the upper left direction, the lower right direction and the lower left direction.
A8, the method as in any one of A2-A7, wherein the window includes a mouse moving direction showing list in which one or more indication icons are adapted to be displayed in sequence.
A9, the method according to any one of a1-A8, wherein the step of recording the corresponding moving path according to the moving operation of the mouse further comprises: and presenting a moving track corresponding to the moving path in the window.
A10, the method according to any one of a1-a9, wherein when detecting an operation of pressing a mouse button and moving the mouse within the window, the recording the moving path according to the moving operation of the mouse includes: and when the operation of pressing a right mouse button and moving the mouse in the window is detected, recording a corresponding moving path in real time according to the moving operation of the mouse.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U.S. disks, floppy disks, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to execute the multilingual spam-text recognition method of the present invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, readable media may comprise readable storage media and communication media. Readable storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with examples of this invention. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (10)

1. A mouse control method executed in a computing device including an application therein, the computing device having a system desktop adapted to display a window of the application thereon, the method comprising the steps of:
when detecting the operation of pressing a mouse button and moving the mouse in the window, recording a moving path according to the movement operation of the mouse;
determining one or more moving directions in sequence according to the moving path; and
and responding to the operation of releasing the mouse button, and determining a corresponding control instruction according to the one or more moving directions so as to execute the control instruction.
2. The method of claim 1, further comprising the steps of:
presenting one or more indicator icons corresponding to the one or more movement directions in the window.
3. The method of claim 1 or 2, wherein the step of sequentially determining one or more moving directions from the moving path comprises:
determining the current moving direction according to the moving path, and displaying an indication icon corresponding to the current moving direction on a window;
when the moving direction is changed, a new changed moving direction is determined, and an indication icon corresponding to the new moving direction is presented after the current indication icon.
4. The method of any of claims 1-3, wherein determining the respective control instruction based on the one or more movement directions comprises:
determining corresponding control instructions according to the one or more moving directions and the sequence thereof; and
and displaying a control icon corresponding to the control instruction on the window.
5. The method of any of claims 1-4, wherein determining the respective control instruction based on the one or more movement directions comprises:
obtaining a configuration file, wherein each item in the configuration file comprises a control instruction and one or more moving directions associated with the control instruction;
determining control instructions corresponding to the one or more movement directions based on a profile.
6. The method of any one of claims 1-5, wherein determining one or more directions of movement from the path of movement comprises:
acquiring two adjacent coordinate points when the mouse moves according to the moving path, and calculating a mouse moving vector according to the two coordinate points;
calculating an included angle between the mouse movement vector and a preset direction vector;
and determining a direction corresponding to the included angle from a plurality of preset directions as a moving direction.
7. The method of any one of claims 1-6,
when a mouse button is detected to be pressed in a window, calling a corresponding mousepressEvent function, and judging whether a right mouse button is pressed through the mousepressEvent function;
when it is determined that the right mouse button is pressed and it is detected that the mouse is moved in a state where the right mouse button is pressed, calling a mouseoveevent function, recording a movement path according to the movement operation of the mouse through the mouseoveevent function, sequentially determining one or more movement directions according to the movement path of the mouse, and recording the sequence of the one or more movement directions;
and when the operation of releasing the right mouse button is detected, calling a mousereleaseEvent function, and determining a control instruction corresponding to one or more moving directions and the sequence thereof through the mousereleaseEvent function.
8. The method of any one of claims 1-7, wherein the application comprises a file manager adapted to execute the mouse control method.
9. A computing device, comprising:
at least one processor; and
a memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-8.
10. A readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the method of any of claims 1-8.
CN202110801693.4A 2021-07-15 2021-07-15 Mouse control method and computing device Pending CN113467690A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110801693.4A CN113467690A (en) 2021-07-15 2021-07-15 Mouse control method and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110801693.4A CN113467690A (en) 2021-07-15 2021-07-15 Mouse control method and computing device

Publications (1)

Publication Number Publication Date
CN113467690A true CN113467690A (en) 2021-10-01

Family

ID=77880525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110801693.4A Pending CN113467690A (en) 2021-07-15 2021-07-15 Mouse control method and computing device

Country Status (1)

Country Link
CN (1) CN113467690A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408824A (en) * 2008-11-18 2009-04-15 广东威创视讯科技股份有限公司 Method for recognizing mouse gesticulation
CN102402361A (en) * 2010-09-08 2012-04-04 腾讯科技(深圳)有限公司 Method and device for controlling on computer based on movement track of mouse
US20120124472A1 (en) * 2010-11-15 2012-05-17 Opera Software Asa System and method for providing interactive feedback for mouse gestures
CN102662581A (en) * 2012-03-31 2012-09-12 奇智软件(北京)有限公司 Method and system for performing control by mouse input
CN107450808A (en) * 2017-09-22 2017-12-08 北京知道创宇信息技术有限公司 The mouse pointer localization method and computing device of a kind of browser

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408824A (en) * 2008-11-18 2009-04-15 广东威创视讯科技股份有限公司 Method for recognizing mouse gesticulation
CN102402361A (en) * 2010-09-08 2012-04-04 腾讯科技(深圳)有限公司 Method and device for controlling on computer based on movement track of mouse
US20120124472A1 (en) * 2010-11-15 2012-05-17 Opera Software Asa System and method for providing interactive feedback for mouse gestures
CN102662581A (en) * 2012-03-31 2012-09-12 奇智软件(北京)有限公司 Method and system for performing control by mouse input
CN107450808A (en) * 2017-09-22 2017-12-08 北京知道创宇信息技术有限公司 The mouse pointer localization method and computing device of a kind of browser

Similar Documents

Publication Publication Date Title
CN105824559B (en) False touch recognition and processing method and electronic equipment
US8378989B2 (en) Interpreting ambiguous inputs on a touch-screen
JP5784551B2 (en) Gesture recognition method and touch system for realizing the method
US20160062649A1 (en) System and method for preview and selection of words
CN107665434B (en) Payment method and mobile terminal
EP2770419B1 (en) Method and electronic device for displaying virtual keypad
CN107084736A (en) A kind of air navigation aid and mobile terminal
US20150146986A1 (en) Electronic apparatus, method and storage medium
US11150797B2 (en) Method and device for gesture control and interaction based on touch-sensitive surface to display
WO2019015581A1 (en) Text deletion method and mobile terminal
US9317199B2 (en) Setting a display position of a pointer
CN111966260B (en) Window display method and computing device
CN107368249A (en) A kind of touch control operation recognition methods, device and mobile terminal
CN107515681B (en) A kind of character input method, mobile terminal and computer readable storage medium
TWI617971B (en) System and method for turning pages of an object through gestures
CN107368205B (en) Handwriting input method and mobile terminal
CN113467690A (en) Mouse control method and computing device
CN103809794A (en) Information processing method and electronic device
JP5735126B2 (en) System and handwriting search method
EP3210101B1 (en) Hit-test to determine enablement of direct manipulations in response to user actions
CN113467695B (en) Task execution method and device, computing device and storage medium
CN111752428A (en) Icon arrangement method and device, electronic equipment and medium
CN111078028A (en) Input method, related device and readable storage medium
US20190073117A1 (en) Virtual keyboard key selections based on continuous slide gestures
WO2023026567A1 (en) Information processing device, information processing method, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211001