CN117687557A - Key mouse traversing method and terminal equipment - Google Patents

Key mouse traversing method and terminal equipment Download PDF

Info

Publication number
CN117687557A
CN117687557A CN202211075099.2A CN202211075099A CN117687557A CN 117687557 A CN117687557 A CN 117687557A CN 202211075099 A CN202211075099 A CN 202211075099A CN 117687557 A CN117687557 A CN 117687557A
Authority
CN
China
Prior art keywords
mouse
screen
input
module
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211075099.2A
Other languages
Chinese (zh)
Inventor
房家鹏
李鹏飞
靳遥宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211075099.2A priority Critical patent/CN117687557A/en
Publication of CN117687557A publication Critical patent/CN117687557A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application provides a method for traversing a keyboard and a mouse and terminal equipment, wherein the method comprises the following steps: the first device establishes a connection with the second device; the first device displays a cursor corresponding to the input device on a first screen; the first device detects a first operation acting on the input device, the first operation being for dragging a cursor displayed on a first screen in a direction in which the second device is located; responding to the first operation, and controlling a cursor to move out of a first screen by the first device and enter a second screen of the second device for display; the first device receives a second operation acting on the input device, the second operation being for inputting a first control instruction on the second device; the first device intercepts an input event corresponding to the second operation and sends the input event corresponding to the second operation to the second device to instruct the second device to execute the first control instruction. According to the key mouse traversing method, different terminal devices can be operated through the mouse and the keyboard, and seamless access among the different terminal devices is achieved.

Description

Key mouse traversing method and terminal equipment
Technical Field
The application relates to the technical field of terminals, in particular to a key mouse traversing method and terminal equipment.
Background
With the development of terminal technology, the variety and number of terminal devices are increasing. Wherein, the terminal device may include: cell phones, tablets, computers, wearable devices, etc. Currently, one terminal device cannot meet the needs of a user, and the user can use a plurality of terminal devices, such as a mobile phone, a tablet and a computer, to meet the needs of work and study. However, when the services of the plurality of terminal devices all need to be processed by the user, the user needs to switch the operation modes back and forth between different terminal devices, which is complex in operation and is unfavorable for user experience.
Disclosure of Invention
The application provides a key mouse traversing method, which can operate different terminal devices through a mouse and a keyboard to realize seamless access among the different terminal devices.
In a first aspect, a method for traversing a mouse is provided and applied to a first device, wherein the first device is connected with an input device for inputting a control instruction, and the first device comprises a first screen; the method further comprises the steps of: the first device establishes a connection with the second device; the first device displays a cursor corresponding to the input device on a first screen; the first device detects a first operation acting on the input device, the first operation being for dragging a cursor displayed on a first screen in a direction in which the second device is located; responding to the first operation, and controlling a cursor to move out of a first screen by the first device and enter a second screen of the second device for display; the first device receives a second operation acting on the input device, the second operation being for inputting a first control instruction on the second device; the first device intercepts an input event corresponding to the second operation and sends the input event corresponding to the second operation to the second device to instruct the second device to execute the first control instruction.
The first device may be referred to as a terminal device, which is not limited in this embodiment of the present application. The first device may be a windows device, i.e., a windows operating system is deployed. The second device may be an android system, i.e. an android operating system is deployed.
The input device may be a keyboard and a mouse, which are simply referred to as a keyboard and a mouse in the embodiments of the present application. The cursor may be understood as a mouse pointer, but the present application is not limited thereto.
The first device and the second device may be connected through wifi or bluetooth, where wifi may include a local area network or person-to-person (P2P). The first device establishes a connection with the second device, it being understood that the first device may communicate with the second device.
After the first device and the second device are connected, the first device and the second device can support the mouse to pass through, namely, a cursor corresponding to the input device can be moved from a first screen of the first device to a second screen of the second device, and can also be moved from the second screen of the second device to the first screen of the first device.
The first operation is used for dragging the cursor displayed on the first screen to the direction where the second device is located, which can be understood as that the user controls the input device to move the cursor corresponding to the input device to the second screen.
The first device responds to the first operation, controls the cursor to move out of the first screen and enter a second screen of the second device for display, and at the moment, the input device can input control instructions on the second device. When a user inputs a control instruction on the second device through the input device, the first device can receive an input event corresponding to the operation of the user, and the first device can intercept the repair input event and send the input event corresponding to the second operation to the second device so as to instruct the second device to execute the first control instruction. Wherein the input event may be a click, a double click, a drag or a move. The control instruction refers to a process triggered by clicking, double clicking, dragging or moving.
According to the mouse traversing method, when the cursor corresponding to the input device connected with the first device moves to the second device, the first device can intercept the input event corresponding to the operation acting on the input device and send the input event to the second device, the second device executes the control instruction corresponding to the operation acting on the input device, so that different devices can share the input device, different operation modes are not required to be switched between different devices, complex operation is avoided, and user experience is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: the first device receives a third operation acting on the input device, wherein the third operation is used for dragging a cursor displayed on the second screen to the direction of the first device; responding to a third operation, and controlling the cursor to move out of the second screen and enter the first screen for display by the first device; the first device stops intercepting input events corresponding to operations acting on the input device to execute control instructions corresponding to the operations of the input device.
The third operation is used for dragging the cursor displayed on the second screen to the direction where the first device is located, namely, the user controls the input device to move the cursor on the second screen to the first screen, and the first device responds to the third operation and controls the cursor to move out of the second screen and enter the first screen to be displayed. At this time, the input device may input a control instruction on the first device. When a user inputs a control instruction on the first device through the input device, the first device can stop intercepting an input event corresponding to the operation of the user, and when the input event corresponding to the operation of the user is received, the control instruction corresponding to the operation of the user can be executed.
According to the mouse traversing method, when the cursor corresponding to the input device connected with the first device moves back (or passes back) from the second device to the first device, the first device can stop intercepting the input event corresponding to the operation acting on the input device, and execute the control instruction corresponding to the operation acting on the input device, so that the input device can be switched back and forth between different devices, and the switched device is controlled.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: the first device receives a fourth operation acting on the input device, wherein the fourth operation is used for inputting a control instruction for forcedly grabbing the cursor on the second device; in response to the fourth operation, the first device controls a cursor to be displayed on the first screen; the first device stops intercepting input events corresponding to operations acting on the input device to execute control instructions corresponding to the operations of the input device.
When the cursor of the input device is displayed on the second device, the user can forcefully seize the input device through the shortcut key, i.e. the cursor of the input device is displayed on the first device. When the first device receives an operation that the user triggers a shortcut key that forces to be taken back to the input device, a cursor corresponding to the input device may be displayed in a designated position of a screen of the first device in response to the operation of the user. The shortcut key forcibly capturing the input device may be Esc or ctrl+alt+delete, and the designated position of the screen may be the middle of the screen, but the application is not limited thereto.
When a cursor of the input device is displayed on the first device, the input device may input control instructions on the first device. When a user inputs a control instruction on the first device through the input device, the first device can stop intercepting an input event corresponding to the operation of the user, and when the input event corresponding to the operation of the user is received, the control instruction corresponding to the operation of the user can be executed.
According to the mouse traversing method, a user can be supported to forcedly move the cursor moved to other equipment back to the first equipment through one control instruction, the user is not required to move, the efficiency can be improved, in addition, the mouse traversing method can be suitable for more scenes, for example, the scenes that the user cannot see the cursor, and the like, different requirements of the user can be met, and the user experience is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: the first device detecting a fifth operation acting on the input device, the fifth operation being for inputting a second control instruction on the first device; in response to the fifth operation, the first device executes the second control instruction.
When the cursor of the input device is displayed on the first device, the input device may input a second control instruction on the first device. When the user inputs the second control instruction on the first device through the input device, the first device may execute the second control instruction.
According to the mouse traversing method, when the cursor of the input device moves from the second screen to the first screen or the cursor of the input device is forcedly taken back from the second screen to be displayed on the first screen, the second control instruction input by the input device can be executed, and the input device can be switched back and forth between different devices.
With reference to the first aspect, in some implementations of the first aspect, in response to the second operation, the first device intercepts an input event corresponding to the second operation, including: in response to the second operation, the first device intercepts an input event corresponding to the second operation through the hook.
Hook, a platform of Windows message handling mechanisms, on which an application can set up a subroutine to monitor certain messages for a given window, and the monitored window can be created by other processes. When the message arrives, it is processed before the target window processing function. The hook mechanism allows an application to intercept a processing window message or a particular event. The hook is effectively a piece of program that processes a message, and is suspended from the system by a system call. Whenever a particular message is sent, the hooking program captures the message before the destination window is reached, i.e., the hooking function gains control first. The hook function can process (change) the message, can continue to transmit the message without processing, and can forcedly end the transmission of the message.
The first device may set a mouse hook, and a shortcut hook to intercept an input event corresponding to the second operation. Wherein the keyboard hook is used for acquiring an input operation of a user through the keyboard, for example, the keyboard hook is used for acquiring an operation of inputting a "temporary unprocessed item" through the keyboard. The mouse hook is used for acquiring control operation of a user through the mouse, for example, the mouse hook is used for acquiring clicking or dragging operation of the user through the mouse. The shortcut hook is used to acquire a user shortcut operation, for example, an operation in which the shortcut hook is used to acquire ctrl+ C, ctrl + V, ATL +tab or ctrl+alt+delete of the user.
According to the mouse traversing method, the input event is intercepted through the hook, so that the first device does not respond to the input event, and other devices can respond to the input event to achieve the effect that the input device controls other devices.
With reference to the first aspect, in some implementations of the first aspect, the first device sending, to the second device, an input event corresponding to the second operation includes: the first device packs the input event corresponding to the second operation through a preset protocol to obtain a packed input event; the first device sends the packaged input event to the second device.
The preset protocol may be a custom protocol, and may be an arrangement rule. The first device may package the input event corresponding to the second operation based on the preset protocol, obtain a packaged input event, and send the packaged input event to the second device.
The input event, for example, a click event, which may include information such as a click position, and the first device may package information included in the click event based on a preset protocol.
According to the key mouse traversing method, before the input event is transmitted, the input event is packaged, and the packaged input event is transmitted, so that data transmission is facilitated.
With reference to the first aspect, in certain implementations of the first aspect, the first device detecting a first operation acting on the input device includes: the first device obtains the position of a cursor on a first screen; the first device judges whether the cursor exceeds the edge of the first screen or not based on the position; if the cursor exceeds the edge of the first screen, the first device detects a first operation.
If the first screen has a coordinate system, the position of the cursor on the first screen, that is, the coordinate of the cursor on the first screen, the first device can determine whether the cursor is on the edge of the first screen through the position. It will be appreciated that when there is a limit point (e.g., a maximum or minimum) to the position of the cursor, the cursor is at the edge of the first screen.
In a scene where the cursor passes through the first screen left and right, the edges of the first screen include a left edge and a right edge. In a scene where the cursor passes over and under the first screen, the edge of the first screen includes an upper edge and a lower edge.
The first device may determine that the input device traversed when the position of the cursor exceeded the edge of the screen.
According to the mouse traversing method, the first operation is detected through the position of the cursor, so that the cursor moving out of the first screen corresponding to the input device can be detected and heard relatively accurately.
With reference to the first aspect, in some implementations of the first aspect, the first device is a Windows system, and the second device is an Android system.
In a second aspect, there is provided a terminal device, which may also be referred to as a first device, to which an input device for inputting control instructions is connected, the first device comprising a first screen; the terminal device includes: the device comprises a processing module and a receiving and transmitting module. Wherein, processing module is used for: establishing a connection with a second device; displaying a cursor corresponding to the input device on a first screen; detecting a first operation acting on the input device, wherein the first operation is used for dragging a cursor displayed on a first screen to the direction of a second device; responding to the first operation, controlling the cursor to move out of the first screen and enter a second screen of the second device for display; the transceiver module is used for: receiving a second operation acting on the input device, the second operation being for inputting a first control instruction on the second device; the processing module is also used for: intercepting an input event corresponding to the second operation, wherein the transceiver module is further configured to: and sending an input event corresponding to the second operation to the second device so as to instruct the second device to execute the first control instruction.
With reference to the second aspect, in some implementations of the second aspect, the transceiver module is further configured to: receiving a third operation acting on the input device, wherein the third operation is used for dragging a cursor displayed on the second screen to the direction of the first device; responding to the third operation, controlling the cursor to move out of the second screen and enter the first screen of the second device for display; stopping intercepting the input event corresponding to the operation acted on the input device to execute the control instruction corresponding to the operation of the input device.
With reference to the second aspect, in some implementations of the second aspect, the transceiver module is further configured to: receiving a fourth operation acting on the input device, wherein the fourth operation is used for inputting a control instruction for forcedly grabbing a cursor on the second device; the processing module is also used for: controlling a cursor to be displayed on the first screen in response to the fourth operation; stopping intercepting the input event corresponding to the operation acted on the input device to execute the control instruction corresponding to the operation of the input device.
With reference to the second aspect, in certain implementations of the second aspect, the processing module is further configured to: detecting a fifth operation acting on the input device, the fifth operation being for inputting a second control instruction on the first device; in response to the fifth operation, the second control instruction is executed.
With reference to the second aspect, in certain implementations of the second aspect, the processing module is further configured to: in response to the second operation, intercepting an input event corresponding to the second operation through the hook.
With reference to the second aspect, in certain implementations of the second aspect, the processing module is further configured to: packaging the input event corresponding to the second operation through a preset protocol to obtain a packaged input event; the transceiver module is further configured to: and sending the packed input event to the second device.
With reference to the second aspect, in certain implementations of the second aspect, the processing module is further configured to: acquiring the position of a cursor on a first screen; judging whether the cursor exceeds the edge of the first screen or not based on the position; if the cursor exceeds the edge of the first screen, a first operation is detected.
With reference to the second aspect, in some implementations of the second aspect, the first device is a Windows system, and the second device is an Android system.
In a third aspect, the present application provides a terminal device comprising a processor coupled to a memory, operable to execute instructions in the memory to implement a method according to any one of the possible implementations of the first aspect. Optionally, the terminal device further comprises a memory. Optionally, the terminal device further comprises a transceiver, and the processor is coupled to the transceiver.
In a fourth aspect, the present application provides a processor comprising: input circuit, output circuit and processing circuit. The processing circuitry is configured to receive signals via the input circuitry and to transmit signals via the output circuitry such that the processor performs the method of any one of the possible implementations of the first aspect described above.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a trigger, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the output signal may be output by, for example and without limitation, a transmitter and transmitted by a transmitter, and the input circuit and the output circuit may be the same circuit, which functions as the input circuit and the output circuit, respectively, at different times. The specific implementation of the processor and various circuits is not limited in this application.
In a fifth aspect, the present application provides a processing device comprising a processor and a memory. The processor is configured to read instructions stored in the memory and to receive signals via the receiver and to transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, the processor is one or more and the memory is one or more.
Alternatively, the memory may be integrated with the processor or the memory may be separate from the processor.
In a specific implementation process, the memory may be a non-transient (non-transitory) memory, for example, a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips, where the type of the memory and the manner of disposing the memory and the processor are not limited in this application.
It should be appreciated that the related data interaction process, for example, transmitting the indication information, may be a process of outputting the indication information from the processor, and the receiving the capability information may be a process of receiving the input capability information by the processor. Specifically, the data output by the processing may be output to the transmitter, and the input data received by the processor may be from the receiver. Wherein the transmitter and receiver may be collectively referred to as a transceiver.
The processing means in the seventh aspect may be a chip, and the processor may be implemented by hardware or software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor, implemented by reading software code stored in a memory, which may be integrated in the processor, or may reside outside the processor, and exist separately.
In an eighth aspect, the present application provides a computer readable storage medium storing a computer program (which may also be referred to as code, or instructions) which, when run on a computer, causes the computer to perform the method of any one of the possible implementations of the first aspect.
In a ninth aspect, the present application provides a computer program product comprising: a computer program (which may also be referred to as code, or instructions) which, when executed, causes a computer to perform the method of any one of the possible implementations of the first aspect.
Drawings
FIG. 1 is a schematic diagram of a communication system to which embodiments of the present application are applicable;
fig. 2 is a schematic diagram of a mouse crossing according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a forced mouse capture according to an embodiment of the present disclosure;
FIG. 4 is a schematic flow chart of a method for traversing a mouse according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of a method for detecting a mouse position according to an embodiment of the present application;
fig. 6 is a schematic diagram of establishing a screen coordinate system according to an embodiment of the present application;
FIG. 7 is a block diagram of a software architecture of a PC provided by an embodiment of the present application;
Fig. 8 is a software architecture block diagram of an android device according to an embodiment of the present application;
FIG. 9 is a schematic flow chart of another mouse traversing method according to an embodiment of the present application;
FIG. 10 is a schematic flow chart of a method for intercepting a keymouse event by a PC according to an embodiment of the present application;
FIG. 11 is a schematic flow chart of a method for forced recovery of a mouse according to an embodiment of the present application;
fig. 12 is a schematic diagram of a terminal device provided in an embodiment of the present application;
fig. 13 is a schematic diagram of another terminal device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the present application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In this application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
The embodiment of the application provides a key mouse traversing method, which can operate different terminal devices through a mouse and a keyboard to realize seamless access among the different terminal devices.
For a better understanding of the embodiments of the present application, a description will first be given of a communication system to which the embodiments of the present application are applicable.
Fig. 1 shows a schematic diagram of a communication system 100. As shown in fig. 1, the communication system 100 includes a computer 101, a mobile phone 102, and a tablet 103. The operating system of the computer 101 is a Windows system, and the operating systems of the mobile phone 102 and the tablet 103 are Android (Android) systems. The computer 101 and the mobile phone 102 can realize mouse crossing, and the computer 101 and the tablet 103 can realize mouse crossing, that is, the computer 101, the mobile phone 102 and the tablet 103 can share a mouse and a keyboard. In this case, the computer 101 may be referred to as a source device or a transmitting end, and the mobile phone 102 and the tablet 103 may be referred to as a sink device or a receiving end, but the embodiment of the present application is not limited thereto.
After the mouse of the computer 101 passes through the mobile phone 102 and/or the tablet 103, the user can use the mouse to click, drag or double click the mobile phone 102 and/or the tablet 103, and can use the keyboard to input information on the mobile phone 102 and/or the tablet 103.
Illustratively, FIG. 2 shows a schematic representation of a mouse traversal. As shown in the interface a of fig. 2, the computer 101 can implement a mouse-over with the mobile phone 102, that is, the mouse pointer 1011 displayed on the computer 101 can be passed over from the screen of the computer 101 to the screen of the mobile phone 102. When the computer 101 detects a crossing operation of the mouse pointer 1011, the b interface in fig. 2 may be displayed. As shown in the interface b of fig. 2, the mouse pointer 1011 may be displayed in the form of a circle on the screen of the mobile phone 102. The user can click on the memo icon through the mouse pointer 1011, and the mobile phone 102 can display the c interface in fig. 2 in response to the user clicking on the memo icon. As shown in interface c in fig. 2, the handset 102 displays a memo interface. The user can input information "items left unprocessed" on the memo interface through the keyboard, and the mobile phone 102 displays the text "items left unprocessed" on the memo interface in response to the user's operation of inputting information on the memo interface. The user may also pass the mouse pointer 1011 back from the screen of the mobile phone 102 to the screen of the computer 101, and the computer 101 detects an operation of the user to pass the mouse pointer 1011 back from the screen of the mobile phone 102 to the computer 101, and in response to the operation, may display the interface a in fig. 2.
It should be noted that, in this example, the mouse pointer 1011 passes from the computer 101 to the mobile phone 102, and is displayed in a circle on the mobile phone 102, and when the mobile phone 102 is externally connected to the mouse and/or the keyboard, the mobile phone 102 also displays the mouse pointer in a circle. In other examples, the mouse pointer may be displayed in other forms after traversing from the computer to the mobile phone, which is not limited herein.
After the mouse of the computer 101 passes through to the mobile phone 102 and/or the tablet 103, the user can forcedly grab the mouse through the shortcut key, and when the computer 101 detects that the user triggers the operation of forcedly grabbing the shortcut key of the mouse, the mouse can be displayed in a designated position on the screen of the computer 101 in response to the operation of the user. The shortcut key of the forced seizing mouse may be Esc or ctrl+alt+delete, and the designated position of the screen may be the middle of the screen, but the embodiment of the present application is not limited thereto.
Illustratively, FIG. 3 shows a schematic diagram of a forced capture of a mouse. As shown in fig. 3, the computer 101 can implement a mouse crossing with the mobile phone 102. The mouse pointer 1011 is displayed on the screen of the mobile phone 102, and the computer 101 detects an operation of the user triggering the Esc, and in response to this operation, the mouse pointer 1011 is displayed in the middle of the screen of the computer 101.
In order to achieve the above-mentioned functions, the embodiment of the present application provides a mouse crossing method, after a mouse of a computer 101 crosses to a mobile phone 102 or a tablet 103, an event triggered by a user through a keyboard or a mouse (may be simply referred to as a mouse event) may be intercepted, that is, the mouse event may not be responded, and the mouse event may be sent to the mobile phone 102 or the tablet 103, and the mouse event may be responded by the mouse event to achieve that the computer 101, the mobile phone 102 or the tablet 103 may share the keyboard and the mouse. In order to meet the requirements of different users, for example, when a mouse passes between different devices, there may be a situation that the user cannot see the mouse on the screen, or the user does not need to operate the mouse on the android device, and wants to control the PC through the mouse, and the embodiment of the present application can also forcedly capture the mouse through a shortcut key.
The method provided by the embodiment of the application can be applied to personal computers (personal computer, PC), such as desktop computers, notebook computers (101), small notebook computers, tablet computers, ultrabooks and other devices. The software system of the personal computer is a windows system. The device that implements the mouse traversal with the personal computer may be a device that deploys an android system, e.g., a cell phone (e.g., cell phone 102 described above), a tablet (e.g., cell phone 103 described above). The device for deploying the android system may be simply referred to as an android device. For convenience of description, it is referred to as an android device in the following embodiments.
Illustratively, FIG. 4 shows a schematic flow chart of a method 400 for traversing a mouse. The method 400 is performed by a PC, such as the computer 101 described above.
As shown in fig. 4, the method 400 may include the steps of:
s401, under the condition that the PC establishes communication connection with the android device, the PC monitors that the keyboard and the mouse pass through.
The account numbers of the PC and the android device are the same account number, and the PC can establish communication connection with the android device. After the PC is connected with the android device, the mouse crossing can be supported.
The mouse crossing is that a mouse pointer passes out of the screen of the PC and reaches the screen of the android device.
S402, setting a mouse hook, a mouse hook and a shortcut key hook.
The keyboard hook is used to acquire an input operation by the user through the keyboard, for example, the keyboard hook is used to acquire an operation by the user to input "a temporary unprocessed item" through the keyboard. The mouse hook is used for acquiring control operation of a user through the mouse, for example, the mouse hook is used for acquiring clicking or dragging operation of the user through the mouse. The shortcut hook is used to acquire a user shortcut operation, for example, an operation in which the shortcut hook is used to acquire ctrl+ C, ctrl + V, ATL +tab or ctrl+alt+delete of the user.
Where Hook is a platform of Windows message handling mechanism, an application may set up a subroutine on it to monitor a certain message for a given window, and the monitored window may be created by other processes. When the message arrives, it is processed before the target window processing function. The hook mechanism allows an application to intercept a processing window message or a particular event. The hook is effectively a piece of program that processes a message, and is suspended from the system by a system call. Whenever a particular message is sent, the hooking program captures the message before the destination window is reached, i.e., the hooking function gains control first. The hook function can process (change) the message, can continue to transmit the message without processing, and can forcedly end the transmission of the message.
S403, when the user trigger key mouse operation is detected, a key mouse event is acquired.
For example, when the user clicks or drags the android device through the mouse, inputs the operation through the keyboard or performs the shortcut key operation, the PC detects the operation of clicking, dragging, inputting or shortcut key by the user, and in response to the operation, the PC may acquire a mouse event. The mouse event comprises a click event, a drag event, an input event or a shortcut event.
S404, judging whether the mouse event is a keyboard event or a mouse event.
The PC judges whether the mouse event is an input event or an event obtained by the control operation of the mouse.
If the mouse event is a keyboard event or a mouse event, the PC may intercept the mouse event, i.e., not respond to the mouse event, and execute S405. If the mouse event is not a keyboard event or a mouse event, the PC may determine whether the mouse event is an event that forces to capture the mouse, i.e., execute S406.
S405, if the mouse event is a keyboard event or a mouse event, intercepting the mouse event.
After the PC intercepts the mouse event, the mouse event can be sent to the android device, and after the android device receives the mouse event, the mouse event can be responded.
S406, if the mouse event is not a keyboard event or a mouse event, judging whether the mouse event is a preset shortcut event.
The preset shortcut key event is a forced capture of a mouse event.
Illustratively, the forced capture-back mouse event may be an event resulting from a user triggering an operation of Esc or Ctrl+Alt+DELETE.
If the mouse event is a preset shortcut event, the PC may force the end of the traversal, i.e. execute S407. If the mouse event is not the preset shortcut event, the PC may intercept it, i.e., execute S405.
S407, if the mouse event is a preset shortcut event, forcedly ending the crossing.
And the PC forcedly ends the traversing, namely the PC forcedly captures the mouse back from the screen of the android device and displays the mouse on the screen of the PC.
Illustratively, in the example shown in fig. 3 described above, when the computer 101 detects that the user triggers an operation of forcibly grabbing the shortcut key of the key mouse, the mouse may be displayed in a specified position on the screen of the computer 101 in response to the user's operation.
S408, canceling the mouse hook, the mouse hook and the shortcut key hook.
If the mouse event is a preset shortcut event, the PC can cancel the mouse hook, the mouse hook and the shortcut hook, namely, when the PC acquires the mouse event, the mouse event cannot be intercepted, and the PC can respond to the mouse event.
The embodiment of the application provides a mouse crossing method, when a PC mouse crosses to an android device, an obtained mouse event can be intercepted, namely, the mouse event is not responded, the mouse event can be sent to the android device, the mouse event is responded by the mouse event to realize that the PC and the android device share the mouse and the keyboard, the PC and the android device can be operated through the mouse and the keyboard, seamless access between the PC and the android device is realized, the operation mode of switching the PC and the android device back and forth by a user is reduced, complex operation is avoided, and user experience is improved.
In the above S401, when the PC and the android device establish a communication connection, the PC monitors that the mouse passes through, and in this embodiment of the present application, the method for detecting a position of the mouse is provided, and the PC may monitor whether the mouse passes through based on the method.
Illustratively, FIG. 5 shows a schematic diagram of a method 500 for detecting a mouse position. As shown in fig. 5, the method 500 may include the steps of:
s501, acquiring the current position of the mouse.
The current position of the mouse, i.e. the coordinates of the mouse in the screen. The PC can create a two-dimensional coordinate system in the screen, and the coordinates of the mouse in the two-dimensional coordinate system are the current position of the mouse. The current position of the mouse may be represented by (XRaw, YRaw), but embodiments of the application are not limited thereto.
Illustratively, FIG. 6 shows a schematic setup of a screen coordinate system. As shown in fig. 6, the screen OABC establishes a coordinate system with the O point as the origin, the direction in which the OA is located being the X-axis forward direction, and the direction in which the OC is located being the Y-axis forward direction. The screen resolution is 520×1080, the length of the line OA is equal to 520 pixels of the line CB, the length of the line OC is equal to 1080 pixels of the line AB, the coordinates of the O point are (0, 0), the coordinates of the a point are (520,0), the coordinates of the B point are (520,1080), and the coordinates of the C point are (0,1080) in the coordinate system. The mouse position is the coordinates of the mouse in the coordinate system, for example the mouse position may be (520,960).
S502, when the movement of the mouse is detected, the displacement of the mouse is obtained.
The mouse displacement is the distance the mouse moves. The mouse displacement may be expressed in (Lx, ly). Lx and Ly may be positive or negative, which is not limited in the embodiment of the present application.
S503, calculating the position of the moved mouse based on the position of the mouse and the displacement of the mouse.
The position after the mouse movement can be represented by (curX, curY). The position of the mouse after movement can be expressed by the following formula: curx=xraw+lx; cury=yraw+ly.
S504, judging whether the position of the moved mouse exceeds the screen.
In a scene where the mouse passes through the screen in the left-right direction, if curX is smaller than the minimum value of the screen in the X axis or larger than the maximum value of the screen in the X axis, the PC can determine that the mouse exceeds the screen.
In a scene where the mouse passes over the screen in the up-down direction, if curY is smaller than the minimum value of the screen in the Y axis or larger than the maximum value of the screen in the Y axis, the PC can determine that the mouse exceeds the screen.
If the position of the moved mouse is beyond the screen, the PC may determine that the mouse passes, i.e., execute S505. If the position of the moved mouse does not exceed the screen, the position of the moved mouse may be determined as the current position, and the movement of the mouse is continuously acquired, that is, S502 is executed.
S505, if the position of the moved mouse exceeds the screen, the mouse crossing can be determined.
The PC may also determine a crossing direction of the mouse, for example, the crossing direction may be a left crossing or a right crossing in a scene where the mouse crosses in the left-right direction of the screen, and the crossing direction may be an up crossing or a down crossing in a scene where the mouse crosses in the up-down direction of the screen.
For example, in the example shown in fig. 6, if curX is smaller than 0 in the scenario where the mouse traverses in the left-right direction of the screen, the PC may determine that the traversing direction of the mouse is left traversing. If curX is greater than 520, the PC may determine that the mouse traversal direction is right traversal. In the scene that the mouse passes through in the up-down direction of the screen, if curY is smaller than 0, the passing direction of the mouse can be determined to be the up-passing direction. If curX is greater than 1080, the PC can determine that the mouse traversing direction is traversing downwards.
According to the method for detecting the mouse crossing, whether the mouse exceeds the screen or not can be determined through the position and the displacement of the mouse, the crossing direction is determined, and accuracy of detecting the mouse crossing is improved.
In order to better understand the embodiments of the present application, the following describes the software structures of the PC and the android device in the embodiments of the present application.
Illustratively, fig. 7 shows a software architecture block diagram of a PC. As shown in fig. 7, the PC includes a service layer, a capability layer, a driver layer, and a hardware layer.
The business layer comprises a User Interface (UI) module, a setting management module, a state management module, a main service module, a direction identification module, a discovery connection module, a rule management module, a mouse business module and a dragging management module.
The UI module is used for providing a visual interface for a user. For example, the UI module may display the main interface shown in fig. 1 described above, and so on. The setting management module is used for managing the setting capacity of the computer manager. The state management module is used for managing information of the traversable equipment and the traversing direction of the key mouse. For example, the status management module may determine the number, location, and direction of traversal of the traversable devices. The main service module can be used for providing an autonomously initiated service for the mouse service. The direction recognition module is used for calling the direction recognition capability to recognize the direction of the android device relative to the PC, monitor the direction change and save the direction information. The discovery connection module is used for discovering connectable android devices and connecting the android devices. The rule management module is used for managing rules for connecting the android device and rules for changing the traversing direction. The keyboard and mouse service module is used for calling, triggering or closing the keyboard virtualization capability. The drag management module is used for monitoring drag operation of a user.
The capability layer comprises a direction recognition capability, a keyboard virtualization capability module, a keyboard event detection module, a device management middleware module, a communication module (magic link), a PC manager framework, an account system, a trust ring, a comprehensive sensing information processing platform (multimodal sensor data platform, MSDP), bluetooth, an audio system and a storage system. The keyboard virtualization capability module is used for acquiring the mouse event and the transmission of the mouse event, and can also be used for adapting the position and the moving speed of the mouse for devices with different resolutions. The mouse event detection module is used for acquiring mouse events. The device management middleware module is used to provide inter-device data sharing capabilities, such as inter-device service switching capabilities. The communication module is used for supporting the establishment of a connection channel between devices and the transmission of data capacity. The PC manager framework is used to manage PC manager options. The account system is used for acquiring a login account of the PC. The trust ring is used to represent that different devices are logged in using the same account number and connected through bluetooth or ad hoc networks. MSDP is used to detect the change of a device from motion to stationary. The audio system is used to manage the playback of high performance audio. The storage system is used for storing data.
The key mouse virtualization capability module comprises a key mouse virtualization service module, a key mouse interception module, a key mouse grabbing module, an edge calculation module, a conversion module, a protocol management module and a data transmission module. The key mouse virtualization service module is a key mouse virtualization service portal, and can provide starting or ending service for a key mouse service module development interface, and schedule key mouse event module relation and state management. The keyboard and mouse interception module is used for avoiding the PC side from continuously responding to the keyboard and mouse event under the condition that the mouse crossing is completed. The mouse grabbing module is used for acquiring mouse events. The edge calculation module is used for judging whether the mouse is at the edge or not based on the position of the mouse. The conversion module is used for matching the coordinates of the PC and the android device. For example, the conversion module converts the mouse crossing position of the mouse on the PC screen into the position of the screen of the mouse android device. And the conversion module converts the moving speed of the mouse on the PC screen into the moving speed of the screen of the mouse android device. The protocol management module is used for packaging the intercepted mouse events by using a preset protocol and sending the packaged mouse events to the data transmission module. The data transmission module is used for transmitting the packaged mouse event to the android device.
The driver layer includes bluetooth drivers, audio drivers, wifi drivers, and human interface device (human interface device, HID) drivers.
The hardware layer includes an accelerometer and a gyroscope. The PC can determine the motion change of the device from the acceleration data acquired by the accelerometer and the angular acceleration data acquired by the gyroscope.
Fig. 8 shows a software architecture block diagram of an android device. As shown in fig. 8, the android device includes an application layer, an application framework layer, an Zhuoyun rows and system libraries, and a kernel layer.
The application layer comprises: UI module, setting management module, state management module, main service module, direction identification module, discovery connection module, key mouse service module, direction identification capability, keyboard virtualization capability module, key mouse event detection module, equipment management middleware module, communication module (logic link), account system, trust ring, service frame and drag management module.
The UI module is used for providing a visual interface for a user. For example, the UI module is used to display a main interface, an interface of an application program, and the like. The setup management module may be used to set up the key mouse functions. The state management module is used for managing state changes in the traversing process. For example, traversed or not traversed. The main service module can be used for providing an autonomously initiated service for the mouse service. The direction recognition module is used for providing direction recognition capability. The discovery connection module is used for connection capability scheduling and data communication in the mouse service. The rule management module is used for managing rules for connecting the android device and rules for changing the traversing direction. The keyboard and mouse service module is used for calling, triggering or closing the keyboard virtualization capability. The drag management module is used for calling a drag frame to monitor drag operation of a user. The service framework is used for managing the registration of the mouse service capability switch. The key mouse virtualization capability module is used for providing key mouse virtualization capability. The mouse event detection module is used for acquiring mouse events. The device management middleware is used to provide inter-device data sharing capabilities. The communication module is used for providing a connection channel between devices and transmitting data capacity.
It should be noted that, the communication module is a module for communication between different devices, and data obtained by other modules is transmitted to the devices, and needs to be transmitted to the communication module first and then transmitted to other devices by the communication module. In the following embodiments, in order to simplify the steps, the situation that other modules communicate with each other across devices except the communication module occurs in different devices, which can be understood as the case that the other modules transmit numbers to the communication module, then the communication module transmits numbers to the communication module of the other devices, and then the communication module of the other devices transmits the numbers to the module of the required data.
The key mouse virtualization capability module comprises a key mouse virtualization service module, an edge calculation module, a conversion module, a protocol analysis management module, a data transmission module and a virtual device management module. The functions of the mouse virtualization service module, the edge calculation module, and the conversion module are similar to those of fig. 7, and are not described herein. The protocol analysis management module is used for analyzing the packaged mouse event to obtain the mouse event. The data transmission module is used for receiving the packaged mouse event and transmitting the mouse event to the protocol analysis management module. The virtual device management module is an operation interface module of virtual input (Uinput), and comprises a mouse input device for creating/closing, providing a mouse event control interface and performing mouse event input control.
It should be noted that, when the data is transmitted, the key mouse virtualization capability module needs to be transmitted to the key mouse service module first, and then transmitted to other modules, and in order to simplify the steps, the situation that the key mouse virtualization capability module directly transmits the data to other modules occurs can be understood that the key mouse virtualization capability module firstly transmits the data to the key mouse service module, and then the key mouse service module transmits the data to other modules.
The application framework layer comprises a drag framework, a multi-screen framework, an input management module and a storage system. The dragging frame is used for providing the dragging capability of the super mouse during crossing. The input management module is used for providing reporting of the key mouse event, wherein the input management module can comprise a key mouse frame and an input frame, the key mouse frame is used for reporting the key mouse event, and the input frame is used for reporting the input event. The storage system is used for storing data. The multi-screen frame is used for providing multi-screen operation of the key mouse.
An Zhuoyun row and system libraries include core libraries, virtual machines, surface managers, media libraries, and three-dimensional graphics processing libraries. And the android system is responsible for scheduling and managing the android system when running. The core library consists of two parts: one part is a function which needs to be called by Java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes Java files of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like. The system library may contain modules for a number of functions, such as: surface manager, media library, three-dimensional graphics processing library, etc.
The surface manager is used to manage the display subsystem and provides a fusion of the two-dimensional and three-dimensional layers for the plurality of applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving the hardware so that the hardware works. The kernel layer includes bluetooth driver, audio driver, wifi driver, virtual input driver (Uinput driver), and HID driver. Wherein the virtual input driver is used to inject input events.
For a better understanding of the embodiments of the present application, the methods provided by the embodiments of the present application are specifically described in connection with the modules related to fig. 7 and 8.
The embodiment of the application provides a specific implementation method for crossing a mouse.
Illustratively, FIG. 9 shows a schematic flow chart of a mouse traversing method 900. The method 900 may be applied to a communication system including a PC and an android device, such as the communication system 100 described above, but embodiments of the present application are not limited thereto. The software architecture diagram of the PC may be as shown in fig. 7, and the software architecture diagram of the android device may be as shown in fig. 8, which is not limited thereto.
The method 900 may include the steps of:
s901, a mouse virtualization capability module of the PC detects the operation of a user to move a mouse, and if the user monitors that the mouse passes through the mouse.
The method for monitoring the crossing of the mouse may refer to the above-mentioned method 500, and will not be described herein.
S902, a key mouse virtualization capability module of the PC sends indication information of the key mouse crossing to a key mouse service module of the android device, and correspondingly, the key mouse service module of the android device receives the indication information of the key mouse crossing.
The keyboard and mouse virtualization capability module of the PC informs the android device of the crossing of the keyboard and mouse through indication information of the crossing of the keyboard and mouse.
S903, the keyboard and mouse service module of the android device sends the indicating information of the crossing of the keyboard and mouse to the keyboard and mouse virtualization capability module of the PC of the android device based on the indicating information of the crossing of the keyboard and mouse, and correspondingly, the keyboard and mouse virtualization capability module of the android device receives the indicating information of the crossing of the keyboard and mouse.
S904, the keyboard and mouse virtualization capability module of the android device updates the crossing state based on the indication information of the crossing of the keyboard and mouse.
The mouse virtualization capability module of the android device can update the crossing state into crossing based on the indication information of mouse crossing.
S905, the mouse virtualization capability module of the PC intercepts the monitored mouse event.
The mouse passes through the android device from the PC, and the mouse virtualization capability module of the PC intercepts the monitored mouse event, so that the PC does not respond to the mouse event.
S906, a keyboard and mouse virtualization capability module of the PC packages the keyboard and mouse event to obtain the packaged keyboard and mouse event.
The key mouse virtualization capability module of the PC can package the key mouse event through a preset protocol to obtain the packaged key mouse event.
S907, the keyboard and mouse virtualization capability module of the PC sends the packaged keyboard and mouse event to the keyboard and mouse virtualization capability module of the android device, and correspondingly, the keyboard and mouse virtualization capability module of the android device receives the packaged keyboard and mouse event.
Optionally, the keyboard and mouse virtualization capability module of the PC may encrypt the packaged keyboard and mouse event before sending it, so as to prevent the keyboard and mouse event from being revealed.
S908, the keyboard and mouse virtualization capability module of the android device analyzes the packaged keyboard and mouse event, acquires the keyboard and mouse event and responds to the keyboard and mouse event.
The mouse passes through the PC to the android device, and the android device can obtain the mouse event by analyzing the packaged mouse event and respond to the mouse event. Thus, the PC and the android device can realize the crossing of the keyboard and the mouse.
S909, a mouse virtualization capability module of the android device monitors that the mouse passes through.
In the process that the key mouse acts on the android device, the key mouse virtualization capability module of the android device continuously monitors the crossing of the key mouse.
After the mouse traverses to the android device, it may also traverse back. In the scenario that the mouse passes back to the PC, the operation of the mouse passing back can be understood as the operation of the mouse passing back, i.e. from the android device back to the PC, and the following description will be continued.
S910, the keyboard and mouse virtualization capability module of the android device sends a message of ending the traversing to the keyboard and mouse service module of the PC, and correspondingly, the keyboard and mouse service module of the PC receives the message of ending the traversing.
The message to end the traversal is to stop the mouse traversing from the PC to the android device.
S911, the key mouse service module of the PC sends the message of ending the crossing to the key mouse virtualization capability module of the PC based on the message of ending the crossing, and correspondingly, the key mouse virtualization capability module of the PC receives the message of ending the crossing.
S912, stopping intercepting the monitored mouse event by the mouse virtualization capability module of the PC based on the message of ending the crossing.
The keyboard and mouse pass back to the PC from the android device, and the PC responds to the keyboard and mouse operation, so that the keyboard and mouse virtualization capability module of the PC can stop intercepting monitored keyboard and mouse events.
S913, the mouse virtualization capability module of the PC updates the crossing state based on the crossing ending message.
The mouse virtualization capability module of the PC may update the traverse state from traversed to non-traversed based on the message ending the traverse.
The execution order of S912 and S913 is not limited in the embodiment of the present application.
S914, the keyboard and mouse virtualization capability module of the PC sends a setting completion message to the keyboard and mouse virtualization capability module of the android device, and correspondingly, the keyboard and mouse virtualization capability module of the android device receives the setting completion message.
The setup completion message is used to indicate that interception of the monitored mouse event has stopped and the crossing state has been updated.
S915, the mouse virtualization capability module of the android device updates the traversing state based on the setting completion message.
When the mouse passes through to the android device, the passing state is in passing, and when the mouse passes back to the PC, the mouse virtualization capability module of the android device can update the passing state from passing to non-passing based on the setting completion message.
According to the mouse crossing method, after the mouse crossing is realized, the mouse crossing position can be adjusted based on different devices, after the mouse crossing is successful, a mouse event is intercepted at the PC, and the mouse crossing can be realized in response at the android device side.
The above S905 and the mouse virtualization capability module of the PC intercept the monitored mouse event, and the embodiment of the present application provides a specific implementation manner.
Illustratively, FIG. 10 shows a schematic flow chart of a method 1000 for a PC to intercept a keymouse event. The method can be executed by a key mouse virtualization capability module of the PC and a key mouse event detection module of the PC. The key mouse virtualization capability module of the PC comprises a hidden window module, a key mouse management module, an edge calculation module and a key mouse interception module.
As shown in fig. 10, the method 1000 may include the steps of:
s1001, the keyboard and mouse management module sends indication information for creating the hidden window to the hidden window module, and correspondingly, the hidden window module receives the indication information for creating the hidden window.
The indication information for creating the hidden window may be used to indicate that the hidden window module is hiding the window.
S1002, the hidden window module creates a hidden window based on the indication information for creating the hidden window.
A hidden window may be understood as a window that runs in the background of the PC. The hidden window is used to receive a mouse event. A mouse event is one type of Windows message. In Windows programming, after each Windows application program starts executing, the system creates a message queue for the program, which is used to store the information of the window created by the application program.
Illustratively, when the user presses the right mouse button, the PC will generate a WM_RBUTTONDOWS message, which the system will automatically place in the message queue of the application to which the hidden window belongs, waiting for the end of the application. Windows places the generated messages in sequence in a message queue, and the application program continuously reads the messages from the message queue through a message loop and responds. After the hidden window module has successfully created the hidden window, a WM_INPUT message may be received.
S1003, responding to the operation of a user for moving the mouse, and acquiring the position S of the mouse by the hidden window module through the hidden window A
The hidden window module can read the mouse position S in the hidden window A
S1004, the hidden window module sends the mouse position S to the keyboard and mouse management module A Correspondingly, the mouse management module receives the mouse position S A
S1005, mouse management module based on mouse position S A And sending the indication information of the edge detection to an edge calculation module, and correspondingly, receiving the indication information of the edge detection by the edge calculation module.
The indication information of the edge detection is used for indicating an edge calculation module to judge whether the mouse is at the edge of the screen.
S1006, the edge calculation module monitors that the mouse exceeds the edge of the screen based on the indication information of the edge detection.
The edge calculation module is based on the mouse position S A And judging that the mouse exceeds the edge of the screen. For specific implementations, reference may be made to the method 500 described above.
S1007, the edge calculation module sends the instruction information of the setting hook to the key mouse interception module, and correspondingly, the key mouse interception module receives the instruction information of the setting hook.
The indication information of the interception event is used for indicating the keyboard hook, the mouse hook and the shortcut key hook to be arranged by the keyboard-mouse interception module.
S1008, the keyboard hook, the mouse hook and the shortcut key hook are set by the key mouse interception module based on the indication information of the set hook.
S1009, the mouse interception module sends the indication information reported by the mouse event to the mouse event detection module, and correspondingly, the mouse event detection module receives the indication information reported by the mouse event.
The indication information reported by the mouse event is used for indicating the mouse event detection module to report the mouse event.
S1010, the mouse event detection module can report the obtained mouse event to the mouse interception module based on the indication information reported by the mouse event, and correspondingly, the mouse interception module receives the mouse event.
S1011, the mouse interception module sends a mouse event to the edge calculation module, and correspondingly, the edge calculation module receives the mouse event.
S1012, the edge calculation module determines that the mouse passes through based on the mouse event.
S1013, the edge calculation module sends a notification message for intercepting an event to the key mouse management module, and correspondingly, the key mouse management module receives the notification message for intercepting the event.
The notification message of the interception event is used for indicating the keyboard and mouse management module to intercept the keyboard and mouse event.
S1014, the keyboard and mouse management module intercepts the keyboard and mouse event based on the notification message of the intercepted event.
After the keyboard and mouse management module intercepts the keyboard and mouse event, the PC can package the event according to a preset protocol and then transmit the event to the android device.
S1015, the keyboard and mouse management module sends the indication information reported by the event of stopping the keyboard and mouse to the keyboard and mouse interception module, and correspondingly, the keyboard and mouse interception module receives the indication information reported by the event of stopping the keyboard and mouse.
The indication information for stopping reporting the mouse event is used for indicating the mouse event detection module to stop reporting the mouse event.
S1016, the mouse interception module sends the indication information for stopping the mouse event report to the mouse event detection module, and correspondingly, the mouse event detection module receives the indication information for stopping the mouse event report.
S1017, responding to the operation of the user for moving the mouse, and acquiring the position S of the mouse by the hidden window module B
Mouse position S B With the mouse position S A Different.
S1018, the hidden window module sends the mouse position S to the keyboard and mouse management module B Correspondingly, the mouse management module receives the mouse position S B
S10110, mouse management module based on mouse position S B And sending the indication information of the edge detection to an edge calculation module, and correspondingly, receiving the indication information of the edge detection by the edge calculation module.
S1020, the edge calculation module monitors that the mouse enters the edge of the screen based on the indication information of the edge detection.
The mouse enters the edge of the screen, i.e. the mouse passes back to the screen of the PC on the screen of the android device.
Illustratively, in the example shown in fig. 2 above, the user traverses the mouse from the computer 101 to the mobile phone 102, after editing the memo application is completed, the user may continue traversing to the computer 101 to operate the computer 101, and when the user traverses the mouse from the screen back to the screen of the PC, the edge calculation module may determine that the mouse enters the edge of the screen in response to the user's operation.
S1021, the edge calculation module sends instruction information for canceling the hook to the keyboard-mouse interception module, and correspondingly, the keyboard-mouse interception module receives the instruction information for canceling the hook.
The instruction information for canceling the hook is used for instructing the keyboard and mouse interception module to cancel setting the hook.
S1022, the keyboard hook, the mouse hook and the shortcut key hook are canceled by the keyboard and mouse interception module based on the instruction information of the cancellation hook.
S1023, the keyboard and mouse interception module sends indication information reported by the keyboard and mouse event to the keyboard and mouse event detection module, correspondingly, the keyboard and mouse event detection module receives the indication information reported by the keyboard and mouse event and reports the keyboard and mouse event based on the indication information reported by the keyboard and mouse event.
According to the method for intercepting the mouse event, when the mouse passes through to the android device, the keyboard hook, the mouse hook and the shortcut key hook are used for intercepting the mouse event on the PC side, the mouse event is responded on the android device side, and when the mouse passes back to the PC, the keyboard hook, the mouse hook and the shortcut key hook are canceled on the PC side, so that the mouse passing through is facilitated to be achieved in response to the mouse event.
If the mouse event is the preset shortcut event, the crossing is forcedly ended in S407, and the embodiment of the application provides a specific implementation manner.
Illustratively, FIG. 11 shows a schematic flow diagram of a method 1100 of forced mouse capture that may be performed by a mouse virtualization capability module of a PC and a mouse event detection module of a PC. The key mouse virtualization capability module of the PC comprises a hidden window module, a key mouse management module, an edge calculation module and a key mouse interception module.
As shown in fig. 11, the method 1100 may include the steps of:
s1101, the keyboard event detection module intercepts and sends a preset shortcut key event to the keyboard.
The mouse event detection module receives the indication information reported by the mouse event and can report the mouse event. When a preset shortcut key event is detected, the keyboard and mouse event detection module intercepts and sends the preset shortcut key event to a keyboard and mouse.
Preset shortcuts, i.e., shortcuts that force the back. The shortcut key for forced extraction may be Esc or ctrl+alt+delete, which is not limited in the embodiment of the present application.
S1102, the keyboard and mouse interception module sends a notification message of the preset shortcut key event to the edge calculation module based on the preset shortcut key event, and correspondingly, the edge calculation module receives the notification message of the preset shortcut key event.
The notification message of the preset shortcut key event is used for notifying the edge computing module that the operation of forcedly taking back the mouse is received.
S1103, the edge calculation module sends a notification message for grabbing a mouse to the mouse management module based on a notification message of a preset shortcut key event under the condition that the mouse passes, and correspondingly, the mouse management module receives the notification message for grabbing the mouse.
The notification message that the mouse is taken back is used to notify the keyboard and mouse management module to force the mouse to be taken back.
S1104, the mouse control module forces the mouse to be taken back based on the notification message of the mouse.
After the mouse is forcibly captured by the keyboard and mouse management module, the mouse can be displayed in the middle of the PC screen. It can be appreciated that the display position after the forced mouse capture is preset and may be the middle of the screen, but the embodiments of the present application are not limited thereto.
S1105, the edge calculation module sends instruction information for canceling the hook to the keyboard and mouse interception module based on a notification message of a preset shortcut key event, and correspondingly, the keyboard and mouse interception module receives the instruction information for canceling the hook.
The instruction information for canceling the hook is used for instructing the keyboard and mouse interception module to cancel setting the hook.
The embodiment of the present application does not limit the execution order of S1104 and S1105 described above.
S1106, the keyboard hook, the mouse hook and the shortcut key hook are canceled by the keyboard and mouse interception module based on the instruction information of the cancellation hook.
S1107, the keyboard and mouse interception module sends the indication information reported by the stop keyboard and mouse event to the keyboard and mouse event detection module, correspondingly, the keyboard and mouse event detection module receives the indication information reported by the stop keyboard and mouse event and does not send the keyboard and mouse event to the keyboard and mouse interception module.
According to the method for forcedly capturing the mouse by the key, when the operation of forcedly capturing the mouse is monitored, the requirement of forcedly capturing the mouse by a user can be met, the mouse is forcedly captured by a shortcut key, and the forcedly capturing mouse is high in efficiency.
The sequence numbers of the above-mentioned processes do not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
The method provided by the embodiment of the present application is described in detail above with reference to fig. 1 to 11, and the terminal device provided by the embodiment of the present application will be described in detail below with reference to fig. 12 and 13.
Fig. 12 shows a schematic diagram of a terminal device 1200 provided in an embodiment of the present application. The terminal apparatus 1200 includes: a processing module 1210 and a transceiver module 1220.
Wherein, the processing module 1210 is configured to: establishing a connection with a second device; displaying a cursor corresponding to the input device on a first screen; detecting a first operation acting on the input device, wherein the first operation is used for dragging a cursor displayed on a first screen to the direction of a second device; responding to the first operation, controlling the cursor to move out of the first screen and enter a second screen of the second device for display; the transceiver module 1220 is configured to: receiving a second operation acting on the input device, the second operation being for inputting a first control instruction on the second device; the processing module is also used for: intercepting an input event corresponding to the second operation, wherein the transceiver module is further configured to: and sending an input event corresponding to the second operation to the second device so as to instruct the second device to execute the first control instruction.
It should be understood that the terminal device 1200 herein is embodied in the form of functional modules. The term module herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an alternative example, it will be understood by those skilled in the art that the terminal device 1200 may be specifically a PC in the foregoing method embodiment, or the functions of the PC in the foregoing method embodiment may be integrated in the terminal device 1200, and the terminal device 1200 may be used to execute each flow and/or step corresponding to the PC in the foregoing method embodiment, which is not repeated herein.
The terminal device 1200 has a function of implementing the corresponding steps executed by the PC in the above-described method embodiment; the above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
In an embodiment of the present application, the terminal device 1200 in fig. 12 may also be a chip or a chip system, for example: system on chip (SoC).
Fig. 13 is a schematic block diagram of another terminal device 1300 provided in an embodiment of the present application. The terminal device 1300 includes a processor 1310, a transceiver 1320, and a memory 1330. Wherein the processor 1310, the transceiver 1320 and the memory 1330 communicate with each other through an internal connection path, the memory 1330 is configured to store instructions, and the processor 1320 is configured to execute the instructions stored in the memory 1330 to control the transceiver 1320 to transmit signals and/or receive signals.
It should be understood that the terminal device 1300 may be specifically a PC in the above-described method embodiment, or the functions of the PC in the above-described method embodiment may be integrated in the terminal device 1300, and the terminal device 1300 may be configured to perform the steps and/or flows corresponding to the PC in the above-described method embodiment. The memory 1330 may optionally include read-only memory and random access memory, and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type. The processor 1310 may be configured to execute instructions stored in the memory, and when the processor executes the instructions, the processor may perform steps and/or flows corresponding to the terminal device in the foregoing method embodiments.
It is to be appreciated that in embodiments of the present application, the processor 1310 may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor executes instructions in the memory to perform the steps of the method described above in conjunction with its hardware. To avoid repetition, a detailed description is not provided herein.
The application also provides a computer readable storage medium for storing a computer program for implementing the method corresponding to the PC in the method embodiment.
The application also provides a chip system for supporting the PC in the method embodiment to realize the functions shown in the application embodiment.
The present application also provides a computer program product comprising a computer program (which may also be referred to as code, or instructions) which, when run on a computer, is capable of performing the method corresponding to the PC shown in the method embodiments described above.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and the changes or substitutions are intended to be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. The mouse traversing method is characterized by being applied to first equipment, wherein the first equipment is connected with input equipment for inputting control instructions, and comprises a first screen;
the method further comprises the steps of:
the first device establishes connection with the second device;
the first device displays a cursor corresponding to the input device on the first screen;
the first device detects a first operation acting on the input device, wherein the first operation is used for dragging the cursor displayed on the first screen to the direction in which the second device is located;
responsive to the first operation, the first device controls the cursor to move out of the first screen and enter a second screen of the second device for display;
the first device receiving a second operation on the input device, the second operation for inputting a first control instruction on the second device;
the first equipment intercepts an input event corresponding to the second operation;
and the first equipment sends an input event corresponding to the second operation to the second equipment so as to instruct the second equipment to execute the first control instruction.
2. The method according to claim 1, wherein the method further comprises:
the first device receives a third operation acting on the input device, wherein the third operation is used for dragging the cursor displayed on the second screen to the direction in which the first device is positioned;
responding to the third operation, and controlling the cursor to move out of the second screen and enter the first screen for display by the first device;
the first device stops intercepting input events corresponding to the operation acting on the input device to execute control instructions corresponding to the operation of the input device.
3. The method according to claim 1, wherein the method further comprises:
the first device receives a fourth operation acting on the input device, wherein the fourth operation is used for inputting a control instruction for forcedly grabbing a cursor on the second device;
in response to the fourth operation, the first device controls the cursor to be displayed on the first screen;
the first device stops intercepting input events corresponding to the operation acting on the input device to execute control instructions corresponding to the operation of the input device.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
the first device detecting a fifth operation acting on the input device, the fifth operation being for inputting a second control instruction on the first device;
in response to the fifth operation, the first device executes the second control instruction.
5. The method of any of claims 1-4, wherein the intercepting, by the first device, an input event corresponding to the second operation in response to the second operation comprises:
and responding to the second operation, and intercepting an input event corresponding to the second operation by the first device through a hook.
6. The method of any of claims 1-5, wherein the first device sending an input event corresponding to the second operation to the second device comprises:
the first device packages the input event corresponding to the second operation through a preset protocol to obtain the packaged input event;
the first device sends the packaged input event to the second device.
7. The method of claims 1-6, wherein the first device detecting a first operation on the input device comprises:
The first device obtains the position of the cursor on the first screen;
the first device judges whether the cursor exceeds the edge of the first screen or not based on the position;
and if the cursor exceeds the edge of the first screen, the first device detects the first operation.
8. The method according to any one of claims 1 to 7, wherein the first device is a Windows system and the second device is an Android system.
9. A terminal device, comprising: a processor coupled to a memory for storing a computer program which, when invoked by the processor, causes the terminal device to perform the method of any one of claims 1 to 7.
10. A computer readable storage medium storing a computer program comprising instructions for implementing the method of any one of claims 1 to 7.
11. A computer program product comprising computer program code embodied therein, which when run on a computer causes the computer to implement the method of any of claims 1 to 7.
CN202211075099.2A 2022-09-02 2022-09-02 Key mouse traversing method and terminal equipment Pending CN117687557A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211075099.2A CN117687557A (en) 2022-09-02 2022-09-02 Key mouse traversing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211075099.2A CN117687557A (en) 2022-09-02 2022-09-02 Key mouse traversing method and terminal equipment

Publications (1)

Publication Number Publication Date
CN117687557A true CN117687557A (en) 2024-03-12

Family

ID=90132510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211075099.2A Pending CN117687557A (en) 2022-09-02 2022-09-02 Key mouse traversing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN117687557A (en)

Similar Documents

Publication Publication Date Title
CN111666055B (en) Data transmission method and device
CN111597000B (en) Small window management method and terminal
KR20220158800A (en) Content Sharing Methods and Electronic Equipment
US20230305789A1 (en) Device Interaction Method and Electronic Device
US20200125243A1 (en) Terminal, split-screen display method for screen thereof, and storage device
CN112947825A (en) Display control method, display control device, electronic device, and medium
CN112114733A (en) Screen capturing and recording method, mobile terminal and computer storage medium
WO2020238357A1 (en) Icon displaying method and terminal device
CN111274564A (en) Communication terminal and application unlocking method in split screen mode
CN110865765A (en) Terminal and map control method
CN116088716B (en) Window management method and terminal equipment
CN114721761B (en) Terminal equipment, application icon management method and storage medium
CN114020379B (en) Terminal equipment, information feedback method and storage medium
CN111726605B (en) Resolving power determining method and device, terminal equipment and storage medium
CN117687557A (en) Key mouse traversing method and terminal equipment
CN113014614A (en) Equipment control method, control equipment and controlled equipment
CN114489429B (en) Terminal equipment, long screen capturing method and storage medium
CN114546219B (en) Picture list processing method and related device
CN113157092B (en) Visualization method, terminal device and storage medium
CN112527182A (en) Electronic device and pattern drawing method
CN114356559A (en) Multithreading control method and terminal equipment
CN117687556A (en) Key mouse traversing method and terminal equipment
WO2024046117A1 (en) Data transmission method and terminal device
CN115297467B (en) Data domain switching method, device, terminal equipment and medium
CN116088740B (en) Interface processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination