WO2022022490A1 - 一种跨设备的对象拖拽方法及设备 - Google Patents

一种跨设备的对象拖拽方法及设备 Download PDF

Info

Publication number
WO2022022490A1
WO2022022490A1 PCT/CN2021/108579 CN2021108579W WO2022022490A1 WO 2022022490 A1 WO2022022490 A1 WO 2022022490A1 CN 2021108579 W CN2021108579 W CN 2021108579W WO 2022022490 A1 WO2022022490 A1 WO 2022022490A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
mouse
drag
display screen
event
Prior art date
Application number
PCT/CN2021/108579
Other languages
English (en)
French (fr)
Inventor
卢跃东
王海军
周学而
周星辰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21850017.1A priority Critical patent/EP4180932A4/en
Priority to US18/007,120 priority patent/US20230229300A1/en
Publication of WO2022022490A1 publication Critical patent/WO2022022490A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/12Frame memory handling
    • G09G2360/121Frame memory handling using a cache memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Definitions

  • the present application relates to the field of electronic devices, and in particular, to a method and device for dragging objects across devices.
  • a user can have more terminals such as mobile phones, tablet computers, personal computers (personal computers, PCs), and smart home devices (such as TV sets) at the same time.
  • terminals such as mobile phones, tablet computers, personal computers (personal computers, PCs), and smart home devices (such as TV sets) at the same time.
  • the use of each terminal is relatively independent.
  • multiple terminals such as collaborative office
  • users will connect multiple terminals to use them together. For example, if a user owns a PC and a mobile phone, the user can connect the PC and the mobile phone in a wireless or wired manner to work together to realize the collaborative office of the PC and the mobile phone.
  • the multi-screen collaboration realizes the screen mirroring projection method to project the interface of the mobile phone (such as the desktop 101 of the mobile phone shown in Figure 1) to the display of the PC. It is displayed on the screen, which is convenient for users to realize collaborative office.
  • the current multi-screen collaboration can also implement the function of two-way dragging and dropping of content between the PC and the mobile phone using input devices (or peripherals) such as a mouse and a touch screen. That is to say, users are allowed to use an input device (such as a mouse, a touch screen) to transfer text (or text) or files between the PC and the mobile phone by dragging and dropping when projecting a screen.
  • the premise of realizing content dragging is to project the interface of the mobile phone to the display screen of the PC, and the mobile phone is usually off, and the touch screen of the mobile phone, Hardware capabilities such as stylus cannot participate in collaborative office.
  • the interface projected from the mobile phone to the PC also greatly occupies the display space of the PC display. The use efficiency of the collaborative use of multiple terminals is reduced.
  • Embodiments of the present application provide a method and device for dragging and dropping objects across devices, which improves the use efficiency of collaborative use of multiple terminals.
  • a first aspect of the present application provides a method for dragging and dropping objects across devices.
  • the method can be applied to a second terminal, where the second terminal is connected to the first terminal.
  • the object dragged from the first terminal is displayed on the display screen; the second terminal displays a second cursor on the object; the second terminal receives the moving operation input by the user using the input device of the first terminal;
  • An animation of the object moving with the second cursor is displayed on the display screen of the second terminal.
  • the user can use an input device such as a mouse to drag and drop an object on one terminal to follow the cursor among multiple terminals participating in the collaborative use. Since there is no need to start screen projection, it will not occupy the display space of a terminal display. In addition, the use efficiency of the multi-terminal collaborative use is improved, and the user experience is improved.
  • the input device of the above-mentioned first terminal may be a mouse, a touchpad, or the like.
  • the second cursor may be a cursor displayed on the display screen of the second terminal.
  • the method may further include: the second terminal receives the shuttle status from the first terminal information, the shuttle status information is used to indicate the start of the shuttle.
  • the second terminal displays the object dragged from the first terminal on the display screen of the second terminal, which may include: the second terminal receives the drag data from the first terminal; the second terminal receives the drag data from the first terminal; The terminal displays the object dragged from the first terminal on the display screen of the second terminal according to the drag data; wherein, the drag data and the moving operation are in the case where the object moves with the first cursor on the display screen of the first terminal , which is sent by the first terminal to the second terminal after determining that the object is dragged out of the edge of the display screen of the first terminal, and is used to initiate a drag event for the object.
  • the first cursor may be a cursor displayed on the display screen of the first terminal.
  • the method may further include: the second terminal generates a pressing operation; the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation, which may The method includes: the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation, pressing the operation and dragging the data. After the object is dragged off the edge of the display screen of the first terminal, the cursor shuttles. After the cursor shuttles, the second terminal implements the drag connection according to the operation input by the user using the input device of the first terminal.
  • the second terminal generates a pressing operation, which may include: the second terminal simulates a pressing event according to operation parameters of the pressing operation;
  • the movement operation may include: the second terminal receives operation parameters from the first terminal, and simulates a movement event according to the operation parameters; the operation parameters are included in the movement event received by the first terminal after the user uses the input device of the first terminal to perform the movement operation Operation parameters; the second terminal displays the animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation, pressing operation and dragging data, including: in response to the pressing event and the moving event, the second terminal According to the dragging data, an animation of the object moving with the second cursor is displayed on the display screen of the second terminal.
  • the method may further include: after the second terminal successfully establishes the connection with the first terminal, creating a virtual input device; or, the second terminal receives a notification message from the first terminal, notifying The message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been enabled, and in response to the notification message, the second terminal creates a virtual input device; wherein the virtual input device is used by the second terminal to simulate input events according to operating parameters.
  • the second terminal displays the object dragged from the first terminal on the display screen of the second terminal, including: the second terminal displays the object dragged from the first terminal on the display screen of the second terminal The shadow of the dragged object; the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation, which may include: the second terminal displays the animation on the display screen of the second terminal according to the moving operation. Animate the shadow of the object on the display as the second cursor moves.
  • the above-mentioned object may be text, a file or a folder; the above-mentioned dragging data may include dragging event content and a bitmap of the shadow; wherein, when the object is text, the dragging event content includes text , when the object is a file or folder, the drag event content is the file path.
  • the method may further include: the second terminal creates an invisible activity, the invisible activity has a view control with a transparency greater than a threshold, and the view control Used to initiate drag events.
  • the object is an icon of an application; or, the object is a window, and the window includes the interface of the application; when the object is the icon of the application, the drag data includes: the icon of the application; the object is the window , the drag data includes: the interface of the application.
  • a second aspect of the present application provides a method for dragging and dropping objects across devices.
  • the method can be applied to a first terminal, where the first terminal is connected to a second terminal.
  • the method can include: displaying on the first terminal by the first terminal The first cursor is displayed on the object of the first terminal; the first terminal receives a drag operation input by the user using the input device of the first terminal, and the drag operation is used to initiate a drag on the above-mentioned object; in response to the drag operation, the first terminal in An animation of the object moving with the first cursor is displayed on the display screen of the first terminal; when the first terminal determines that the object is dragged out of the edge of the display screen of the first terminal, the first terminal sends drag data to the second terminal.
  • the input device of the above-mentioned first terminal may be a mouse, a touchpad, or the like.
  • the first cursor may be a cursor displayed on the display screen of the first terminal.
  • the user can use an input device such as a mouse to drag and drop an object on one terminal to follow the cursor among multiple terminals participating in the collaborative use. Since there is no need to start screen projection, it will not occupy the display space of a terminal display.
  • the drag data related to the dragged object is sent to other terminals for the other terminals to continue dragging.
  • the user can use these terminals to process the transferred objects, so that the hardware capabilities of the multiple terminals can all participate in the collaborative office. The use efficiency of multi-terminal collaborative use is improved, and the user experience is improved.
  • the above drag data may be used by the second terminal to initiate a drag event for the object, so that the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal.
  • the second cursor may be a cursor displayed on the display screen of the second terminal.
  • the above-mentioned dragging operation may include a pressing operation and a moving operation; when the first terminal determines that the object is dragged out of the edge of the display screen of the first terminal, the method may further include: first A terminal sends data of a movement operation input by the user using the input device of the first terminal to the second terminal. After the object is dragged out of the edge of the display screen of the first terminal, the cursor shuttles. After the cursor shuttles, the first terminal sends the data of the operation input by the user using the first terminal to other terminals, so that other terminals can realize Drag-and-drop continuation.
  • sending the first terminal to the second terminal the data of the movement operation input by the user using the input device of the first terminal may include: a process in which the user performs the movement operation using the input device of the first terminal , the first terminal intercepts the movement event; the first terminal sends the operation parameters included in the movement event to the second terminal.
  • the method may further include: the first terminal sends the shuttle status information to the second terminal, the shuttle status A message is used to indicate the start of the shuttle.
  • the shuttle status information is used to indicate the start of the shuttle.
  • the first terminal displays an animation of the object moving with the first cursor on the display screen of the first terminal, which may include: displaying the shadow of the object on the display screen of the first terminal by the first terminal Animation of the first cursor movement.
  • the method may further include: the first terminal hides the first cursor and the shadow of the object. After it is determined that the cursor shuttles, the cursor displayed on the first terminal and the shadow corresponding to being dragged are hidden, so as to give the user a visual effect that the object is dragged from the first terminal to another terminal.
  • the above-mentioned object may be text, a file or a folder; the above-mentioned dragging data may include dragging event content and a bitmap of the shadow; wherein, when the object is text, the dragging event content includes text , when the object is a file or folder, the drag event content is the file path.
  • the method may further include: the first terminal displays an invisible window, the transparency of the invisible window is greater than a threshold, and the invisible window is invisible.
  • the window is used to receive the drag event; before sending the drag data to the second terminal, the method may further include: the first terminal obtains the drag event content from the drag event received by the invisible window; the first terminal obtains the shadow data bitmap.
  • the object is the icon of the application; or, the object is a window, and the window includes the interface of the application; when the object is the icon of the application, the drag data includes: the icon of the application; when the object is the window, The drag data includes: the interface of the application.
  • a third aspect of the present application provides an apparatus for dragging objects across devices, which is applied to a first terminal, and the first terminal is connected to a second terminal.
  • the apparatus may include: a display unit for displaying objects on the first terminal A first cursor is displayed on the top; the input unit is used to receive a drag operation input by the user using the input device of the first terminal, and the drag operation is used to initiate a drag for the object; the display unit is also used to respond to the drag operation, Display an animation of the object moving with the first cursor on the display screen of the first terminal; the sending unit is configured to send drag data to the second terminal when it is determined that the object is dragged out of the edge of the display screen of the first terminal.
  • the above drag data is used for the second terminal to initiate a drag event for the object, so that the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal.
  • the dragging operation may include a pressing operation and a moving operation; the sending unit is further configured to send data of the moving operation input by the user using the input device of the first terminal to the second terminal.
  • the apparatus may further include: an intercepting unit; an intercepting unit, configured to intercept a movement event during a user performing a movement operation using the input device of the first terminal; a sending unit, specifically for The operation parameters included in the movement event are sent to the second terminal.
  • the sending unit is further configured to send the shuttle status information to the second terminal, where the shuttle status information is used to indicate the start of the shuttle.
  • the display unit is specifically configured to display an animation of the shadow of the object moving with the first cursor on the display screen of the first terminal.
  • the display unit is further configured to hide the shadow of the first cursor and the object.
  • the above-mentioned object is a text, a file or a folder; the drag data includes drag event content and a bitmap of the shadow; wherein, when the object is text, the drag event content includes text, and the object is For files or folders, the drag event content is the file path.
  • the display unit is further configured to display an invisible window, the transparency of the invisible window is greater than a threshold, and the invisible window is used to receive drag events;
  • the apparatus may further include: an obtaining unit; an obtaining unit, used for Get the drag event content from the drag event received by the invisible window; get the bitmap of the shadow.
  • the object is the icon of the application; or, the object is a window, and the window includes the interface of the application; when the object is the icon of the application, the drag data includes: the icon of the application; when the object is the window, The drag data includes: the interface of the application.
  • a fourth aspect of the present application provides a cross-device object dragging apparatus, which is applied to a second terminal, and the second terminal is connected to the first terminal.
  • the apparatus may include: a display unit for displaying a display screen on the second terminal The object dragged from the first terminal is displayed on the top; the second cursor is displayed on the object; the receiving unit is used for receiving the movement operation input by the user using the input device of the first terminal; the display unit is also used for according to the movement operation , displaying an animation of the object moving with the second cursor on the display screen of the second terminal.
  • the apparatus further includes: a receiving unit, further configured to receive shuttle status information from the first terminal, where the shuttle status information is used to indicate the start of the shuttle.
  • the receiving unit is further configured to receive drag data from the first terminal
  • the display unit is specifically configured to display the drag data from the first terminal on the display screen of the second terminal according to the drag data.
  • the dragged object The drag data and the movement operation are sent to the second terminal after the first terminal determines that the object is dragged out of the edge of the display screen of the first terminal when the object moves with the first cursor on the display screen of the first terminal, and uses to initiate a drag event for the object.
  • the apparatus may further include: a generating unit, configured to generate a pressing operation; a display unit, specifically configured to display the pressing operation and dragging data on the second terminal according to the moving operation An animation of the object moving with the second cursor is displayed on the screen.
  • the generating unit is specifically configured to simulate the pressing event according to the operation parameter of the pressing operation; the receiving unit is specifically configured to receive the operation parameter from the first terminal; the generating unit is further configured to simulate the pressing event according to the operation parameter Simulate a movement event; the operation parameter is the operation parameter included in the movement event received by the first terminal after the user uses the input device of the first terminal to perform the movement operation; the display unit is specifically used for responding to the pressing event and the movement event, according to drag data, and an animation of the object moving with the second cursor is displayed on the display screen of the second terminal.
  • the apparatus may further include: a creating unit; a creating unit, configured to create a virtual input device after the connection with the first terminal is successfully established; or a receiving unit, further configured to receive data from A notification message of the first terminal, where the notification message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on, and the creation unit is used to create a virtual input device in response to the notification message; wherein, the virtual input device is used by the second terminal according to the operation The parameter simulates an input event.
  • the display unit is specifically configured to display the shadow of the object dragged from the first terminal on the display screen of the second terminal.
  • the display unit is specifically configured to display an animation of the shadow of the object moving with the second cursor on the display screen of the second terminal according to the moving operation.
  • the above-mentioned object may be a text, a file or a folder; the drag data includes the drag event content and the bitmap of the shadow; wherein, when the object is text, the drag event content includes text, the object When it is a file or folder, the drag event content is the file path.
  • the creation unit is also used to create an invisible activity
  • the invisible activity has a view control whose transparency is greater than a threshold
  • the view control is used to initiate a drag event.
  • the object is the icon of the application; or, the object is a window, and the window includes the interface of the application; when the object is the icon of the application, the drag data includes: the icon of the application; when the object is the window, The drag data includes: the interface of the application.
  • a fifth aspect of the present application provides a cross-device object dragging apparatus, comprising: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to implement the first aspect or The method described in any one of the possible implementations of the first aspect, or the method described in any one of the second aspect or the possible implementations of the second aspect.
  • a sixth aspect of the present application provides a computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by a processor, implement any one of the first aspect or the possible implementation manners of the first aspect The method, or implement the method according to any one of the second aspect or possible implementation manners of the second aspect.
  • a seventh aspect of the present application provides a terminal, the terminal includes a display screen, one or more processors and a memory; the display screen, the processor and the memory are coupled; the memory is used for storing computer program code, and the computer program code includes computer instructions , when the computer instruction is executed by the terminal, causing the terminal to execute the method described in any one of the first aspect or the possible implementation manners of the first aspect, or causing the terminal to execute the method as described in the second aspect or the second aspect The method described in any one of the possible implementations.
  • An eighth aspect of the present application provides a computer program product, comprising computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are run in a terminal , the processor in the terminal executes the method described in any one of the first aspect or possible implementations of the first aspect, or executes the method described in any one of the second aspect or possible implementations of the second aspect .
  • a ninth aspect of the present application provides a cross-device object dragging system, the system may include a first terminal and a second terminal, and the first terminal is connected to the second terminal.
  • the first terminal displays a first cursor on the object displayed on the first terminal, and receives a drag operation input by the user using an input device of the first terminal, where the drag operation is used to initiate a drag on the above-mentioned object; in response to the drag operation , the first terminal displays an animation of the object moving with the first cursor on the display screen of the first terminal, and when it is determined that the object is dragged out of the edge of the display screen of the first terminal, the first terminal sends the dragging data and the user usage information to the second terminal.
  • the movement operation input by the input device of the first terminal.
  • the second terminal displays the object dragged from the first terminal on the display screen of the second terminal according to the drag data, and displays the second cursor on the object; the second terminal receives the moving operation input by the user using the input device of the first terminal .
  • the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation.
  • the first terminal after determining that the object is dragged out of the edge of the display screen of the first terminal, the first terminal sends shuttle status information to the second terminal, where the shuttle status information is used to indicate the start of shuttle.
  • the first terminal hides the first cursor and the object after determining that the object is dragged out of the edge of the display screen of the first terminal.
  • the edge of the object that is dragged out of the display screen of the terminal may be a part of the object that is dragged out (or overflowed) of the terminal display screen, or the edge of the display screen of the terminal.
  • the entire area of the object is dragged out (or overflows) the terminal display screen, or the cursor slides out of the edge of the terminal display screen when the object moves with the cursor on the terminal display screen, which is not specifically limited in this embodiment.
  • cross-device object dragging apparatus described in the third aspect and any possible implementation manners provided above, and the cross-device object dragging apparatus described in the fourth aspect and any possible implementation manners thereof.
  • a dragging device, the cross-device object dragging device according to the fifth aspect, the computer-readable storage medium according to the sixth aspect, the terminal according to the seventh aspect, the computer program product according to the eighth aspect, and the ninth aspect For the beneficial effects that can be achieved by the cross-device object dragging system, reference may be made to the beneficial effects in the first aspect or the second aspect and any possible implementation manners thereof, which will not be repeated here.
  • FIG. 1 is a schematic diagram of a collaborative office scenario provided by the prior art
  • FIG. 2 is a simplified schematic diagram of a system architecture provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a tablet computer according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a method for dragging objects across devices provided by an embodiment of the present application
  • 6A is a schematic diagram of a coordinate system on a display screen provided by an embodiment of the present application.
  • 6B is a schematic diagram of a cross-device object dragging interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a data structure of a drag event on a Windows side provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a data structure of a drag event on an Android side provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of another cross-device object dragging interface provided by an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of another method for dragging and dropping objects across devices provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of yet another cross-device object dragging interface provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of yet another cross-device object dragging interface provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of the composition of a device for dragging objects across devices provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of the composition of another device for dragging objects across devices provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of the composition of a chip system according to an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • Embodiments of the present application provide a method and device for dragging and dropping objects across devices, and the method can be applied to a scenario where multiple terminals are used collaboratively.
  • the method provided in this embodiment uses the keyboard and mouse sharing technology, so that the user can use input devices such as touchpad, mouse and other input devices to drag and drop content such as text or files (or objects ) is transmitted among multiple terminals participating in the cooperative use, and users are allowed to use these terminals to process the transmitted content. That is to say, the hardware capabilities of these multiple terminals can all participate in the collaborative office.
  • it since there is no need to start screen projection, it will not occupy the display space of a terminal display.
  • the use efficiency of multi-terminal collaborative use is improved, and the user experience is improved.
  • FIG. 2 is a simplified schematic diagram of a system architecture to which the above method can be applied, provided by an embodiment of the present application.
  • the system architecture may be a cross-device object dragging system in this embodiment.
  • the system architecture may at least include: a first terminal 201 and a second terminal 202 .
  • the first terminal 201 is connected to the input device 201-1 (as shown in FIG. 2 ), or includes the input device 201-1 (not shown in FIG. 2 ).
  • the input device 201-1 may be a mouse, a touch pad, a touch screen, and the like.
  • the input device 201-1 is a mouse as an example.
  • the first terminal 201 and the second terminal 202 may establish a connection in a wired or wireless manner. Based on the established connection, the first terminal 201 and the second terminal 202 may be used together in cooperation.
  • the wireless communication protocol adopted when the first terminal 201 and the second terminal 202 establish a connection wirelessly may be wireless fidelity (Wi-Fi) protocol, Bluetooth (Bluetooth) protocol, ZigBee protocol, The near field communication (Near Field Communication, NFC) protocol, etc., may also be various cellular network protocols, which are not specifically limited here.
  • the user can use a set of input devices, such as the above-mentioned input device 201-1, to control both the first terminal 201 and the second terminal 202 using the keyboard and mouse sharing technology. That is to say, the user can not only use the input device 201-1 of the first terminal 201 to control the first terminal 201, but the first terminal 201 can also share its input device 201-1 with the second terminal 202 for the user to realize Control of the second terminal 202 .
  • a set of input devices such as the above-mentioned input device 201-1
  • the above-mentioned input device 201-1 is a mouse as an example.
  • the user can use the mouse to drag and drop the text or files of the first terminal 201 by using the mouse and keyboard sharing technology without starting screen projection. Drag and drop to the second terminal 202 .
  • the user can also use the mouse to drag and drop content such as text or files of the second terminal 202 to the first terminal 201 by dragging and dropping.
  • the user uses the mouse to drag and drop the text or file of the first terminal 201 to the first terminal 201 by dragging and dropping.
  • the content can be dragged to the third terminal by dragging and dropping.
  • the user releases the mouse the content dragging is complete.
  • the terminals in the embodiments of the present application may be mobile phones, tablet computers, handheld computers, PCs, cellular phones, personal digital Assistant (personal digital assistant, PDA), wearable devices (such as smart watches), smart home devices (such as TV sets), car computers, game consoles, and augmented reality (AR) ⁇ virtual reality (virtual reality, VR) equipment, etc.
  • PDA personal digital assistant
  • AR augmented reality
  • VR virtual reality
  • FIG. 2 the first terminal 201 is a PC
  • the second terminal 202 is a tablet computer as an example.
  • the terminal is a tablet computer as an example.
  • FIG. 3 is a schematic structural diagram of a tablet computer according to an embodiment of the present application. The methods in the following embodiments can be implemented in a tablet computer having the above-mentioned hardware structure.
  • the tablet computer may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, Antenna 1, Antenna 2, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193 and display screen 194, etc.
  • the tablet computer may further include a mobile communication module 150, a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environmental sensor Light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in this embodiment does not constitute a specific limitation on the tablet computer.
  • the tablet computer may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller can be the nerve center and command center of the tablet.
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, SIM interface, and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface SIM interface
  • USB interface etc.
  • the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 is charging the battery 142 , it can also supply power to the tablet computer through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 can also receive input from the battery 142 to power the tablet computer.
  • the wireless communication function of the tablet computer can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the tablet can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the tablet computer.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the tablet computer including wireless local area networks (WLAN) (such as Wi-Fi networks), bluetooth (BT), global navigation satellite system (GNSS) , frequency modulation (frequency modulation, FM), NFC, infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • FM frequency modulation
  • NFC infrared technology
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the tablet computer is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the tablet computer can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the tablet computer realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the tablet computer may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the tablet computer can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the tablet computer may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the tablet computer.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the tablet computer by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the tablet computer.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the tablet computer can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the tablet computer detects the intensity of the touch operation according to the pressure sensor 180A.
  • the tablet computer can also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • the gyroscope sensor 180B can be used to determine the motion attitude of the tablet computer.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the tablet can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the acceleration of the tablet computer in various directions (generally three axes).
  • Distance sensor 180F for measuring distance.
  • the tablet computer can use the proximity light sensor 180G to detect the user holding the tablet computer close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the tablet computer can use the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a picture with the fingerprint, answer the incoming call with the fingerprint, etc.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the tablet computer, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • Motor 191 can generate vibrating cues. The motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the tablet computer.
  • the tablet computer can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the tablet computer interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the tablet employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the tablet and cannot be separated from the tablet.
  • FIG. 2 the software structure of the first terminal 201 and the second terminal 202 is exemplarily described in this embodiment of the present application by taking the software system of the first terminal 201 as the windows system and the software system of the second terminal 202 as the Android system.
  • FIG. 4 is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
  • the software architecture of the first terminal 201 may include: an application layer and a windows system (windows shell).
  • the application layer may include various applications installed on the first terminal 201 . Applications at the application layer can directly interact with the Windows system.
  • the application layer may further include a mouse and keyboard module, a transmission management module, a drag and drop management module and a window management module.
  • the software system of the second terminal 202 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Take the software system of the second terminal 202 as an example of a layered architecture.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the second terminal 202 may include an application layer and a framework layer (framework, FWK).
  • the application layer can include a series of application packages.
  • an application package can include settings, calculator, camera, SMS, music player, etc. applications.
  • the application included in the application layer may be a system application of the second terminal 202 or a third-party application, which is not specifically limited in this embodiment of the present application.
  • the application layer may also include a transfer management module and a drag and drop management module.
  • the framework layer is mainly responsible for providing an application programming interface (API) and a programming framework for applications in the application layer.
  • the framework layer may include a window manager (or referred to as a window management service).
  • the second terminal 202 may also include other layers, such as a kernel layer (not shown in FIG. 4 ).
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can contain at least display drivers, camera drivers, audio drivers, sensor drivers, etc.
  • the user uses the input from the first terminal 201 based on the above software architecture and the keyboard and mouse sharing technology without starting the screen projection.
  • the device 201-1 can transfer content such as files or texts from the first terminal 201 to the second terminal 202 by dragging and dropping. Contents such as files or texts of the second terminal 202 can also be transferred to the first terminal 201 by dragging and dropping. That is to say, on the premise of not starting the screen projection, the user can use the input device 201-1 of the first terminal 201 to drag and drop the contents of the files or texts in the application on the first terminal 201 and the second terminal 201. Two-way drag between terminals 202.
  • the keyboard and mouse sharing technology may refer to a technology of realizing control of other terminals by using an input device (such as a mouse, a touchpad) of one terminal.
  • the above drag management module may also be referred to as a drag service module.
  • both the first terminal 201 and the second terminal 202 include a transmission management module, and the communication between the first terminal 201 and the second terminal 202 can be implemented through the transmission management module.
  • the drag management module may also have the function of communicating with other terminals, that is to say, neither the first terminal 201 nor the second terminal 202 may include a transmission management module, and the communication between them may pass through It is implemented by dragging and dropping the management module, which is not specifically limited in this embodiment.
  • the communication between the first terminal 201 and the second terminal 202 is implemented by using a transmission management module as an example for description.
  • FIG. 5 is a schematic flowchart of a method for dragging and dropping objects across devices provided by an embodiment of the present application.
  • the method provided in this embodiment is described in detail by taking the user using the PC mouse to transfer the content in the PC (the content is the dragged object) to the tablet computer by dragging as an example.
  • the method may include the following S501-S510.
  • the tablet computer establishes a connection with the PC.
  • the connection between the tablet computer and the PC can be established in a wired manner.
  • a tablet computer and a PC can establish a wired connection through a data cable.
  • connection between the tablet computer and the PC can be established wirelessly.
  • the connection information may be a device identification of the terminal, such as an internet protocol (Internet protocol, IP) address, a port number, or an account logged in by the terminal.
  • IP internet protocol
  • the account logged in by the terminal may be an account provided by the operator for the user, such as a Huawei account.
  • the account logged in by the terminal may also be an application account, such as a WeChat account, a Youku account, and the like.
  • the transmission capability of the terminal may be near-field communication capability or long-distance communication capability. That is to say, the wireless communication protocol used to establish a connection between terminals, such as a tablet computer and a PC, may be a near field communication protocol such as a Wi-Fi protocol, a Bluetooth protocol, or an NFC protocol, or a cellular network protocol.
  • a tablet computer and a PC to establish a connection wirelessly as an example.
  • the user can touch the NFC tag of the PC with the tablet computer, and the tablet computer reads the connection information stored in the NFC tag, for example, the connection information includes the IP address of the PC. After that, the tablet computer can establish a connection with the PC using the NFC protocol according to the IP address of the PC.
  • both the tablet computer and the PC have the Bluetooth function and the Wi-Fi function turned on.
  • the PC may broadcast a Bluetooth signal to discover surrounding terminals, for example, the PC may display a list of discovered devices, and the list of discovered devices may include the identifiers of the tablet computers discovered by the PC.
  • the PC can also exchange connection information, such as IP addresses, with the discovered devices.
  • the PC can establish a connection with the tablet computer by using the Wi-Fi protocol according to the IP address of the tablet computer.
  • both the tablet computer and the PC are connected to the cellular network, and the tablet computer and the PC log in to the same Huawei account. The tablet and PC can establish a connection based on the cellular network according to the Huawei account.
  • a user can use a set of input devices, such as a PC mouse, to control both the PC and the tablet.
  • a set of input devices such as a PC mouse
  • a set of input devices can be used to control both the PC and the tablet computer.
  • the PC may display a pop-up window for asking the user whether to enable the keyboard and mouse sharing mode. If an operation of selecting to enable the keyboard and mouse sharing mode is received from the user, the PC can enable the keyboard and mouse sharing mode.
  • the PC After the PC has enabled the keyboard and mouse sharing mode, it can notify all terminals connected to itself that the keyboard and mouse sharing mode has been enabled. If a connection is established between the PC and the tablet, the PC will notify the tablet that the keyboard and mouse sharing mode is turned on (for example, the PC can send a notification message to the tablet, and the notification message can be used to indicate that the PC's keyboard and mouse sharing mode is turned on).
  • the tablet computer After receiving the notification, the tablet computer can create a virtual input device, which has the same function as conventional input devices such as a mouse and a touchpad, and can be used for the tablet computer to simulate corresponding input events. For example, taking the input device as a mouse as an example, the virtual input device created by the tablet has the same function as a conventional mouse.
  • uinput is a kernel layer module that can simulate input devices.
  • a process can create a virtual input device with a specific function. Once the virtual input device is created, it can simulate corresponding events.
  • other terminals that have established a connection with the PC will also create virtual input devices according to the notification received.
  • the operating system of the terminal receiving the notification is the Android system
  • the uinput capability of linux can be used to create a virtual input device
  • the human interface device (HID) protocol can be used to realize the virtual input device. Creation of input devices.
  • the operating system of the terminal receiving the notification is another operating system such as the IOS system or the windows system
  • the HID protocol can be used to realize the creation of the virtual input device.
  • the above-mentioned embodiment is described by taking an example of creating a virtual input device after the terminal connected to the PC receives the notification for notifying the PC that the keyboard and mouse sharing mode has been turned on.
  • a pop-up window may also be displayed to ask the user whether he wants to use the input device of the PC to control the device. If it is received that the user chooses to use the input device of the PC to control the device, then create a virtual input device, otherwise, no virtual input device is created.
  • the PC after a connection is established between another terminal, such as a tablet computer, and the PC by default, the PC automatically enables the keyboard and mouse sharing mode, and the user does not need to manually enable it.
  • the virtual input device can also be created automatically, without the need for the PC to send a notification.
  • a pop-up window may be displayed to ask the user whether he wants to use the input device of the PC to control the device. If it is received that the user chooses to use the input device of the PC to control the device, the virtual input device is automatically created, otherwise the virtual input device is not created.
  • the PC since the mouse is the input device of the PC, after other terminals, such as the tablet computer, establish a connection with the PC, under normal circumstances, the PC temporarily responds to the operation of the mouse, or the user can temporarily use the mouse. Realize the control to the PC.
  • the PC after turning on the keyboard and mouse sharing mode, the PC can also trigger the transfer of other terminals that have established a connection with the PC and created a virtual input device when it is determined that the mouse shuttle condition is satisfied, such as the operation of the tablet computer on the mouse respond. That is to say, after the mouse shuttle condition is satisfied, the user can use the mouse to control other terminals that have established a connection with the PC and created a virtual input device, such as a tablet computer.
  • the mouse shuttle condition may be that the mouse corresponding to the mouse pointer displayed on the PC display screen slides over the edge of the PC display screen. That is to say, the user can move the mouse so that the mouse pointer corresponding to the mouse displayed on the PC display screen slides over the edge of the PC display screen, so as to trigger the transfer of the other terminal pair that has established a connection with the PC and created a virtual input device. Responds to mouse operations.
  • the PC may enable input (input) monitoring, and mount a hook (HOOK).
  • Input monitoring can be used to monitor the relative displacement and coordinate position of the mouse pointer.
  • the mounted HOOK can be used to intercept the corresponding input event (or shield the corresponding input event) after the mouse shuttle starts.
  • the input device is a mouse
  • the input event can be a mouse event, so that the mouse event is processed by the keyboard and mouse of the PC.
  • the mounted HOOK can also be used to capture intercepted input events, such as parameters in mouse events, after the mouse shuttle starts.
  • the PC can use input monitoring to monitor the relative displacement and coordinate position of the mouse pointer, and determine whether the mouse shuttle condition is satisfied according to the monitored data. After determining that the mouse shuttle condition is satisfied, the mounted HOOK intercepts the mouse event, captures the operation parameters in the mouse event, and sends the captured operation parameters to other terminals connected to the PC that have created a virtual input device, so that the terminal can Use the created virtual input device to simulate corresponding input events, such as mouse events, and then respond accordingly. That is, it is realized that the other terminal connected with the PC and created the virtual input device responds to the operation of the mouse.
  • the interception of input events and the capture of operation parameters therein can also be implemented in other ways (such as registering RAWINPUT in the PC).
  • the interception of input events and the capture of operation parameters therein can also be implemented in different ways. For example, taking the mouse as the input device, after the keyboard and mouse sharing mode is turned on, the PC can mount the HOOK and register the RAWINPUT. After the mouse shuttle starts, the mounted HOOK can be used to intercept mouse events (or shield the mouse event), the registered RAWINPUT can be used to capture the operation parameters in the intercepted mouse events.
  • This embodiment does not limit the specific implementation of the interception of mouse events and the capture of operation parameters therein.
  • the implementation of the interception of input events and the capture of operation parameters therein by mounting a HOOK is used as an example for introduction.
  • the process of responding to the operation of the mouse may include the following S502-S506.
  • the PC receives the mouse operation of the selected content.
  • the above-mentioned content may be text (or referred to as text, text), a file, or a folder.
  • Files can include files in one or more of the following formats, such as word documents, Excel workbooks, PowerPoint presentations, bitmaps, image files, plain text files, sound files, movie files, flash animation files, web page files, compressed files Wait.
  • the selected content may be one or multiple.
  • select two word documents For another example, select a word document and an image file.
  • select two folders For another example, select two folders.
  • the PC receives the mouse down event and the mouse movement event, initiates a drag event according to the mouse down event and the mouse movement event, and displays an animation of the shadow of the content moving with the mouse pointer on the display screen of the PC.
  • the mouse pointer described in this embodiment may also be called a cursor.
  • the cursor can be an image, it can be dynamic or static, and the cursor can be styled differently in different situations.
  • the mouse pointer is used as an example for description.
  • the mouse pointer of the PC may be the first cursor in this application.
  • the PC monitors the coordinate position of the mouse pointer on the PC display screen.
  • the PC intercepts the mouse movement event according to the coordinate position of the mouse pointer on the PC display screen and determines that the mouse pointer slides out of the edge of the PC display screen, and sends the mouse operation parameters included in the mouse movement event to the tablet computer.
  • the PC obtains the bitmap of the drag event content and the shadow, and sends the drag event content and the bitmap of the shadow to the tablet computer.
  • the drag event content is used for dragging the connecting terminal device, for example, a tablet computer initiates a drag event.
  • the drag event content may include the text (text).
  • the drag event content may include a file path (such as a uniform resource identifier (uri).
  • the drag data in this application may include the drag and drop data.
  • a bitmap of event content and shadows that can be used for drag-and-drop devices, such as a tablet, to animate objects on their display as they move with the mouse pointer.
  • the user when the user wants to transfer the content of the PC to other terminals connected to the PC that have created a virtual input device, such as a tablet computer, in the tablet computer
  • a virtual input device such as a tablet computer
  • the user can use the input device of the PC to select the content to be dragged.
  • the mouse pointer of the PC is displayed on the content
  • the user can input a drag operation, so that the PC can drag the corresponding object, that is, the content (eg, the content selected in S502 ), according to the drag operation.
  • the drag operation may be an operation for instructing to initiate a drag event for the selected content.
  • the dragging operation may include one operation, or may include multiple operations.
  • a drag operation includes two operations, namely a press operation and a move operation.
  • the pressing operation may be a mouse pressing operation
  • the moving operation may be a mouse moving operation.
  • the user can press and move the mouse (that is, use the PC's mouse to input the mouse-pressing operation and the mouse-moving operation) to trigger the PC's windows system to initiate a drag event on the content, so that the content (such as the shadow of the content) can be accessed.
  • the PC can determine whether the dragged content (eg, the shadow of the content) is dragged off the edge of the PC display screen.
  • a mouse-shuttle condition can be triggered when content, such as a shadow of the content, is dragged off the edge of the PC display.
  • the content that is dragged out of the edge of the PC display screen may be a partial area of the content (such as the shadow of the content) that is dragged out (or overflows) the PC display screen (that is, the proportion of the window overflowing the display screen area is greater than the predetermined ratio).
  • the mouse pointer slides out of the edge of the PC display screen, which is not specifically limited in this embodiment.
  • the following is an example of judging whether the dragged content (such as the shadow of the content) is dragged out of the edge of the PC display screen, specifically judging whether the mouse pointer slides out of the edge of the PC display screen.
  • the user can continuously move the mouse in the same direction so that the mouse pointer correspondingly displayed on the PC display screen slides (or slides out) the edge of the PC display screen, that is, the mouse shuttle condition is triggered.
  • the PC may determine the coordinate position of the mouse pointer on the PC display screen according to the initial position and relative displacement of the mouse pointer, so as to determine whether the mouse pointer slides out of the edge of the PC display screen.
  • the initial position of the mouse pointer may be the coordinate position of the mouse pointer on the PC display screen when the mouse starts to move, or the coordinate position of the mouse pointer on the PC display screen before the mouse starts to move.
  • the initial position of the mouse pointer may specifically refer to a coordinate system in which the upper left corner of the PC display screen is the coordinate origin, the X axis points from the upper left corner to the right edge of the PC display screen, and the Y axis points from the upper left corner to the lower edge of the PC display screen. Coordinate location.
  • the specific process for the PC to determine whether the mouse pointer slides out of the edge of the PC display screen may be: Referring to FIG.
  • the PC may establish that the initial coordinate position is used as the coordinate origin (position o as shown in FIG. 6A ), and the X-axis starts from The coordinate origin o points to the right edge of the PC display screen, and the Y-axis points from the coordinate origin o to the coordinate system at the top edge of the PC display screen.
  • the PC can determine the coordinate values of each edge of the PC display screen in this coordinate system.
  • the coordinate values of each edge of the PC display screen in the coordinate system can be determined according to the resolution of the PC display screen and the initial position of the mouse pointer.
  • the coordinate value of the right edge of the PC display screen on the X axis is x1
  • the coordinate value of the left edge on the X axis is -x2
  • the upper edge of the display screen is on the Y axis.
  • the coordinate value is y1
  • the coordinate value of the lower edge on the Y axis is -y2.
  • the PC can calculate the coordinate position (x, y) of the mouse pointer on the PC display screen after the mouse moves according to the relative displacement reported by the mouse. According to the coordinate position (x, y), the PC can determine whether the mouse pointer slides out of the edge of the PC display screen.
  • the coordinate value x of the mouse pointer on the X axis is greater than x1, it can be determined that the mouse pointer slides out of the right edge of the PC display screen. If the coordinate value x of the mouse pointer on the X axis is less than -x2, it can be determined that the mouse pointer slides out of the left edge of the PC display screen. If the coordinate value y of the mouse pointer on the Y axis is greater than y1, it can be determined that the mouse pointer slides out of the upper edge of the PC display screen. If the coordinate value y of the mouse pointer on the Y axis is less than -y2, it can be determined that the mouse pointer slides out of the lower edge of the PC display screen.
  • the user can use the input device of the PC to control other terminals connected to the PC that have created a virtual input device. That is, after the mouse shuttle condition is triggered, the PC can send the data of the operation input by the user using the input device of the PC to other terminals that have created the virtual input device. For example, if the user continues to move the mouse in the same direction, the PC can intercept the received movement events, such as mouse movement events, and transmit the operation parameters contained in the mouse movement events, such as mouse operation parameters, to the creation of the connection with the PC. other terminal of the virtual input device, so that the terminal can realize the connection of the drag event.
  • the received movement events such as mouse movement events
  • the operation parameters contained in the mouse movement events such as mouse operation parameters
  • the PC can transmit the corresponding operation parameters to the tablet computer, so that the tablet computer can Implement the continuation of drag and drop events.
  • the PC may display a display on the display screen of the PC when it is determined that the mouse shuttle condition is triggered.
  • a list option is displayed on the display, and the list option includes the identification of the device in which the virtual input device is created among the devices connected to the PC (for example, including the identification of the above-mentioned tablet computer).
  • the PC can determine the device that implements the drag event connection according to the user's selection. If the user selects the above-mentioned identifier of the tablet computer, the PC can send the corresponding operation parameters to the tablet computer, so that the tablet computer can realize the connection of the drag event.
  • the device connected to the PC may send a message indicating that the virtual input device is successfully created to the PC after completing the creation of the virtual input device.
  • the PC can obtain which of the devices connected to the PC has successfully created the virtual input device, and display the above list options based on this.
  • the shuttle relationship may be pre-configured. If there are multiple devices connected to the PC, and some or all of the multiple devices have established virtual input devices, the device that implements the drag connection can be determined according to the preconfigured shuttle relationship.
  • the tablet computer For example, if the above-mentioned tablet computer is included in the multiple devices connected to the PC, and the tablet computer creates a virtual input device, and the pre-configured shuttle relationship is that the mouse pointer slides out from the left side (or left edge) of the PC display screen, then determine The device that implements drag and drop is a tablet. Then, when the user presses and moves the mouse such that the mouse pointer slides over the left edge of the PC display, the PC can determine not only that the mouse shuttle has started, but also that the device implementing the drag continuation is a tablet. Of course, if there is one device connected to the PC, and the device creates a virtual input device, it can also be determined whether the device that implements the drag connection is the device according to the preconfigured shuttle relationship.
  • the preconfigured shuttle relationship is that the mouse pointer slides out from the left edge of the PC display screen, and the mouse shuttles to the tablet computer. However, after the user presses and moves the mouse so that the mouse pointer slides over the right edge of the PC display, it can be determined that the mouse does not shuttle to the tablet.
  • the device that realizes the drag connection can be determined by identifying the position of the device.
  • positioning technologies such as Bluetooth, Ultra-wideband (UWB), and ultrasound can be used to identify the location of devices located around the PC, such as the PC If it is recognized that the left side of the PC is a tablet computer, it can be determined that the device that realizes the drag connection is a tablet computer.
  • UWB Ultra-wideband
  • the shuttle relationship may be configured in advance by the user through a configuration file, or a configuration interface for configuring the shuttle relationship may be provided for the user, and the user may configure the shuttle relationship in advance through the configuration interface.
  • a configuration interface for configuring the shuttle relationship may be provided for the user, and the user may configure the shuttle relationship in advance through the configuration interface.
  • the PC receives the user's operation of opening the configuration interface, and displays the configuration interface.
  • the configuration interface includes the logo of the PC (such as the icon of the PC) and the logo of the tablet computer (such as the icon of the tablet computer). The user can drag these two logos to Configure the shuttle relationship.
  • the PC can determine that when the mouse pointer slides over the left edge of the PC display screen, the device that implements the drag connection is the tablet computer. If the user places the logo of the tablet computer to the right of the logo of the PC, the PC can determine that when the mouse pointer slides over the right edge of the display screen of the PC, the device that realizes the drag connection is the tablet computer.
  • the shuttle relationship of each device can be configured in a pre-configured manner.
  • the determined device for realizing drag and drop is a tablet computer as an example for description.
  • the above S501 can be executed before the mouse shuttle is triggered, or can be executed after the mouse shuttle is triggered.
  • the embodiments are not specifically limited herein.
  • the keyboard and mouse module of the PC may receive a corresponding operation, such as a mouse operation for the user to select the content.
  • the user can move the mouse so that the PC's mouse pointer is displayed on the user's selection.
  • the keyboard and mouse module of the PC can receive a press event correspondingly (such as mouse down events) and movement events (such as mouse move events).
  • the Windows system of the PC can initiate a drag event for the content, and draw the content, such as the animation (or called drag animation) in which the shadow of the content moves with the mouse pointer. displayed on the display of the PC.
  • the user wants to drag and drop the picture 601 of the PC to the tablet computer, and continue to drag and drop in the tablet computer. The user selects the picture 601 using the mouse 602, then presses and moves the mouse 602.
  • the PC displays an animation of the shadow 606 of the picture 601 moving with the mouse pointer 604 on the display screen 603 of the PC, as shown in FIG. 6B , the drag track of the shadow 606 of the picture 601 moving with the mouse pointer 604 As shown by trace 605 .
  • the operation of selecting the content such as the mouse operation of selecting the content in S502 is optional.
  • the mouse operation of the selected content may not be performed, but when the mouse pointer is displayed on the file or folder, the press operation and Move operation, you can initiate a drag event for the file or folder.
  • the PC will enable input monitoring and mount HOOK.
  • the mouse pointer will move on the PC display screen, and the keyboard and mouse module of the PC can use input monitoring to monitor the real-time coordinate position of the mouse pointer on the PC display screen.
  • the mouse and keyboard module of the PC determines that the mouse pointer slides out of the edge of the PC display screen according to the monitored real-time coordinate position of the mouse pointer on the PC display screen, it can be determined that the above-mentioned mouse shuttle condition is satisfied. At this time, the mouse and keyboard module of the PC can determine that the mouse shuttle starts.
  • the keyboard and mouse module of the PC can send the shuttle status information to the tablet computer through the connection established with the tablet computer through the transmission management module of the PC to instruct the mouse to start shuttle.
  • the tablet computer can simulate a mouse pointer and display the mouse pointer on the display screen of the tablet computer (the mouse pointer displayed on the tablet computer can be the second cursor in the application).
  • the mouse and keyboard module of the PC can also hide the mouse pointer displayed on the PC display.
  • objects that move with the mouse pointer such as object shadows, are also hidden. For example, in conjunction with FIG.
  • the PC hides the shadow 606 of the picture 601 and the mouse pointer 604 displayed on the PC's display screen 603 after the mouse pointer 604 slides over the edge of the PC display screen 603. Additionally, the tablet displays a mouse pointer on the tablet's display. Gives the user the visual effect of moving the mouse pointer from PC to tablet.
  • the keyboard and mouse module of the PC determines that the mouse shuttle starts, if the user operates the mouse, the keyboard and mouse module of the PC can use HOOK to intercept the received corresponding input events, such as mouse events, and capture the intercepted mouse events. Operation parameters, such as mouse operation parameters.
  • the mouse operation parameters may include: a mouse button flag bit (used to indicate which operation the user has pressed, lifted, moved or scrolled with the mouse), coordinate information (when the user moves the mouse, use the It is used to indicate the X and Y coordinates of the mouse movement), wheel information (used to indicate the X-axis distance and Y-axis distance of the scroll wheel when the user operates the mouse wheel), key information (used to indicate the user's which of the left, middle, or right key was used).
  • the keyboard and mouse module of the PC can also transmit the captured operation parameters, such as mouse operation parameters, to the tablet computer through the established connection through the transmission management module of the PC, so that the tablet computer can respond accordingly. For example, continuing with the example shown in FIG.
  • the mouse and keyboard module of the PC can receive movement events, such as mouse movement events.
  • the keyboard and mouse module of the PC can use HOOK at this time to intercept (or, in other words, block it), so that the mouse movement event will not be sent to the Windows system of the PC, so that the PC will not respond to the received mouse movement event. Do respond.
  • the mouse and keyboard module of the PC can also use HOOK to capture the operation parameters of the intercepted mouse movement events, such as mouse operation parameters, and send the captured mouse operation parameters to the tablet computer through the established connection through the transmission management module of the PC.
  • the corresponding mouse operation parameters may be: a mouse button flag bit used to indicate that the user has moved the mouse, the coordinates of the X coordinate and the Y coordinate used to indicate the mouse movement information, wheel information (null value) and key information (null value).
  • the PC eg, the drag management module of the PC
  • the PC can identify the current drag state of the PC (ie, whether the drag is being performed). If the PC is currently dragging, it can initiate a continuation of the drag event, or initiate a cross-screen drag. For the windows side, since the drag event needs to be initiated from the window, it also needs to be received by the window. Therefore, after determining that the mouse shuttle has started, the PC can display an invisible window, or so-called invisible window.
  • the mouse and keyboard module of the PC may send a callback instruction for the start of the mouse shuttle to the drag management module of the PC.
  • the drag management module of the PC may send a request for instructing to create an invisible window to the window management module of the PC according to the callback instruction.
  • the window management module of the PC can create and display invisible windows according to the request. For example, the window management module of the PC can display the invisible window at the edge of the PC display screen.
  • the transparency of the invisible window is greater than the threshold, for example, the transparency of the invisible window is very high, or completely transparent.
  • the invisible window can receive the drag event from the windows system. drag event. If the PC does not drag when the mouse shuttle occurs, that is, when the user moves the mouse without selecting anything, but only moves the mouse, the invisible window will not receive the drag event.
  • the window management module of the PC can obtain the content of the drag event from the drag event received by the invisible window. For example, the window management module of the PC can capture the drag event content from the drag event through the DragEnter event.
  • the window management module of the PC After the window management module of the PC obtains the drag event content, it can be sent to the tablet computer through the connection established with the tablet computer through the transmission management module of the PC.
  • the PC may further serialize the drag event content before sending it to the tablet computer, that is, the drag event content sent by the PC to the tablet computer may be data obtained after serialization.
  • FIG. 7 a schematic diagram of a data structure of a drag event on a Windows side provided by an embodiment of the present application.
  • the invisible window will receive the data object corresponding to the drag event, such as IDataObject.
  • the PC's window management module can attach it to the COleDataObject.
  • the drag event content in the IDataObject corresponding to the drag event is obtained through the DragEnter event, such as calling the GetData function.
  • the content of the drag event required to implement the continuation of the drag event may include text or a file path, or the like.
  • the window management module of the PC can obtain the text in the IDataObject through the GetData(CF_UNICODETEXT) function.
  • the window management module of the PC can obtain the file path in the IDataObject through the GetData(CF_HDROP) function. Once the text or file path is obtained, it can be serialized and sent to the tablet.
  • the PC in order to display the dragged object on the tablet computer, such as the animation of the shadow of the object moving with the mouse pointer, since the shadow and the bitmap can be converted to each other, the PC also needs to Get the bitmap of the shadow displayed on the PC.
  • the PC can obtain the bitmap of the shadow by intercepting the image of the dragged content displayed on the PC display screen.
  • the PC can generate a bitmap of the shadow according to the dragged text.
  • the PC can find the dragged content according to the obtained file path of the dragged content to determine the type of the dragged content (such as an image file), and then can find the dragged content according to the file path of the dragged content.
  • the type uses the corresponding default material as the bitmap of the shadow, or the thumbnail of the content can be obtained as the bitmap of the shadow according to the obtained file path of the dragged content.
  • the bitmap of the shadow can be sent to the tablet computer through the established connection with the tablet computer through the transmission management module of the PC.
  • the PC can also serialize the shadow bitmap before sending it to the tablet, that is, the shadow bitmap sent by the PC to the tablet can be data obtained after serialization.
  • the tablet computer receives the drag event content and the bitmap of the shadow, and initiates a drag event according to the drag event content and the bitmap of the shadow.
  • the tablet computer receives the mouse operation parameters, and simulates a mouse movement event according to the mouse operation parameters.
  • the tablet computer generates a mouse press event.
  • the tablet computer After the tablet computer receives the bitmap of the drag event content and shadow, it can parse it and initiate a drag event.
  • the tablet can start a transparent activity (activity), or invisible activity.
  • the invisible activity has a view control with a transparency greater than the threshold.
  • the Android open source project (AOSP) interface is called, and the tablet computer can initiate the corresponding drag event according to the received drag event content and the bitmap of the shadow, so as to realize drag and drop on the tablet computer. continuation of the event.
  • AOSP Android open source project
  • the transmission management module of the tablet computer can transmit the drag event content and the bitmap of the shadow to the tablet.
  • the drag-and-drop management module of the computer parses the received drag event content, and obtains the text or file path from the PC. According to the obtained text or file path, the drag management module of the PC can construct the content data (clipData) of the drag event.
  • clipData content data
  • the drag management module of the tablet computer can also generate a corresponding shadow according to the received bitmap of the shadow.
  • the drag management module of the tablet can use the view control of the transparent activity opened by the tablet, call the startDragAndDrop method provided by the AOSP interface, and use the clipData and the shadow as input parameters to initiate the drag event on the tablet.
  • the drag event content and the bitmap of the shadow are serialized on the PC before being sent to the tablet, after the tablet receives the corresponding data, deserialization can be performed to obtain the drag event content and Shadow bitmap.
  • FIG. 8 a schematic diagram of a data structure of an Android-side drag event (DragEvent) provided by an embodiment of the present application.
  • the drag management module of the tablet computer can construct content data (clipData) according to the text (text) or file path (uri) received from the PC (wherein the text or file path is included in the content ( item)), and generate the corresponding shadow according to the received bitmap of the shadow, and then call the startDragAndDrop method of AOSP to combine the clipData and shadow, and according to the user's operation information on the mouse (such as the received mouse movement event), Other parameters obtained (for example, action, which can include start, enter, hover, release, leave, end, etc.), current x coordinate, current y coordinate, local state (localstate), content description (clipdescription, etc.), etc.
  • action which can include start, enter, hover, release, leave, end, etc.
  • current x coordinate current y coordinate
  • local state localstate
  • content description clipdescription, etc.
  • the content description includes a label.
  • the label is used to indicate whether the drag event is initiated by the tablet's drag management module or an application in the tablet.
  • Label is a string (String).
  • the label is "windowscast”
  • it is used to indicate that the drag event is initiated by the tablet's drag management module
  • the label is not “windowscast”
  • it is used to indicate that the drag event is not initiated by the tablet's drag management module Instead, it is initiated by the tablet's app.
  • the tag in the content description of the drag event is "windowscast”.
  • the specific descriptions and construction rules of other parameters are similar to the corresponding implementation of generating the original drag event on the Android side in the prior art, and will not be repeated here.
  • the execution of the drag event may be triggered by a drag operation, and the drag operation may include a press operation (eg, a mouse press operation) and a move operation (eg, a mouse move operation).
  • a press operation eg, a mouse press operation
  • a move operation eg, a mouse move operation
  • the mouse and keyboard module of the PC will intercept the corresponding received movement events, such as mouse movement events, and the operation parameters contained in the mouse movement events, such as After the mouse operation parameters are captured and processed, they are sent to the tablet computer through the transmission management module of the PC, such as the transmission management module sent to the tablet computer.
  • the transmission management module of the tablet computer can receive the mouse operation parameters. Because the operating systems of the PC and the tablet are different, the key values of the mouse operation parameters in the mouse events are different. Therefore, after receiving the mouse operation parameter, the tablet computer can convert the received key code of the mouse operation parameter into a key code that can be recognized by the tablet computer according to the preset mapping relationship.
  • the tablet computer (such as the keyboard and mouse module of the tablet computer) can use the created virtual input device to simulate the input events that the tablet computer can recognize according to the mouse operation parameters after converting the key code, such as mouse events, which can simulate the tablet computer. Recognized movement events, such as mouse movement events.
  • the keyboard and mouse module of the tablet can also send mouse movement events to the framework layer of the tablet.
  • the shuttle initiator that is, the PC is in the dragging state, that is, the PC is in the mouse-pressed state
  • the shuttle target that is, the tablet can only receive to a mouse move event, i.e. not in a mouse down state. Therefore, the tablet can generate a press, such as a mouse press.
  • the tablet computer can receive a drag start callback (onDragStart).
  • the tablet computer may determine whether the drag event is initiated by the drag management module of the tablet computer according to the label in the drag event initiated by the tablet computer.
  • the tablet computer can use the created virtual input device to generate (or Say simulation) press events, such as mouse press events, for example, the drag management module of the tablet computer controls the mouse and keyboard module of the tablet computer to send the mouse press event to the frame layer of the tablet computer by using the created virtual input device.
  • the drag event initiated in S507 of the tablet computer can be attached to the mouse pointer displayed on the display screen of the tablet computer.
  • the tablet computer executes the drag event according to the mouse movement event and the mouse press event, and displays an animation of the shadow of the content moving with the mouse pointer on the display screen of the tablet computer.
  • the tablet computer may execute the drag event initiated in S507 according to a movement event, such as a mouse movement event and a press event, such as a mouse press event.
  • a movement event such as a mouse movement event
  • a press event such as a mouse press event.
  • the tablet computer may also display content on the display screen of the tablet computer, such as an animation of the shadow of the content moving with the mouse pointer (the mouse pointer may be the second cursor in the application).
  • the tablet computer correspondingly displays the animation of the shadow 903 of the picture moving with the mouse pointer 904 on the display screen 902 of the tablet computer, as shown in the picture in FIG. 9 .
  • the drag track of the shadow 903 moving with the mouse pointer 904 is shown as track 905 .
  • the drag event is attached to the mouse pointer displayed on the tablet display on the tablet side. , so the user can use the mouse pointer to precisely select the mouse release point.
  • a user who wants to use or work with the content on a tablet can release the mouse after moving the mouse pointer over the view control of the tablet that wants to use or work with the content.
  • the keyboard and mouse module of the PC can receive a corresponding lift event, such as a mouse lift event.
  • the keyboard and mouse module of the PC will use HOOK to intercept (or in other words, block it) so that the mouse up event will not be sent to The Windows system of the PC, so that the PC will not respond to the received mouse up event.
  • the mouse and keyboard module of the PC can also use HOOK to capture the operation parameters contained in the mouse up event, such as the mouse operation parameters, and send the captured mouse operation parameters to the tablet computer through the established connection through the transmission management module of the PC.
  • the mouse operation parameters of the mouse lift event may include: a mouse button flag bit used to indicate that the user has lifted the mouse, coordinate information (the value is empty), wheel information (the value is empty) and an indicator for The key information indicating that the user has operated the left button of the mouse.
  • the transmission management module of the tablet computer can receive the mouse operation parameters. After the tablet computer receives the mouse operation parameter, the tablet computer can convert the received key code of the mouse operation parameter into a key code that can be recognized by the tablet computer according to the preset mapping relationship. After that, the tablet computer can simulate the mouse events that the tablet computer can recognize according to the operation parameters after converting the key code by using the created virtual input device, such as mouse operation parameters, that is, it can simulate the corresponding input events, such as the mouse up event.
  • the tablet computer can determine the mouse release point according to the current coordinate position of the mouse pointer. For example, after the tablet learns that the mouse and keyboard shuttle has started, the tablet can register a listener for the position of the mouse pointer coordinates. In this way, the tablet computer can monitor the coordinate position of the mouse pointer on the display screen of the tablet computer in real time through the listener. Based on this, after the tablet computer receives the mouse up event, using the listener, the tablet computer can obtain the current coordinate position of the mouse pointer on the display screen of the tablet computer. According to the obtained coordinate position, the tablet can determine the mouse release point.
  • the tablet computer will monitor the input event. For example, when the user continues to drag in the tablet computer, the tablet computer can monitor movement events, such as mouse movement events, and according to the mouse movement events, the tablet computer can obtain the operation parameters of the mouse movement event, such as mouse operation parameters, such as from MotionEvent extract this parameter.
  • the mouse operation parameter includes coordinate information for indicating the mouse position. After that, when the user lifts the finger, the tablet computer monitors the lift event, such as the mouse lift event, and the tablet computer can determine the coordinate position of the mouse pointer according to the previously obtained coordinate information, thereby determining the mouse release point according to the obtained coordinate position.
  • the tablet can also respond accordingly to lift events, such as mouse up events.
  • lift events such as mouse up events.
  • the drag management module of the tablet computer can send the content data in the above drag event to the view control at the mouse release point, The text is included in the content data.
  • the view control receives the content data, it can perform corresponding processing according to the text in the content data, such as displaying the text in the view control.
  • the content in S502 is a file as an example. Due to the process of dragging and dropping the file on the tablet computer, the file is not actually transferred to the tablet computer.
  • the PC can transmit the file to the tablet computer.
  • the tablet computer After the tablet computer receives the file, it can store the file in a predetermined cache directory.
  • the drag and drop management module of the tablet computer can obtain the uri (such as uri 1) of the file, and the uri 1 is the cache directory.
  • the path of the file below is different from the uri in the content of the drag event sent by the PC to the tablet in S507 (the uri sent by the PC is the storage path of the file on the PC).
  • the drag management module of the tablet computer can construct new content data according to uri 1, and send the content data to the view control at the mouse release point as a response to the mouse up event. After the view control receives the content data, it can perform corresponding processing. If the view control is the view control in the memo, the file can be displayed. If the view control is the input box in the chat window, the file can be displayed. send out.
  • the tablet computer when the tablet computer has established a connection with other devices, such as a mobile phone, if the user wants to continue dragging the content to the mobile phone, he can continue to move the mouse, so that the mouse pointer on the tablet computer slides Move the mouse across the edge of the tablet display to trigger the mouse to shuttle from the tablet to the phone, so that the drag event continues on the phone.
  • the PC can directly interact with the mobile phone, so that the mobile phone can realize the connection of the drag event.
  • the specific description of realizing the drag event connection on the mobile phone is similar to the specific description of the tablet computer realizing the drag event connection, and will not be repeated here.
  • the user can use an input device, such as a PC mouse, to drag and drop the text or files in the PC application to the screen of the PC without starting the screen projection. edge and continue moving in the same direction to trigger mouse shuttle. After the mouse shuttle starts, the mouse pointer appears on the tablet. And the PC sends the drag event content to the tablet computer, so that the drag event can continue to be attached to the mouse pointer of the tablet computer, so as to realize the continuation of the drag event on the tablet computer, and give the user a content that is dragged from the PC. Visual effects on tablet.
  • an input device such as a PC mouse
  • the user may not only need to transfer the content of the PC to the tablet computer by dragging and dragging, and continue to drag and drop in the tablet computer, but may also have the need to drag and drop the content in the tablet computer by dragging and dropping the content. way to pass the demand to the PC.
  • the user transfers the picture 1 of the PC to the tablet computer by dragging, and after continuing to drag and drop on the tablet computer, the user releases the mouse. After that, the user edits and saves the picture 1 by using the stylus of the tablet computer.
  • the user wants to transfer the edited picture 1 to the PC by dragging and dropping, and after the PC continues to drag and drop, release the mouse so as to save the edited picture 1 on the PC side.
  • FIG. 10 is a schematic flowchart of another method for dragging and dropping objects across devices provided by an embodiment of the present application.
  • the user wants to transfer the content of the tablet computer to the PC by dragging and dragging the content of the tablet computer as an example, and the method provided by this embodiment is carried out as an example. introduce.
  • the method may include the following S1001-S1010.
  • the tablet computer receives the mouse operation of the selected content from the PC.
  • the tablet computer receives the mouse operation parameters of the mouse down event and the mouse movement event from the PC, simulates the mouse down event and the mouse movement event according to the mouse operation parameters, initiates a drag event according to the mouse down event and the mouse movement event, and An animation of a shadow showing content on the tablet's display as the mouse pointer moves.
  • the PC will use the mounted HOOK to intercept the corresponding input event after the user operates the input device of the PC, such as a mouse, such as a mouse event, and the PC will capture the operation parameters contained in the intercepted mouse events, such as mouse operation parameters, and send them to the tablet computer.
  • the user wants to transfer the content in the tablet computer to the PC by dragging, and continues to drag and drop in the PC as an example.
  • the user uses an input device of a PC, such as a mouse, to select the content to be dragged.
  • the keyboard and mouse module of the PC can receive corresponding input operations, such as mouse operations.
  • the keyboard and mouse module of the PC will use HOOK to intercept the corresponding input events received, such as mouse events, so that the input events will not be sent to the Windows system of the PC, so that the PC will not respond to the received input events.
  • the keyboard and mouse module of the PC will also use HOOK to capture the operation parameters in the intercepted input events, such as mouse operation parameters, and transmit them to the tablet computer through the established connection through the transmission management module of the PC.
  • the transmission management module of the tablet computer can receive the corresponding operation parameters, such as mouse operation parameters. After the tablet computer performs key code conversion on the received operation parameters, it can simulate corresponding input events, such as mouse events, by using the created virtual input device.
  • the drag operation may be an operation for instructing to initiate a drag event for the selected content.
  • the dragging operation may include one operation, or may include multiple operations.
  • a drag operation includes two operations, namely a press operation and a move operation.
  • the pressing operation may be a mouse pressing operation
  • the moving operation may be a mouse moving operation.
  • the mouse and keyboard module of the PC can correspondingly receive a press event (such as a mouse press event) and a movement event (such as a mouse move event).
  • a press event such as a mouse press event
  • a movement event such as a mouse move event
  • the keyboard and mouse module of the PC can use HOOK to intercept the received mouse press events and mouse movement events, and use the HOOK to intercept the operation parameters of the intercepted mouse press events and mouse movement events, such as after the mouse operation parameters are captured.
  • transmitted to the tablet computer through the established connection through the transmission management module of the PC.
  • the transmission management module of the tablet computer can receive corresponding operation parameters, such as mouse operation parameters.
  • the tablet computer After the tablet computer performs key code conversion on the received operating parameters, it can simulate the corresponding input events by using the created virtual input device, such as simulating a press event (such as a mouse press event) and a movement event (such as a mouse move event) ).
  • a press event such as a mouse press event
  • a movement event such as a mouse move event
  • the tablet computer After the tablet computer simulates a press event (such as a mouse press event) and a movement event (such as a mouse move event), as a response to these events, the tablet computer can use the corresponding application (such as the selected content is a word document, the application is Office application; another example, if the selected content is a picture in the file manager, the application is a file manager; another example, if the selected content is a piece of text in a memo, the application is a memo) to initiate a drag event and draw the content, such as An animation of the content's shadow as the mouse pointer moves is shown on the tablet's display. For example, the user edits and saves the picture 1 dragged from the PC by using the stylus of the tablet computer.
  • a press event such as a mouse press event
  • a movement event such as a mouse move event
  • the user wants to drag and drop the edited picture to the PC, and continue to drag and drop in the PC.
  • the user can use the mouse 1101 to select the edited picture 1102 , and then press and move the mouse 1101 .
  • the tablet computer correspondingly displays an animation of the shadow 1106 of the edited picture 1102 moving with the mouse pointer 1104 on the display screen 1103 of the tablet computer.
  • the shadow 1106 of the picture 1102 follows the mouse pointer 1104
  • the moving drag track is shown as track 1105 .
  • the above description is given by taking the event that triggers the dragging including the pressing event and the moving event as an example.
  • the user can trigger a drag by pressing and moving the mouse.
  • the drag-triggering event may only include a press event.
  • the user can trigger dragging by long pressing the mouse, or the user can trigger dragging by pressing the mouse. There is no specific limitation in this embodiment.
  • the tablet computer obtains the bitmap of the drag event content and the shadow, and caches the obtained bitmap of the drag event content and the shadow.
  • the drag event content is used for dragging the connecting end device to construct a drag event.
  • the drag event content may include text.
  • the drag event content may include a file path (eg, uri).
  • the framework layer can use program instrumentation to extract the content of the drag event from it.
  • the location and content to be instrumented can be determined in advance, and then program instrumentation can be performed according to the determined location and content, so that the drag event content can be extracted.
  • the framework layer of the tablet computer can call back the extracted drag event content to the drag management module of the tablet computer application layer.
  • the drag management module of the tablet computer After the drag management module of the tablet computer obtains the content of the drag event, it can be cached.
  • the content of the drag event required to implement the continuation of the drag event may include text or a file path or the like.
  • the framework layer can extract the text or file path by means of program instrumentation.
  • the drag event may be initiated in two ways, one is initiated by the drag management module of the tablet computer (that is, the drag event is shuttled from the PC to the tablet computer, which is implemented as shown in FIG. 5 ). the description in the example), the other is initiated by the application of the tablet.
  • the content description of the drag event of the tablet computer includes a label, which is used to indicate that the drag event is caused by Whether initiated by the drag management module of the tablet, or initiated by the application in the tablet.
  • the tablet computer can distinguish whether the drag event is initiated by the application of the tablet computer or by the drag management module of the tablet computer according to the label in the content description of the drag event. . For example, when the label is not "windowscast", it can be determined that the drag event is not initiated by the tablet's drag management module, but by the tablet's application.
  • the framework layer needs to obtain the content of the drag event and send the obtained content of the drag event to the drag management module of the tablet, so that it can cache the drag event content. Drag the event content.
  • the framework layer may also acquire a bitmap of the shadow by adding a new interface or using an original interface (eg, using an interface that calls back clipData).
  • the obtained bitmap of the shadow can also be called back to the drag management module of the tablet application layer. After the bitmap of the shadow is obtained by the drag management module of the tablet, it can also be cached.
  • the tablet computer monitors the coordinate position of the mouse pointer on the display screen of the tablet computer.
  • the tablet computer According to the coordinate position of the mouse pointer on the display screen of the tablet computer, the tablet computer sends the shuttle status information indicating the end of the mouse shuttle to the PC when it is determined that the mouse pointer slides out of the edge of the display screen of the tablet computer.
  • the PC receives the shuttle state information for indicating the end of the mouse shuttle.
  • the tablet computer can determine whether the dragged content (eg, the shadow of the content) is dragged off the edge of the display screen of the tablet computer.
  • the specific description of the content being dragged out of the edge of the display screen of the tablet computer is similar to the specific description of the content being dragged out of the PC, and details are not repeated here.
  • the following is an example of judging whether the dragged content (such as the shadow of the content) is dragged out of the edge of the tablet computer display screen is specifically judging whether the mouse pointer slides out of the edge of the tablet computer display screen.
  • the mouse pointer moves on the display screen of the tablet computer, and the tablet computer can monitor the real-time coordinate position of the mouse pointer on the display screen of the tablet computer.
  • a tablet can register a listener for the position of the mouse pointer coordinates after the keyboard and mouse shuttle begins. In this way, the tablet computer can monitor the coordinate position of the mouse pointer on the display screen of the tablet computer in real time through the listener. The tablet computer can determine whether the mouse pointer slides over the edge of the tablet computer display screen according to the real-time coordinate position of the mouse pointer on the tablet computer display screen detected by the listener.
  • the tablet computer may determine the coordinate position of the mouse pointer on the tablet computer display screen according to the initial position and relative displacement of the mouse pointer, so as to determine whether the mouse pointer slides out of the edge of the tablet computer display screen.
  • the initial position of the mouse pointer may be the coordinate position of the mouse pointer on the display screen of the tablet computer when the mouse starts to move, or the coordinate position of the mouse pointer on the display screen of the tablet computer before the mouse starts to move.
  • the initial position of the mouse pointer may specifically refer to the coordinate origin at the upper left corner of the tablet computer display screen, the X axis points from the upper left corner to the right edge of the tablet computer display screen, and the Y axis points from the upper left corner to the lower edge of the tablet computer display screen.
  • the specific implementation of determining that the mouse pointer slides out of the edge of the display screen of the tablet computer by the tablet computer is similar to the specific implementation of determining that the mouse pointer slides out of the edge of the display screen of the PC by the PC, and will not be described in detail here.
  • the tablet After the tablet determines that the mouse pointer is over the edge of the tablet display, it indicates that the user wants to use the mouse to control other devices. As described in S506, if the tablet computer has only established a connection with one device of the PC, it means that the user wants to use the mouse to control the PC. If the tablet computer is connected with multiple devices, the mobile phone can display a list option, and the list option Includes the identification of all devices connected to the tablet, allowing the user to select the device they want to control with the mouse. If the user selects the identification of the PC, it indicates that the user wants to use the mouse to control the PC.
  • the shuttle relationship can also be pre-configured in the tablet computer to determine which device the mouse is shuttled to, that is, to determine which device to respond to the operation of the mouse.
  • the configuration and application of the shuttle relationship are described in detail in the above-mentioned embodiments. The description is similar, and will not be described in detail here.
  • the PC determines that the mouse shuttle ends according to the received shuttle state information.
  • the PC obtains the bitmap of the drag event content and the shadow from the tablet computer, and initiates a drag event according to the drag event content and the bitmap of the shadow.
  • the PC after the PC receives the shuttle status information indicating the end of the mouse shuttle, it can determine that the mouse shuttle ends.
  • the PC may display the mouse pointer on the PC display.
  • the PC hides the mouse pointer on the PC display screen. Therefore, the PC can redisplay the mouse pointer on the PC display screen after determining that the mouse shuttle ends.
  • the trigger condition for the start of mouse shuttle is that the mouse pointer slides over the edge of the PC display screen, so the mouse pointer is displayed on the edge of the PC display screen before being hidden.
  • the mouse shuttle ends after the PC cancels the hiding of the mouse pointer, the mouse pointer will be displayed on the edge of the PC display screen.
  • the mouse pointer on the tablet side will also not be displayed. This gives the user the visual effect of moving the mouse pointer from the tablet to the PC.
  • the PC After determining that the mouse shuttle is over, the PC also needs to uninstall the HOOK (or close the HOOK) to cancel the interception of input devices such as mouse events.
  • the user moves the mouse to transfer the content of the tablet to the PC by dragging, and triggers the mouse to end the shuttle, the user will continue to move the mouse in the same direction, and the mouse and keyboard module of the PC can receive the movement event. such as mouse movement events. Since the HOOK has been uninstalled at this time, the keyboard and mouse module of the PC will send the received movement events, such as mouse movement events, to the Windows system of the PC, so that the Windows system of the PC can respond to the movement event.
  • the HOOK has been uninstalled at this time
  • the keyboard and mouse module of the PC will send the received movement events, such as mouse movement events, to the Windows system of the PC, so that the Windows system of the PC can respond to the movement event.
  • the tablet computer is in the state of being dragged, and the mouse shuttle back to the PC is to realize the drag connection, that is, it needs to continue to realize the drag on the PC.
  • the Windows platform must press the mouse on the window, drag and drop can be initiated normally.
  • the PC can only receive a movement event, such as a mouse move event, that is, it will not receive a press event, such as a mouse press event. Therefore, the PC (eg, the PC's drag-and-drop management module) can generate a press event, such as a mouse press event, and transmit it to the invisible window.
  • the PC can generate a mouse press event and transmit it to the invisible window, so that the drag event initiated in S1009 on the PC side can be attached to the PC display screen On the displayed mouse pointer, the connection of the drag event is realized on the PC side.
  • the PC can also request the tablet computer for the dragging state of the tablet (that is, whether it is being dragged).
  • the dragging state returned by the tablet indicates that the tablet is being dragged
  • the PC can request a bitmap of the drag event content and shadow from the tablet.
  • the PC may send a request message to the tablet, which may be used to request drag data, ie, a bitmap for requesting drag event content and shadows.
  • the mouse and keyboard module of the PC may send an indication of the end of the mouse shuttle to the drag management module of the PC.
  • the drag management module of the PC can request the drag state from the tablet computer through the transmission management module of the PC.
  • the drag management module of the PC can request the tablet computer for the bitmap of the drag event content and the shadow through the transmission management module of the PC.
  • the transmission management module of the tablet computer can receive the request and forward the request to the drag management module of the tablet computer.
  • the drag management module of the tablet computer can feed back the drag event content and the bitmap of the shadow cached in S1003 to the transmission management module of the PC through the transmission management module of the tablet computer.
  • the transmission management module of the PC can transmit the drag event content to the drag management module of the PC.
  • the drag management module of the PC parses the received drag event content, and obtains the text or file path from the tablet computer. According to the obtained text or file path, the drag management module of the PC can construct the data object of the drag event, such as IDataObject.
  • the transmission management module of the PC can restore the shadow on the PC side according to the bitmap. For example, the shadow can be restored by using the IDragSourceHelper interface provided by the PC.
  • the PC can initiate a drag event on the PC side.
  • the PC will display an invisible window. Therefore, the PC can initiate the drag event with the invisible window displayed. The invisible window can be closed after the drag event is fired.
  • the PC executes the drag event according to the mouse movement event and the mouse press event, and displays an animation of the shadow of the content moving with the mouse pointer on the display screen of the PC.
  • the PC may execute the drag event in response to a movement event, such as a mouse move event, and a press event, such as a mouse press event. And display the content on the PC's display, such as an animation of the content's shadow moving with the mouse pointer.
  • a movement event such as a mouse move event
  • a press event such as a mouse press event.
  • display the content on the PC's display such as an animation of the content's shadow moving with the mouse pointer.
  • the PC displays an animation of the shadow 1203 of the edited picture moving with the mouse pointer 1204 on the display screen 1202 of the PC, as shown in FIG. 12
  • the drag track where the shadow 1203 moves with the mouse pointer 1204 is shown as track 1205 .
  • the above example is described by taking the user first dragging and dropping content from the PC to the tablet computer, and then dragging and dropping the content from the tablet computer to the PC as an example.
  • the user may directly drag and drop certain content of the tablet computer to the PC instead of dragging and dropping the content from the PC to the tablet computer.
  • the specific implementation in this embodiment is similar to the specific implementation of the embodiment shown in FIG. 5 and FIG. 10 , the difference is that S502 is not performed.
  • S503 is replaced by receiving a movement event (such as a mouse movement event), and displaying an animation of the movement of the mouse pointer on the display screen of the PC according to the movement event.
  • S506 is not performed, but the operation of displaying the invisible window on the PC needs to be performed, but the invisible window will not receive the drag event.
  • S507 and S509 are not executed.
  • S510 is replaced with the tablet computer displaying an animation of the movement of the mouse pointer on the display screen of the tablet computer according to a movement event, such as a mouse movement event. That is to say, after the mouse shuttle starts, the tablet computer can receive the operation of the user using the input device of the PC, such as the operation of moving the mouse pointer displayed on the tablet computer end by the mouse input. In response to this action, the tablet may display an animation of the mouse pointer movement on the tablet's display.
  • the tablet computer receives the user's input device using the PC, such as the operation of moving the mouse pointer input by the mouse. Specifically, the PC intercepts the corresponding movement event, such as the mouse movement event, and sends the operation parameters included in the movement event to the tablet. computer.
  • the tablet computer can simulate movement events, such as mouse movement events, according to the operating parameters, so that an animation of the mouse pointer movement can be displayed on the display screen of the tablet computer. Other operations are similar, and are not described in detail in this embodiment of the present application.
  • the tablet computer caches the drag data when the drag starts, and after the mouse shuttle ends, the PC requests the tablet computer for the drag data as an example for description.
  • the tablet computer may not cache the dragging data, but obtain and actively send the dragging data to the PC after determining the end of the mouse shuttle, without a PC request.
  • the above embodiment is described by taking an example that the input device is a mouse.
  • the input device may also be a touch pad.
  • the user can use a key (left key or right key) of the touchpad to input a pressing operation, and slide a finger on the touchpad to input a moving operation.
  • the specific implementation that the user uses the touchpad to drag and drop objects is similar to the specific implementation that uses the mouse to drag and drop, and will not be repeated here.
  • the method provided by this embodiment uses the keyboard and mouse sharing technology, so that the user can use an input device such as a mouse to drag and drop content such as text or files to follow the mouse pointer to participate in collaborative use shuttle between multiple terminals. And allowing the user to use these terminals to process the transmitted content, so that the hardware capabilities of the multiple terminals can all participate in the collaborative office.
  • an input device such as a mouse to drag and drop content such as text or files to follow the mouse pointer to participate in collaborative use shuttle between multiple terminals.
  • the user to use these terminals to process the transmitted content, so that the hardware capabilities of the multiple terminals can all participate in the collaborative office.
  • it since there is no need to start screen projection, it will not occupy the display space of a terminal display.
  • the use efficiency of multi-terminal collaborative use is improved, and the user experience is improved.
  • the object to be dragged is a text or a file as an example for description.
  • the object to be dragged may also be an icon of an application;
  • the dragged object can also be a window, and the window can include the interface of the application.
  • the drag data may include an icon of the application;
  • the drag data may include an interface of the application (eg, a screenshot of the interface of the application).
  • the specific implementation is similar to the specific implementation of dragging and dropping files or files and other content in the above-mentioned embodiment, and details are not repeated here.
  • the tablet computer when the dragged object is an application icon or an application interface, after determining that the object is dragged off the edge of the PC display, there is no need to create an invisible window, and the tablet does not need to open a transparent activity.
  • the tablet computer after receiving the drag data, the tablet computer does not need to construct a drag event, but directly performs the drag connection according to the icon of the application or the interface of the application included in the drag data.
  • the tablet computer does not need to generate a mouse press event, but according to the mouse movement event, the icon or window of the application can be moved with the movement of the mouse pointer on the tablet computer.
  • the PC can send the interface of the dragged application to the tablet computer, so that the tablet computer can display the event on the display screen of the tablet computer.
  • application interface For example, the user can use the input device of the PC to drag the icon of an application or the interface of the application displayed on the display screen of the PC to the tablet computer by dragging and dropping. The user may continue to drag on the tablet using the PC's input device, or the user may release the drag. After the user releases the drag, the tablet can display the application's interface on the tablet.
  • FIG. 13 is a schematic diagram of the composition of a device for dragging objects across devices provided by an embodiment of the present application.
  • the apparatus can be applied to a first terminal (such as the above PC), the first terminal is connected to the second terminal, and the apparatus can include: a display unit 1301 , an input unit 1302 and a sending unit 1303 .
  • the display unit 1301 is configured to display a first cursor on an object displayed by the first terminal.
  • the input unit 1302 is configured to receive a drag operation input by the user using the input device of the first terminal, where the drag operation is used to initiate dragging of the object.
  • the display unit 1301 is further configured to display an animation of the object moving with the first cursor on the display screen of the first terminal in response to the dragging operation.
  • the sending unit 1303 is configured to send drag data to the second terminal when it is determined that the object is dragged out of the edge of the display screen of the first terminal.
  • the above drag data can be used for the second terminal to initiate a drag event for the object, so that the second terminal displays an animation of the object moving with the second cursor on the display screen of the second terminal.
  • the above dragging operation may include a pressing operation and a moving operation; the sending unit 1303 is further configured to send the data of the moving operation input by the user using the input device of the first terminal to the second terminal.
  • the apparatus may further include: an intercepting unit 1304 .
  • the intercepting unit 1304 is configured to intercept the movement event when the user performs the movement operation using the input device of the first terminal; the sending unit 1303 is specifically configured to send the operation parameters included in the movement event to the second terminal.
  • the sending unit 1303 is further configured to send the shuttle status information to the second terminal, where the shuttle status information is used to indicate the start of the shuttle.
  • the display unit 1301 is specifically configured to display an animation of the shadow of the object moving with the first cursor on the display screen of the first terminal.
  • the display unit 1301 is also used to hide the shadow of the first cursor and the object.
  • the above-mentioned objects can be text, files or folders;
  • the dragging data includes dragging event content and a bitmap of the shadow; wherein, when the object is text, the dragging event content includes text, and when the object is a file or folder, The drag event content is the file path.
  • the display unit 1301 is further configured to display an invisible window, the transparency of the invisible window is greater than a threshold, and the invisible window is used to receive drag events; the device may further include: an obtaining unit 1305; the obtaining unit 1305 is used to receive from the invisible window Get the content of the drag event from the drag event; get the bitmap of the shadow.
  • the above-mentioned object may be an application icon; or, the above-mentioned object may be a window, and the window includes the interface of the application; when the object is the icon of the application, the dragging data includes: the icon of the application; when the object is the window, the dragging data includes: The drag data includes: the interface of the application.
  • FIG. 14 is a schematic diagram of the composition of another device for dragging objects across devices provided by an embodiment of the present application.
  • the apparatus can be applied to a second terminal (such as the above-mentioned tablet computer), the second terminal is connected to the first terminal, and the apparatus can include: a receiving unit 1401 and a display unit 1402 .
  • the display unit 1402 is configured to display the object dragged from the first terminal on the display screen of the second terminal; and display the second cursor on the object.
  • the receiving unit 1401 is configured to receive a moving operation input by a user using an input device of a first terminal.
  • the display unit 1402 is further configured to display an animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation.
  • the receiving unit 1401 is further configured to receive the shuttle status information from the first terminal, where the shuttle status information is used to indicate the start of the shuttle.
  • the receiving unit 1401 is further configured to receive drag data from the first terminal.
  • the drag data and the movement operation are sent to the second terminal after the first terminal determines that the object is dragged out of the edge of the display screen of the first terminal when the object moves with the first cursor on the display screen of the first terminal. , which is used to initiate drag events for objects.
  • the above apparatus may further include: a generating unit 1403, configured to generate a pressing operation.
  • the display unit 1402 is specifically configured to display an animation of the object moving with the second cursor on the display screen of the second terminal according to the moving operation, pressing operation and dragging data.
  • the generating unit 1403 is specifically used for simulating the pressing event according to the operation parameters of the pressing operation; the receiving unit 1401 is specifically used for receiving the operation parameters from the first terminal; the generating unit 1403 is also used for simulating the moving event according to the operation parameters
  • the operation parameter is the operation parameter contained in the mobile event that the first terminal receives after the user uses the input device of the first terminal to perform the mobile operation; the display unit 1402 is specifically used to respond to the pressing event and the moving event, according to the dragging data, An animation of the object moving with the second cursor is displayed on the display screen of the second terminal.
  • the apparatus may further include: a creating unit 1404 .
  • the creating unit 1404 is configured to create a virtual input device after the connection with the first terminal is successfully established; or, the receiving unit 1401 is further configured to receive a notification message from the first terminal, where the notification message is used to indicate a key of the first terminal
  • the mouse sharing mode is enabled, and the creating unit 1404 is configured to create a virtual input device in response to the notification message; wherein, the virtual input device is used for the second terminal to simulate an input event according to the operation parameter.
  • the display unit 1402 is specifically configured to display the shadow of the object dragged from the first terminal on the display screen of the second terminal.
  • the display unit 1402 is specifically configured to display an animation of the shadow of the object moving with the second cursor on the display screen of the second terminal according to the moving operation.
  • the above-mentioned objects can be text, files or folders;
  • the dragging data includes dragging event content and a bitmap of the shadow; wherein, when the object is text, the dragging event content includes text, and when the object is a file or folder, The drag event content is the file path.
  • the creating unit 1404 is further configured to create an invisible activity, where the invisible activity has a view control whose transparency is greater than a threshold, and the view control is used to initiate a drag event.
  • the object is an icon of the application; or, the object is a window, and the window includes the interface of the application; when the object is the icon of the application, the dragging data includes: the icon of the application; when the object is the window, the dragging data includes: the application interface.
  • An embodiment of the present application further provides an apparatus for dragging objects across devices, and the apparatus may be applied to the first terminal or the second terminal in the foregoing embodiments.
  • the apparatus may include: a processor, and a memory for storing instructions executable by the processor; wherein the processor is configured to implement various functions or steps performed by the PC or tablet computer in the above method embodiments when the processor is configured to execute the instructions.
  • This embodiment of the present application further provides a terminal (the terminal may be the first terminal or the second terminal in the foregoing embodiment), and the terminal may include: a display screen, a memory, and one or more processors.
  • the display screen, memory and processor are coupled.
  • the memory is used to store computer program code comprising computer instructions.
  • the terminal can execute various functions or steps executed by the PC or tablet computer in the above method embodiments.
  • the terminal includes but is not limited to the above-mentioned display screen, memory and one or more processors.
  • the structure of the terminal may refer to the structure of the tablet computer shown in FIG. 2 .
  • the chip system includes at least one processor 1501 and at least one interface circuit 1502 .
  • the processor 1501 may be the processor in the above-mentioned terminal.
  • the processor 1501 and the interface circuit 1502 may be interconnected by wires.
  • the processor 1501 may receive and execute computer instructions from the memory of the above-mentioned terminal through the interface circuit 1502 .
  • the terminal can be made to execute various steps executed by the PC or tablet computer in the above-mentioned embodiments.
  • the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
  • Embodiments of the present application further provide a computer-readable storage medium for storing computer instructions run by the above-mentioned terminal (such as a PC or a tablet computer).
  • a computer-readable storage medium for storing computer instructions run by the above-mentioned terminal (such as a PC or a tablet computer).
  • Embodiments of the present application further provide a computer program product, including computer instructions run by the above-mentioned terminal (such as a PC or a tablet computer).
  • a computer program product including computer instructions run by the above-mentioned terminal (such as a PC or a tablet computer).
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种跨设备的对象拖拽方法及设备,涉及电子设备领域。提高了多终端协同使用的使用效率,提高了用户的使用体验。具体方案为,第二终端在第二终端的显示屏上显示从第一终端拖拽来的对象;第二终端在对象上显示第二光标;第二终端接收用户使用第一终端的输入设备输入的移动操作的数据;第二终端根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画。

Description

一种跨设备的对象拖拽方法及设备
本申请要求于2020年07月29日提交国家知识产权局、申请号为202010747173.5、申请名称为“一种跨设备的对象拖拽方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,尤其涉及一种跨设备的对象拖拽方法及设备。
背景技术
随着电子技术及移动互联网的发展,一个用户可同时拥有更多如手机、平板电脑、个人电脑(personal computer,PC)、智能家居设备(如电视机)等终端。一般情况下,各个终端的使用都是比较独立的。在如协同办公等有多终端协同使用需求的场景下,用户会将多个终端连接起来一起使用。例如,用户拥有一台PC和一部手机,用户可将PC和手机采用无线或有线的方式连接起来一起配合使用,实现PC和手机的协同办公。
如图1所示,在PC与手机协同办公的场景中,多屏协同实现了利用屏幕镜像投射方式,将手机的界面(如,图1中所示手机的桌面101)投屏到PC的显示屏上显示,方便用户实现协同办公。在投屏状态下,目前多屏协同还可实现使用鼠标、触摸屏等输入设备(或称为外设)进行PC与手机间的内容双向拖拽功能。也就是说,允许用户在投屏时利用输入设备(如鼠标,触摸屏)通过拖拽的方式,将文字(或称为文本)或文件在PC和手机间互相传递。
但是,在利用多屏协同实现PC和手机协同办公的场景中,实现内容拖拽的前提是将手机的界面投屏到PC的显示屏上显示,且手机通常是灭屏的,手机的触摸屏,手写笔等硬件能力是无法参与到协同办公中来的。另外,手机投屏到PC上的界面也极大挤占了PC显示屏的显示空间。降低了多终端协同使用的使用效率。
发明内容
本申请实施例提供一种跨设备的对象拖拽方法及设备,提高了多终端协同使用的使用效率。
为达到上述目的,本申请实施例采用如下技术方案:
本申请的第一方面,提供一种跨设备的对象拖拽方法,该方法可以应用于第二终端,该第二终端与第一终端连接,该方法可以包括:第二终端在第二终端的显示屏上显示从第一终端拖拽来的对象;第二终端在对象上显示第二光标;第二终端接收用户使用第一终端的输入设备输入的移动操作;第二终端根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画。
采用上述技术方案,在不启动投屏的前提下,使得用户可利用如鼠标等输入设备,通过拖拽的方式将一个终端上的对象跟随光标在参与协同使用的多个终端间穿梭。由于无需启动投屏,因此不会挤占某个终端显示屏的显示空间。另外,提高了多终端协 同使用的使用效率,提高了用户的使用体验。
作为一种示例,上述第一终端的输入设备可以是鼠标,触摸板等。第二光标可以为显示在第二终端显示屏上的光标。
在一种可能的实现方式中,在第二终端在第二终端的显示屏上显示从第一终端拖拽来的对象之前,该方法还可以包括:第二终端接收来自第一终端的穿梭状态信息,穿梭状态信息用于指示穿梭开始。
在另一种可能的实现方式中,第二终端在第二终端的显示屏上显示从第一终端拖拽来的对象,可以包括:第二终端接收来自第一终端的拖拽数据;第二终端根据拖拽数据在第二终端的显示屏上显示从第一终端拖拽来的对象;其中,拖拽数据和移动操作是对象随第一光标在第一终端的显示屏上移动的情况下,第一终端确定对象被拖出第一终端的显示屏的边缘后向第二终端发送的,用于发起针对对象的拖拽事件。通过将拖拽的对象相关拖拽数据发送给其他终端,使得用户可使用这些终端对传递的对象进行处理,即可使得这多个终端的硬件能力均能参与到协同办公中来。第一光标可以为显示在第一终端显示屏上的光标。
在另一种可能的实现方式中,该方法还可以包括:第二终端生成按下操作;第二终端根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画,可以包括:第二终端根据移动操作,按下操作和拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画。在对象被拖出第一终端显示屏的边缘后,光标发生穿梭。在光标发生穿梭后,第二终端根据用户使用第一终端的输入设备输入的操作,实现拖拽的接续。
在另一种可能的实现方式中,第二终端生成按下操作,可以包括:第二终端根据按下操作的操作参数模拟按下事件;第二终端接收用户使用第一终端的输入设备输入的移动操作,可以包括:第二终端接收来自第一终端的操作参数,根据操作参数模拟移动事件;操作参数是用户使用第一终端的输入设备执行移动操作后第一终端接收到的移动事件包含的操作参数;第二终端根据移动操作,按下操作和拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画,包括:响应于按下事件和移动事件,第二终端根据拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画。
在另一种可能的实现方式中,该方法还可以包括:第二终端在与第一终端的连接建立成功后,创建虚拟输入设备;或者,第二终端接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,响应于通知消息,第二终端创建虚拟输入设备;其中,虚拟输入设备用于第二终端根据操作参数模拟输入事件。借助键鼠共享技术,实现了使用一个终端的输入设备在多个终端间对象的拖拽。
在另一种可能的实现方式中,第二终端在第二终端的显示屏上显示从第一终端拖拽来的对象,包括:第二终端在第二终端的显示屏上显示从第一终端拖拽来的对象的阴影;第二终端根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画,可以包括:第二终端根据移动操作,在第二终端的显示屏上显示对象的阴影随第二光标移动的动画。
在另一种可能的实现方式中,上述对象可以为文本,文件或文件夹;上述拖拽数据可以包括拖拽事件内容和阴影的位图;其中,对象为文本时,拖拽事件内容包括文 本,对象为文件或文件夹时,拖拽事件内容为文件路径。
在另一种可能的实现方式中,在第二终端接收来自第一终端的拖拽数据之后,该方法还可以包括:第二终端创建隐形活动,隐形活动具有透明度大于阈值的视图控件,视图控件用于发起拖拽事件。
在另一种可能的实现方式中,对象为应用的图标;或者,对象为窗口,窗口中包括应用的界面;对象为所述应用的图标时,拖拽数据包括:应用的图标;对象为窗口时,拖拽数据包括:应用的界面。
本申请的第二方面,提供一种跨设备的对象拖拽方法,该方法可以应用于第一终端,该第一终端与第二终端连接,该方法可以包括:第一终端在第一终端显示的对象上显示第一光标;第一终端接收用户使用第一终端的输入设备输入的拖拽操作,该拖拽操作用于发起针对上述对象的拖拽;响应于拖拽操作,第一终端在第一终端的显示屏上显示该对象随第一光标移动的动画;第一终端在确定该对象被拖出第一终端的显示屏的边缘时,向第二终端发送拖拽数据。
作为一种示例,上述第一终端的输入设备可以是鼠标,触摸板等。第一光标可以为显示在第一终端显示屏上的光标。
采用上述技术方案,在不启动投屏的前提下,使得用户可利用如鼠标等输入设备,通过拖拽的方式将一个终端上的对象跟随光标在参与协同使用的多个终端间穿梭。由于无需启动投屏,因此不会挤占某个终端显示屏的显示空间。另外,通过将拖拽的对象相关拖拽数据发送给其他终端,用于该其他终端接续拖拽。且允许用户可使用这些终端对传递的对象进行处理,即可使得这多个终端的硬件能力均能参与到协同办公中来。提高了多终端协同使用的使用效率,提高了用户的使用体验。
在一种可能的实现方式中,上述拖拽数据可以用于第二终端发起针对对象的拖拽事件,以便第二终端在第二终端的显示屏上显示对象随第二光标移动的动画。其中,第二光标可以为显示在第二终端显示屏上的光标。通过将拖拽数据发送给其他终端,可使得该其他终端根据用户输入的操作继续在该其他终端上显示对象随光标移动的动画,实现了拖拽的接续。
在另一种可能的实现方式中,上述拖拽操作可以包括按下操作和移动操作;在第一终端在确定对象被拖出第一终端的显示屏的边缘时,该方法还可以包括:第一终端向第二终端发送用户使用第一终端的输入设备输入的移动操作的数据。在对象被拖出第一终端显示屏的边缘后,光标发生穿梭,第一终端通过在光标发生穿梭后,将用户使用第一终端的输入的操作的数据发送给其他终端,以使得其他终端实现拖拽的接续。
在另一种可能的实现方式中,第一终端向第二终端发送用户使用第一终端的输入设备输入的移动操作的数据,可以包括:在用户使用第一终端的输入设备执行移动操作的过程中,第一终端拦截移动事件;第一终端向第二终端发送移动事件包括的操作参数。
在另一种可能的实现方式中,在第一终端确定对象被拖出第一终端的显示屏的边缘之后,该方法还可以包括:第一终端向第二终端发送穿梭状态信息,该穿梭状态信息用于指示穿梭开始。在对象被拖出第一终端显示屏的边缘后,确定光标发生穿梭,通过向其他终端发送用于指示穿梭开始的穿梭状态信息,使得其他终端为接续拖拽做 好准备,如显示光标。
在另一种可能的实现方式中,第一终端在第一终端的显示屏上显示对象随第一光标移动的动画,可以包括:第一终端在第一终端的显示屏上显示对象的阴影随第一光标移动的动画。
在另一种可能的实现方式中,在第一终端确定对象被拖出第一终端的显示屏的边缘之后,该方法还可以包括:第一终端隐藏第一光标和对象的阴影。通过在确定光标发生穿梭后,隐藏第一终端显示的光标和被拖拽对应的阴影,以给用户以对象从第一终端被拖拽到其他终端的视觉效果。
在另一种可能的实现方式中,上述对象可以为文本,文件或文件夹;上述拖拽数据可以包括拖拽事件内容和阴影的位图;其中,对象为文本时,拖拽事件内容包括文本,对象为文件或文件夹时,拖拽事件内容为文件路径。
在另一种可能的实现方式中,在第一终端确定对象被拖出第一终端的显示屏的边缘之后,该方法还可以包括:第一终端显示隐形窗口,隐形窗口的透明度大于阈值,隐形窗口用于接收拖拽事件;在向第二终端发送拖拽数据之前,该方法还可以包括:第一终端从隐形窗口接收到的拖拽事件中获取拖拽事件内容;第一终端获取阴影的位图。
在另一种可能的实现方式中,对象为应用的图标;或者,对象为窗口,窗口中包括应用的界面;对象为应用的图标时,拖拽数据包括:应用的图标;对象为窗口时,拖拽数据包括:应用的界面。
本申请的第三方面,提供一种跨设备的对象拖拽装置,应用于第一终端,第一终端与第二终端连接,该装置可以包括:显示单元,用于在第一终端显示的对象上显示第一光标;输入单元,用于接收用户使用第一终端的输入设备输入的拖拽操作,拖拽操作用于发起针对对象的拖拽;显示单元,还用于响应于拖拽操作,在第一终端的显示屏上显示对象随第一光标移动的动画;发送单元,用于在确定对象被拖出第一终端的显示屏的边缘时,向第二终端发送拖拽数据。
在一种可能的实现方式中,上述拖拽数据用于第二终端发起针对对象的拖拽事件,以便第二终端在第二终端的显示屏上显示对象随第二光标移动的动画。
在另一种可能的实现方式中,拖拽操作可以包括按下操作和移动操作;发送单元,还用于向第二终端发送用户使用第一终端的输入设备输入的移动操作的数据。
在另一种可能的实现方式中,该装置还可以包括:拦截单元;拦截单元,用于在用户使用第一终端的输入设备执行移动操作的过程中,拦截移动事件;发送单元,具体用于向第二终端发送移动事件包括的操作参数。
在另一种可能的实现方式中,发送单元,还用于向第二终端发送穿梭状态信息,穿梭状态信息用于指示穿梭开始。
在另一种可能的实现方式中,显示单元,具体用于在第一终端的显示屏上显示对象的阴影随第一光标移动的动画。
在另一种可能的实现方式中,显示单元,还用于隐藏第一光标和对象的阴影。
在另一种可能的实现方式中,上述对象为文本,文件或文件夹;拖拽数据包括拖拽事件内容和阴影的位图;其中,对象为文本时,拖拽事件内容包括文本,对象为文 件或文件夹时,拖拽事件内容为文件路径。
在另一种可能的实现方式中,显示单元,还用于显示隐形窗口,隐形窗口的透明度大于阈值,隐形窗口用于接收拖拽事件;该装置还可以包括:获取单元;获取单元,用于从隐形窗口接收到的拖拽事件中获取拖拽事件内容;获取阴影的位图。
在另一种可能的实现方式中,对象为应用的图标;或者,对象为窗口,窗口中包括应用的界面;对象为应用的图标时,拖拽数据包括:应用的图标;对象为窗口时,拖拽数据包括:应用的界面。
本申请的第四方面,提供一种跨设备的对象拖拽装置,应用于第二终端,第二终端与第一终端连接,该装置可以包括:显示单元,用于在第二终端的显示屏上显示从第一终端拖拽来的对象;在对象上显示第二光标;接收单元,用于接收用户使用所述第一终端的输入设备输入的移动操作;显示单元,还用于根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画。
在一种可能的实现方式中,该装置还包括:接收单元,还用于接收来自第一终端的穿梭状态信息,穿梭状态信息用于指示穿梭开始。
在另一种可能的实现方式中,接收单元,还用于接收来自第一终端的拖拽数据,显示单元,具体用于根据拖拽数据在第二终端的显示屏上显示从第一终端拖拽来的对象。拖拽数据和移动操作是对象随第一光标在第一终端的显示屏上移动的情况下,第一终端确定对象被拖出第一终端的显示屏的边缘后向第二终端发送的,用于发起针对该对象的拖拽事件。
在另一种可能的实现方式中,该装置还可以包括:生成单元,用于生成按下操作;显示单元,具体用于根据移动操作,按下操作和拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画。
在一种可能的实现方式中,生成单元,具体用于根据按下操作的操作参数模拟按下事件;接收单元,具体用于接收来自第一终端的操作参数;生成单元还用于根据操作参数模拟移动事件;操作参数是用户使用第一终端的输入设备执行移动操作后第一终端接收到的移动事件包含的操作参数;显示单元,具体用于响应于按下事件和移动事件,根据拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画。
在另一种可能的实现方式中,该装置还可以包括:创建单元;创建单元,用于在与第一终端的连接建立成功后,创建虚拟输入设备;或者,接收单元,还用于接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,创建单元,用于响应于通知消息,创建虚拟输入设备;其中,虚拟输入设备用于第二终端根据操作参数模拟输入事件。
在另一种可能的实现方式中,显示单元,具体用于在第二终端的显示屏上显示从第一终端拖拽来的对象的阴影。显示单元,具体用于根据移动操作,在第二终端的显示屏上显示对象的阴影随第二光标移动的动画。
在另一种可能的实现方式中,上述对象可以为文本,文件或文件夹;拖拽数据包括拖拽事件内容和阴影的位图;其中,对象为文本时,拖拽事件内容包括文本,对象为文件或文件夹时,拖拽事件内容为文件路径。
在另一种可能的实现方式中,创建单元,还用于创建隐形活动,隐形活动具有透 明度大于阈值的视图控件,视图控件用于发起拖拽事件。
在另一种可能的实现方式中,对象为应用的图标;或者,对象为窗口,窗口中包括应用的界面;对象为应用的图标时,拖拽数据包括:应用的图标;对象为窗口时,拖拽数据包括:应用的界面。
本申请的第五方面,提供一种跨设备的对象拖拽装置,包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令时实现如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者实现如第二方面或第二方面的可能的实现方式中任一项所述的方法。
本申请的第六方面,提供一种计算机可读存储介质,其上存储有计算机程序指令,计算机程序指令被处理器执行时实现如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者实现如第二方面或第二方面的可能的实现方式中任一项所述的方法。
本申请的第七方面,提供一种终端,该终端包括显示屏,一个或多个处理器和存储器;显示屏,处理器和存储器耦合;存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当计算机指令被终端执行时,使得该终端执行如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者,使得该终端执行如第二方面或第二方面的可能的实现方式中任一项所述的方法。
本申请的第八方面,提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在终端中运行时,终端中的处理器执行第一方面或第一方面的可能的实现方式中任一项所述的方法,或者执行第二方面或第二方面的可能的实现方式中任一项所述的方法。
本申请的第九方面,提供一种跨设备的对象拖拽系统,该系统可以包括第一终端和第二终端,第一终端与第二终端连接。
第一终端在第一终端显示的对象上显示第一光标,接收用户使用第一终端的输入设备输入的拖拽操作,该拖拽操作用于发起针对上述对象的拖拽;响应于拖拽操作,第一终端在第一终端的显示屏上显示该对象随第一光标移动的动画,在确定对象被拖出第一终端的显示屏的边缘时,向第二终端发送拖拽数据和用户使用第一终端的输入设备输入的移动操作。第二终端根据拖拽数据在第二终端的显示屏上显示从第一终端拖拽来的对象,在对象上显示第二光标;第二终端接收用户使用第一终端的输入设备输入的移动操作。第二终端根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画。
在一种可能的实现方式中,第一终端在确定对象被拖出第一终端的显示屏的边缘后,向第二终端发送穿梭状态信息,穿梭状态信息用于指示穿梭开始。
在另一种可能的实现方式中,第一终端在确定对象被拖出第一终端的显示屏的边缘后,隐藏第一光标和对象。
需要说明的是,本实施例中所述的对象被拖出终端(如第一终端)显示屏的边缘可以为该对象的部分区域被拖出(或者说溢出)终端显示屏,也可以为该对象的全部区域被拖出(或者说溢出)终端显示屏,还可以是对象随光标在终端显示屏移动时,光标滑出终端显示屏的边缘,本实施例在此不做具体限制。
可以理解地,上述提供的第三方面及其任一种可能的实现方式所述的跨设备的对象拖拽装置,第四方面及其任一种可能的实现方式所述的跨设备的对象拖拽装置,第五方面所述的跨设备的对象拖拽装置,第六方面所述的计算机可读存储介质,第七方面所述的终端,第八方面所述的计算机程序产品及第九方面所述的跨设备的对象拖拽系统所能达到的有益效果,可参考如第一方面或第二方面及其任一种可能的实现方式中的有益效果,此处不再赘述。
附图说明
图1为现有技术提供的一种协同办公的场景示意图;
图2为本申请实施例提供的一种系统架构的简化示意图;
图3为本申请实施例提供的一种平板电脑的结构示意图;
图4为本申请实施例提供的一种软件架构的组成示意图;
图5为本申请实施例提供的一种跨设备的对象拖拽方法的流程示意图;
图6A为本申请实施例提供的一种显示屏上坐标系的示意图;
图6B为本申请实施例提供的一种跨设备的对象拖拽界面示意图;
图7为本申请实施例提供的一种windows端的拖拽事件的数据结构示意图;
图8为本申请实施例提供的一种Android端的拖拽事件的数据结构示意图;
图9为本申请实施例提供的另一种跨设备的对象拖拽界面示意图;
图10为本申请实施例提供的另一种跨设备的对象拖拽方法的流程示意图;
图11为本申请实施例提供的又一种跨设备的对象拖拽界面示意图;
图12为本申请实施例提供的又一种跨设备的对象拖拽界面示意图;
图13为本申请实施例提供的一种跨设备的对象拖拽装置的组成示意图;
图14为本申请实施例提供的另一种跨设备的对象拖拽装置的组成示意图;
图15为本申请实施例提供的一种芯片系统的组成示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
在现有技术中,利用多屏协同实现多终端,如PC和手机协同办公时,实现内容拖拽的前提是将手机的界面投屏到PC的显示屏上显示,且内容的拖拽以及处理均是利用PC的硬件来完成的。手机通常是灭屏的,其硬件能力(如显示屏,手写笔等)无法参与到协同办公中来。这会导致用户很难综合利用参与协同办公的不同终端的特性,扬长避短,最大化工作效率。且手机投屏到PC上的界面也会挤占PC显示屏的显示空间。影响了多终端协同使用的使用效率。
本申请实施例提供一种跨设备的对象拖拽方法及设备,该方法可以应用于多个终端协同使用的场景中。本实施例提供的方法,在不启动投屏的前提下,借助键鼠共享技术,使得用户可利用如触摸板,鼠标等输入设备,通过拖拽的方式将文本或文件等内容(或者说对象)在参与协同使用的多个终端间相互传递,且允许用户使用这些终端对传递的内容进行处理。也就是说,可使得这多个终端的硬件能力均能参与到协同 办公中来。另外,由于无需启动投屏,因此不会挤占某个终端显示屏的显示空间。提高了多终端协同使用的使用效率,提高了用户的使用体验。
下面将结合附图对本申请实施例的实施方式进行详细描述。
请参考图2,为本申请实施例提供的一种可以应用上述方法的系统架构的简化示意图。该系统架构可以是本实施例中的跨设备的对象拖拽系统。如图2所示,该系统架构至少可以包括:第一终端201和第二终端202。
其中,第一终端201与输入设备201-1连接(如图2所示),或包括输入设备201-1(图2中未示出)。作为一种示例,该输入设备201-1可以为鼠标,触摸板,触摸屏等。图2中以输入设备201-1是鼠标为例示出。
在本实施例中,第一终端201和第二终端202可通过有线或无线的方式建立连接。基于建立的连接,第一终端201和第二终端202可配合一起使用。在本实施例中,第一终端201和第二终端202采用无线方式建立连接时采用的无线通信协议可以为无线保真(wireless fidelity,Wi-Fi)协议、蓝牙(Bluetooth)协议、ZigBee协议、近距离无线通信(Near Field Communication,NFC)协议等,还可以是各种蜂窝网协议,在此不做具体限制。
在第一终端201与第二终端202连接后,利用键鼠共享技术,用户可使用一套输入设备,如上述输入设备201-1实现对第一终端201和第二终端202两者的控制。也就是说,用户不仅可以使用第一终端201的输入设备201-1实现对第一终端201的控制,第一终端201还可将其输入设备201-1共享给第二终端202,供用户实现对第二终端202的控制。
例如,在本申请实施例中,以上述输入设备201-1是鼠标为例。第一终端201与第二终端202连接后,在不启动投屏的前提下,利用键鼠共享技术,用户可使用鼠标,将第一终端201的文本或文件等内容,通过拖拽的方式,拖拽到第二终端202。用户还可以使用该鼠标,将第二终端202的文本或文件等内容,通过拖拽的方式,拖拽到第一终端201。
需要说明的是,本申请实施例在不启动投屏的前提下,不仅可以实现两个终端间内容的拖拽,还可以实现三个或三个以上终端间内容的拖拽。示例性的,当第二终端202还与其他设备,如称为第三终端建立了连接时,在本申请实施例中,在不启动投屏的前提下,利用键鼠共享技术,用户在将文本或文件等内容从一个终端拖拽到另一个终端后,还可继续通过拖拽的方式,将该内容拖拽到第三个终端。例如,继续以上述输入设备201-1是鼠标为例,在不启动投屏的前提下,用户使用鼠标,将第一终端201的文本或文件等内容,通过拖拽的方式,拖拽到第二终端202后,可继续通过拖拽的方式,将该内容拖拽到第三终端。在用户释放鼠标后,内容拖拽完成。
需要说明的是,本申请实施例中的终端,如上述第一终端201,又如上述第二终端202,上述第三终端,可以为手机,平板电脑,手持计算机,PC,蜂窝电话,个人数字助理(personal digital assistant,PDA),可穿戴式设备(如智能手表),智能家居设备(如电视机),车载电脑,游戏机,以及增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等,本实施例对终端的具体形式不做特殊限制。其中,图2中以第一终端201为PC,第二终端202为平板电脑为例示出。
在本实施例中,以终端为平板电脑为例。请参考图3,为本申请实施例提供的一种平板电脑的结构示意图。以下实施例中的方法可以在具有上述硬件结构的平板电脑中实现。
如图3所示,平板电脑可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193以及显示屏194等。可选的,平板电脑还可以包括移动通信模块150,用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本实施例示意的结构并不构成对平板电脑的具体限定。在另一些实施例中,平板电脑可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是平板电脑的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,SIM接口,和/或USB接口等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为平板电脑供电。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141也可接收电池142 的输入为平板电脑供电。
平板电脑的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。平板电脑中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
当平板电脑包括移动通信模块150时,移动通信模块150可以提供应用在平板电脑上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在平板电脑上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),NFC,红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,平板电脑的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得平板电脑可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系 统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
平板电脑通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,平板电脑可以包括1个或N个显示屏194,N为大于1的正整数。
平板电脑可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。在一些实施例中,平板电脑可以包括1个或N个摄像头193,N为大于1的正整数。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展平板电脑的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行平板电脑的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储平板电脑使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
平板电脑可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。当有触摸操作作用于显示屏194,平板电脑根据压力传感器180A检测所述触摸操作强度。平板电脑也可以根据压力传感器180A的检测信号计算触摸的位置。
陀螺仪传感器180B可以用于确定平板电脑的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。平板电脑可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测平板电脑在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。平板电脑可以利用接近光传感器180G检测用户手持平板电脑贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。环境光传感器180L用于感知 环境光亮度。指纹传感器180H用于采集指纹。平板电脑可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。温度传感器180J用于检测温度。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于平板电脑的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
当平板电脑包括SIM卡接口195时,SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和平板电脑的接触和分离。平板电脑可以支持1个或N个SIM卡接口,N为大于1的正整数。平板电脑通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,平板电脑采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在平板电脑中,不能和平板电脑分离。
结合图2,本申请实施例以第一终端201的软件系统是windows系统,第二终端202的软件系统是Android系统为例,示例性说明第一终端201和第二终端202的软件结构。请参考图4,为本申请实施例提供的一种软件架构的组成示意图。
其中,如图4所示,第一终端201的软件架构可以包括:应用层和windows系统(windows shell)。在一些实施例中,应用层可以包括安装在第一终端201的各个应用。应用层的应用可直接与windows系统交互。示例性的,应用层还可以包括键鼠模块,传输管理模块,拖拽管理模块和窗口管理模块。
第二终端202的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。以第二终端202的软件系统是分层架构为例。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,如图4所示,第二终端202可以包括应用层和框架层(framework,FWK)。应用层可以包括一系列应用程序包。例如,应用程序包可以包括设置,计算器,相机,短信息,音乐播放器等应用。应用层包括的应用可以是第二终端202的系统应用,也可以是第三方应用,本申请实施例在此不做具体限制。应用层还可以包括传输管理模块和拖拽管理模块。框架层主要负责为应用层的应用提供应用编程接口(application programming interface,API)和编程框架。在本实施例中,框架层可以包括窗口管理器(或称为窗口管理服务)。当然,第二终端202还可以包括其他层,如内核层(图4中未示出)等。该内核层是硬件和软件之间的层。内核层至少可以包含显示驱动,摄像头驱动,音频驱动,传感器驱动等。
在本申请实施例中,在第一终端201和第二终端202建立连接后,在不启动投屏的前提下,基于上述软件架构,并借助键鼠共享技术,用户使用第一终端201的输入 设备201-1,可将第一终端201的文件或文本等内容,通过拖拽的方式传递到第二终端202。还可将第二终端202的文件或文本等内容,通过拖拽的方式传递到第一终端201。也就是说,在不启动投屏的前提下,用户可利用第一终端201的输入设备201-1,通过拖拽的方式,实现应用中的文件或文本等内容在第一终端201和第二终端202之间的双向拖拽。其中,键鼠共享技术可以是指用一个终端的输入设备(如鼠标,触摸板),实现对其他终端控制的技术。
需要说明的是,在本实施例中,上述拖拽管理模块也可以称为拖拽服务模块。另外,在上述实施例中,可以看到的是,第一终端201和第二终端202均包括传输管理模块,第一终端201和第二终端202之间的通信可以通过传输管理模块来实现。在其他一些实施例中,拖拽管理模块也可以具有与其他终端通信的功能,也就是说,第一终端201和第二终端202也可以均不包括传输管理模块,其之间的通信可以通过拖拽管理模块来实现,本实施例在此并不做具体限制。为了便于描述,以下实施例中以第一终端201和第二终端202之间的通信通过传输管理模块实现为例进行说明。
以下结合图2和图4,以第一终端201为PC,第二终端202为平板电脑,输入设备202-1为鼠标为例,结合附图对本申请实施例提供的跨设备的对象拖拽方法进行详细介绍。
图5为本申请实施例提供的一种跨设备的对象拖拽方法的流程示意图。其中,本实施例以用户利用PC的鼠标将PC中的内容(该内容即为被拖拽的对象)通过拖拽的方式传递到平板电脑为例,对本实施例提供的方法进行详细说明。如图5所示,该方法可以包括以下S501-S510。
S501、平板电脑与PC建立连接。
在一些实施例中,平板电脑与PC可以采用有线的方式建立连接。例如,平板电脑与PC可通过数据线建立有线连接。
在其他一些实施例中,平板电脑与PC可以采用无线的方式建立连接。其中,终端之间采用无线方式建立连接有两点要求,一个是终端之间需互相知晓对端的连接信息,另一个是各终端需具有传输能力。连接信息可以是终端的设备标识,如互联网协议(internet protocol,IP)地址,端口号或终端登录的账号等。终端登录的账号可以是运营商为用户提供的账号,如华为账号等。终端登录的账号还可以为应用账号,如微信账号、优酷账号等。终端具有传输能力可以是近场通信能力,也可以是长距离通信能力。也就是说,终端间,如平板电脑与PC建立连接采用的无线通信协议可以是如Wi-Fi协议或蓝牙协议或NFC协议等近场通信协议,也可以是蜂窝网协议。以平板电脑与PC采用无线的方式建立连接为例。例如,用户可使用平板电脑触碰PC的NFC标签,平板电脑读取该NFC标签中保存的连接信息,如该连接信息中包括PC的IP地址。之后,平板电脑可根据PC的IP地址采用NFC协议与PC建立连接。又例如,平板电脑与PC均打开了蓝牙功能和Wi-Fi功能。PC可广播蓝牙信号,以发现周围的终端,如PC可显示发现的设备列表,该发现设备列表中可包括PC发现的平板电脑的标识。另外,在PC进行设备发现的过程中也可与发现的设备互相交换连接信息,如IP地址。之后,在PC接收到用户在显示的设备列表中选择该平板电脑的标识的操作后,PC根据平板电脑的IP地址,可采用Wi-Fi协议与该平板电脑建立连接。再例如, 平板电脑和PC均接入了蜂窝网,平板电脑与PC登录了同一华为账号。平板电脑与PC可根据该华为账号基于蜂窝网建立连接。
在平板电脑与PC成功建立连接后,两者便可协同使用。为了提高协同使用的效率,用户可使用一套输入设备,如PC的鼠标实现对PC和平板电脑两者的控制。
作为一种示例性的实现,在PC的键鼠共享模式开启的情况下,可使用一套输入设备实现对PC和平板电脑两者的控制。
例如,在一些实施例中,在其他终端与PC成功建立连接后,PC可显示弹窗,该弹窗用于询问用户是否开启键鼠共享模式。如果接收到用户选择开启键鼠共享模式的操作,PC可开启键鼠共享模式。
PC在开启键鼠共享模式后,可通知与自身建立了连接的所有终端键鼠共享模式已开启。如PC与平板电脑建立了连接,则PC会向平板电脑通知键鼠共享模式已开启(如PC可向平板电脑发送通知消息,该通知消息可用于指示PC的键鼠共享模式已开启)。平板电脑在接收到该通知后,可创建一个虚拟输入设备,该虚拟输入设备与常规的如鼠标,触摸板等输入设备的作用相同,可用于平板电脑模拟对应输入事件。例如,以输入设备为鼠标为例,平板电脑创建的该虚拟输入设备与常规的鼠标作用相同,可以看作是PC共享给平板电脑的鼠标,能够用于在平板电脑端模拟鼠标事件,以实现PC的鼠标对平板电脑的控制。示例性的,以平板电脑的操作系统是Android系统为例。平板电脑可利用linux的uinput能力实现虚拟输入设备的创建。其中,uinput是一个内核层模块,可以模拟输入设备。通过写入/dev/uinput(或/dev/input/uinput)设备,进程可以创建具有特定功能的虚拟输入设备。一旦创建了该虚拟输入设备,其便可模拟对应的事件。类似的,其他与PC建立了连接的终端也会根据接收到通知,进行虚拟输入设备的创建。需要说明的是,如果接收到通知的终端的操作系统是Android系统,则可以利用linux的uinput能力实现虚拟输入设备的创建,或者可以使用人机交互设备(human interface device,HID)协议来实现虚拟输入设备的创建。如果接收到通知的终端的操作系统是IOS系统或windows系统等其他操作系统,则可使用HID协议来实现虚拟输入设备的创建。另外,上述实施例是以与PC连接的终端接收到用于通知PC的键鼠共享模式已开启的通知后,便进行虚拟输入设备的创建为例进行说明的。在其他一些实施例中,在与PC连接的终端接收到上述通知后,也可以显示弹窗,以询问用户是否想要使用PC的输入设备实现对本设备的控制。如果接收到用户选择使用PC的输入设备实现对本设备的控制的操作,则再进行虚拟输入设备的创建,否则不创建虚拟输入设备。
又例如,在其他一些实施例中,也可以默认在其他终端,如平板电脑与PC建立连接后,PC自动开启键鼠共享模式,无需用户手动打开。在其他终端,如上述平板电脑与PC建立连接后,也可自动进行虚拟输入设备的创建,无需PC发送通知。或者,在其他终端与PC建立连接后,可以先显示弹窗询问用户是否想要使用PC的输入设备实现对本设备的控制。如果接收到用户选择使用PC的输入设备实现对本设备的控制的操作,则再进行虚拟输入设备的自动创建,否则不创建虚拟输入设备。
另外,结合图2,由于鼠标是PC的输入设备,在其他终端,如平板电脑与PC建立连接后,一般情况下,暂时是由PC对鼠标的操作进行响应的,或者说用户使用鼠 标暂时可实现对PC的控制。在本实施例中,PC在开启键鼠共享模式后,还可在确定满足鼠标穿梭条件时,触发转由与PC建立了连接的创建了虚拟输入设备的其他终端,如平板电脑对鼠标的操作进行响应。也就是说,在满足鼠标穿梭条件后,用户便可使用鼠标实现对与PC建立了连接的创建了虚拟输入设备的其他终端,如平板电脑的控制。
示例性的,鼠标穿梭条件可以是鼠标对应在PC显示屏上显示的鼠标指针滑过PC显示屏的边缘。也就是说,用户可通过移动鼠标,使得该鼠标对应在PC显示屏上显示的鼠标指针滑过PC显示屏的边缘,以触发转由与PC建立了连接的创建了虚拟输入设备的其他终端对鼠标的操作进行响应。
作为一种示例性的实现,PC可在开启键鼠共享模式后,开启输入(input)监听,并挂载钩子(HOOK)。输入监听可用于监听鼠标指针的相对位移和坐标位置。挂载的HOOK在鼠标穿梭开始后可用于拦截对应输入事件(或者说屏蔽对应输入事件),如以输入设备是鼠标为例,该输入事件可以是鼠标事件,使得鼠标事件在被PC的键鼠模块接收后不会传输到PC的windows系统。挂载的HOOK在鼠标穿梭开始后还可用于捕获拦截到的输入事件,如鼠标事件中的参数。PC可利用输入监听,监听鼠标指针的相对位移和坐标位置,并根据监听到的数据确定是否满足鼠标穿梭条件。在确定满足鼠标穿梭条件后,挂载的HOOK拦截鼠标事件,捕获鼠标事件中的操作参数,并通过将捕获到的操作参数发送给与PC连接的创建了虚拟输入设备的其他终端,以便该终端利用创建的虚拟输入设备模拟对应输入事件,如鼠标事件,进而进行相应响应。即实现转由与PC连接的创建了虚拟输入设备的其他终端对鼠标的操作进行响应。
当然,也可以通过其他方式(如在PC中注册RAWINPUT)来实现输入事件的拦截和其中操作参数的捕获。或者,还可以通过不同的方式来分别实现输入事件的拦截和其中操作参数的捕获。例如,以输入设备是鼠标为例,PC在开启键鼠共享模式后,可挂载HOOK,并注册RAWINPUT,其中,在鼠标穿梭开始后,挂载的HOOK可用于拦截鼠标事件(或者说屏蔽鼠标事件),注册的RAWINPUT可用于捕获拦截到的鼠标事件中的操作参数。本实施例在此对鼠标事件的拦截和其中操作参数的捕获的具体实现不做限制。为了便于描述,以下实施例中以通过挂载HOOK来实现输入事件的拦截和其中操作参数的捕获为例进行介绍。
如,以用户想要将PC的内容通过拖拽的方式传递到平板电脑,并在平板电脑中继续拖拽为例,触发转由与PC连接的创建了虚拟输入设备的其他终端,如平板电脑对鼠标的操作进行响应的过程可以包括以下S502-S506。
S502、PC接收选中内容的鼠标操作。
其中,上述内容可以是文本(或称为文字,text),也可以是文件,还可以是文件夹。文件可以包括以下一种或多种格式的文件,如word文档,Excel工作簿,PowerPoint演示文稿,位图,图像文件,纯文本文件,声音文件,影片文件,flash动画文件,网页文件,压缩文件等。
另外,在本实施例中,选中的内容可以是一个,也可以是多个。例如,选中两个word文档。又例如,选中一个word文档,一个图像文件。再例如,选中两个文件夹。
S503、PC接收鼠标按下事件和鼠标移动事件,根据鼠标按下事件和鼠标移动事件 发起拖拽事件,并在PC的显示屏上显示内容的阴影随鼠标指针移动的动画。
需要说明的是,本实施例中所述的鼠标指针也可以称为光标。光标可以是一个图像,其可以是动态的也可以是静态的,在不同情况下光标的样式也可能有所不同。本实施例中以鼠标指针为例进行说明。
其中,PC的鼠标指针可以为本申请中的第一光标。
S504、PC监测鼠标指针在PC显示屏上的坐标位置。
S505、PC根据鼠标指针在PC显示屏上的坐标位置,在确定鼠标指针滑出PC显示屏边缘时,拦截鼠标移动事件,并将鼠标移动事件包含的鼠标操作参数发送给平板电脑。
S506、PC获取拖拽事件内容和阴影的位图,并将拖拽事件内容和阴影的位图发送给平板电脑。
其中,拖拽事件内容用于拖拽接续端设备,如平板电脑发起拖拽事件。例如,当S502中的内容为文本时,拖拽事件内容可以包括该文本(text)。又例如,当S502中的内容为文件或文件夹时,拖拽事件内容可以包括文件路径(如,统一资源标识符(uniform resource identifier,uri)。本申请中的拖拽数据可以包括该拖拽事件内容和阴影的位图,可用于拖拽接续端设备,如平板电脑在其显示屏上显示对象随鼠标指针移动的动画。
在本实施例中,在键鼠共享模式开启后,在用户想要将PC的内容通过拖拽的方式传递到与PC连接的创建了虚拟输入设备的其他终端,如平板电脑,在平板电脑中继续拖拽时,用户可使用PC的输入设备选中想要拖拽的内容。随后,在PC的鼠标指针显示在该内容上时,用户可输入拖拽操作,以便PC可根据该拖拽操作实现对相应对象,也就是该内容(如S502中选中内容)的拖拽。
其中,该拖拽操作可以是用于指示发起针对该选中内容的拖拽事件的操作。该拖拽操作可以包括一个操作,也可以包括多个操作。例如,拖拽操作包括两个操作,分别为按下操作和移动操作。以输入设备是鼠标为例,该按下操作可以是鼠标按下操作,该移动操作可以是鼠标移动操作。用户可按下并移动鼠标(即使用PC的鼠标输入鼠标按下操作和鼠标移动操作),以触发PC的windows系统发起对该内容的拖拽事件,以便该内容(如该内容的阴影)可在PC的显示屏上随PC的鼠标指针的移动而移动。其中,阴影也可以称为拖拽阴影。
在内容(如内容的阴影)跟随鼠标指针移动的过程中,PC可判断被拖拽的内容(如内容的阴影)是否被拖出PC显示屏的边缘。在内容(如内容的阴影)被拖出PC显示屏的边缘时,可触发鼠标穿梭条件。在本实施例中,内容被拖出PC显示屏的边缘可以为该内容(如内容的阴影)的部分区域被拖出(或者说溢出)PC显示屏(即窗口溢出显示屏区域的比例大于预设阈值),也可以为该内容(如内容的阴影)的全部区域被拖出(或者说溢出)PC显示屏,还可以是该内容(如内容的阴影)随鼠标指针在PC显示屏移动时,鼠标指针滑出PC显示屏的边缘,本实施例在此不做具体限制。以下以判断被拖拽的内容(如内容的阴影)是否被拖出PC显示屏的边缘具体为判断鼠标指针是否滑出PC显示屏的边缘为例进行说明。
作为一种示例,用户可通过持续向同一个方向移动鼠标使对应显示在PC显示屏 上的鼠标指针滑过(或者说滑出)PC显示屏的边缘,即触发鼠标穿梭条件。
示例性的,PC可根据鼠标指针的初始位置和相对位移确定鼠标指针在PC显示屏上的坐标位置,从而确定鼠标指针是否滑出PC显示屏的边缘。
其中,鼠标指针的初始位置可以是鼠标开始移动时,鼠标指针在PC显示屏上的坐标位置,或者说是鼠标开始移动之前鼠标指针在PC显示屏上的坐标位置。该鼠标指针的初始位置具体可以是指在以PC显示屏的左上角为坐标原点,X轴从左上角指向PC显示屏右边缘,Y轴从左上角指向PC显示屏下边缘的坐标系中的坐标位置。例如,PC确定鼠标指针是否滑出PC显示屏的边缘的具体过程可以是:结合图6A,PC可建立以该初始坐标位置为坐标原点(如图6A中所示的位置o),X轴从坐标原点o指向PC显示屏右边缘,Y轴从坐标原点o指向PC显示屏上边缘的坐标系。PC可确定在该坐标系中,PC显示屏各边缘的坐标值。PC显示屏各边缘在该坐标系中的坐标值可根据PC显示屏的分辨率和鼠标指针的初始位置来确定。如,如图6A所示,在该坐标系中,PC显示屏的右边缘在X轴上的坐标值为x1,左边缘在X轴上的坐标值为-x2,上边缘在Y轴上的坐标值为y1,下边缘在Y轴上的坐标值为-y2。在鼠标移动后,鼠标会向PC上报对应鼠标指针的相对位移。PC根据鼠标上报的相对位移可计算出鼠标移动后鼠标指针在PC显示屏上的坐标位置(x,y)。根据该坐标位置(x,y),PC即可确定鼠标指针是否滑出PC显示屏的边缘。如,如果鼠标指针在X轴的坐标值x大于x1,则可确定鼠标指针滑出PC显示屏的右边缘。如果鼠标指针在X轴的坐标值x小于-x2,则可确定鼠标指针滑出PC显示屏的左边缘。如果鼠标指针在Y轴的坐标值y大于y1,则可确定鼠标指针滑出PC显示屏的上边缘。如果鼠标指针在Y轴的坐标值y小于-y2,则可确定鼠标指针滑出PC显示屏的下边缘。
在确定鼠标指针滑出PC显示屏的边缘,即鼠标穿梭条件触发后,用户便可使用PC的输入设备对与PC连接的创建了虚拟输入设备的其他终端进行控制。也就是说,鼠标穿梭条件触发后,PC可将用户使用PC的输入设备输入的操作的数据发送给创建了虚拟输入设备的其他终端。如,如果用户仍继续向同一个方向移动鼠标,则PC可将接收到的移动事件,如鼠标移动事件拦截,并将鼠标移动事件包含的操作参数,如鼠标操作参数传输给与PC连接的创建了虚拟输入设备的其他终端,以便该终端实现拖拽事件的接续。
需要说明的是,在本实施例中,如果与PC连接的设备仅有一个,如上述平板电脑,且平板电脑创建了虚拟输入设备,则PC可将对应操作参数传输给平板电脑,以便平板电脑实现拖拽事件的接续。
如果与PC连接的设备有多个,这多个设备中存在部分设备或全部设备建立了虚拟输入设备,则在一些实施例中,PC可在确定鼠标穿梭条件被触发时,在PC的显示屏上显示列表选项,该列表选项中包括与PC连接的设备中创建了虚拟输入设备的设备的标识(如包括上述平板电脑的标识)。PC可根据用户的选择,确定实现拖拽事件接续的设备。如用户选择了上述平板电脑的标识,则PC可将对应操作参数发送给平板电脑,以便平板电脑实现拖拽事件的接续。其中,与PC连接的设备在完成虚拟输入设备的创建后,可向PC发送虚拟输入设备创建成功的指示消息。PC根据接收到的指示消息可获得与PC连接的设备中哪些设备成功创建了虚拟输入设备,并以此为依 据显示上述列表选项。在其他一些实施例中,可以预先配置穿梭关系。如果与PC连接的设备有多个,且这多个设备中存在部分设备或全部设备建立了虚拟输入设备,则可以根据预先配置的穿梭关系确定实现拖拽接续的设备。例如,与PC连接的多个设备中包括上述平板电脑,且平板电脑创建了虚拟输入设备,预先配置的穿梭关系是鼠标指针从PC显示屏的左侧(或者说左边缘)滑出,则确定实现拖拽接续的设备是平板电脑。那么,当用户按下并移动鼠标,使得鼠标指针滑过PC显示屏的左边缘时,PC不仅可以确定鼠标穿梭开始,还可确定实现拖拽接续的设备是平板电脑。当然,如果与PC连接的设备有一个,且该设备创建了虚拟输入设备时,也可以根据预先配置的穿梭关系确定实现拖拽接续的设备是否是该设备。如,预先配置的穿梭关系是鼠标指针从PC显示屏的左边缘滑出,则鼠标穿梭到平板电脑。但用户按下并移动鼠标后,使得鼠标指针滑过PC显示屏的右边缘,则可确定鼠标不穿梭到平板电脑。在另外一些实施例中,可以通过识别设备位置确定实现拖拽接续的设备。例如,用户按下并移动鼠标,使得鼠标指针滑过PC显示屏的左边缘,则可以利用蓝牙、超宽带(Ultra-wideband,UWB)、超声波等定位技术识别位于PC周围的设备位置,如PC识别出PC的左边是平板电脑,则可确定实现拖拽接续的设备是平板电脑。
其中,穿梭关系可以是用户通过配置文件提前配置的,也可以为用户提供配置穿梭关系的配置界面,用户可通过该配置界面提前配置穿梭关系。例如,以用户通过界面配置平板电端的穿梭关系为例。PC接收用户打开配置界面操作,显示配置界面,该配置界面中包括PC的标识(如PC的图标)和平板电脑的标识(如,平板电脑的图标),用户可通过拖动这两个标识来配置穿梭关系。作为一种示例,若用户将平板电脑的标识放置到PC的标识的左侧,则PC可确定当鼠标指针滑过PC显示屏的左边缘时,实现拖拽接续的设备的是平板电脑。若用户将平板电脑的标识放置到PC的标识的右侧,则PC可确定当鼠标指针滑过PC显示屏的右边缘时,实现拖拽接续的设备的是平板电脑。当存在多个设备时,可通过预先配置的方式配置每个设备的穿梭关系。以下实施例均以确定出的实现拖拽继续的设备为平板电脑为例进行说明。需要说明的是,对于根据预先配置穿梭关系及识别设备位置来确定实现拖拽接续的设备这两种实现来说,上述S501可以在鼠标穿梭触发之前执行,也可以在鼠标穿梭触发之后执行,本实施例在此不做具体限制。
示例性的,结合图4,在用户使用鼠标选中想要拖拽的内容后,PC的键鼠模块可接收到对应操作,如用户选中该内容的鼠标操作。用户可移动鼠标,使得PC的鼠标指针显示在用户选中的内容上。之后,如果用户按下并移动鼠标(即使用PC的鼠标输入按下操作(如鼠标按下操作)和移动操作(如鼠标移动操作)),PC的键鼠模块可对应接收到按下事件(如鼠标按下事件)和移动事件(如鼠标移动事件)。此时,由于鼠标穿梭条件还未触发,因此挂载的HOOK不会拦截输入事件,鼠标按下事件和鼠标移动事件会传输给PC的windows系统。根据接收到的鼠标按下事件和鼠标移动事件,PC的windows系统可发起针对该内容的拖拽事件,并绘制内容,如内容的阴影随鼠标指针移动的动画(或者称为拖拽动画)在PC的显示屏上显示。例如,如图6B所示,用户想要将PC的图片601拖拽到平板电脑,并在平板电脑中继续拖拽。用户使用鼠标602选中该图片601,随后按下并移动鼠标602。随着鼠标602的移动,PC 在PC的显示屏603上对应显示图片601的阴影606随鼠标指针604移动的动画,如图6B中所示图片601的阴影606随鼠标指针604移动的拖拽轨迹如轨迹605所示。需要说明的是,在本实施例中,选中内容的操作,如S502中的选中内容的鼠标操作是可选的。例如,用户想要拖拽的内容是一个文件,或一个文件夹时,则可以不执行该选中内容的鼠标操作,而是在鼠标指针显示在该文件或文件夹上时,执行按下操作和移动操作,即可发起针对该文件或文件夹的拖拽事件。
另如S501中的描述,在键鼠共享模式开启后,PC会开启输入监听,并挂载HOOK。在拖拽事件发起后,鼠标指针会在PC显示屏上移动,PC的键鼠模块可利用输入监听,监测鼠标指针在PC显示屏上的实时坐标位置。当PC的键鼠模块根据监测到的鼠标指针在PC显示屏上的实时坐标位置,确定鼠标指针滑出PC显示屏的边缘时,可确定满足上述鼠标穿梭条件。此时,PC的键鼠模块可确定鼠标穿梭开始。
在PC的键鼠模块确定鼠标穿梭开始后,PC的键鼠模块可通过PC的传输管理模块,通过与平板电脑之间建立的连接,向平板电脑发送用于指示鼠标开始穿梭的穿梭状态信息。平板电脑接收到该信息后,可模拟出一个鼠标指针,并在平板电脑的显示屏上显示该鼠标指针(平板电脑上显示的该鼠标指针可以为本申请中的第二光标)。PC的键鼠模块也可将PC显示屏上显示的鼠标指针隐藏。当然,随鼠标指针移动的对象,如对象的阴影也会被隐藏。例如,结合图6B,随着鼠标602的移动,在鼠标指针604滑过PC显示屏603的边缘后,PC隐藏PC的显示屏603上显示的图片601的阴影606和鼠标指针604。另外,平板电脑在平板电脑的显示屏上显示鼠标指针。给用户鼠标指针从PC穿梭到平板电脑的视觉效果。
在PC的键鼠模块确定鼠标穿梭开始后,如果用户对鼠标进行了操作,PC的键鼠模块可利用HOOK,拦截接收到的对应输入事件,如鼠标事件,并捕获拦截到的鼠标事件中的操作参数,如鼠标操作参数。其中,该鼠标操作参数可以包括:鼠标按键标志位(用于指示用户对鼠标进行了按下、抬起、移动或滚轮滚动中的何种操作)、坐标信息(在用户移动了鼠标时,用于指示鼠标移动的X坐标和Y坐标)、滚轮信息(在用户操作了鼠标的滚轮时,用于指示滚轮滚动的X轴距离和Y轴距离)、键位信息(用于指示用户对鼠标的左键、中键或右键中的哪个键进行了操作)。PC的键鼠模块还可通过PC的传输管理模块,将捕获到的操作参数,如鼠标操作参数通过建立的连接传输给平板电脑,用于平板电脑做出对应响应。例如,继续结合图6B所示的示例,在鼠标指针滑过PC显示屏的边缘后,用户继续向同一个方向移动鼠标。PC的键鼠模块可接收移动事件,如鼠标移动事件。PC的键鼠模块此时可利用HOOK,将其拦截(或者说,将其屏蔽),以便该鼠标移动事件不会被发送给PC的windows系统,从而使得PC不会对接收到的鼠标移动事件做响应。PC的键鼠模块还可利用HOOK捕获拦截到的鼠标移动事件的操作参数,如鼠标操作参数,并通过PC的传输管理模块,通过建立的连接将捕获的该鼠标操作参数发送给平板电脑。作为一种示例,当鼠标事件为鼠标移动事件时,对应的鼠标操作参数可以是:用于指示用户对鼠标进行了移动的鼠标按键标志位,用于指示鼠标移动的X坐标和Y坐标的坐标信息,滚轮信息(取值为空)和键位信息(取值为空)。
另在鼠标穿梭开始后,PC(如PC的拖拽管理模块)可识别PC当前的拖拽状态 (即是否正在进行拖拽)。如果PC当前正在进行拖拽,则可发起拖拽事件的接续,或者说发起跨屏拖拽。对于windows端,由于拖拽事件需要从窗口发起,也需要用窗口接收。因此,在确定鼠标穿梭开始后,PC可显示一个不可见的窗口,或者称为隐形窗口。示例性的,在PC的键鼠模块确定鼠标穿梭开始后,PC的键鼠模块可向PC的拖拽管理模块发送鼠标穿梭开始的回调指示。PC的拖拽管理模块可根据该回调指示向PC的窗口管理模块发送用于指示创建隐形窗口的请求。PC的窗口管理模块可根据该请求,创建并显示隐形窗口。如PC的窗口管理模块可在PC显示屏的边缘显示该隐形窗口。其中,该隐形窗口的透明度大于阈值,如隐形窗口的透明度很高,或者完全透明。
在隐形窗口显示后,如果鼠标穿梭发生时,PC正在进行拖拽,即在鼠标穿梭发生时,PC的windows系统发起的是针对内容的拖拽事件,则该隐形窗口可从windows系统接收该拖拽事件。如果鼠标穿梭发生时,PC并未进行拖拽,即用户移动鼠标时并未选中任何内容,而是仅移动了鼠标,则该隐形窗口不会接收到拖拽事件。在隐形窗口接收到拖拽事件后,PC的窗口管理模块可从隐形窗口接收到的拖拽事件中获取拖拽事件内容。如,PC的窗口管理模块可通过DragEnter事件,从拖拽事件中捕获拖拽事件内容。PC的窗口管理模块在获得拖拽事件内容后,可通过PC的传输管理模块,通过与平板电脑的建立的连接发送给平板电脑。其中,拖拽事件内容在发给平板电脑之前,PC还可对其进行序列化处理,也就是说,PC发送给平板电脑的拖拽事件内容可以是进行序列化处理后得到的数据。
示例性的,如图7所示,为本申请实施例提供的一种windows端的拖拽事件的数据结构示意图。当拖拽事件进入隐形窗口时,隐形窗口会接收到拖拽事件对应的数据对象,如IDataObject。PC的窗口管理模块可将其附加(attach)到COleDataObject。之后,通过DragEnter事件,如调用GetData函数获取拖拽事件对应的IDataObject中的拖拽事件内容。其中,在本实施例中,实现拖拽事件的延续所需的拖拽事件内容可包括文本或文件路径等。作为一种示例,PC的窗口管理模块可通过GetData(CF_UNICODETEXT)函数获取IDataObject中的文本。PC的窗口管理模块可通过GetData(CF_HDROP)函数获取IDataObject中的文件路径。在获得文本或文件路径后,可将其进行序列化处理后发送给平板电脑。
另外,在本实施例中,为了在平板电脑端能够显示被拖拽的对象,如该对象的阴影随鼠标指针移动的动画,由于阴影和位图(bitmap)可以相互转换,因此,PC还需获取PC端显示的阴影的位图(bitmap)。例如,PC可以通过截取拖拽的内容显示在PC显示屏的图像的方式,获得阴影的位图。又例如,以拖拽的内容是文本为例,PC可以根据拖拽的文本生成阴影的位图。再例如,以拖拽的内容是文件为例,PC可以根据获得的拖拽内容的文件路径查找到该拖拽的内容,以确定该拖拽的内容的类型(如图像文件),然后可根据类型使用对应的默认素材作为该阴影的位图,或者可根据获得的拖拽内容的文件路径获取该内容的缩略图作为该阴影的位图。阴影的位图可以通过PC的传输管理模块,通过与平板电脑的建立的连接发送给平板电脑。其中,阴影的位图在发给平板电脑之前,PC还可对其进行序列化处理,也就是说,PC发送给平板电脑的阴影的位图可以是进行序列化处理后得到的数据。
S507、平板电脑接收拖拽事件内容和阴影的位图,根据拖拽事件内容和阴影的位图发起拖拽事件。
S508、平板电脑接收鼠标操作参数,根据鼠标操作参数模拟鼠标移动事件。
S509、平板电脑生成鼠标按下事件。
在平板电脑接收到拖拽事件内容和阴影的位图后,可对其进行解析,并发起拖拽事件。对于Android系统而言,由于拖拽事件需要从视图控件发起,也需要用视图控件接收。因此,平板电脑可开启一个透明的活动(activity),或者称为隐形activity。该隐形activity拥有透明度大于阈值的视图控件。利用该视图控件,调用安卓开源项目(Android open source project,AOSP)接口,平板电脑可根据接收到的拖拽事件内容和阴影的位图发起对应的拖拽事件,以便在平板电脑上实现拖拽事件的接续。
示例性的,结合图4,及图7所示的示例,平板电脑的传输管理模块接收到拖拽事件内容和阴影的位图后,可将该拖拽事件内容和阴影的位图传输给平板电脑的拖拽管理模块。平板电脑的拖拽管理模块对接收到的拖拽事件内容进行解析,获得来自PC的文本或文件路径。根据获得的文本或文件路径,PC的拖拽管理模块可构造出拖拽事件的内容数据(clipData)。另外,平板电脑的拖拽管理模块还可根据接收到的阴影的位图,生成对应的阴影。之后,平板电脑的拖拽管理模块可利用平板电脑开启的透明activity的视图控件,调用AOSP接口提供的startDragAndDrop方法,将clipData和阴影作为输入参数,便可发起平板电脑端的拖拽事件。当然,如果拖拽事件内容和阴影的位图在发送给平板电脑之前,PC还进行了序列化处理,则平板电脑接收到对应数据后,进行反序列化处理便可获得该拖拽事件内容和阴影的位图。
其中,作为一种示例,如图8所示,为本申请实施例提供的一种Android端的拖拽事件(DragEvent)的数据结构示意图。结合图8,平板电脑的拖拽管理模块可根据接收来自PC的文本(text)或文件路径(uri)等构造出内容数据(clipData)(其中文本或文件路径包含在内容数据包括的内容物(item)中),并根据接收到的阴影的位图,生成对应的阴影,然后调用AOSP的startDragAndDrop方法,将clipData和阴影,及根据用户对鼠标的操作信息(如接收到的鼠标移动事件),获得的其他参数(如,动作(action,可包含开始、进入、悬停、释放、离开、结束等),当前x坐标,当前y坐标,本地状态(localstate),内容描述(clipdescription)等)等作为输入参数,以发起(dispatch)平板电脑端的拖拽事件(DragEvent)。
需要说明的是,结合图8,内容描述中包括标签(label)。该label用于指示拖拽事件是由平板电脑的拖拽管理模块发起,还是由平板电脑中的应用发起。例如,Label是一个字符串(String)。如,当label为“windowscast”时,用于指示拖拽事件由平板电脑的拖拽管理模块发起;当label不为“windowscast”时,用于指示拖拽事件不是由平板电的拖拽管理模块发起,而是由平板电脑的应用发起。在上述示例中,由于拖拽事件是由拖拽管理模块发起的,因此拖拽事件的内容描述中的标签为“windowscast”。其他参数的具体描述及构造规则与现有技术在Android端生成原始拖拽事件的对应实现类似,此处不再赘述。
可以理解的是,拖拽事件的执行可由拖拽操作来触发,该拖拽操作可以包括按下操作(如鼠标按下操作)和移动操作(如鼠标移动操作)。而在用户想要将PC的内 容通过拖拽的方式传递到平板电脑,并在平板电脑中继续拖拽时,在用户通过移动鼠标,触发鼠标穿梭后,用户会继续向同一个方向移动鼠标。也就是说,用户会使用PC输入设备输入移动操作,如使用PC的鼠标输入鼠标移动操作。此时如S505的描述,在用户使用PC的鼠标输入鼠标移动操作后,PC的键鼠模块会将对应接收到的移动事件,如鼠标移动事件拦截,并将鼠标移动事件包含的操作参数,如鼠标操作参数捕获处理后通过PC的传输管理模块发送给平板电脑,如发送给平板电脑的传输管理模块。此时,平板电脑的传输管理模块可接收该鼠标操作参数。由于PC与平板电脑的操作系统不同,其鼠标事件中鼠标操作参数的键值存在差异。因此,平板电脑接收到该鼠标操作参数后,可根据预设映射关系,将接收到的鼠标操作参数的键位码转换成平板电脑能够识别的键位码。之后,平板电脑(如平板电脑的键鼠模块)利用创建的虚拟输入设备根据转换键位码后的鼠标操作参数可模拟出平板电脑能够识别的输入事件,如鼠标事件,即可以模拟出平板电脑能够识别的移动事件,如鼠标移动事件。平板电脑的键鼠模块还可将鼠标移动事件发送给平板电脑的框架层。
另外,在鼠标穿梭开始时,穿梭发起端,即PC是处于正在进行拖拽的状态的,也就是说,PC是处于鼠标按下状态的,而穿梭目标端,即平板电脑此时仅可接收到鼠标移动事件,即未处于鼠标按下状态。因此,平板电脑可生成一个按下操作,如鼠标按下操作。例如,平板电脑在调用AOSP接口提供的startDragAndDrop方法后,平板电脑可收到拖拽开始回调(onDragStart)。此时平板电脑可根据平板电脑发起的拖拽事件中的标签确定拖拽事件是否由平板电脑的拖拽管理模块发起。当确定拖拽事件由平板电脑的拖拽管理模块发起(如label为“windowscast”)时,平板电脑可根据按下操作,如鼠标按下操作的操作参数,利用创建的虚拟输入设备生成(或者说模拟)按下事件,如鼠标按下事件,如,平板电脑的拖拽管理模块控制平板电脑的键鼠模块利用创建的虚拟输入设备向平板电脑的框架层发送鼠标按下事件。这样,在平板电脑端S507中发起的拖拽事件才可被附加在平板电脑显示屏显示的鼠标指针上。
S510、平板电脑根据鼠标移动事件和鼠标按下事件执行拖拽事件,并在平板电脑的显示屏上显示内容的阴影随鼠标指针移动的动画。
平板电脑(如平板电脑的框架层)可根据移动事件,如鼠标移动事件和按下事件,如鼠标按下事件执行S507中发起的拖拽事件。在执行拖拽事件的过程中,平板电脑还可在平板电脑的显示屏上显示内容,如内容的阴影随鼠标指针(该鼠标指针可以为本申请中的第二光标)移动的动画。例如,结合图6B,如图9所示,随着鼠标901的移动,平板电脑在平板电脑的显示屏902上对应显示图片的阴影903随鼠标指针904移动的动画,如图9中所示图片的阴影903随鼠标指针904移动的拖拽轨迹如轨迹905所示。
在用户将内容通过拖拽的方式从PC拖拽到平板电脑,并在平板电脑中继续拖拽后,由于在平板电脑端,拖拽事件被附加在了平板电脑显示屏上显示的鼠标指针上,因此,用户可利用鼠标指针精确的选择鼠标释放点。例如,用户想要在平板电脑中使用或处理该内容,则可在将鼠标指针移动到想要使用或处理该内容的平板电脑的视图控件上后,释放鼠标。在鼠标被释放后,PC的键鼠模块可接收到对应的抬起事件,如鼠标抬起事件。由于该鼠标抬起事件是在鼠标穿梭开始后PC接收到的,因此PC的键 鼠模块会利用HOOK,将其拦截(或者说,将其屏蔽),以便该鼠标抬起事件不会被发送给PC的windows系统,从而使得PC不会对接收到的鼠标抬起事件做响应。PC的键鼠模块还可利用HOOK,捕获鼠标抬起事件包含的操作参数,如鼠标操作参数,并通过PC的传输管理模块,通过建立的连接将捕获到的鼠标操作参数发送给平板电脑。其中,该鼠标抬起事件的鼠标操作参数可以包括:用于指示用户对鼠标进行了抬起的鼠标按键标志位,坐标信息(取值为空),滚轮信息(取值为空)和用于指示用户对鼠标的左键进行了操作的键位信息。相应的,平板电脑的传输管理模块可接收该鼠标操作参数。在平板电脑接收到该鼠标操作参数后,平板电脑可根据预设映射关系,将接收到的鼠标操作参数的键位码转换成平板电脑能够识别的键位码。之后,平板电脑利用创建的虚拟输入设备根据转换键位码后的操作参数,如鼠标操作参数可模拟出平板电脑能够识别的鼠标事件,即可以模拟出对应输入事件,如鼠标抬起事件。
在模拟出鼠标抬起事件后,平板电脑可根据鼠标指针当前所在坐标位置,确定鼠标释放点。例如,在平板电脑获知键鼠穿梭开始后,平板电脑可注册鼠标指针坐标位置的侦听器。这样,平板电脑通过该侦听器,可实时监测鼠标指针在平板电脑显示屏上的坐标位置。基于此,在平板电脑接收到鼠标抬起事件后,利用该侦听器,平板电脑可获得鼠标指针当前在平板电脑显示屏上的坐标位置。根据获得的该坐标位置,平板便可确定鼠标释放点。又例如,在平板电脑调用AOSP的startDragAndDrop方法发起拖拽事件后,平板电脑会对输入事件进行监听。如,当用户在平板电脑中继续拖拽时,平板电脑可监听到移动事件,如鼠标移动事件,根据鼠标移动事件平板电脑可获取该鼠标移动事件的操作参数,如鼠标操作参数,如从MotionEvent中提取该参数。该鼠标操作参数中包括用于指示鼠标位置的坐标信息。之后,在用户抬起手指,平板电脑监听到抬起事件,如鼠标抬起事件后,根据之前获得坐标信息平板电脑可确定鼠标指针的坐标位置,从而根据获得的坐标位置,确定鼠标释放点。
在用户释放鼠标后,由于PC端是直接对接收到的抬起事件,如鼠标抬起事件进行拦截处理的,因此,PC端的拖拽事件并未被释放,此时平板电脑可在接收到鼠标抬起事件后,通过建立的连接,向PC发送信令,以告知PC释放拖拽事件。
另外,平板电脑还可对抬起事件,如鼠标抬起事件进行相应的响应。例如,以S502中的内容为文本为例,平板电脑在模拟出鼠标抬起事件后,平板电脑的拖拽管理模块可将上述拖拽事件中的内容数据,发送给鼠标释放点的视图控件,该内容数据中包括该文本。视图控件接收到该内容数据后,可根据内容数据中的文本进行相应的处理,如将该文本在视图控件中展示出来。又例如,S502中的内容为文件为例。由于在平板电脑端拖拽文件的过程中,实际上文件并未传输给平板电脑。因此,在用户释放鼠标,PC接收到来自平板电脑的上述告知释放拖拽事件的信令后,PC可将该文件传输给平板电脑。平板电脑接收到该文件后,可将该文件存储在预定的缓存目录下,此时平板电脑的拖拽管理模块可获取该文件的uri(如称为uri 1),该uri 1是该缓存目录下的文件的路径,与S507中PC发送给平板电脑的拖拽事件内容中的uri不同(PC发送的uri是该文件在PC上的存储路径)。平板电脑的拖拽管理模块根据uri 1可构造新的内容数据,并作为对鼠标抬起事件的响应,将该内容数据发送给鼠标释放点的视图控件。视图控件接收到该内容数据后,可进行相应的处理,如视图控件是备忘录中的视图控 件,则可将该文件展示出来,又如视图控件是将聊天窗口中的输入框,则可将文件发送出去。
在其他一些实施例中,当平板电脑还与其他设备,如手机建立了连接时,如果用户想要继续将该内容拖拽到手机,则可继续移动鼠标,以使得平板电脑上的鼠标指针滑过平板电脑显示屏的边缘,触发鼠标从平板电脑到手机的穿梭,以便在手机上继续实现拖拽事件的接续。需要说明的是,在鼠标穿梭到手机上后,PC可直接与手机进行交互,以便手机实现拖拽事件的接续。手机上实现拖拽事件接续的具体描述,与平板电脑实现拖拽事件接续的具体描述类似,此处不再一一赘述。
可以理解的是,通过上述过程,在不启动投屏的前提下,用户可使用输入设备,如PC的鼠标,将PC应用内的文本或文件等内容使用拖拽的方式,移动到PC屏幕的边缘并继续向同一个方向移动,以触发鼠标穿梭。在鼠标穿梭开始后,鼠标指针出现在平板电脑上。且PC通过将拖拽事件内容发送给平板电脑,使得拖拽事件可被继续附加在平板电脑的鼠标指针上,以在平板电脑上实现拖拽事件的接续,给用户一个内容被从PC拖拽在平板电脑上的视觉效果。
在本申请实施例中,用户可能不仅有将PC的内容通过拖拽的方式传递到平板电脑,并在平板电脑中继续拖拽的需求,还可能会有将平板电脑中的内容通过拖拽的方式传递到PC的需求。例如,结合上述S501-S510中的示例,用户将PC的图片1通过拖拽的方式传递到了平板电脑,并在平板电脑中继续拖拽后,释放了鼠标。之后,用户利用平板电脑的手写笔对图片1进行编辑后保存。用户想将编辑后的图片1通过拖拽的方式传递到了PC,并在PC继续拖拽后,释放鼠标,以便在PC端保存编辑后的图片1。
请参考图10,为本申请实施例提供的另一种跨设备的对象拖拽方法的流程示意图。以下结合图5所示实施例,以在鼠标穿梭发生后,用户想要将平板电脑的内容通过拖拽的方式传递到PC,并在PC中继续拖拽为例,对本实施例提供的方法进行介绍。如图10所示,该方法可以包括以下S1001-S1010。
S1001、平板电脑接收来自PC的选中内容的鼠标操作。
S1002、平板电脑接收来自PC的鼠标按下事件和鼠标移动事件的鼠标操作参数,根据鼠标操作参数模拟鼠标按下事件和鼠标移动事件,根据鼠标按下事件和鼠标移动事件发起拖拽事件,并在平板电脑的显示屏上显示内容的阴影随鼠标指针移动的动画。
如图5所示实施例中的描述,在键鼠共享模式开启后,如果鼠标穿梭发生,则PC会利用挂载HOOK在用户操作PC的输入设备,如鼠标后,拦截对应输入事件,如鼠标事件,且PC会将拦截到的鼠标事件中包含的操作参数,如鼠标操作参数捕获后发送给平板电脑。示例性的,以在鼠标穿梭到平板电脑后,用户想要通过拖拽的方式将平板电脑中的内容传递到PC,并在PC中继续拖拽为例。结合图4,用户使用PC的输入设备,如鼠标选中想要拖拽的内容。PC的键鼠模块可接收到对应的输入操作,如鼠标操作。PC的键鼠模块会利用HOOK,拦截接收到的对应输入事件,如鼠标事件,以便该输入事件不会被发送给PC的windows系统,从而使得PC不会对接收到的输入事件做响应。PC的键鼠模块还会利用HOOK捕获拦截到的输入事件中的操作参数,如鼠标操作参数,并通过PC的传输管理模块,通过建立的连接传输给平板电脑。此 时平板电脑的传输管理模块可接收到对应的操作参数,如鼠标操作参数。平板电脑在对接收到的操作参数进行键位码转化后,利用创建的虚拟输入设备可模拟出对应的输入事件,如鼠标事件。
之后,在平板电脑的鼠标指针显示在想要拖拽的对象,如用户选中的内容上时,用户可使用PC的输入设备,如鼠标输入拖拽操作,以便平板电脑可根据该拖拽操作实现对相应对象,也就是该内容的拖拽。其中,该拖拽操作可以是用于指示发起针对该选中内容的拖拽事件的操作。该拖拽操作可以包括一个操作,也可以包括多个操作。例如,拖拽操作包括两个操作,分别为按下操作和移动操作。以输入设备是鼠标为例,该按下操作可以是鼠标按下操作,该移动操作可以是鼠标移动操作。如用户可按下并移动PC的鼠标,即使用PC的鼠标输入鼠标按下操作和鼠标移动操作。之后,PC的键鼠模块可对应接收到按下事件(如鼠标按下事件)和移动事件(如鼠标移动事件)。类似的,PC的键鼠模块可利用HOOK,拦截接收到的鼠标按下事件和鼠标移动事件,并利用HOOK将拦截到的鼠标按下事件和鼠标移动事件的操作参数,如鼠标操作参数捕获后,通过PC的传输管理模块,通过建立的连接传输给平板电脑。此时平板电脑的传输管理模块可接收到对应的操作参数,如鼠标操作参数。平板电脑在对接收到的操作参数进行键位码转化后,利用创建的虚拟输入设备可模拟出对应输入事件,如模拟出按下事件(如鼠标按下事件)和移动事件(如鼠标移动事件)。
在平板电脑模拟出按下事件(如鼠标按下事件)和移动事件(如鼠标移动事件)后,作为对这些事件的响应,平板电脑可由对应应用(如选中的内容是word文档,该应用为办公应用;又如选中的内容是文件管理器中的图片,该应用为文件管理器;再如选中的内容是备忘录中的一段文本,该应用为备忘录)发起拖拽事件,并绘制内容,如内容的阴影随鼠标指针移动的动画在平板电脑的显示屏上显示。例如,用户利用平板电脑的手写笔对从PC拖拽来的图片1进行编辑后保存。用户想要将编辑后的图片拖拽到PC,并在PC中继续拖拽。如图11所示,用户可使用鼠标1101选中该编辑后的图片1102,随后按下并移动鼠标1101。随着鼠标1101的移动,平板电脑在平板电脑的显示屏1103上对应显示编辑后的图片1102的阴影1106随鼠标指针1104移动的动画,如图11中所示图片1102的阴影1106随鼠标指针1104移动的拖拽轨迹如轨迹1105所示。
需要说明的是,以上是以触发拖拽的事件包括按下事件和移动事件为例进行说明的。如用户通过按下并移动鼠标可触发拖拽。在其他一些实施例中,触发拖拽的事件可以仅包括按下事件。如用户可通过长按鼠标触发拖拽,或者用户可通过按下鼠标触发拖拽。本实施例在此不做具体限制。
S1003、平板电脑获取拖拽事件内容和阴影的位图,并缓存获得的拖拽事件内容和阴影的位图。
其中,拖拽事件内容用于拖拽接续端设备构造拖拽事件。例如,当S1101中的内容为文本时,拖拽事件内容可以包括文本(text)。又例如,当S1101中的内容为文件或文件夹时,拖拽事件内容可以包括文件路径(如,uri)。
示例性的,结合图4,在平板电脑的应用发起拖拽事件时,对应的拖拽事件内容可被框架层回调给应用层。如在应用在调用AOSP接口提供的startDragAndDrop方法 发起拖拽事件时,框架层可利用程序插桩的方式,从中提取到拖拽事件内容。如可预先确定需要插桩的地点和内容,然后根据确定出的地点和内容进行程序插桩,以便可提取出拖拽事件内容。之后,平板电脑的框架层可将提取出的拖拽事件内容回调给平板电脑应用层的拖拽管理模块。平板电脑的拖拽管理模块获得该拖拽事件内容后,可将其缓存。作为一种示例,实现拖拽事件的延续所需的拖拽事件内容可包括文本或文件路径等。在应用发起拖拽事件时,框架层利用程序插桩的方式可提取出文本或文件路径。
另外,在本实施例中,拖拽事件可能有两种发起方式,一种是由平板电脑的拖拽管理模块发起的(即拖拽是由PC穿梭到平板电脑的,如图5所示实施例中的描述),另一种是由平板电脑的应用发起的。且结合图5所示实施例中对图8所示数据结构的描述可知,在本实施例中,平板电脑的拖拽事件的内容描述中包括标签(label),用于指示拖拽事件是由平板电脑的拖拽管理模块发起,还是由平板电脑中的应用发起。可以理解的是,当拖拽事件是由平板电脑的应用发起,或者说不是由平板电脑的拖拽管理模块发起的时,才需要获取拖拽事件内容并缓存,以便在拖拽穿梭到PC后,用于PC进行拖拽事件接续。也就是说,拖拽事件被发起后,平板电脑可根据拖拽事件的内容描述中的标签(label)来区分拖拽事件是否由平板电脑的应用发起,还是由平板电脑的拖拽管理模块发起。如当label不为“windowscast”时,可确定拖拽事件不是由平板电脑的拖拽管理模块发起,而是由平板电脑的应用发起。当拖拽事件不是由平板电脑的拖拽管理模块发起的,框架层才需获取拖拽事件内容,并将获取到的拖拽事件内容发送给平板电脑的拖拽管理模块,以便其缓存该拖拽事件内容。
另外,在本实施例中,框架层还可通过新增接口或利用原有接口(如利用回调clipData的接口),获取阴影的位图(bitmap)。获得的阴影的位图也可以回调给平板电脑应用层的拖拽管理模块。平板电脑的拖拽管理模块获得该阴影的位图后,也可将其缓存。
S1004、平板电脑监测鼠标指针在平板电脑显示屏上的坐标位置。
S1005、平板电脑根据鼠标指针在平板电脑显示屏上的坐标位置,在确定鼠标指针滑出平板电脑显示屏边缘时,向PC发送用于指示鼠标穿梭结束的穿梭状态信息。
S1006、PC接收用于指示鼠标穿梭结束的穿梭状态信息。
在内容(如内容的阴影)跟随鼠标指针移动的过程中,平板电脑可判断被拖拽的内容(如内容的阴影)是否被拖出平板电脑显示屏的边缘。在内容(如内容的阴影)被拖出平板电脑显示屏的边缘时,表明用户想要使用鼠标对其他设备进行控制。在本实施例中,内容被拖出平板电脑显示屏的边缘的具体描述与内容被拖出PC的具体描述类似,此处不再详细赘述。以下以判断被拖拽的内容(如内容的阴影)是否被拖出平板电脑显示屏的边缘具体为判断鼠标指针是否滑出平板电脑显示屏的边缘为例进行说明。
作为一种示例,在拖拽事件发起后,鼠标指针会在平板电脑的显示屏上移动,平板电脑可监测鼠标指针在平板电脑显示屏上的实时坐标位置。例如,在键鼠穿梭开始后,平板电脑可注册鼠标指针坐标位置的侦听器。这样,平板电脑通过该侦听器,可实时监测鼠标指针在平板电脑显示屏上的坐标位置。平板电脑可根据侦听器监测到的 鼠标指针在平板电脑显示屏上的实时坐标位置,确定鼠标指针是否滑过平板电脑显示屏的边缘。示例性的,平板电脑可根据鼠标指针的初始位置和相对位移确定鼠标指针在平板电脑显示屏上的坐标位置,从而确定鼠标指针是否滑出平板电脑显示屏的边缘。其中,鼠标指针的初始位置可以是鼠标开始移动时,鼠标指针在平板电脑显示屏上的坐标位置,或者说是鼠标开始移动之前鼠标指针在平板电脑显示屏上的坐标位置。该鼠标指针的初始位置具体可以是指在以平板电脑显示屏的左上角为坐标原点,X轴从左上角指向平板电脑显示屏右边缘,Y轴从左上角指向平板电脑显示屏下边缘的坐标系中的坐标位置。平板电脑确定鼠标指针滑出平板电脑显示屏边缘的具体实现与PC确定鼠标指针滑出PC显示屏边缘的具体实现类似,此处不再详细赘述。
在平板电脑确定鼠标指针滑过平板电脑显示屏的边缘后,表明用户想要使用鼠标对其他设备进行控制。如S506中的描述,如果平板电脑仅与PC一个设备建立了连接,则表明用户想使用鼠标对PC进行控制,如果平板电脑与多个设备建立了连接,则手机可显示列表选项,该列表选项包括与平板电脑连接的所有设备的标识,供用户选择想用鼠标控制的设备。如用户选择了PC的标识,则表明用户想使用鼠标对PC进行控制。或者也可以在平板电脑中预先配置穿梭关系,用于确定鼠标穿梭到哪个设备,即确定转由哪个设备对鼠标的操作进行响应,对于穿梭关系的配置和应用的具体描述上述实施例中对应内容的描述类似,此处不在详细赘述。在确定用户想使用鼠标对PC进行控制时,平板电脑可确定鼠标穿梭结束。此时,平板电脑可通过建立的连接,向PC发送用于指示鼠标穿梭结束的穿梭状态信息。
S1007、PC根据接收到的穿梭状态信息确定鼠标穿梭结束。
S1008、PC取消鼠标事件的拦截,并生成鼠标按下事件。
S1009、PC从平板电脑获取拖拽事件内容和阴影的位图,并根据拖拽事件内容和阴影的位图发起拖拽事件。
其中,在PC接收到用于指示鼠标穿梭结束的穿梭状态信息后,可确定鼠标穿梭结束。
在确定鼠标穿梭结束后,PC可在PC显示屏上显示鼠标指针。结合图5所示实施例中的描述,在鼠标穿梭开始时,PC会隐藏PC显示屏上的鼠标指针,因此,在确定鼠标穿梭结束后,PC可重新在PC显示屏上显示鼠标指针。另外,鼠标穿梭开始的触发条件是鼠标指针滑过PC显示屏的边缘,因此,鼠标指针被隐藏前显示在PC显示屏的边缘。则在该实施例中,当鼠标穿梭结束时,PC取消鼠标指针的隐藏后,鼠标指针会显示在PC显示屏的边缘。当然,在鼠标穿梭结束时,平板电脑端的鼠标指针也将不显示。这样给用户鼠标指针从平板电脑穿梭到PC的视觉效果。
在确定鼠标穿梭结束后,PC还需卸载HOOK(或者说关闭HOOK),以取消对输入设备,如鼠标事件的拦截。示例性的,在用户通过移动鼠标,将平板电脑的内容通过拖拽的方式传递到PC,触发鼠标结束穿梭后,用户会继续向同一个方向移动鼠标,PC的键鼠模块可接收移动事件,如鼠标移动事件。由于此时已卸载了HOOK,因此,PC的键鼠模块会将接收到的移动事件,如鼠标移动事件发送给PC的windows系统,以便PC的windows系统对该移动事件进行响应。
另外,在鼠标穿梭结束前,平板电脑是处于正在进行拖拽的状态的,而鼠标穿梭 回PC是要实现拖拽的接续,即需要在PC继续实现拖拽。由于Windows平台必须要在窗口上按下鼠标,才能正常发起拖拽。但在用户继续移动鼠标的过程中,PC仅可接收到移动事件,如鼠标移动事件,即不会接收到按下事件,如鼠标按下事件。因此,PC(如PC的拖拽管理模块)可生成一个按下事件,如鼠标按下事件,并传输给隐形窗口。例如,在PC确定鼠标穿梭结束后,平板电脑正在处于拖拽时,PC可生成鼠标按下事件,并传输给隐形窗口,以便在PC端S1009中发起的拖拽事件能被附加在PC显示屏显示的鼠标指针上,即在PC端实现拖拽事件的接续。
在本实施例中,在确定鼠标穿梭结束后,PC还可向平板电脑请求平板电脑的拖拽状态(即是否正在处于拖拽),当平板电脑返回的拖拽状态表明正在处于拖拽时,PC可向平板电脑请求拖拽事件内容和阴影的位图。PC可向平板电脑发送请求消息,该请求消息可用于请求拖拽数据,即用于请求拖拽事件内容和阴影的位图。
示例性的,结合图4,在PC确定出鼠标穿梭结束时,PC的键鼠模块可向PC的拖拽管理模块发送鼠标穿梭结束的指示。PC的拖拽管理模块根据接收到的指示,可通过PC的传输管理模块,向平板电脑请求拖拽状态。当平板电脑返回的拖拽状态表明正在处于拖拽时,PC的拖拽管理模块可通过PC的传输管理模块,向平板电脑请求拖拽事件内容和阴影的位图。相应的,平板电脑的传输管理模块可接收到该请求,并向平板电脑的拖拽管理模块转发该请求。平板电脑的拖拽管理模块接收到该请求后,可将S1003中缓存的拖拽事件内容和阴影的位图,通过平板电脑的传输管理模块反馈给PC的传输管理模块。
结合S1003中的描述,PC的传输管理模块接收到拖拽事件内容后,可将该拖拽事件内容传输给PC的拖拽管理模块。PC的拖拽管理模块对接收到的拖拽事件内容进行解析,可获得来自平板电脑的文本或文件路径。根据获得的文本或文件路径,PC的拖拽管理模块可构造出拖拽事件的数据对象,如IDataObject。另外,PC的传输管理模块接收到阴影的位图后,可根据该位图在PC端还原阴影。如,可利用PC端提供的IDragSourceHelper接口还原阴影。之后,PC便可发起PC端的拖拽事件。如,结合图5所示实施例的描述,在鼠标穿梭开始时,PC会显示一隐形窗口。因此,PC可利用显示的该隐形窗口发起该拖拽事件。在拖拽事件被发起后可关闭该隐形窗口。
S1010、PC根据鼠标移动事件和鼠标按下事件执行拖拽事件,并在PC的显示屏上显示内容的阴影随鼠标指针移动的动画。
在拖拽事件被发起后,作为对移动事件,如鼠标移动事件和按下事件,如鼠标按下事件的响应,PC可执行该拖拽事件。并在PC的显示屏上显示内容,如内容的阴影随鼠标指针移动的动画。例如,结合图11,如图12所示,随着鼠标1201的移动,PC在PC的显示屏1202上对应显示编辑后的图片的阴影1203随鼠标指针1204移动的动画,如图12中所示阴影1203随鼠标指针1204移动的拖拽轨迹如轨迹1205所示。
需要说明的是,以上示例是以用户先进行了内容由PC到平板电脑的拖拽,后进行了内容由平板电脑到PC的拖拽为例进行说明的。在其他一些实施例中,用户也可以不进行内容由PC到平板电脑的拖拽,而是直接将平板电脑的某内容拖拽到了PC。这种实施例中的具体实现与图5和图10所示实施例的具体实现类似,区别在于,不执行S502。S503替换为接收移动事件(如鼠标移动事件),根据该移动事件在PC的显 示屏上显示鼠标指针移动的动画。不执行S506,但PC显示隐形窗口的操作是需要执行的,只是隐形窗口不会接收到拖拽事件。不执行S507和S509。S510替换为平板电脑根据移动事件,如鼠标移动事件,在平板电脑的显示屏上显示鼠标指针移动的动画。也就是说,在鼠标穿梭开始后,平板电脑可接收用户使用PC的输入设备,如鼠标输入的移动平板电脑端显示的鼠标指针的操作。作为对该操作的响应,平板电脑可在平板电脑的显示屏上显示鼠标指针移动的动画。如果用户想要拖拽某个内容,则可继续移动PC的鼠标,直到鼠标指针移动到该内容上。其中,平板电脑接收用户使用PC的输入设备,如鼠标输入的移动鼠标指针的操作,具体的可以是:PC拦截对应的移动事件,如鼠标移动事件,并将移动事件包括的操作参数发送给平板电脑。平板电脑可根据该操作参数模拟移动事件,如鼠标移动事件,从而可在平板电脑的显示屏上显示鼠标指针移动的动画。其他操作类似,本申请实施例在此不再详细赘述。
以上实施例是以在拖拽开始时,平板电脑将拖拽数据缓存起来,在鼠标穿梭结束后,由PC向平板电脑请求拖拽数据为例进行说明的。在其他一些实施例中,在拖拽开始时,平板电脑也可不进行拖拽数据的缓存,而是在确定鼠标穿梭结束后,获取并将PC主动发送拖拽数据,无需PC请求。另外,以上实施例是以输入设备是鼠标为例进行说明的,在本实施例中,输入设备也可以是触摸板。当输入设备是触摸板时,用户可使用触摸板的按键(左按键或右按键)输入按下操作,通过手指在触摸板上滑动输入移动操作。用户使用触摸板实现对象拖拽的具体实现与使用鼠标实现拖拽的具体实现类似,此处不再一一赘述。
本实施例提供的方法,在不启动投屏的前提下,借助键鼠共享技术,使得用户可利用如鼠标等输入设备,通过拖拽的方式使文本或文件等内容跟随鼠标指针在参与协同使用的多个终端间穿梭。且允许用户使用这些终端对传递的内容进行处理,即可使得这多个终端的硬件能力均能参与到协同办公中来。另外,由于无需启动投屏,因此不会挤占某个终端显示屏的显示空间。提高了多终端协同使用的使用效率,提高了用户的使用体验。
需要说明的是,以上实施例是以被拖拽的对象是文本或文件等内容为例进行说明的,在其他一些实施例中,被拖拽的对象也可以为应用的图标;或者,被拖拽的对象也可以为窗口,该窗口中可以包括应用的界面。当对象为应用的图标时,上述拖拽数据可以包括:应用的图标;当对象为窗口时,上述拖拽数据可以包括:应用的界面(如应用的界面截图)。其具体实现与上述实施例拖拽文件或文件等内容的具体实现类似,此处不再一一赘述。区别在于,在拖拽对象为应用的图标或应用的界面时,在确定对象被拖出PC显示屏的边缘后,无需创建隐形窗口,平板电脑也无需开启透明的活动。另外,平板电脑接收到拖拽数据后,也无需进行拖拽事件的构建,而是直接根据拖拽数据中包括的应用的图标或应用的界面,进行拖拽的接续。另外,平板电脑也无需生成鼠标按下事件,而是根据鼠标移动事件,便可使得应用的图标或窗口随平板电脑上鼠标指针的移动而移动。另外,在用户释放鼠标后,平板电脑将接收到的鼠标抬起事件发送给PC后,PC可将被拖拽的应用的界面发送给平板电脑,以便平板电脑在平板电脑的显示屏上显示该应用的界面。例如,用户可利用的PC的输入设备,将PC显示屏上显示的某应用的图标或应用的界面通过拖拽的方式拖拽到平板电脑上。用户可继 续利用PC的输入设备,在平板电脑上继续拖拽,或者用户可释放拖拽。在用户释放拖拽后,平板电脑可在平板电脑上显示该应用的界面。
图13为本申请实施例提供的一种跨设备的对象拖拽装置的组成示意图。如图13所示,该装置可以应用于第一终端(如上述PC),第一终端与第二终端连接,该装置可以包括:显示单元1301、输入单元1302和发送单元1303。
其中,显示单元1301,用于在第一终端显示的对象上显示第一光标。
输入单元1302,用于接收用户使用第一终端的输入设备输入的拖拽操作,拖拽操作用于发起针对对象的拖拽。
显示单元1301,还用于响应于拖拽操作,在第一终端的显示屏上显示对象随第一光标移动的动画。
发送单元1303,用于在确定对象被拖出第一终端的显示屏的边缘时,向第二终端发送拖拽数据。
进一步的,上述拖拽数据可以用于第二终端发起针对对象的拖拽事件,以便第二终端在第二终端的显示屏上显示对象随第二光标移动的动画。
进一步的,上述拖拽操作可以包括按下操作和移动操作;发送单元1303,还用于向第二终端发送用户使用第一终端的输入设备输入的移动操作的数据。
进一步的,该装置还可以包括:拦截单元1304。
拦截单元1304,用于在用户使用第一终端的输入设备执行移动操作的过程中,拦截移动事件;发送单元1303,具体用于向第二终端发送移动事件包括的操作参数。
进一步的,发送单元1303,还用于向第二终端发送穿梭状态信息,穿梭状态信息用于指示穿梭开始。
进一步的,显示单元1301,具体用于在第一终端的显示屏上显示对象的阴影随第一光标移动的动画。
进一步的,显示单元1301,还用于隐藏第一光标和对象的阴影。
进一步的,上述对象可以为文本,文件或文件夹;拖拽数据包括拖拽事件内容和阴影的位图;其中,对象为文本时,拖拽事件内容包括文本,对象为文件或文件夹时,拖拽事件内容为文件路径。
进一步的,显示单元1301,还用于显示隐形窗口,隐形窗口的透明度大于阈值,隐形窗口用于接收拖拽事件;该装置还可以包括:获取单元1305;获取单元1305,用于从隐形窗口接收到的拖拽事件中获取拖拽事件内容;获取阴影的位图。
进一步的,上述对象可以为应用的图标;或者,上述对象可以为窗口,窗口中包括应用的界面;对象为应用的图标时,拖拽数据包括:应用的图标;对象为所述窗口时,拖拽数据包括:所述应用的界面。
图14为本申请实施例提供的另一种跨设备的对象拖拽装置的组成示意图。如图14所示,该装置可以应用于第二终端(如上述平板电脑),第二终端与第一终端连接,该装置可以包括:接收单元1401、显示单元1402。
显示单元1402,用于在第二终端的显示屏上显示从第一终端拖拽来的对象;在对象上显示第二光标。
接收单元1401,用于接收用户使用第一终端的输入设备输入的移动操作。
显示单元1402,还用于根据移动操作,在第二终端的显示屏上显示对象随第二光标移动的动画。
进一步的,接收单元1401,还用于接收来自第一终端的穿梭状态信息,穿梭状态信息用于指示穿梭开始。
进一步的,接收单元1401,还用于接收来自第一终端的拖拽数据。显示到哪元1402具体用于,根据拖拽数据在第二终端的显示屏上显示从第一终端拖拽来的对象。
其中,拖拽数据和移动操作是对象随第一光标在第一终端的显示屏上移动的情况下,第一终端确定对象被拖出第一终端的显示屏的边缘后向第二终端发送的,用于发起针对对象的拖拽事件。
进一步的,上述装置还可以包括:生成单元1403,用于生成按下操作。
显示单元1402,具体用于根据移动操作,按下操作和拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画。
进一步的,生成单元1403,具体用于根据按下操作的操作参数模拟按下事件;接收单元1401,具体用于接收来自第一终端的操作参数;生成单元1403还用于根据操作参数模拟移动事件;操作参数是用户使用第一终端的输入设备执行移动操作后第一终端接收到的移动事件包含的操作参数;显示单元1402,具体用于响应于按下事件和移动事件,根据拖拽数据,在第二终端的显示屏上显示对象随第二光标移动的动画。
进一步的,该装置还可以包括:创建单元1404。
创建单元1404,用于在与第一终端的连接建立成功后,创建虚拟输入设备;或者,接收单元1401,还用于接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,创建单元1404,用于响应于通知消息,创建虚拟输入设备;其中,虚拟输入设备用于第二终端根据操作参数模拟输入事件。
进一步的,显示单元1402,具体用于在第二终端的显示屏上显示从第一终端拖拽来的对象的阴影。
显示单元1402,具体用于根据移动操作,在第二终端的显示屏上显示对象的阴影随第二光标移动的动画。
进一步的,上述对象可以为文本,文件或文件夹;拖拽数据包括拖拽事件内容和阴影的位图;其中,对象为文本时,拖拽事件内容包括文本,对象为文件或文件夹时,拖拽事件内容为文件路径。
进一步的,创建单元1404,还用于创建隐形活动,隐形活动具有透明度大于阈值的视图控件,视图控件用于发起拖拽事件。
进一步的,对象为应用的图标;或者,对象为窗口,窗口中包括应用的界面;对象为应用的图标时,拖拽数据包括:应用的图标;对象为窗口时,拖拽数据包括:应用的界面。
本申请实施例还提供一种跨设备的对象拖拽装置,该装置可以应用于上述实施例中的第一终端或第二终端。该装置可以包括:处理器,以及用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令时实现上述方法实施例中PC或平板电脑执行的各个功能或者步骤。
本申请实施例还提供一种终端(该终端可以为上述实施例中的第一终端或第二终 端),该终端可以包括:显示屏、存储器和一个或多个处理器。该显示屏、存储器和处理器耦合。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,终端可执行上述方法实施例中PC或平板电脑执行的各个功能或者步骤。当然,该终端包括但不限于上述显示屏、存储器和一个或多个处理器。例如,该终端的结构可以参考图2所示的平板电脑的结构。
本申请实施例还提供一种芯片系统,该芯片系统可以应用于前述实施例中的终端(如第一终端或第二终端)。如图15所示,该芯片系统包括至少一个处理器1501和至少一个接口电路1502。该处理器1501可以是上述终端中的处理器。处理器1501和接口电路1502可通过线路互联。该处理器1501可以通过接口电路1502从上述终端的存储器接收并执行计算机指令。当计算机指令被处理器1501执行时,可使得终端执行上述实施例中PC或平板电脑执行的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机可读存储介质,用于存储上述终端(如PC或平板电脑)运行的计算机指令。
本申请实施例还提供一种计算机程序产品,包括上述终端(如PC或平板电脑)运行的计算机指令。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only  memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (25)

  1. 一种跨设备的对象拖拽方法,其特征在于,应用于第二终端,所述第二终端与第一终端连接,所述方法包括:
    所述第二终端在所述第二终端的显示屏上显示从所述第一终端拖拽来的对象;
    所述第二终端在所述对象上显示第二光标;
    所述第二终端接收用户使用所述第一终端的输入设备输入的移动操作;
    所述第二终端根据所述移动操作,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画。
  2. 根据权利要求1所述的方法,其特征在于,在所述第二终端在所述第二终端的显示屏上显示从所述第一终端拖拽来的对象之前,所述方法还包括:
    所述第二终端接收来自所述第一终端的穿梭状态信息,所述穿梭状态信息用于指示穿梭开始。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第二终端在所述第二终端的显示屏上显示从所述第一终端拖拽来的对象,包括:
    所述第二终端接收来自所述第一终端的拖拽数据;
    所述第二终端根据所述拖拽数据在所述第二终端的显示屏上显示从所述第一终端拖拽来的所述对象;
    其中,所述拖拽数据和所述移动操作是所述对象随第一光标在所述第一终端的显示屏上移动的情况下,所述第一终端确定所述对象被拖出所述第一终端的显示屏的边缘后向所述第二终端发送的,用于发起针对所述对象的拖拽事件。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    所述第二终端生成按下操作;
    所述第二终端根据所述移动操作,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画,包括:
    所述第二终端根据所述移动操作,所述按下操作和所述拖拽数据,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画。
  5. 根据权利要求4所述的方法,其特征在于,
    所述第二终端生成按下操作,包括:所述第二终端根据所述按下操作的操作参数模拟按下事件;
    所述第二终端接收用户使用所述第一终端的输入设备输入的移动操作,包括:所述第二终端接收来自所述第一终端的操作参数,根据所述操作参数模拟移动事件;所述操作参数是用户使用所述第一终端的输入设备执行所述移动操作后所述第一终端接收到的移动事件包含的操作参数;
    所述第二终端根据所述移动操作,所述按下操作和所述拖拽数据,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画,包括:
    响应于所述按下事件和所述移动事件,所述第二终端根据所述拖拽数据,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述方法还包括:
    所述第二终端在与所述第一终端的连接建立成功后,创建虚拟输入设备;或者,
    所述第二终端接收来自所述第一终端的通知消息,所述通知消息用于指示所述第一终端的键鼠共享模式已开启,响应于所述通知消息,所述第二终端创建所述虚拟输入设备;
    其中,所述虚拟输入设备用于所述第二终端根据操作参数模拟输入事件。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述第二终端在所述第二终端的显示屏上显示从所述第一终端拖拽来的对象,包括:
    所述第二终端在所述第二终端的显示屏上显示从所述第一终端拖拽来的对象的阴影;
    所述第二终端根据所述移动操作,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画,包括:
    所述第二终端根据所述移动操作,在所述第二终端的显示屏上显示所述对象的阴影随所述第二光标移动的动画。
  8. 根据权利要求7所述的方法,其特征在于,所述对象为文本,文件或文件夹;所述拖拽数据包括拖拽事件内容和所述阴影的位图;
    其中,所述对象为所述文本时,所述拖拽事件内容包括所述文本,所述对象为所述文件或所述文件夹时,所述拖拽事件内容为文件路径。
  9. 根据权利要求8所述的方法,其特征在于,在所述第二终端接收来自所述第一终端的拖拽数据之后,所述方法还包括:
    所述第二终端创建隐形活动,所述隐形活动具有透明度大于阈值的视图控件,所述视图控件用于发起所述拖拽事件。
  10. 根据权利要求3-6中任一项所述的方法,其特征在于,所述对象为应用的图标;或者,所述对象为窗口,所述窗口中包括应用的界面;
    所述对象为所述应用的图标时,所述拖拽数据包括:所述应用的图标;所述对象为所述窗口时,所述拖拽数据包括:所述应用的界面。
  11. 一种跨设备的对象拖拽方法,其特征在于,应用于第一终端,所述第一终端与第二终端连接,所述方法包括:
    所述第一终端在所述第一终端显示的对象上显示第一光标;
    所述第一终端接收用户使用所述第一终端的输入设备输入的拖拽操作,所述拖拽操作用于发起针对所述对象的拖拽;
    响应于所述拖拽操作,所述第一终端在所述第一终端的显示屏上显示所述对象随所述第一光标移动的动画;
    所述第一终端在确定所述对象被拖出所述第一终端的显示屏的边缘时,向所述第二终端发送拖拽数据。
  12. 根据权利要求11所述的方法,其特征在于,
    所述拖拽数据用于所述第二终端发起针对所述对象的拖拽事件,以便所述第二终端在所述第二终端的显示屏上显示所述对象随第二光标移动的动画。
  13. 根据权利要求11或12所述的方法,其特征在于,所述拖拽操作包括按下操作和移动操作;
    在所述第一终端在确定所述对象被拖出所述第一终端的显示屏的边缘时,所述方 法还包括:
    所述第一终端向所述第二终端发送用户使用所述第一终端的输入设备输入的所述移动操作的数据。
  14. 根据权利要求13所述的方法,其特征在于,所述第一终端向所述第二终端发送用户使用所述第一终端的输入设备输入的所述移动操作的数据,包括:
    在用户使用所述第一终端的输入设备执行所述移动操作的过程中,所述第一终端拦截移动事件;
    所述第一终端向所述第二终端发送所述移动事件包括的操作参数。
  15. 根据权利要求11-14中任一项所述的方法,其特征在于,在所述第一终端确定所述对象被拖出所述第一终端的显示屏的边缘之后,所述方法还包括:
    所述第一终端向所述第二终端发送穿梭状态信息,所述穿梭状态信息用于指示穿梭开始。
  16. 根据权利要求11-15中任一项所述的方法,其特征在于,所述第一终端在所述第一终端的显示屏上显示所述对象随所述第一光标移动的动画,包括:
    所述第一终端在所述第一终端的显示屏上显示所述对象的阴影随所述第一光标移动的动画。
  17. 根据权利要求16所述的方法,其特征在于,在所述第一终端确定所述对象被拖出所述第一终端的显示屏的边缘之后,所述方法还包括:
    所述第一终端隐藏所述第一光标和所述对象的阴影。
  18. 根据权利要求16或17所述的方法,其特征在于,所述对象为文本,文件或文件夹;所述拖拽数据包括拖拽事件内容和所述阴影的位图;
    其中,所述对象为所述文本时,所述拖拽事件内容包括所述文本,所述对象为所述文件或所述文件夹时,所述拖拽事件内容为文件路径。
  19. 根据权利要求18所述的方法,其特征在于,在所述第一终端确定所述对象被拖出所述第一终端的显示屏的边缘之后,所述方法还包括:
    所述第一终端显示隐形窗口,所述隐形窗口的透明度大于阈值,所述隐形窗口用于接收所述拖拽事件;
    在所述向所述第二终端发送拖拽数据之前,所述方法还包括:
    所述第一终端从所述隐形窗口接收到的所述拖拽事件中获取所述拖拽事件内容;
    所述第一终端获取所述阴影的位图。
  20. 根据权利要求11-15中任一项所述的方法,其特征在于,所述对象为应用的图标;或者,所述对象为窗口,所述窗口中包括应用的界面;
    所述对象为所述应用的图标时,所述拖拽数据包括:所述应用的图标;所述对象为所述窗口时,所述拖拽数据包括:所述应用的界面。
  21. 一种跨设备的对象拖拽装置,其特征在于,包括:处理器;用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令时实现如权利要求1-10中任一项所述的方法,或者实现如权利要求11-20中任一项所述的方法。
  22. 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计 算机程序指令被处理器执行时实现如权利要求1-10中任一项所述的方法,或者实现如权利要求11-20中任一项所述的方法。
  23. 一种跨设备的对象拖拽系统,其特征在于,包括:第一终端和第二终端,所述第一终端与所述第二终端连接;
    所述第一终端,用于在所述第一终端显示的对象上显示第一光标,接收用户使用所述第一终端的输入设备输入的拖拽操作,所述拖拽操作用于发起针对所述对象的拖拽;响应于所述拖拽操作,在所述第一终端的显示屏上显示所述对象随所述第一光标移动的动画;在确定所述对象被拖出所述第一终端的显示屏的边缘时,向所述第二终端发送拖拽数据;
    所述第二终端,用于根据所述拖拽数据在所述第二终端的显示屏上显示从所述第一终端拖拽来的所述对象,在所述对象上显示第二光标;接收用户使用所述第一终端的输入设备输入的移动操作,根据所述移动操作,在所述第二终端的显示屏上显示所述对象随所述第二光标移动的动画。
  24. 根据权利要求23所述的系统,其特征在于,
    所述第一终端,还用于在确定所述对象被拖出所述第一终端的显示屏的边缘后,向所述第二终端发送穿梭状态信息,所述穿梭状态信息用于指示穿梭开始。
  25. 根据权利要求23或24所述的系统,其特征在于,
    所述第一终端,还用于确定所述对象被拖出所述第一终端的显示屏的边缘后,隐藏所述第一光标和所述对象。
PCT/CN2021/108579 2020-07-29 2021-07-27 一种跨设备的对象拖拽方法及设备 WO2022022490A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21850017.1A EP4180932A4 (en) 2020-07-29 2021-07-27 METHOD AND DEVICE FOR DRAWING OBJECTS ACROSS DEVICE
US18/007,120 US20230229300A1 (en) 2020-07-29 2021-07-27 Cross-Device Object Drag Method and Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010747173.5 2020-07-29
CN202010747173.5A CN114089901B (zh) 2020-07-29 2020-07-29 一种跨设备的对象拖拽方法及设备

Publications (1)

Publication Number Publication Date
WO2022022490A1 true WO2022022490A1 (zh) 2022-02-03

Family

ID=80037584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/108579 WO2022022490A1 (zh) 2020-07-29 2021-07-27 一种跨设备的对象拖拽方法及设备

Country Status (4)

Country Link
US (1) US20230229300A1 (zh)
EP (1) EP4180932A4 (zh)
CN (3) CN117891369A (zh)
WO (1) WO2022022490A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115145510A (zh) * 2021-03-30 2022-10-04 华为技术有限公司 一种输入设备复用的跨设备协同方法及电子设备
US20220391076A1 (en) * 2021-06-04 2022-12-08 Apple Inc. Activity Stream Foundations
CN117193583A (zh) * 2022-05-31 2023-12-08 华为技术有限公司 光标显示的方法及电子设备
CN117270699A (zh) * 2022-06-13 2023-12-22 荣耀终端有限公司 设备建立连接的方法和终端设备
CN114760291B (zh) * 2022-06-14 2022-09-13 深圳乐播科技有限公司 一种文件处理方法及装置
CN117687555A (zh) * 2022-09-02 2024-03-12 荣耀终端有限公司 键鼠穿越方法和终端设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829323A (zh) * 2018-06-22 2018-11-16 联想(北京)有限公司 信息处理方法、装置及电子设备
CN109669747A (zh) * 2018-11-30 2019-04-23 维沃移动通信有限公司 一种移动图标的方法及移动终端
CN110908625A (zh) * 2018-09-18 2020-03-24 阿里巴巴集团控股有限公司 多屏显示方法、装置、设备、系统、舱体及存储介质
CN111240547A (zh) * 2020-01-08 2020-06-05 华为技术有限公司 跨设备任务处理的交互方法、电子设备及存储介质

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012025A1 (en) * 1998-03-20 2001-08-09 Toshiba America Information Systems, Inc. Display scrolling system using pointing device
TW571221B (en) * 2000-06-30 2004-01-11 Intel Corp Method and apparatus for using an input device of a first computer system to wirelessly enter data into a second computer system
JP4686150B2 (ja) * 2004-08-23 2011-05-18 インターナショナル・ビジネス・マシーンズ・コーポレーション マルチモニタシステム、マルチモニタ方法及び目印表示プログラム
CN101354639A (zh) * 2007-07-25 2009-01-28 联想(北京)有限公司 在终端之间操作对象的方法及终端
US8756519B2 (en) * 2008-09-12 2014-06-17 Google Inc. Techniques for sharing content on a web page
CN102419689B (zh) * 2011-10-27 2016-09-28 康佳集团股份有限公司 一种基于触摸屏的手势滑动传送文件方法和系统
CN104137048B (zh) * 2011-12-28 2019-10-25 诺基亚技术有限公司 提供应用的打开实例
US10045105B2 (en) * 2014-08-12 2018-08-07 Bose Corporation Wireless communication between endpoint devices
CN105512086B (zh) * 2016-02-16 2018-08-10 联想(北京)有限公司 信息处理设备以及信息处理方法
CN107329927A (zh) * 2016-04-28 2017-11-07 富泰华工业(深圳)有限公司 一种数据共享系统及方法
CN108616561B (zh) * 2017-01-03 2021-04-09 国基电子(上海)有限公司 数据传输方法及系统
US11422765B2 (en) * 2018-07-10 2022-08-23 Apple Inc. Cross device interactions
US11599322B1 (en) * 2019-09-26 2023-03-07 Apple Inc. Systems with overlapped displays
CN110750197B (zh) * 2019-10-17 2021-07-23 广州视源电子科技股份有限公司 文件共享方法、装置、系统、相应设备及存储介质
US11567644B2 (en) * 2020-02-03 2023-01-31 Apple Inc. Cursor integration with a touch screen user interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829323A (zh) * 2018-06-22 2018-11-16 联想(北京)有限公司 信息处理方法、装置及电子设备
CN110908625A (zh) * 2018-09-18 2020-03-24 阿里巴巴集团控股有限公司 多屏显示方法、装置、设备、系统、舱体及存储介质
CN109669747A (zh) * 2018-11-30 2019-04-23 维沃移动通信有限公司 一种移动图标的方法及移动终端
CN111240547A (zh) * 2020-01-08 2020-06-05 华为技术有限公司 跨设备任务处理的交互方法、电子设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4180932A4

Also Published As

Publication number Publication date
CN114089901A (zh) 2022-02-25
CN114089901B (zh) 2023-11-24
CN117891369A (zh) 2024-04-16
EP4180932A1 (en) 2023-05-17
US20230229300A1 (en) 2023-07-20
EP4180932A4 (en) 2023-11-29
CN117971104A (zh) 2024-05-03

Similar Documents

Publication Publication Date Title
WO2022022490A1 (zh) 一种跨设备的对象拖拽方法及设备
WO2021052147A1 (zh) 一种数据传输的方法及相关设备
WO2021013158A1 (zh) 显示方法及相关装置
CN110737386A (zh) 一种屏幕截取方法及相关设备
WO2021104030A1 (zh) 一种分屏显示方法及电子设备
WO2022042656A1 (zh) 一种界面显示方法及设备
WO2021196970A1 (zh) 一种创建应用快捷方式的方法、电子设备及系统
WO2022048500A1 (zh) 一种显示方法及设备
WO2022033342A1 (zh) 数据传输方法和设备
WO2021078032A1 (zh) 用户界面的显示方法及电子设备
CN112130788A (zh) 一种内容分享方法及其装置
CN115657918A (zh) 一种跨设备的对象拖拽方法及设备
WO2022063159A1 (zh) 一种文件传输的方法及相关设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021190524A1 (zh) 截屏处理的方法、图形用户接口及终端
WO2023088459A1 (zh) 设备协同方法及相关装置
WO2022206848A1 (zh) 一种应用小部件的显示方法及设备
WO2023029983A1 (zh) 一种控件内容的拖拽方法、电子设备及系统
WO2022213831A1 (zh) 一种控件显示方法及相关设备
WO2024046315A1 (zh) 一种输入设备的事件处理方法和装置
WO2023071590A1 (zh) 输入控制方法及电子设备
WO2024109481A1 (zh) 窗口控制方法及电子设备
WO2022179273A1 (zh) 一种分布式音频播放方法及电子设备
CN114356186A (zh) 一种拖动阴影动画效果的实现方法及相关设备
CN116069226A (zh) 一种窗口内容拖拽的方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21850017

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021850017

Country of ref document: EP

Effective date: 20230207

NENP Non-entry into the national phase

Ref country code: DE