WO2024001813A1 - 输入控制方法、电子设备及系统 - Google Patents

输入控制方法、电子设备及系统 Download PDF

Info

Publication number
WO2024001813A1
WO2024001813A1 PCT/CN2023/100503 CN2023100503W WO2024001813A1 WO 2024001813 A1 WO2024001813 A1 WO 2024001813A1 CN 2023100503 W CN2023100503 W CN 2023100503W WO 2024001813 A1 WO2024001813 A1 WO 2024001813A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
target
display
user
mouse pointer
Prior art date
Application number
PCT/CN2023/100503
Other languages
English (en)
French (fr)
Inventor
卞苏成
刘丰恺
杨婉艺
丁宁
唐彦
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024001813A1 publication Critical patent/WO2024001813A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • Embodiments of the present application relate to the field of terminal technology, and in particular, to an input control method, electronic device and system.
  • multi-screen collaboration can be achieved between electronic devices. Controlling multiple electronic devices through one electronic device not only expands the display space, but also allows the content on one electronic device to be quickly transferred to other electronic devices. show. Generally, during multi-screen collaboration, multiple electronic devices can share a set of keyboard and mouse.
  • a multi-screen collaborative connection is established between a computer 101, a laptop 102 and a tablet 103.
  • the keyboard and mouse of the computer 101 can control the computer 101, the laptop 102 and the tablet 103.
  • the computer 101 can obtain the position relationship of three electronic devices. In this way, after the computer 101 detects the user's operation of moving the mouse pointer to the right to the edge of the display screen, it can determine that the user has instructed to move the mouse pointer to the notebook computer 102, and can send the keyboard and mouse information to the notebook computer 102 to achieve display on the notebook computer 102.
  • a mouse pointer controllable by the computer 101 is displayed on the screen.
  • the computer 101 detects the user's operation of moving the mouse pointer to the right to the edge of the display screen of the notebook computer 102, it can determine that the user has instructed to move the mouse pointer to the tablet 103, and can send the keyboard and mouse information to the tablet 103 to implement the operation on the tablet 103.
  • a mouse pointer controllable by the computer 101 is displayed on the display screen. Through the above process, the mouse pointer can be moved from the display screen of the computer 101 to the display screen of the tablet 103, so that the user can control the tablet 103 through the mouse of the computer 101.
  • this application provides an input control method, electronic device and system.
  • the first electronic device in response to the user's operation of selecting the target device view on the display screen of the first electronic device, the first electronic device can directly switch the mouse pointer to the target second electronic device corresponding to the target device view, so that Simplify user operation difficulty and improve user experience.
  • the first aspect is to provide an input control method.
  • the method includes: the first electronic device receives a first operation of moving a mouse pointer across devices as instructed by a user, and displays a device view corresponding to at least one second electronic device that establishes a communication connection with the first electronic device.
  • the first electronic device detects the user's second operation of selecting the target device view in the device view, and determines the target second electronic device corresponding to the target device view.
  • At least one second electronic device includes the target second electronic device.
  • the first electronic device sends a display instruction to the target second electronic device, where the display instruction is used to instruct the target second electronic device to display a mouse pointer.
  • the first electronic device displays the device view, Help the user to quickly move the mouse pointer to the target second electronic device.
  • the user does not need to move the mouse pointer across devices through a long path, which avoids mouse pointer movement failure caused by user errors and improves interaction efficiency.
  • the method before the first electronic device receives a first operation instructing the user to move the mouse pointer across devices, the method further includes: the first electronic device detects a third operation in which the user selects the target object.
  • the target object includes one or more of files, windows, icons, text, text selected by the user, documents, pictures, and videos.
  • the first electronic device can realize cross-device transmission of multiple types of target objects selected by the user, thereby improving the user experience.
  • the display indication is also used to instruct the target second electronic device to display the target object in the selected state; the method further includes: the first electronic device detects that the user moves the mouse pointer.
  • the fourth operation is to send first information of the dragged target object to the target second electronic device, where the first information is used for the target second electronic device to display the dragged target object.
  • the first information may include information about the target object during movement (eg, location information, etc.).
  • the source device (such as the first electronic device) helps the user to quickly move the target object to the target device (such as the target second electronic device) by displaying the device view, and realizes Continue moving the target object on the target device.
  • the user does not need to move the target object across devices through a long path, thus avoiding failure to move the target object caused by user errors.
  • the interaction efficiency of moving target objects between multiple devices is improved.
  • the complexity of moving the target object will not increase due to the increase in the number of electronic devices, thus improving the user experience.
  • the method further includes: the first electronic device detects the user's drag and release operation, and sends a drag completion instruction to the target second electronic device, and the drag completion instruction is Used to instruct the target second electronic device to display the target object according to the display content.
  • the first electronic device can instruct the target second electronic device to stop moving the target object selected by the user, so that the target object transmitted across devices can be displayed at the location required by the user, meeting user needs, simplifying user operations, and improving User experience.
  • the method further includes: the first electronic device displays the second electronic device view. Multiple virtual desktop views corresponding to multiple virtual desktops of the device.
  • the first electronic device detects the fifth operation of the user selecting a target virtual desktop view among the plurality of virtual desktop views, and determines the target virtual desktop of the target second electronic device corresponding to the target virtual desktop view, and the plurality of virtual desktops include the target virtual desktop.
  • the display instruction is used to instruct the target second electronic device to display the mouse pointer on the target virtual desktop.
  • the source device (such as the first electronic device) can also help the user achieve this by displaying the device view and the virtual desktop view. Quickly move the mouse pointer (or the mouse pointer and the target object) to the target virtual desktop of the target device (such as the target second electronic device), and continue to move the mouse pointer (or mouse pointer) on the target virtual desktop of the target device. pointer and target object).
  • the user does not need to move the mouse pointer (or the mouse pointer and the target object) across devices through a long path, thereby avoiding failure to move the mouse pointer (or the mouse pointer and the target object) caused by user errors.
  • the interaction efficiency of moving the mouse pointer (or the mouse pointer and the target object) between multiple devices is improved.
  • the first electronic device detects the second operation of the user selecting the target device view among the device views, and determines the target second electronic device corresponding to the target device view, including: The first electronic device detects the user's operation of selecting the target window preview image displayed in the target device view in the device view, determines the target second electronic device corresponding to the target device view, and displays an instruction for instructing the target second electronic device to preview in the target window.
  • the mouse pointer is displayed in the first window corresponding to the figure.
  • the target second electronic device can run multiple windows, and the first electronic device can help the user directly move the mouse pointer (or the mouse pointer and the target object) to the desired one by displaying previews of the multiple windows.
  • the window of the target second electronic device is displayed. This simplifies user operations, improves interaction efficiency, and enhances user experience.
  • the method further includes: the first electronic device obtains device information of the second electronic device in a preset manner.
  • the preset method includes obtaining device information according to a preset period, or in response to the first operation, requesting the second electronic device to obtain device information.
  • the device information is used to generate the device view.
  • the device information includes one or more of the following: desktop screenshot, device identification, currently displayed interface screenshot, device name, user-edited device nickname, system account, and password used to identify electronic devices. picture.
  • the first electronic device after the first electronic device obtains the device information of the second electronic device in various ways, it can display device views for identifying different second electronic devices in various ways to help the user distinguish different second electronic devices. This further simplifies user operations and improves the accuracy of moving the mouse pointer (or mouse pointer and target object) across devices.
  • the method further includes: the first electronic device hides the mouse pointer.
  • the first electronic device hides the mouse pointer, which can also be understood as the mouse pointer on the first electronic device disappears, or the first electronic device does not display the mouse pointer.
  • the first electronic device determines that it has completed moving the mouse pointer across devices, it can hide the mouse pointer on its own side, thereby helping the user confirm that the mouse pointer can currently be operated on the target second electronic device and avoid causing trouble to the user.
  • the display indication carries displacement information of the mouse pointer, and the displacement information is used to instruct the target second electronic device to display the mouse pointer at the corresponding display position
  • the display position includes Any one of the following: a position corresponding to the hidden mouse pointer of the first electronic device, a position corresponding to the second operation on the display screen of the first electronic device, or a position corresponding to the second operation on the view of the target device.
  • the target second electronic device displays the cross-device moving mouse pointer at the corresponding display position, which facilitates the user to quickly find the mouse pointer on the display screen of the target second electronic device and improves the user experience.
  • the first operation includes one or more of the following: operating a first button of the first electronic device, moving the mouse pointer to a preset area of the display screen operations, operations on the preset icons displayed on the display.
  • an input control method includes: the second electronic device receives a display instruction sent by the first electronic device, and the display instruction is a display instruction corresponding to at least one electronic device that establishes a communication connection with the first electronic device and is displayed by the first electronic device in response to the user.
  • the second operation of selecting the target device view in the device view sends an instruction to the second electronic device corresponding to the target device view, and at least one electronic device includes the second electronic device.
  • the second electronic device displays the mouse pointer corresponding to the input device of the first electronic device according to the display instruction.
  • the method further includes: the second electronic device displays the target object in the selected state according to the display instruction; the target object is the object selected by the first electronic device in response to the user's third operation before sending the display instruction.
  • the second electronic device receives the first information sent by the first electronic device, and the first information is information determined by the first electronic device in response to the fourth operation of the user moving the mouse pointer.
  • the second electronic device displays the dragged target object according to the first information.
  • the method further includes: the second electronic device receives the drag completion instruction sent by the first electronic device. In response to the drag completion instruction, the second electronic device displays the target object according to the display content.
  • the target object includes one or more of files, windows, icons, text, user-selected text, documents, pictures, and videos.
  • the method further includes: the second electronic device determines multiple running windows. In response to the display instruction, the second electronic device displays multiple preview windows corresponding to the multiple windows.
  • the second electronic device receives the second information sent by the first electronic device; the second information is information determined by the first electronic device after detecting that the user moves the mouse pointer. According to the second information, the second electronic device displays the target object in the drag state in the second window corresponding to the first preview window in the plurality of preview windows, and the plurality of windows includes the second window.
  • the second information may include information about the target object during movement (eg, location information, etc.).
  • the user can quickly move the mouse pointer (or the mouse pointer and the target object) to the desired second window display. Effectively simplify user operations, improve interaction efficiency, and enhance user experience.
  • the second electronic device is configured with multiple virtual desktops
  • the display instruction is used to instruct the second electronic device to display the mouse on the target virtual desktop among the multiple virtual desktops.
  • the pointer, the target virtual desktop is a virtual desktop determined in response to the user's fifth operation after the first electronic device displays multiple virtual desktop views corresponding to the multiple virtual desktops.
  • the method further includes: the second electronic device sends the device information of the second electronic device to the first electronic device in a preset manner; wherein the preset manner includes: Send device information in a preset period, or in response to a request from the first electronic device, send device information to the first electronic device; the device information is used by the first electronic device to display the device view corresponding to the second electronic device, and the device information includes the following Item or items: desktop screenshot, currently displayed interface screenshot, device identification, device name, user-edited device nickname, system account, and pictures used to identify electronic devices.
  • the display instruction carries the displacement information of the mouse pointer
  • the second electronic device displays the mouse pointer corresponding to the input device of the first electronic device according to the display instruction, including :
  • the second electronic device displays the mouse pointer at the determined corresponding display position according to the displacement information of the mouse pointer.
  • the corresponding display position includes any of the following: corresponding to the position where the mouse pointer is hidden by the first electronic device, corresponding to the position where the mouse pointer is hidden on the first electronic device.
  • the position of the second operation on the device's display screen corresponds to the position of the second operation on the target device's view.
  • a third aspect provides an electronic device.
  • the electronic device includes: a processor, a display screen and a memory.
  • the memory and the display screen are coupled to the processor.
  • the memory is used to store computer program codes.
  • the computer program codes include computer instructions.
  • the electronic device is caused to perform: the first electronic device receives a user's instruction to move a mouse pointer across devices, and displays a device view corresponding to at least one second electronic device that establishes a communication connection with the first electronic device.
  • the first electronic device detects the user's second operation of selecting the target device view in the device view, and determines the target second electronic device corresponding to the target device view.
  • the at least one second electronic device includes the target second electronic device.
  • the first electronic device sends a display instruction to the target second electronic device, where the display instruction is used to instruct the target second electronic device to display a mouse pointer.
  • the processor when the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the first electronic device detects a third operation in which the user selects the target object.
  • the display instruction is also used to instruct the target second electronic device to display the target object in the selected state.
  • the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the first electronic device detects the fourth operation of the user moving the mouse pointer, and sends the first operation of dragging the target object to the target second electronic device. Information, the first information is used to target the second electronic device to display the dragged target object.
  • the processor when the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the first electronic device detects the user's drag and release operation. , sending a drag completion instruction to the target second electronic device, where the drag completion instruction is used to instruct the target second electronic device to display the target object according to the display content.
  • the target object includes one or more of files, windows, icons, text, text selected by the user, documents, pictures, and videos.
  • the processor when the processor reads the computer readable instructions from the memory, it also causes the electronic device to perform the following operations: the first electronic device displays a plurality of the second electronic device Multiple virtual desktop views corresponding to the virtual desktop.
  • the first electronic device detects the fifth operation of the user selecting a target virtual desktop view among the plurality of virtual desktop views, and determines the target virtual desktop of the target second electronic device corresponding to the target virtual desktop view, and the plurality of virtual desktops include the target virtual desktop.
  • the display instruction is used to instruct the target second electronic device to display the mouse pointer on the target virtual desktop.
  • the first electronic device detects the second operation of the user selecting the target device view among the device views, and determines the target second electronic device corresponding to the target device view, including: The first electronic device detects the user's operation of selecting the target window preview displayed in the target device view in the device view, determines the target second electronic device corresponding to the target device view, and displays an indication for indicating the target The second electronic device displays the mouse pointer in the first window corresponding to the preview image of the target window.
  • the processor when the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the first electronic device acquires the second electronic device in a preset manner.
  • Device information for the device The preset method includes obtaining device information according to a preset period, or in response to the first operation, requesting the second electronic device to obtain device information.
  • the device information is used to generate the device view.
  • the device information includes one or more of the following: desktop screenshot, device identification, currently displayed interface screenshot, device name, user-edited device nickname, system account, and password used to identify electronic devices. picture.
  • the processor when the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the first electronic device hides the mouse pointer.
  • the display indication carries displacement information of the mouse pointer, and the displacement information is used to instruct the target second electronic device to display the mouse pointer at the corresponding display position
  • the display position includes Any one of the following: a position corresponding to the hidden mouse pointer of the first electronic device, a position corresponding to the second operation on the display screen of the first electronic device, or a position corresponding to the second operation on the view of the target device.
  • the first operation includes one or more of the following: operating the first button of the first electronic device, moving the mouse pointer to the preset area of the display screen operations, operations on the preset icons displayed on the display.
  • a fourth aspect provides an electronic device.
  • the electronic device includes: a processor, a display screen and a memory.
  • the memory and the display screen are coupled to the processor.
  • the memory is used to store computer program codes.
  • the computer program codes include computer instructions.
  • the processor reads the computer instructions from the memory, causing the electronic device to execute: the second electronic device receives a display instruction sent by the first electronic device, and the display instruction corresponds to at least one electronic device that the first electronic device displays on the first electronic device in response to the user establishing a communication connection with the first electronic device
  • the second operation of selecting a target device view among the device views sends an instruction to a second electronic device corresponding to the target device view, and at least one electronic device includes the second electronic device.
  • the second electronic device displays the mouse pointer corresponding to the input device of the first electronic device according to the display instruction.
  • the processor when the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the second electronic device displays the target object in the selected state according to the display instruction; the target object is the first electronic device in the selected state. The object selected in response to a third operation of the user is sent before displaying the indication.
  • the second electronic device receives the first information sent by the first electronic device, and the first information is information determined by the first electronic device in response to the fourth operation of the user moving the mouse pointer.
  • the second electronic device displays the dragged target object according to the first information.
  • the processor when the processor reads the computer readable instructions from the memory, it also causes the electronic device to perform the following operations: the second electronic device receives the drag sent by the first electronic device. Drag to complete the instructions. In response to the drag completion instruction, the second electronic device displays the target object according to the display content.
  • the target object includes one or more of files, windows, icons, text, user-selected text, documents, pictures, and videos.
  • the second electronic device is configured with multiple virtual desktops
  • the display instruction is used to instruct the second electronic device to display the mouse on the target virtual desktop among the multiple virtual desktops.
  • the pointer, the target virtual desktop is a virtual desktop determined in response to the user's fifth operation after the first electronic device displays multiple virtual desktop views corresponding to the multiple virtual desktops.
  • the processor when the processor reads the computer-readable instructions from the memory, it also causes the electronic device to perform the following operations: the second electronic device transmits the information to the first electronic device in a preset manner.
  • the device sends device information of the second electronic device; wherein the preset method includes sending device information according to a preset period, or in response to a request of the first electronic device, sending device information to the first electronic device; the device information is used for the first electronic device.
  • the electronic device displays the device view corresponding to the second electronic device.
  • the device information includes one or more of the following: desktop screenshots, currently displayed interface screenshots, device identification, device name, device nickname edited by the user, system account, and information used to identify the electronic device. Picture of the device.
  • the display instruction carries the displacement information of the mouse pointer
  • the second electronic device displays the mouse pointer corresponding to the input device of the first electronic device according to the display instruction, including :
  • the second electronic device displays the mouse pointer at the determined corresponding display position according to the displacement information of the mouse pointer.
  • the corresponding display position includes any of the following: corresponding to the position where the mouse pointer is hidden by the first electronic device, corresponding to the position where the mouse pointer is hidden on the first electronic device.
  • the position of the second operation on the device's display screen corresponds to the position of the second operation on the target device's view.
  • embodiments of the present application provide an input control system, which includes a first electronic device and a second electronic device.
  • the first electronic device is configured to receive a first operation of the user's instruction to move the mouse pointer across devices, and display a device view corresponding to at least one second electronic device that establishes a communication connection with the first electronic device.
  • the first electronic device is also configured to detect the second operation of the user selecting the target device view in the device view, and determine the target second electronic device corresponding to the target device view, and the at least one second electronic device includes the target second electronic device.
  • the first electronic device is also configured to send a display instruction to the target second electronic device.
  • the target second electronic device in the second electronic device is configured to receive a display instruction sent by the first electronic device.
  • the target second electronic device is also used to display the mouse pointer according to the display instruction.
  • the first electronic device is further configured to detect the fourth operation of the user moving the mouse pointer, and send the first step of dragging the target object to the target second electronic device. information.
  • the target second electronic device is also configured to receive the first information sent by the first electronic device.
  • the target second electronic device is also used to display the dragged target object according to the first information.
  • the first electronic device is further configured to detect the user's drag and release operation, and send a drag completion instruction to the target second electronic device.
  • target second electron The device is also configured to receive the drag completion instruction sent electronically by the first electronic device.
  • the target second electronic device is also configured to display the target object according to the display content of the target second electronic device in response to the drag completion instruction.
  • the target object includes one or more of files, windows, icons, text, user-selected text, documents, pictures, and videos.
  • the target second electronic device is also used to determine multiple running windows.
  • the target second electronic device is also configured to display multiple preview windows corresponding to the multiple windows in response to the display instruction.
  • the first electronic device is also configured to detect the user's operation of moving the mouse pointer to the first preview window among the plurality of preview windows, and determine the second information.
  • the first electronic device is also used to send the second information to the target second electronic device.
  • the target second electronic device is further configured to display the target object in the drag state in a second window corresponding to the first preview window in the plurality of preview windows according to the second information, and the plurality of windows include the second window.
  • the first electronic device is further configured to display multiple virtual desktop views corresponding to multiple virtual desktops of the second electronic device.
  • the first electronic device is also configured to detect the fifth operation of the user selecting a target virtual desktop view among the plurality of virtual desktop views, and determine the target virtual desktop of the target second electronic device corresponding to the target virtual desktop view.
  • the plurality of virtual desktops include the Target virtual desktop.
  • the display instruction is used to instruct the target second electronic device to display the mouse pointer on the target virtual desktop.
  • the first electronic device is specifically configured to detect the user's operation of selecting the target window preview image displayed in the target device view in the device view, and determine the target device view corresponding For the target second electronic device, the display instruction is used to instruct the target second electronic device to display the mouse pointer in the first window corresponding to the target window preview.
  • the first electronic device is further configured to obtain device information of the second electronic device in a preset manner.
  • the preset method includes obtaining device information according to a preset period, or in response to the first operation, requesting the second electronic device to obtain device information.
  • the device information is used to generate the device view.
  • the device information includes one or more of the following: desktop screenshot, device identification, currently displayed interface screenshot, device name, user-edited device nickname, system account, and password used to identify electronic devices. picture.
  • the first electronic device is also used to hide the mouse pointer.
  • the display indication carries displacement information of the mouse pointer.
  • the target second electronic device is specifically configured to display the mouse pointer at a determined corresponding display position according to the displacement information of the mouse pointer.
  • the display position includes any of the following: corresponding to the position of the first electronic device where the mouse pointer is hidden, corresponding to the position of the first electronic device.
  • the position of the second operation on the display screen of an electronic device corresponds to the position of the second operation on the view of the target device.
  • the first operation includes one or more of the following: operating the first button of the first electronic device, moving the mouse pointer to the preset area of the display screen operations, operations on the preset icons displayed on the display.
  • embodiments of the present application provide an electronic device, which has the ability to implement the above-mentioned first party and the function of the input control method described in any one of the possible implementations; or, the electronic device has the function of implementing the input control method described in the above second aspect and any one of the possible implementations.
  • This function can be implemented by hardware, or it can be implemented by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • a computer-readable storage medium stores a computer program (which may also be referred to as instructions or codes).
  • the computer program When the computer program is executed by an electronic device, it causes the electronic device to perform the method of the first aspect or any one of the embodiments of the first aspect; or , causing the electronic device to perform the second aspect or the method of any one of the implementation modes of the second aspect.
  • embodiments of the present application provide a computer program product.
  • the computer program product When the computer program product is run on an electronic device, it causes the electronic device to execute the method of the first aspect or any one of the embodiments in the first aspect; or, causes the electronic device to execute The device performs the second aspect or the method of any one of the implementation modes of the second aspect.
  • inventions of the present application provide a circuit system.
  • the circuit system includes a processing circuit.
  • the processing circuit is configured to execute the method of the first aspect or any one of the implementation modes of the first aspect; or, the processing circuit is configured to execute The method of the second aspect or any one of the second aspects.
  • embodiments of the present application provide a chip system, including at least one processor and at least one interface circuit.
  • the at least one interface circuit is used to perform transceiver functions and send instructions to at least one processor.
  • at least one processor When at least one processor When executing instructions, at least one processor performs the method of the first aspect or any one implementation of the first aspect; or, at least one processor performs the method of the second aspect or any two implementations of the first aspect.
  • Figure 1 is a schematic diagram of a scene of moving a mouse across devices according to an embodiment of the present application
  • Figure 2 is a schematic diagram of a communication system in which an input control method is applied according to an embodiment of the present application
  • Figure 3 is a schematic diagram of the hardware structure of a first electronic device or a second electronic device provided by an embodiment of the present application;
  • FIG. 4 is a schematic flowchart 1 of the input control method provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 6A is a second schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 6B is a schematic diagram three of the interface provided by the embodiment of the present application.
  • Figure 6C is a schematic diagram 4 of the interface provided by the embodiment of the present application.
  • Figure 7A is a schematic diagram 5 of the interface provided by the embodiment of the present application.
  • Figure 7B is a schematic diagram 6 of the interface provided by the embodiment of the present application.
  • FIG. 8 is a schematic diagram 7 of the interface provided by the embodiment of this application.
  • FIG. 9 is a schematic flowchart 2 of the input control method provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram 8 of the interface provided by the embodiment of the present application.
  • FIG 11 is a schematic diagram 9 of the interface provided by the embodiment of the present application.
  • Figure 12 is a schematic diagram 10 of the interface provided by the embodiment of the present application.
  • Figure 13 is a schematic diagram 11 of the interface provided by the embodiment of this application.
  • Figure 14 is a schematic flowchart three of the input control method provided by the embodiment of the present application.
  • Figure 15 is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 16 is a schematic structural diagram of a first electronic device provided by an embodiment of the present application.
  • Figure 17 is a schematic structural diagram of a second electronic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a communication system in which the input control method provided by the embodiment of the present application is applied.
  • the communication system includes a first electronic device 100 and a second electronic device 200 .
  • the number of the second electronic devices 200 is one or more, such as the second electronic device 1, the second electronic device 2, etc. shown in FIG. 2 .
  • the first electronic device 100 may move a mouse pointer or a file to the second electronic device 200 in response to a user operation, and the second electronic device 200 may be controlled through the input device of the first electronic device 100 .
  • the second electronic device 200 can also move the mouse pointer or file to the first electronic device 100 in response to the user's operation, so that the first electronic device 100 can be controlled through the input device of the second electronic device 200 .
  • any second electronic device 200 among the plurality of second electronic devices 200 can also move the mouse pointer or file to other second electronic devices 200 in response to user operations, thereby enabling input through the second electronic device 200 device controls its Another second electronic device 200, such as the second electronic device 1, can move the mouse pointer or the file to the second electronic device 2 in response to the user operation.
  • the first electronic device 100 or the second electronic device 200 may be, for example, a mobile phone, a tablet computer, a notebook computer, a large-screen device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant ( personal digital assistant (PDA), wearable devices, artificial intelligence (AI) equipment, cars and other terminal devices.
  • the operating system installed on the first electronic device 100 or the second electronic device 200 includes but is not limited to or other operating systems. This application does not limit the specific type of the first electronic device 100 or the second electronic device 200 or the operating system installed.
  • the input device of the first electronic device 100 or the second electronic device 200 may be a mouse, a touch pad or a touch screen, etc.
  • a communication connection is established between the first electronic device 100 and the second electronic device 200 .
  • the communication connection may be a wired connection (such as a USB connection, etc.) or a wireless communication connection.
  • the wireless communication technology for establishing a wireless communication connection includes but is not limited to at least one of the following: Bluetooth (bluetooth, BT) (for example, traditional Bluetooth or low-power (bluetooth low energy, BLE) Bluetooth), wireless local area network (wireless local area network) area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), near field communication (NFC), Zigbee, frequency modulation (FM), infrared , IR), ultra wide band (UWB) technology, etc.
  • the first electronic device 100 and the second electronic device 200 can also establish a communication connection through a third-party device in the local area network, such as a router, gateway, smart device controller, server, wireless access point ( access point, AP) equipment, etc.
  • a third-party device such as a router, gateway, smart device controller, server, wireless access point ( access point, AP) equipment, etc.
  • both the first electronic device 100 and the second electronic device 200 support the proximity discovery function.
  • the first electronic device 100 and the second electronic device 200 can discover each other, and then establish a peer-to-peer (P2P) connection such as Wi-Fi. and/or wireless communication connections such as Bluetooth connections.
  • P2P peer-to-peer
  • wireless communication connections such as Bluetooth connections.
  • the user can operate the second electronic device 200 using the input device of the first electronic device 100 .
  • the first electronic device 100 and the second electronic device 200 establish a wireless communication connection through a local area network.
  • the first electronic device 100 and the second electronic device 200 are both connected to the same router.
  • the first electronic device 100 and the second electronic device 200 establish a wireless communication connection through a cellular network, the Internet, etc.
  • the second electronic device 200 accesses the Internet through a router, and the first electronic device 100 accesses the Internet through a cellular network; then, the first electronic device 100 and the second electronic device 200 establish a wireless communication connection.
  • the first electronic device 100 or the second electronic device 200 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, and a power management module 141.
  • Battery 142 antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, sensor module 180, button 190, motor 191, input module 192, camera 193, display screen 194, and user identification module (subscriber identification module, SIM card interface 195, etc.
  • first electronic device 100 or the second electronic device 100 does not constitute a limitation on the first electronic device 100 or the second electronic device 100 .
  • the first electronic device 100 or the second electronic device 200 may include more or less components than shown in the figures, or combine some components, or separate some components, or different components. layout.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the first electronic device 100 when the first electronic device 100 is a PC, the first electronic device 100 may not include the mobile communication module 150 and the SIM card interface 195 .
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can separately couple the touch sensor, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 can be coupled to a touch sensor through an I2C interface, so that the processor 110 and the touch sensor communicate through an I2C bus interface to implement the touch function of the first electronic device 100 or the second electronic device 200 .
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the first electronic device 100 or the second electronic device 200 .
  • the processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the first electronic device 100 or the second electronic device 200 .
  • the USB interface 130 is an interface that complies with USB standard specifications, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the first electronic device 100 or the second electronic device 200, and can also be used to transmit data between the first electronic device 100 or the second electronic device 200 and peripheral devices. It can also be used to connect headphones to play audio through them.
  • This interface can also be used to connect Connect to other electronic devices, such as AR devices, etc.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is only a schematic explanation and does not constitute a structural limitation on the first electronic device 100 or the second electronic device 200 .
  • the first electronic device 100 or the second electronic device 200 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the first electronic device 100 or the second electronic device 200 . While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the first electronic device 100 or the second electronic device 200 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the first electronic device 100 or the second electronic device 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G applied on the first electronic device 100 or the second electronic device 200 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through the audio device, or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 may provide a package applied on the first electronic device 100 or the second electronic device 200 Including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • Bluetooth blue, BT
  • global navigation satellite system global navigation satellite system
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from
  • the antenna 1 of the first electronic device 100 or the second electronic device 200 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the first electronic device 100 or the second electronic device 200 can pass Wireless communications technology communicates with networks and other devices.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time division code division multiple access
  • long term evolution long term evolution
  • LTE long term evolution
  • BT
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the first electronic device 100 or the second electronic device 200 implements the display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), such as an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic Manufacturing of light emitting diodes (AMOLED), flexible light-emitting diodes (FLED), Mini-led, Micro-led, Micro-oled, quantum dot light-emitting diodes (QLED), etc.
  • the first electronic device 100 or the second electronic device 200 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the input module 192 may include a mouse, a keyboard, a touch pad or a touch screen that can be used to implement keyboard and mouse functions, and the like.
  • the first electronic device 100 detects a user operation through the input module 192, determines that the user has instructed to send the mouse pointer or selected files to other electronic devices, and can display one or more items related to the first electronic device through the display screen 194.
  • Device 100 establishes a device view of the connected electronic device. Among them, the device view is used to indicate the corresponding electronic device. In response to the user's operation of selecting the device view through the input module 192, it is determined to send the mouse pointer or the selected file to the corresponding second electronic device 200.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semi-conductor Conductor (complementary metal-oxide-semiconductor, CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal oxide semi-conductor Conductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the first electronic device 100 or the second electronic device 200 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the first electronic device 100 or the second electronic device 200 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 . The first electronic device 100 or the second electronic device 200 can play music, record, etc. through the audio module 170 .
  • the audio module 170 may include a speaker, a receiver, a microphone, a headphone interface, and an application processor to implement audio functions.
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • Touch sensor also known as "touch device”.
  • the touch sensor can be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also called a "touch screen”. Touch sensors are used to detect touches on or near them.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor may also be disposed on the surface of the first electronic device 100 or the second electronic device 200 at a location different from that of the display screen 194 .
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • touch operations for different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effect.
  • Different application scenarios (such as time reminders, receiving information, alarm clocks, games, etc.) can also correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also be customized.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the first electronic device 100 or the second electronic device 200 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the first electronic device 100 or the second electronic device 200 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • FIG. 4 is a schematic flowchart of an input control method provided by an embodiment of the present application. As shown in Figure 4, the method includes the following steps.
  • the first electronic device and the second electronic device establish a communication connection.
  • the first electronic device can establish a communication connection with the second electronic device in a preset manner.
  • the preset mode includes, for example, one or more modes among Bluetooth, Wi-Fi, NFC, Zigbee, infrared, UWB, etc.
  • the first electronic device can respond to the user operation, search for the second electronic device located in the current Wi-Fi local area network, and establish communication. connect.
  • the number of second electronic devices is one or more.
  • the device information of the second electronic device can be obtained in a preset synchronization manner.
  • the device information is used to generate a device view corresponding to the electronic device, and can be used to distinguish different second electronic devices.
  • the device information includes, for example, any one or more of the electronic device's desktop screenshot, device identification (identity, ID), device name, user-edited device nickname, system account, picture used to identify the electronic device, etc. .
  • the first electronic device obtains desktop screenshots of the second electronic device according to a preset period.
  • the first electronic device can display the device view of the electronic device according to the obtained desktop screenshot, so that the user can distinguish and determine the electronic device corresponding to each device view according to the display content of the device view.
  • the first electronic device After detecting the user's instruction to move the mouse pointer to another electronic device, the first electronic device displays a device view of the electronic device that has established a communication connection.
  • the operation of the user instructing to move the mouse pointer to other electronic devices includes, for example, the operation of special keys (such as the middle mouse button, keyboard key combinations, etc.), moving the mouse pointer to a preset area of the display screen (such as the display screen). upper right corner, edge area of the display screen, etc.), operations on the preset icons displayed on the display screen, etc.
  • the first electronic device may display a device view of at least one electronic device that has established a communication connection with the first electronic device.
  • the first electronic device 100 detects the user's operation of moving the mouse pointer to the upper right corner of the display screen, and determines that it is used to instruct the user to move the mouse pointer to other electronic devices. Then, the first electronic device 100 may display a device view of the electronic device to which the connection has been established.
  • the first electronic device may perform the above step S402 to obtain the device information of the second electronic device after detecting the user's instruction to move the mouse pointer to another electronic device. Afterwards, the first electronic device displays the device view of the second electronic device based on the acquired device information of the second electronic device. That is, the embodiment of the present application does not limit the execution order of step S402 and step S403.
  • the first electronic device After detecting the user's instruction to move the mouse pointer to another electronic device, the first electronic device obtains a desktop screenshot of the electronic device that has established a communication connection. Afterwards, the first electronic device displays the device view based on the obtained desktop screenshot. In this way, it is ensured that the device view displayed by the first electronic device corresponds to the latest desktop screenshot of the second electronic device, and the accuracy of the user's determination of the second electronic device is ensured.
  • the layout method of the device view includes, for example, plane layout, linear layout, conceptual layout, and other methods.
  • the first electronic device 100 displays each electronic device 100 according to the relative positional relationship between each electronic device that has established a communication connection with the first electronic device in a planar layout.
  • the displayed device view may include a device view of the first electronic device.
  • the device view indicated by reference numeral 61 is a device view of the first electronic device.
  • the first electronic device can display the device view indicated by reference numeral 61 according to the positional relationship between each electronic device that has established a communication connection with the first electronic device.
  • a device view of each electronic device is displayed around it.
  • the first electronic device 100 in response to user operation, the first electronic device 100 follows a linear layout, according to the time sequence in which each electronic device establishes a communication connection with the first electronic device, or through the first electronic device.
  • the input device of the device (such as a mouse, touch pad, etc.) controls the frequency of each electronic device, or the time sequence in which the input device of the first electronic device last controls other electronic devices, or other priority sequences, displays the corresponding order of each electronic device. device view.
  • the first electronic device displays the device view of each electronic device in order of decreasing priority from left to right, based on the time sequence of establishing a connection or the frequency of use or the latest control time.
  • the electronic device corresponding to the device view indicated by reference numeral 62 is the electronic device that first establishes a communication connection with the first electronic device, or the electronic device with the highest control frequency of the input device of the first electronic device, or the input of the first electronic device The electronic device the device last controlled.
  • the first electronic device 100 displays a device view corresponding to each electronic device according to a conceptual layout.
  • the central device view indicated by reference numeral 63 is the device view of the first electronic device, and device views of each electronic device are displayed around it according to the product characteristics of the electronic device. For example, the closer to the central device view indicated by reference numeral 63 The electronic device corresponding to the device view has a larger display screen, etc.
  • the first electronic device can generate a corresponding device view based on the device information obtained in step S402.
  • the device view is used to help the user distinguish different electronic devices.
  • the display content of the device view includes any one or more of desktop screenshots, currently displayed interface screenshots, device IDs, device names, user-edited device nicknames, system accounts, pictures used to identify electronic devices, etc.
  • the device information obtained by the first electronic device 100 includes a desktop screenshot of the electronic device. Then, in response to the user's instruction to move the mouse pointer to other electronic devices, the first electronic device 100 displays a device view including a screenshot of the desktop of the electronic device.
  • the electronic device desktop screenshot includes at least one piece of information from the electronic device desktop background, desktop icons, desktop open windows, etc. set by the user, and, the first An electronic device can obtain desktop screenshots of the electronic device in the latest cycle according to a preset cycle. Therefore, the user can distinguish different electronic devices based on the screenshot of the electronic device desktop, thereby determining the electronic device to which the mouse pointer needs to be moved.
  • the device information obtained by the first electronic device 100 includes a device nickname edited by the user of the electronic device. Then, in response to the user's instruction to move the mouse pointer to the other electronic device, the first electronic device 100 displays a device view including the device nickname edited by the user. Then, the user can distinguish different electronic devices corresponding to the device view according to the device nickname set by the user, thereby determining the electronic device to which the mouse pointer needs to be moved.
  • the first electronic device detects the user's operation of selecting the target device view, and determines the second electronic device corresponding to the target device view.
  • the user's operation of selecting the target device view includes, for example, the user's operation of clicking the target device view with the mouse, the user's operation of moving the mouse pointer to the target device view and staying there for more than a preset time, etc.
  • the user can quickly select the desired second electronic device among at least one electronic device that establishes a communication connection with the first electronic device by selecting the target device view from the device view displayed on the first electronic device.
  • the first electronic device can highlight the device view to facilitate the user to confirm whether the selected electronic device is correct.
  • the first electronic device may send a prompt instruction to the second electronic device after determining the second electronic device corresponding to the target device view.
  • the second electronic device can also prompt the user to confirm whether to move the mouse pointer by displaying a prompt bar around the edge of the display screen, an audio module reminder, displaying prompt information, and other status change prompts. displayed on the second electronic device.
  • the first electronic device 100 displays a device view including a desktop screenshot, as indicated by reference numeral 82 device view.
  • the first electronic device 100 can determine that the user chooses to move the mouse pointer 81 to the attached position.
  • the device view indicated by figure 82 corresponds to the second electronic device 200 .
  • the first electronic device 100 prompts the user to confirm the selection through the status change of the device view indicated by reference numeral 82, and the second electronic device 200 responds to the prompt indication sent by the first electronic device 100 and displays a prompt bar prompt around the edge of the display screen. User confirms selection.
  • the first electronic device hides the mouse pointer.
  • the mouse pointer on its own side may be hidden. Hiding the mouse pointer on the first electronic device can also be understood as the mouse pointer on the first electronic device disappears, or the mouse pointer is not displayed on the first electronic device.
  • the first electronic device sends mouse information to the second electronic device.
  • the mouse information includes, for example, mouse displacement information and other information.
  • the first electronic device moves the mouse pointer to the second electronic device in response to the user's instruction, determines the current mouse information, and sends the mouse information to the second electronic device.
  • the corresponding second electronic device receives the mouse information sent by the first electronic device.
  • the embodiment of the present application does not limit the execution order of step S405 and step S406.
  • the first electronic device may hide the mouse pointer and then send the mouse information to the second electronic device (that is, the first electronic device first performs step S405 and then performs step S406).
  • the first electronic device may send a message to the second electronic device After sending the mouse information, the mouse pointer is then hidden (that is, the first electronic device first performs step S406 and then performs step S405).
  • the first electronic device sends mouse information to the second electronic device (that is, the first electronic device simultaneously performs steps S405 and S406).
  • the second electronic device displays the mouse pointer corresponding to the input device of the first electronic device according to the mouse information.
  • the second electronic device after receiving the mouse information, can display the mouse pointer corresponding to the input device of the first electronic device according to the mouse information, thereby moving the mouse pointer of the first electronic device to the second electronic device. It is displayed that the user can control the second electronic device through the keyboard and mouse of the first electronic device.
  • the mouse information includes mouse displacement information.
  • the second electronic device can display the corresponding position of the mouse pointer before the first electronic device hides the mouse pointer (that is, the corresponding position of the mouse pointer at the position where the first electronic device disappears), corresponding to the display of the first electronic device.
  • the mouse pointer corresponding to the input device of the electronic device, or the second electronic device can display the first electronic device correspondingly when the user clicks the corresponding position in the device view of the second electronic device (such as corresponding to the desktop screenshot) based on the mouse displacement information.
  • the second electronic device displays the mouse pointer corresponding to the input device of the first electronic device at a preset fixed position. This facilitates the user to determine the display position of the mouse pointer on the display screen of the second electronic device and improves the user experience.
  • the first electronic device 100 detects the user's operation of clicking on the device view indicated by reference numeral 82 through the mouse pointer 81 , as shown in (c) of FIG. 8 , the first electronic device 100
  • One electronic device 100 hides the mouse pointer and sends mouse information to the second electronic device 200 .
  • the second electronic device 200 displays the mouse pointer 81 according to any of the above methods according to the received mouse information.
  • the mouse pointer 81 controlled by the input device of the first electronic device 100 is moved to the second electronic device 200 for display.
  • the source device (such as the first electronic device) helps the user quickly move the mouse pointer to the target device (such as the second electronic device) by displaying the device view.
  • the user does not need to move the mouse pointer across devices through a long path, which avoids mouse pointer movement failure caused by user errors and improves interaction efficiency.
  • the first electronic device can also move files or windows in the first electronic device to the second electronic device in response to user operations.
  • the file or window moving process is introduced in detail below.
  • FIG. 9 is a schematic flowchart of yet another input control method provided by an embodiment of the present application. As shown in Figure 9, the method includes the following steps.
  • the first electronic device and the second electronic device establish a communication connection.
  • the first electronic device obtains device information of the second electronic device.
  • step S901 and step S902 reference may be made to the relevant content described in the above-mentioned step S401 and step S402, which will not be described again here.
  • the first electronic device After detecting the user's operation of selecting a file or window, the first electronic device detects the user's instruction to move the mouse pointer to other electronic devices, and displays a device view of the electronic device that has established a communication connection.
  • the user when using the first electronic device, can use the input of the first electronic device to The input device selects a target object displayed on the first electronic device, and the target object includes, for example, the file or window selected by the user.
  • the files include, for example, various types of files in the first electronic device, such as icons, text, text selected by the user, documents, pictures, videos, etc.
  • the window includes, for example, a window displayed by the first electronic device.
  • the user's instructions to move the mouse pointer to other electronic devices include, for example, the user's operation of shaking the mouse after selecting a file or window, operations of special keys (such as the middle mouse button, keyboard key combinations, etc.), The operation of moving the mouse pointer to the preset area of the display screen (such as the upper right corner of the display screen, the edge area of the display screen, etc.), the operation of the preset icons displayed on the display screen, etc.
  • the first electronic device 100 after detecting the user's operation of long-pressing the picture 102 through the mouse pointer 101 (such as the operation of the user long-pressing the left mouse button), the first electronic device 100 determines that the user has selected the picture 102 . Afterwards, the first electronic device 100 detects the user's operation of shaking the mouse while long-pressing the picture 102 with the mouse pointer 101, and determines that the user instructs the user to move the mouse pointer 101 and the picture 102 to other electronic devices. As shown in (b) of FIG. 10 , the first electronic device 100 displays a device view of each electronic device with which a communication connection has been established.
  • the layout of the device view includes, for example, plane layout, linear layout, conceptual layout, and other methods.
  • step S903 for other content of step S903, reference may be made to the relevant content of step S403 above, which will not be described again here. It should be noted that the embodiment of the present application also does not limit the execution order of step S902 and step S903. For example, after detecting the user's instruction to move the mouse pointer to other electronic devices, the first electronic device can obtain the device information of the electronic device that has established a communication connection with it to display the device view.
  • the first electronic device After detecting the user's operation of moving the selected file or window to the target device view display position, the first electronic device determines the second electronic device corresponding to the target device view.
  • the first electronic device may determine a second location corresponding to the instruction to move the selected file or window to the target device view. Electronic device display.
  • the first electronic device 100 detects the user's operation of moving the picture 102 to the target device view display position indicated by the reference numeral 103 through the mouse pointer 101 , and can highlight the reference numeral. 103 indicates the target device view to prompt the user to confirm whether the selected electronic device is correct.
  • the first electronic device 100 may also send a prompt instruction to the second electronic device 200 corresponding to the target device view indicated by reference numeral 103, for instructing the second electronic device 200 to prompt the user to select the electronic device in a preset manner.
  • the preset method includes, for example, displaying a prompt bar around the edge of the display screen, audio module reminders, displaying prompt information, etc.
  • the first electronic device 100 determines that the mouse pointer 101 remains selected at the display position of the target device view indicated by reference numeral 103 and stops for more than a preset time, and may determine that the user chooses to move the picture 102 to the display position indicated by reference numeral 103
  • the target device view is displayed on the second electronic device 200 corresponding to the target device view.
  • the first electronic device can change and display the view corresponding to the file or window in a preset manner. For example, in the process of moving the selected file or window to the target device view, the first electronic device reduces the view corresponding to the file or window to adapt to the target device view. The size of the device view.
  • the first electronic device 100 in response to a user operation, reduces the display picture 102 to adapt to the size of the target device view indicated by reference numeral 103 .
  • the first electronic device hides the mouse pointer.
  • the first electronic device can hide the mouse pointer on its own side (for example, the mouse pointer disappears, or the mouse pointer is not displayed). This produces a display effect of the mouse pointer crossing across devices, helping the user to determine that the mouse pointer has moved out of the first electronic device.
  • the first electronic device sends mouse information, file or window information to the second electronic device.
  • the first electronic device may send the information of the file or window selected by the user and the mouse information to the second electronic device. In this way, the mouse pointer, the file or window selected by the user can be transferred to the second electronic device for display.
  • the information corresponding to the file or window is used to display the view corresponding to the file or window.
  • the second electronic device displays the file or window dragged by the mouse pointer corresponding to the input device of the first electronic device.
  • the first electronic device sends dragging file or window information to the second electronic device.
  • the second electronic device displays the file or window dragged by the mouse pointer corresponding to the input device of the first electronic device.
  • the second electronic device displays the view of the file or window that remains in the drag state based on the acquired mouse information, file or window information, so that the user can continue to use the first The mouse of the electronic device drags the file or window to the desired location.
  • the first electronic device after detecting the operation of dragging the file or moving the window with the mouse, sends the information of the dragged file or window during the movement (such as location information, etc.) to the second electronic device, so that the second electronic device The moved file or window may be displayed on the display screen of the second electronic device in response to the user operation.
  • the first electronic device 100 sends mouse information, file or window information to the second electronic device 200 , and hides the mouse pointer on its own side.
  • the second electronic device 200 can display the picture 104 in the drag state by the mouse pointer 101 according to the mouse information, file or window information.
  • the first electronic device 100 sends the information of the dragged file or window to the second electronic device 200 during the dragging process.
  • the second electronic device 200 moves the display position of the picture 104 according to the information of the dragged file or window.
  • steps S908 to S909 are loop steps.
  • the first electronic device can respond to the user's operation of moving the mouse and send the operation of dragging the file or window information to the second electronic device multiple times, so that the second electronic device can Display files or windows in the process of being moved in response to user operations.
  • steps S908 to S909 are optional steps.
  • the user may no longer move the file or window, but directly perform a drag-and-drop operation to end the crossover.
  • the device moves files or windows (that is, after step S907, the following step S910 is directly executed).
  • the second electronic device directly cancels the dragging state of the file or window, and displays the file or window at the corresponding position (see the following steps for the specific process).
  • the first electronic device detects the user's drag and release operation.
  • the first electronic device sends a drag completion instruction to the second electronic device.
  • the first electronic device may determine that the user has moved the file or window to the desired location. Location required. Then, the first electronic device may send a drag completion instruction to the second electronic device to instruct the second electronic device to stop moving and displaying the file or window, and display the moved file or window at the current display position.
  • the drag completion instruction may carry the content information of the dragged file or window.
  • the second electronic device can display the view of the file or window based on the thumbnail information of the file or window. Later, after it is determined that the dragging of the file or window has been completed, the subsequent user may need to edit the file or window, and then the second electronic device needs to obtain the detailed content of the file or window.
  • the second electronic device displays a file or window according to the content displayed on the display screen.
  • the second electronic device may display the dragged file or window according to the display content of the display screen.
  • the display screen of the second electronic device displays the desktop, and the second electronic device can display the dragged file or window at a corresponding position on the desktop.
  • the user drags the picture in the first electronic device 100 to the second electronic device 200 for display.
  • the second electronic device 200 displays the dragged picture 104 at a corresponding position on the desktop.
  • the picture 102 can still be displayed at the original display position on the first electronic device 100 . It can also be understood that the picture 102 on the first electronic device 100 is copied to the second electronic device 200 through a drag operation.
  • the first electronic device 100 detects the user's operation of selecting the window 112 through the mouse pointer 111, it triggers an operation instructing the user to move the mouse pointer to other electronic devices (such as shaking to select the window 112).
  • the operation of the window 112) may display a device view of the electronic device that has established a communication connection as shown in (b) of FIG. 11 .
  • the first electronic device 100 detects the user's operation of moving the window 112 to the display position of the target device view indicated by the reference numeral 113, and may determine that the user instructs the window 112 to be sent to the second electronic device corresponding to the target device view indicated by the reference numeral 113. displayed in.
  • the first electronic device 100 sends the information of the mouse pointer 111 and the window 112 to the second electronic device 200 and hides the mouse pointer 111 .
  • the second electronic device 200 displays the mouse pointer 111 and the window 114 according to the received mouse information and window information. Afterwards, in response to the user's operation of moving the mouse of the first electronic device 100, the first electronic device 100 sends the drag window information to the second electronic device 200, and the second electronic device 200 moves the display position of the window 114 accordingly. As shown in (d) of FIG. 11 , the first electronic device 100 detects the user's drag release operation and may send a drag completion instruction carrying window content information to the second electronic device 200 . After receiving the drag completion instruction, the second electronic device 200 determines that the desktop is currently displayed, and can display the window 114 at a corresponding position on the desktop.
  • the display screen of the second electronic device displays a window, and a dragged file can be inserted at the corresponding cursor position of the window.
  • the window does not support inserting the dragged file, and the user can be prompted that the file drag failed; or the window can be minimized and the dragged file will be displayed on the desktop.
  • the second electronic device displays a window, and can overlay and display the window dragged by the first electronic device.
  • the first electronic device 100 detects that the user presses the mouse pointer 121 After the operation of selecting picture 122, an operation of moving the mouse pointer to other electronic devices is triggered (such as the operation of shaking the selected picture 122), which can display the electronic devices that have established communication connections as shown in (b) of Figure 12 view.
  • the first electronic device 100 detects the user's operation of moving the picture 122 to the display position of the target device view indicated by reference numeral 123, and may determine that the user instructs to send the picture 122 to the second electronic device corresponding to the target device view indicated by reference numeral 123.
  • a preview window of the window currently displayed on the second electronic device 200 may be displayed in the target device view indicated by reference numeral 123 .
  • the first electronic device detects that the user moved the picture 122 into the preview window, and can determine that the user instructed to send the picture 122 to the corresponding window in the second electronic device. Then, as shown in (c) of FIG. 12 , the first electronic device 100 sends the information of the mouse pointer 121 and the picture 122 to the second electronic device 200 and hides the mouse pointer 121 . Optionally, the first electronic device 100 still keeps displaying the picture 122 at the original position.
  • the second electronic device 200 displays the mouse pointer 121 and the picture 124 according to the received mouse information and picture information.
  • the mouse pointer 121 and the picture 124 are displayed in the window currently displayed by the second electronic device 200 .
  • the first electronic device 100 sends information of dragging the picture to the second electronic device 200, and the second electronic device 200 moves the display position of the picture 124 accordingly.
  • the first electronic device 100 detects the user's drag release operation and may send a drag completion instruction carrying the image content information to the second electronic device 200 .
  • the second electronic device 200 After receiving the drag completion instruction, the second electronic device 200 determines based on the currently displayed window that the window allows the insertion of the picture 124 at the cursor position, and then the second electronic device 200 inserts and displays the picture 124 at the cursor position according to the picture information.
  • the second electronic device may run multiple windows (including windows corresponding to the desktop or application programs). Then, after detecting the user's operation of moving the file to the second electronic device, the second electronic device can display preview bars of multiple windows. Afterwards, in response to the user operation, it is determined which window the user instructs the file to be moved to for display. Alternatively, after detecting the user's operation of moving the mouse pointer to the second electronic device, the second electronic device may display preview bars of multiple windows. Afterwards, in response to the user operation, it is determined which window the user instructs to move the mouse pointer to is displayed.
  • the first electronic device 100 displays the device view as shown in (b) of FIG. 13 in response to the user's operation of moving the picture 132 to other electronic devices through the mouse pointer 131 . interface.
  • the first electronic device 100 detects the user's operation of moving the picture 132 to the target device view indicated by reference numeral 133, sends mouse information and picture information to the second electronic device 200 corresponding to the target device view indicated by reference numeral 133, and hides it.
  • Mouse pointer 131 the user's operation of moving the picture 132 to the target device view indicated by reference numeral 133.
  • the second electronic device 200 determines that multiple windows are currently running in the background, as shown in the task bar 134 . Then, as shown in (c) of FIG. 13 , the second electronic device 200 displays preview windows corresponding to multiple running windows (for example, including preview windows corresponding to the desktop or application programs). Afterwards, after receiving the drag-and-drop picture information sent by the first electronic device 100, the second electronic device 200 determines that the user instructed to move the picture 135 to the preview window 136, and may determine that the user instructed to insert the picture 136 into the window corresponding to the preview window 136. middle.
  • the second electronic device 200 can display the window 137 corresponding to the preview window 136, and display the picture 135 in the drag state in the window 137, so that the user can move the picture 135 to the window.
  • the appropriate location in 137 is shown.
  • the first electronic device 100 sends a drag completion instruction to the second electronic device 200 , and the second electronic device 200 inserts and displays the picture 135 at a corresponding position in the window 137 according to the drag completion instruction.
  • the source device (such as the first electronic device) helps the user to quickly move the file or window to the target device (such as the second electronic device) by displaying the device view, and Enables continued movement of files or windows on the target device.
  • the target device such as the second electronic device
  • users do not need to move files or windows across devices through a long path, avoiding file or window movement failures caused by user errors.
  • it improves the interactive efficiency of moving files or windows between multiple devices.
  • the same second electronic device can correspond to multiple virtual desktops (for example, the second electronic device is connected to multiple display screens; or a display screen of the second electronic device can switch to display multiple virtual desktops, etc.), Then the first electronic device needs to determine which desktop of the second electronic device the user instructs to move the mouse pointer, window or file to display.
  • the input control process in the multi-desktop scenario is introduced in detail below.
  • FIG. 14 is a schematic flowchart of yet another input control method provided by an embodiment of the present application. As shown in Figure 14, the method includes the following steps.
  • the first electronic device and the second electronic device establish a communication connection.
  • the first electronic device obtains device information of the second electronic device.
  • the first electronic device After detecting the user's operation of selecting a file or window, the first electronic device detects the user's instruction to move the mouse pointer to other electronic devices, and displays a device view of the electronic device that has established a communication connection.
  • the first electronic device detects the user's operation of moving the selected file or window to the target device view display position, and determines the second electronic device corresponding to the target device view.
  • step S1401 to step S1404 reference may be made to the relevant content described in the above step S901 to step S904, which will not be described again here.
  • the embodiment of the present application also does not limit the execution order of step S1402 and step S1403.
  • the first electronic device can obtain the device information of the electronic device that has established a communication connection with it to display the device view.
  • the first electronic device determines that the second electronic device is configured with multiple virtual desktops and displays multiple virtual desktop views.
  • the first electronic device can obtain the virtual desktop situation configured by the second electronic device, for example, the second electronic device is configured with multiple virtual desktops. Then, the first electronic device can create a device view corresponding to the second electronic device and multiple virtual desktop views corresponding to the device view according to the virtual desktop situation of the second electronic device, wherein each virtual desktop view corresponds to the second electronic device.
  • a virtual desktop for electronic devices can be obtained.
  • the device information obtained by the first electronic device from the second electronic device may include information about the virtual desktop of the second electronic device.
  • it includes any one or more of virtual desktop screenshots, screenshots of the current display interface of the virtual desktop, virtual desktop ID, virtual desktop name, virtual desktop nickname edited by the user, pictures used to identify the virtual desktop, etc.
  • the virtual desktop view displayed by the first electronic device according to the virtual desktop information can help the user distinguish different virtual desktops of the second electronic device.
  • the first electronic device 100 displays the device view as shown in (b) of Figure 15 in response to the user's operation of moving the picture 152 to other electronic devices through the mouse pointer 151 interface.
  • the first electronic device 100 detects the user's operation of moving the picture 152 to the target device view indicated by reference numeral 153, determines the second electronic device 200 corresponding to the target device view indicated by reference numeral 153, and determines the configuration of the second electronic device 200 There are multiple virtual desktops.
  • the first electronic device 100 displays multiple virtual desktop views corresponding to the target device view indicated by reference numeral 153. The user can determine the need to move the picture 152 to the first virtual desktop view according to the virtual desktop view. Which virtual desktop of the two electronic devices 200 is displayed.
  • the picture as shown in (c) of Figure 15 can be displayed.
  • the first electronic device 100 detects a position other than the display position of multiple virtual desktop views, and the user controls the display of the picture 152 through the mouse pointer 151 for more than a preset time. , which determines that the user's target device view selection is incorrect. Then, the first electronic device 100 can return to displaying the device view interface as shown in (b) of FIG. 15 , and receive the user's operation of reselecting the target device view.
  • the electronic device corresponding to the device view displayed by the first electronic device can be configured with only one virtual desktop (for example, the electronic device is only configured with one display screen, or only one virtual desktop is created in a configured display screen, etc.), Then after detecting the user's operation of moving the file or window to the device view, the first electronic device can directly send the file or window information and mouse information to the electronic device corresponding to the device view, without having to expand and display the corresponding device view. virtual desktop view.
  • the first electronic device After detecting the user's operation of moving a file or window to the target virtual desktop view, the first electronic device determines the target virtual desktop corresponding to the target virtual desktop view.
  • the first electronic device may determine that the user instructed to send the file or window to the virtual desktop corresponding to the target virtual desktop view of the second electronic device. displayed on the desktop.
  • the first electronic device 100 may highlight the target virtual desktop view to display the target virtual desktop view. The user is prompted to confirm whether the selected virtual desktop is correct.
  • the first electronic device 100 may also send a prompt instruction to the second electronic device 200 corresponding to the target virtual desktop view indicated by reference numeral 154, for instructing the second electronic device 200 to prompt the user to select the virtual desktop view in a preset manner.
  • the default method includes, for example, displaying a prompt bar around the edge of the display screen, switching to the target virtual desktop, audio module reminder, display of prompt information, etc.
  • the second electronic device 200 can switch multiple virtual desktops displayed on the same display screen. After receiving the prompt instruction carrying the target virtual desktop identification, as shown in (c) of Figure 15, the second electronic device 200 can switch to display the target virtual desktop, and display a prompt bar around the edge of the display screen to prompt the user to confirm whether Move the picture to the target virtual desktop for display.
  • the first electronic device 100 determines that the mouse pointer 151 remains selected at the display position of the target virtual desktop view indicated by reference numeral 154 and stops for more than a preset time, and may determine that the user The instruction moves the picture 152 to the target virtual desktop of the second electronic device 200 corresponding to the target virtual desktop view indicated by reference numeral 154 .
  • the first electronic device can change and display the view corresponding to the file or window in a preset manner. For example, the first In the process of moving the selected file or window to the target device view or the target virtual desktop view, an electronic device reduces the view corresponding to the file or window to adapt to the size of the target device view or the target virtual desktop view.
  • the first electronic device 100 in response to a user operation, reduces the display picture 152 to adapt to the size of the target device view indicated by reference numeral 153 .
  • the first electronic device 100 in response to a user operation, reduces the display picture 152 to adapt to the size of the virtual desktop view indicated by reference numeral 154 .
  • the first electronic device hides the mouse pointer.
  • the first electronic device may hide its own mouse pointer (for example, the mouse pointer disappears or the mouse pointer is not displayed). This produces a display effect of the mouse pointer crossing across devices, helping the user to determine that the mouse pointer has moved out of the first electronic device.
  • the first electronic device sends mouse information, file or window information, and target virtual desktop information to the second electronic device.
  • the first electronic device may send the information of the file or window selected by the user, the mouse information, and the target virtual desktop information to Second electronic device.
  • the second electronic device can determine the target virtual desktop selected by the user according to the target virtual desktop information. In this way, the mouse pointer, the file or window selected by the user are transferred to the target virtual desktop of the second electronic device for display.
  • the information corresponding to the file or window is used to display the view corresponding to the file or window.
  • the second electronic device displays the file or window dragged by the mouse pointer corresponding to the input device of the first electronic device on the target virtual desktop.
  • the first electronic device sends dragging file or window information to the second electronic device.
  • the second electronic device displays the file or window dragged by the mouse pointer corresponding to the input device of the first electronic device on the target virtual desktop.
  • the second electronic device displays the file or window in the drag state on the target virtual desktop based on the acquired mouse information, file or window information, and target virtual desktop information. view, so that the user can continue to drag the file or window to the desired location through the mouse of the first electronic device.
  • the first electronic device after detecting the operation of dragging the file or moving the window with the mouse, the first electronic device sends the information of the dragged file or window during the movement (such as location information, etc.) to the second electronic device, so that the second electronic device The moved file or window may be displayed on the target virtual desktop of the second electronic device in response to the user operation.
  • the first electronic device 100 sends mouse information, file or window information, and target virtual desktop information to the second electronic device 200 , and hides its own mouse pointer.
  • the second electronic device 200 can display the mouse pointer 151 on the target virtual desktop according to the mouse information, file or window information, and target virtual desktop information, as shown in (d) in FIG. 15 Picture 155 of drag state.
  • the first electronic device 100 in response to the user's operation of dragging the picture by moving the mouse of the first electronic device 100, the first electronic device 100 sends the information of the dragged file or window to the second electronic device 200 during the dragging process, so that The second electronic device 200 moves the display position of the picture 155 according to the information of the dragged file or window.
  • steps S1410 to S1411 are loop steps.
  • the first electronic device can respond to the user's operation of moving the mouse and send the operation of dragging the file or window information to the second electronic device multiple times, so that the second electronic device can Display files or windows in the process of being moved in response to user operations.
  • steps S1410 to S1411 are optional steps.
  • the user may no longer move the file or window, but directly perform a drag-and-drop operation to end the crossover.
  • the device moves files or windows (that is, after step S1409, the following step S1412 is directly executed).
  • the second electronic device directly cancels the dragging state of the file or window, and displays the file or window at the corresponding position (see the following steps for the specific process).
  • the first electronic device may also determine the second electronic device.
  • the number of virtual desktops configured. If the second electronic device is configured with multiple virtual desktops, then as in step S1405 above, the first electronic device may also display multiple virtual desktop views corresponding to the second electronic device to receive the user's operation of selecting a target virtual desktop view. Afterwards, the first electronic device can send the mouse pointer to the target virtual desktop of the second electronic device for display according to the user's operation.
  • the first electronic device detects the user's drag and release operation.
  • the first electronic device sends a drag completion instruction to the second electronic device.
  • the first electronic device may determine that the user has moved the file or window to the target virtual machine. The desired position of the desktop. Then, the first electronic device can send a drag completion instruction to the second electronic device to instruct the second electronic device to stop moving and displaying the file or window, and display the moved file or window at the current display position of the target virtual desktop.
  • the drag completion instruction may carry the content information of the dragged file or window.
  • the second electronic device can display the view of the file or window based on the thumbnail information of the file or window. Later, after it is determined that the dragging of the file or window has been completed, the subsequent user may need to edit the file or window, and then the second electronic device needs to obtain the detailed content of the file or window.
  • the second electronic device displays files or windows on the target virtual desktop according to the target virtual desktop display content.
  • the second electronic device may display the dragged file or window according to the display content of the target virtual desktop.
  • the target virtual desktop of the second electronic device displays the desktop, and the second electronic device can display the dragged file or window at a corresponding position on the desktop.
  • the target virtual desktop display window of the second electronic device can insert a dragged file at the corresponding cursor position of the window.
  • the window does not support inserting the dragged file, and the user can be prompted that the file drag failed; or the window can be minimized and the dragged file will be displayed on the desktop.
  • the target virtual desktop display window of the second electronic device can overlay and display the window dragged by the first electronic device.
  • the target virtual desktop can run multiple windows, as shown in the related content described in Figure 13 above.
  • the second electronic The device may display preview bars of multiple windows on the target virtual desktop, and then, in response to the user operation, insert the file moved by the first electronic device into the window selected by the user on the target virtual desktop.
  • the source device (such as the first electronic device) can also help the user quickly move files or windows to the target by displaying the device view and the virtual desktop view.
  • a target virtual desktop of the device (such as a second electronic device), and continues to move files or windows on the target virtual desktop of the target device.
  • users do not need to move files or windows across devices through a long path, avoiding file or window movement failures caused by user errors.
  • it improves the interactive efficiency of moving files or windows between multiple devices.
  • FIG. 16 is a schematic structural diagram of a first electronic device provided by an embodiment of the present application.
  • the first electronic device 1600 may include: a transceiver unit 1601, a processing unit 1602, and a display unit 1603.
  • the first electronic device 1600 may be used to implement the functions of the first electronic device involved in the above method embodiments.
  • the transceiver unit 1601 is used to support the first electronic device 1600 to perform S401, S402, S403, S404 and S406 in Figure 4; and/or to support the first electronic device 1600 to perform S901, S902, S903, S904, S906, S908, S910 and S911; and/or, used to support the first electronic device 1600 to perform S1401, S1402, S1403, S1404, S1406, S1408, S1410, S1412 and S1413 in Figure 14.
  • the processing unit 1602 is used to support the first electronic device 1600 to perform S404 and S405 in Figure 4; and/or, to support the first electronic device 1600 to perform S904 and S905 in Figure 9; and/or, Used to support the first electronic device 1600 to perform S1404, S1405, S1406 and S1407 in Figure 14.
  • the display unit 1603 is used to support the first electronic device 1600 to perform S403 and S405 in Figure 4; and/or, to support the first electronic device 1600 to perform S903 and S905 in Figure 9; and/or, Used to support the first electronic device 1600 to perform S1403, S1405 and S1407 in Figure 14.
  • the transceiver unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver-related circuit component, and may be a transceiver or a transceiver module.
  • the operation and/or function of each unit in the first electronic device 1600 is to implement the corresponding process of the input control method described in the above method embodiment. All relevant content of each step involved in the above method embodiment can be quoted from the corresponding For the sake of brevity, the function description of the functional unit will not be repeated here.
  • the first electronic device 1600 shown in FIG. 16 may also include a storage unit (not shown in FIG. 16), in which programs or instructions are stored.
  • a storage unit not shown in FIG. 16
  • the transceiver unit 1601, the processing unit 1602, and the display unit 1603 execute the program or instruction
  • the first electronic device 1600 shown in FIG. 16 can execute the input control method described in the above method embodiment.
  • the technical solution provided by this application can also be a functional unit or chip in the first electronic device, or a device used in conjunction with the first electronic device.
  • FIG. 17 is a schematic structural diagram of a second electronic device provided by an embodiment of the present application.
  • the second electronic device 1700 may include: a transceiver unit 1701, a processing unit 1702, and a display unit. display unit 1703.
  • the second electronic device 1700 may be used to implement the functions of the second electronic device involved in the above method embodiments.
  • the transceiver unit 1701 is used to support the second electronic device 1700 to perform S401, S402 and S406 in Figure 4; and/or to support the second electronic device 1700 to perform S901, S902, S906, S908 and S911; and/or, used to support the second electronic device 1700 to perform S1401, S1402, S1408, S1410 and S1413 in FIG. 14 .
  • the processing unit 1702 is used to support the second electronic device 1700 to perform S407 in Figure 4; and/or to support the second electronic device 1700 to perform S907, S909 and S912 in Figure 9; and/or, Used to support the second electronic device 1700 to perform S1409, S1411 and S1414 in Figure 14.
  • the display unit 1703 is used to support the second electronic device 1700 to perform S407 in Figure 4; and/or, to support the second electronic device 1700 to perform S907, S909 and S912 in Figure 9; and/or, Used to support the second electronic device 1700 to perform S1409, S1411 and S1414 in Figure 14.
  • the transceiver unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver-related circuit component, and may be a transceiver or a transceiver module.
  • the operation and/or function of each unit in the second electronic device 1700 is to implement the corresponding process of the input control method described in the above method embodiment. All relevant content of each step involved in the above method embodiment can be quoted from the corresponding For the sake of brevity, the function description of the functional unit will not be repeated here.
  • the second electronic device 1700 shown in FIG. 17 may also include a storage unit (not shown in FIG. 17), in which programs or instructions are stored.
  • a storage unit not shown in FIG. 17
  • programs or instructions are stored.
  • the transceiver unit 1701, the processing unit 1702, and the display unit 1703 execute the program or instruction
  • the second electronic device 1700 shown in FIG. 17 can perform the input control method described in the above method embodiment.
  • the technical solution provided by this application can also be a functional unit or chip in the first electronic device, or a device used in conjunction with the first electronic device.
  • An embodiment of the present application also provides a chip system, including: a processor, the processor is coupled to a memory, and the memory is used to store programs or instructions. When the program or instructions are executed by the processor, the The chip system implements the method in any of the above method embodiments.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be implemented in hardware or software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software code stored in memory.
  • the memory may be integrated with the processor or may be provided separately from the processor, which is not limited by the embodiments of the present application.
  • the memory may be a non-transient processor, such as a read-only memory ROM, which may be integrated with the processor on the same chip, or may be separately provided on different chips.
  • the embodiments of this application vary on the type of memory, and The arrangement of the memory and processor is not specifically limited.
  • the chip system can be a field programmable gate array (FPGA), or an application specific integrated circuit (AP device application specific integrated circuit, ASIC), It can also be a system on chip (SoC), a central processor unit (CPU), a network processor (NP), or a digital signal processing circuit (digital signal).
  • SoC system on chip
  • CPU central processor unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller unit
  • PLD programmable logic device
  • each step in the above method embodiment can be completed by an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the method steps disclosed in conjunction with the embodiments of this application can be directly implemented by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • a computer program is stored in the computer-readable storage medium. When the computer program is run on a computer, it causes the computer to perform the above related steps to implement the above embodiments. Enter the control method.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to perform the above related steps to implement the input control method in the above embodiment.
  • the embodiment of the present application also provides a device.
  • the device may specifically be a component or module, and the device may include one or more connected processors and memories. Among them, memory is used to store computer programs. When the computer program is executed by one or more processors, the device is caused to execute the input control method in each of the above method embodiments.
  • the devices, computer-readable storage media, computer program products or chips provided by the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects it can achieve can be referred to the beneficial effects in the corresponding methods provided above, and will not be described again here.
  • the steps of the methods or algorithms described in connection with the disclosure of the embodiments of this application can be implemented in hardware or by a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, and software modules can be stored in random access memory (RAM), flash memory, read only memory (read only memory, ROM), erasable programmable read-only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable read-only memory (EPROM, EEPROM), register, hard disk, removable hard disk, compact disc (CD-ROM) or any other form of storage media well known in the art.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from the storage medium and write information to the storage medium.
  • the storage medium can also be an integral part of the processor.
  • the processor and storage medium may be located in an application specific integrated circuit (ASIC).
  • ASIC application specific integrated circuit
  • the disclosed method can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of modules or units is only a logical function division, and there may be other division methods in actual implementation; for example, multiple units or components may be combined or integrated into another system, or some features may be ignored. or not executed.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, modules or units.
  • the indirect coupling or communication connection may be electrical, mechanical or other forms.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.
  • Computer-readable storage media includes but is not limited to any of the following: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disk or optical disk etc.
  • Various media that can store program code include but is not limited to any of the following: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请提供输入控制方法、电子设备及系统,涉及终端技术领域。本申请响应于用户在第一电子设备显示屏上选择目标设备视图的操作,第一电子设备可直接将鼠标指针切换至目标设备视图对应的目标第二电子设备上显示,从而简化用户操作难度,提升用户使用体验。该方法包括:第一电子设备在检测到用户指示跨设备移动鼠标指针的操作后,可显示与第一电子设备建立通信连接的第二电子设备对应的设备视图。之后,响应于用户在设备视图中选择目标设备视图的操作,第一电子设备可向目标设备视图对应的目标第二电子设备发送显示指示,该显示指示用于指示目标第二电子设备显示第一电子设备的输入设备对应的鼠标指针。

Description

输入控制方法、电子设备及系统
本申请要求于2022年06月30日提交国家知识产权局、申请号为202210760417.2、发明名称为“输入控制方法、电子设备及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及终端技术领域,尤其涉及一种输入控制方法、电子设备及系统。
背景技术
随着终端技术的发展,电子设备之间可以实现多屏协同,通过一个电子设备控制多个电子设备,不仅扩大了显示空间,而且可以将一个电子设备上的内容快速传输到其他电子设备上进行显示。一般的,在多屏协同过程中,多个电子设备可共用一套键鼠。
示例性的,如图1所示,电脑101、笔记本电脑102以及平板103之间建立多屏协同连接,电脑101的键鼠可实现控制电脑101、笔记本电脑102以及平板103。其中,在多屏协同连接建立过程中,电脑101可获取到三台电子设备的位置关系。这样电脑101在检测到用户向右移动鼠标指针至显示屏边缘的操作后,可确定用户指示移动鼠标指针至笔记本电脑102,可将键鼠信息发送至笔记本电脑102,实现在笔记本电脑102的显示屏上显示电脑101可控制的鼠标指针。之后,电脑101检测到用户向右移动鼠标指针至笔记本电脑102的显示屏边缘的操作后,可确定用户指示移动鼠标指针至平板103,可将键鼠信息发送至平板103,实现在平板103的显示屏上显示电脑101可控制的鼠标指针。通过上述过程,可实现将鼠标指针由电脑101的显示屏移动至平板103的显示屏上显示,从而用户可通过电脑101的鼠标控制平板103。
可以看出,多屏协同场景中,在多台电子设备显示屏之间,跨设备移动鼠标指针的过程复杂,可能会经由其他中间设备的显示屏,且随着电子设备数量的增多交互路径的数量会随之增多,用户操作难度较高。
发明内容
为了解决上述的技术问题,本申请提供了一种输入控制方法、电子设备及系统。本申请提供的技术方案,响应于用户在第一电子设备显示屏上选择目标设备视图的操作,第一电子设备可直接将鼠标指针切换至目标设备视图对应的目标第二电子设备上显示,从而简化用户操作难度,提升用户使用体验。
为了实现上述的技术目的,本申请实施例提供了如下技术方案:
第一方面,提供一种输入控制方法。该方法包括:第一电子设备接收用户指示跨设备移动鼠标指针的第一操作,显示与第一电子设备建立通信连接的至少一个第二电子设备对应的设备视图。第一电子设备检测到用户选择设备视图中目标设备视图的第二操作,确定目标设备视图对应的目标第二电子设备,至少一个第二电子设备包括该目标第二电子设备。第一电子设备向目标第二电子设备发送显示指示,显示指示用于指示目标第二电子设备显示鼠标指针。
如此,在跨设备移动鼠标指针的过程中,第一电子设备通过显示设备视图的方式, 帮助用户实现快速移动鼠标指针至目标第二电子设备。这样,用户不需要通过较长路径跨设备移动鼠标指针,避免用户操作失误导致的鼠标指针移动失败,并且可提升交互效率。
此外,直接通过设备视图跨设备移动鼠标指针,相比于现有技术,鼠标指针移动的复杂度不会由于电子设备数量的增多而增加,提升用户使用体验。
根据第一方面,在第一电子设备接收用户指示跨设备移动鼠标指针的第一操作之前,方法还包括:第一电子设备检测用户选中目标对象的第三操作。
示例性的,目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
如此,第一电子设备可实现跨设备传输用户选中的多种类型的目标对象,提升用户使用体验。
根据第一方面,或者以上第一方面的任意一种实现方式,显示指示还用于指示目标第二电子设备显示选中状态下的目标对象;方法还包括:第一电子设备检测到用户移动鼠标指针的第四操作,向目标第二电子设备发送拖拽目标对象的第一信息,第一信息用于目标第二电子设备显示拖拽后的目标对象。
示例性的,第一信息可以包括移动过程中的目标对象的信息(如包括位置信息等)。
如此,在跨设备移动目标对象的过程中,源端设备(如第一电子设备)通过显示设备视图的方式,帮助用户实现快速移动目标对象至目标设备(如目标第二电子设备),并实现在目标设备上继续移目标对象。这样,用户不需要通过较长路径跨设备移动目标对象,避免用户操作失误导致的目标对象移动失败。并且,提升在多设备之间移动目标对象的交互效率。
此外,直接通过设备视图跨设备移动目标对象,相比于现有技术,目标对象移动的复杂度不会由于电子设备数量的增多而增加,提升用户使用体验。
根据第一方面,或者以上第一方面的任意一种实现方式,方法还包括:第一电子设备检测到用户的拖拽释放操作,向目标第二电子设备发送拖拽完成指示,拖拽完成指示用于指示目标第二电子设备根据显示内容,显示目标对象。
如此,第一电子设备响应于用户操作,能够指示目标第二电子设备停止移动用户选中的目标对象,使得跨设备传输的目标对象能够显示在用户所需位置,满足用户需求,简化用户操作,提升用户使用体验。
根据第一方面,或者以上第一方面的任意一种实现方式,在第一电子设备检测到用户选择设备视图中目标设备视图的第二操作之后,方法还包括:第一电子设备显示第二电子设备的多个虚拟桌面对应的多个虚拟桌面视图。第一电子设备检测到用户选择多个虚拟桌面视图中目标虚拟桌面视图的第五操作,确定目标虚拟桌面视图对应的目标第二电子设备的目标虚拟桌面,多个虚拟桌面包括该目标虚拟桌面。显示指示用于指示目标第二电子设备在目标虚拟桌面上显示鼠标指针。
如此,在多桌面场景的跨设备移动鼠标指针(或鼠标指针和目标对象)的过程中,源端设备(如第一电子设备)也可通过显示设备视图和虚拟桌面视图的方式,帮助用户实现快速移动鼠标指针(或鼠标指针和目标对象)至目标设备(如目标第二电子设备)的目标虚拟桌面,并实现在目标设备的目标虚拟桌面上继续移动鼠标指针(或鼠 标指针和目标对象)。这样,用户不需要通过较长路径跨设备移动鼠标指针(或鼠标指针和目标对象),避免用户操作失误导致的鼠标指针(或鼠标指针和目标对象)移动失败。并且,提升在多设备之间移动鼠标指针(或鼠标指针和目标对象)的交互效率。
根据第一方面,或者以上第一方面的任意一种实现方式,第一电子设备检测到用户选择设备视图中目标设备视图的第二操作,确定目标设备视图对应的目标第二电子设备,包括:第一电子设备检测到用户选择设备视图中目标设备视图中显示的目标窗口预览图的操作,确定目标设备视图对应的目标第二电子设备,显示指示用于指示目标第二电子设备在目标窗口预览图对应的第一窗口中显示鼠标指针。
在一些实施例中,目标第二电子设备可运行多个窗口,第一电子设备通过显示多个窗口预览图的方式,可帮助用户直接移动鼠标指针(或鼠标指针和目标对象)至所需的目标第二电子设备的窗口中显示。从而简化用户操作,提高交互效率,提升用户使用体验。
根据第一方面,或者以上第一方面的任意一种实现方式,方法还包括:第一电子设备按照预设方式获取第二电子设备的设备信息。其中,预设方式包括按照预设周期获取设备信息,或者,响应于第一操作,向第二电子设备请求获取设备信息。其中,设备信息用于生成设备视图,设备信息包括如下一项或几项:桌面截图、设备标识、当前显示的界面截图、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
如此,第一电子设备通过多种方式获取到第二电子设备的设备信息后,能够通过多种方式显示用于标识不同第二电子设备的设备视图,帮助用户区分不同的第二电子设备。从而进一步简化用户操作,提升跨设备移动鼠标指针(或鼠标指针和目标对象)的准确率。
根据第一方面,或者以上第一方面的任意一种实现方式,方法还包括:第一电子设备隐藏鼠标指针。
在一些实施例中,第一电子设备隐藏鼠标指针,也可以理解为,第一电子设备上的鼠标指针消失、或者第一电子设备上不显示鼠标指针
如此,第一电子设备在确定完成跨设备移动鼠标指针后,可隐藏本侧的鼠标指针,从而帮助用户确认当前可在目标第二电子设备上操作鼠标指针,避免对用户的使用造成困扰。
根据第一方面,或者以上第一方面的任意一种实现方式,显示指示中携带有鼠标指针的位移信息,位移信息用于指示目标第二电子设备在相应的显示位置显示鼠标指针,显示位置包括如下任一项:对应于第一电子设备隐藏鼠标指针的位置、对应于在第一电子设备的显示屏上第二操作的位置、对应于在目标设备视图上第二操作的位置。
如此,目标第二电子设备在相应的显示位置显示跨设备移动的鼠标指针,便于用户快速在目标第二电子设备的显示屏上查找到鼠标指针,提升用户使用体验。
根据第一方面,或者以上第一方面的任意一种实现方式,第一操作包括如下一项或几项:对第一电子设备的第一按键的操作、将鼠标指针移动到显示屏预设区域的操作、对显示屏上显示的预设图标的操作。
如此,用户可通过多种方式,灵活的触发跨设备移动鼠标指针的流程,提升用户使用体验。并且,多种方式的第一操作,可适用于多种使用场景,提升跨设备移动鼠标指针的适用范围。
第二方面,提供一种输入控制方法。该方法包括:第二电子设备接收第一电子设备发送的显示指示,显示指示为第一电子设备响应于用户在第一电子设备显示的与第一电子设备建立通信连接的至少一个电子设备对应的设备视图中选择目标设备视图的第二操作,向目标设备视图对应的第二电子设备发送的指示,至少一个电子设备包括该第二电子设备。第二电子设备根据显示指示,显示第一电子设备的输入设备对应的鼠标指针。
根据第二方面,方法还包括:第二电子设备根据显示指示,显示选中状态下的目标对象;目标对象为第一电子设备在发送显示指示之前响应于用户的第三操作选中的对象。第二电子设备接收第一电子设备发送的第一信息,第一信息为第一电子设备响应于用户移动鼠标指针的第四操作确定的信息。第二电子设备根据第一信息,显示拖拽后的目标对象。
根据第二方面,或者以上第二方面的任意一种实现方式,方法还包括:第二电子设备接收第一电子设备发送的拖拽完成指示。响应于拖拽完成指示,第二电子设备根据显示内容,显示目标对象。
根据第二方面,或者以上第二方面的任意一种实现方式,目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
根据第二方面,或者以上第二方面的任意一种实现方式,方法还包括:第二电子设备确定正在运行的多个窗口。响应于显示指示,第二电子设备显示多个窗口对应的多个预览窗口。第二电子设备接收第一电子设备发送的第二信息;第二信息为第一电子设备在检测到用户移动鼠标指针后确定的信息。第二电子设备根据第二信息,在多个预览窗口中的第一预览窗口对应的第二窗口中显示拖拽状态的目标对象,多个窗口包括该第二窗口。
示例性的,第二信息可以包括移动过程中的目标对象的信息(如包括位置信息等)。
如此,在第二电子设备后台运行多个窗口的情况下,用户也可快速移动鼠标指针(或鼠标指针和目标对象)至所需的第二窗口显示。有效简化用户操作,提高交互效率,提升用户使用体验。
根据第二方面,或者以上第二方面的任意一种实现方式,第二电子设备配置有多个虚拟桌面,显示指示用于指示第二电子设备在多个虚拟桌面中的目标虚拟桌面上显示鼠标指针,目标虚拟桌面为第一电子设备显示多个虚拟桌面对应的多个虚拟桌面视图后,响应于用户的第五操作确定的虚拟桌面。
根据第二方面,或者以上第二方面的任意一种实现方式,方法还包括:第二电子设备按照预设方式向第一电子设备发送第二电子设备的设备信息;其中,预设方式包括按照预设周期发送设备信息,或者,响应于第一电子设备的请求,向第一电子设备发送设备信息;设备信息用于第一电子设备显示第二电子设备对应的设备视图,设备信息包括如下一项或几项:桌面截图、当前显示的界面截图、设备标识、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
根据第二方面,或者以上第二方面的任意一种实现方式,显示指示中携带有鼠标指针的位移信息,第二电子设备根据显示指示,显示第一电子设备的输入设备对应的鼠标指针,包括:第二电子设备根据鼠标指针的位移信息,在确定的对应显示位置显示鼠标指针,该对应显示位置包括如下任一项:对应于第一电子设备隐藏鼠标指针的位置、对应于在第一电子设备的显示屏上第二操作的位置、对应于在目标设备视图上第二操作的位置。
第二方面以及第二方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第三方面,提供一种电子设备。该电子设备包括:处理器、显示屏和存储器,存储器和显示屏与处理器耦合,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当处理器从所述存储器中读取计算机指令,使得电子设备执行:第一电子设备接收用户指示跨设备移动鼠标指针的第一操作,显示与第一电子设备建立通信连接的至少一个第二电子设备对应的设备视图。第一电子设备检测到用户选择设备视图中目标设备视图的第二操作,确定目标设备视图对应的目标第二电子设备,至少一个第二电子设备包括目标第二电子设备。第一电子设备向目标第二电子设备发送显示指示,显示指示用于指示目标第二电子设备显示鼠标指针。
根据第三方面,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第一电子设备检测用户选中目标对象的第三操作。
根据第三方面,或者以上第三方面的任意一种实现方式,显示指示还用于指示目标第二电子设备显示选中状态下的目标对象。当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第一电子设备检测到用户移动鼠标指针的第四操作,向目标第二电子设备发送拖拽目标对象的第一信息,第一信息用于目标第二电子设备显示拖拽后的目标对象。
根据第三方面,或者以上第三方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第一电子设备检测到用户的拖拽释放操作,向目标第二电子设备发送拖拽完成指示,拖拽完成指示用于指示目标第二电子设备根据显示内容,显示目标对象。
根据第三方面,或者以上第三方面的任意一种实现方式,目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
根据第三方面,或者以上第三方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第一电子设备显示第二电子设备的多个虚拟桌面对应的多个虚拟桌面视图。第一电子设备检测到用户选择多个虚拟桌面视图中目标虚拟桌面视图的第五操作,确定目标虚拟桌面视图对应的目标第二电子设备的目标虚拟桌面,多个虚拟桌面包括该目标虚拟桌面。显示指示用于指示目标第二电子设备在目标虚拟桌面上显示鼠标指针。
根据第三方面,或者以上第三方面的任意一种实现方式,第一电子设备检测到用户选择设备视图中目标设备视图的第二操作,确定目标设备视图对应的目标第二电子设备,包括:第一电子设备检测到用户选择设备视图中目标设备视图中显示的目标窗口预览图的操作,确定目标设备视图对应的目标第二电子设备,显示指示用于指示目 标第二电子设备在目标窗口预览图对应的第一窗口中显示鼠标指针。
根据第三方面,或者以上第三方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第一电子设备按照预设方式获取第二电子设备的设备信息。其中,预设方式包括按照预设周期获取设备信息,或者,响应于第一操作,向第二电子设备请求获取设备信息。其中,设备信息用于生成设备视图,设备信息包括如下一项或几项:桌面截图、设备标识、当前显示的界面截图、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
根据第三方面,或者以上第三方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第一电子设备隐藏鼠标指针。
根据第三方面,或者以上第三方面的任意一种实现方式,显示指示中携带有鼠标指针的位移信息,位移信息用于指示目标第二电子设备在相应的显示位置显示鼠标指针,显示位置包括如下任一项:对应于第一电子设备隐藏鼠标指针的位置、对应于在第一电子设备的显示屏上第二操作的位置、对应于在目标设备视图上第二操作的位置。
根据第三方面,或者以上第三方面的任意一种实现方式,第一操作包括如下一项或几项:对第一电子设备的第一按键的操作、将鼠标指针移动到显示屏预设区域的操作、对显示屏上显示的预设图标的操作。
第三方面以及第三方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第四方面,提供一种电子设备。该电子设备包括:处理器、显示屏和存储器,存储器和显示屏与处理器耦合,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当处理器从所述存储器中读取计算机指令,使得电子设备执行:第二电子设备接收第一电子设备发送的显示指示,显示指示为第一电子设备响应于用户在第一电子设备显示的与第一电子设备建立通信连接的至少一个电子设备对应的设备视图中选择目标设备视图的第二操作,向目标设备视图对应的第二电子设备发送的指示,至少一个电子设备包括该第二电子设备。第二电子设备根据显示指示,显示第一电子设备的输入设备对应的鼠标指针。
根据第四方面,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第二电子设备根据显示指示,显示选中状态下的目标对象;目标对象为第一电子设备在发送显示指示之前响应于用户的第三操作选中的对象。第二电子设备接收第一电子设备发送的第一信息,第一信息为第一电子设备响应于用户移动鼠标指针的第四操作确定的信息。第二电子设备根据第一信息,显示拖拽后的目标对象。
根据第四方面,或者以上第四方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第二电子设备接收第一电子设备发送的拖拽完成指示。响应于拖拽完成指示,第二电子设备根据显示内容,显示目标对象。
根据第四方面,或者以上第四方面的任意一种实现方式,目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
根据第四方面,或者以上第四方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第二电子设备确定正在运行的多 个窗口。响应于显示指示,第二电子设备显示多个窗口对应的多个预览窗口。第二电子设备接收第一电子设备发送的第二信息;第二信息为第一电子设备在检测到用户移动鼠标指针后确定的信息。第二电子设备根据第二信息,在多个预览窗口中的第一预览窗口对应的第二窗口中显示拖拽状态的目标对象,多个窗口包括该第二窗口。
根据第四方面,或者以上第四方面的任意一种实现方式,第二电子设备配置有多个虚拟桌面,显示指示用于指示第二电子设备在多个虚拟桌面中的目标虚拟桌面上显示鼠标指针,目标虚拟桌面为第一电子设备显示多个虚拟桌面对应的多个虚拟桌面视图后,响应于用户的第五操作确定的虚拟桌面。
根据第四方面,或者以上第四方面的任意一种实现方式,当处理器从存储器中读取计算机可读指令,还使得电子设备执行如下操作:第二电子设备按照预设方式向第一电子设备发送第二电子设备的设备信息;其中,预设方式包括按照预设周期发送设备信息,或者,响应于第一电子设备的请求,向第一电子设备发送设备信息;设备信息用于第一电子设备显示第二电子设备对应的设备视图,设备信息包括如下一项或几项:桌面截图、当前显示的界面截图、设备标识、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
根据第四方面,或者以上第四方面的任意一种实现方式,显示指示中携带有鼠标指针的位移信息,第二电子设备根据显示指示,显示第一电子设备的输入设备对应的鼠标指针,包括:第二电子设备根据鼠标指针的位移信息,在确定的对应显示位置显示鼠标指针,该对应显示位置包括如下任一项:对应于第一电子设备隐藏鼠标指针的位置、对应于在第一电子设备的显示屏上第二操作的位置、对应于在目标设备视图上第二操作的位置。
第四方面以及第四方面中任意一种实现方式所对应的技术效果,可参见上述第二方面及第二方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第五方面,本申请实施例提供一种输入控制系统,该系统包括第一电子设备和第二电子设备。其中,第一电子设备,用于接收用户指示跨设备移动鼠标指针的第一操作,显示与第一电子设备建立通信连接的至少一个第二电子设备对应的设备视图。第一电子设备,还用于检测到用户选择设备视图中目标设备视图的第二操作,确定目标设备视图对应的目标第二电子设备,至少一个第二电子设备包括该目标第二电子设备。第一电子设备,还用于向目标第二电子设备发送显示指示。第二电子设备中的目标第二电子设备,用于接收第一电子设备发送的显示指示。目标第二电子设备,还用于根据显示指示,显示鼠标指针。
根据第五方面,第一电子设备,还用于检测用户选中目标对象的第三操作。第二电子设备,还用于显示选中状态下的目标对象。
根据第五方面,或者以上第五方面的任意一种实现方式,第一电子设备,还用于检测到用户移动鼠标指针的第四操作,向目标第二电子设备发送拖拽目标对象的第一信息。目标第二电子设备,还用于接收第一电子发送的第一信息。目标第二电子设备,还用于根据第一信息显示拖拽后的目标对象。
根据第五方面,或者以上第五方面的任意一种实现方式,第一电子设备,还用于检测到用户的拖拽释放操作,向目标第二电子设备发送拖拽完成指示。目标第二电子 设备,还用于接收第一电子发送的拖拽完成指示。目标第二电子设备,还用于响应于拖拽完成指示,根据目标第二电子设备的显示内容,显示目标对象。
根据第五方面,或者以上第五方面的任意一种实现方式,目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
根据第五方面,或者以上第五方面的任意一种实现方式,目标第二电子设备,还用于确定正在运行的多个窗口。目标第二电子设备,还用于响应于显示指示,显示多个窗口对应的多个预览窗口。第一电子设备,还用于检测到用户移动鼠标指针至多个预览窗口中的第一预览窗口的操作,确定第二信息。第一电子设备,还用于向目标第二电子设备发送第二信息。目标第二电子设备,还用于根据第二信息,在多个预览窗口中的第一预览窗口对应的第二窗口中显示拖拽状态的目标对象,多个窗口包括该第二窗口。
根据第五方面,或者以上第五方面的任意一种实现方式,第一电子设备,还用于显示第二电子设备的多个虚拟桌面对应的多个虚拟桌面视图。第一电子设备,还用于检测到用户选择多个虚拟桌面视图中目标虚拟桌面视图的第五操作,确定目标虚拟桌面视图对应的目标第二电子设备的目标虚拟桌面,多个虚拟桌面包括该目标虚拟桌面。显示指示用于指示目标第二电子设备在目标虚拟桌面上显示鼠标指针。
根据第五方面,或者以上第五方面的任意一种实现方式,第一电子设备,具体用于检测到用户选择设备视图中目标设备视图中显示的目标窗口预览图的操作,确定目标设备视图对应的目标第二电子设备,显示指示用于指示目标第二电子设备在目标窗口预览图对应的第一窗口中显示鼠标指针。
根据第五方面,或者以上第五方面的任意一种实现方式,第一电子设备,还用于按照预设方式获取第二电子设备的设备信息。其中,预设方式包括按照预设周期获取设备信息,或者,响应于第一操作,向第二电子设备请求获取设备信息。其中,设备信息用于生成设备视图,设备信息包括如下一项或几项:桌面截图、设备标识、当前显示的界面截图、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
根据第五方面,或者以上第五方面的任意一种实现方式,第一电子设备,还用于隐藏鼠标指针。
根据第五方面,或者以上第五方面的任意一种实现方式,显示指示中携带有鼠标指针的位移信息。目标第二电子设备,具体用于根据鼠标指针的位移信息,在确定的对应显示位置显示鼠标指针,显示位置包括如下任一项:对应于第一电子设备隐藏鼠标指针的位置、对应于在第一电子设备的显示屏上第二操作的位置、对应于在目标设备视图上第二操作的位置。
根据第五方面,或者以上第五方面的任意一种实现方式,第一操作包括如下一项或几项:对第一电子设备的第一按键的操作、将鼠标指针移动到显示屏预设区域的操作、对显示屏上显示的预设图标的操作。
第五方面以及第五方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第六方面,本申请实施例提供一种电子设备,该电子设备具有实现如上述第一方 面及其中任一种可能的实现方式中所述的输入控制方法的功能;或者,该电子设备具有实现如上述第二方面及其中任一种可能的实现方式中所述的输入控制方法的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。
第六方面以及第六方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第七方面,提供一种计算机可读存储介质。计算机可读存储介质存储有计算机程序(也可称为指令或代码),当该计算机程序被电子设备执行时,使得电子设备执行第一方面或第一方面中任意一种实施方式的方法;或者,使得电子设备执行第二方面或第二方面中任意一种实施方式的方法。
第七方面以及第七方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第八方面,本申请实施例提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行第一方面或第一方面中任意一种实施方式的方法;或者,使得电子设备执行第二方面或第二方面中任意一种实施方式的方法。
第八方面以及第八方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第九方面,本申请实施例提供一种电路系统,电路系统包括处理电路,处理电路被配置为执行第一方面或第一方面中任意一种实施方式的方法;或者,处理电路被配置为执行第二方面或第二方面中任意一种实施方式的方法。
第九方面以及第九方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第十方面,本申请实施例提供一种芯片系统,包括至少一个处理器和至少一个接口电路,至少一个接口电路用于执行收发功能,并将指令发送给至少一个处理器,当至少一个处理器执行指令时,至少一个处理器执行第一方面或第一方面中任意一种实施方式的方法;或者,至少一个处理器执行第二方面或第一方面中任意二种实施方式的方法。
第十方面以及第十方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
附图说明
图1为本申请实施例提供的跨设备移动鼠标的场景示意图;
图2为本申请实施例提供的一种输入控制方法应用的通信系统的示意图;
图3为本申请实施例提供的第一电子设备或第二电子设备的硬件结构示意图;
图4为本申请实施例提供的输入控制方法流程示意图一;
图5为本申请实施例提供的界面示意图一;
图6A为本申请实施例提供的界面示意图二;
图6B为本申请实施例提供的界面示意图三;
图6C为本申请实施例提供的界面示意图四;
图7A为本申请实施例提供的界面示意图五;
图7B为本申请实施例提供的界面示意图六;
图8为本申请实施例提供的界面示意图七;
图9为本申请实施例提供的输入控制方法流程示意图二;
图10为本申请实施例提供的界面示意图八;
图11为本申请实施例提供的界面示意图九;
图12为本申请实施例提供的界面示意图十;
图13为本申请实施例提供的界面示意图十一;
图14为本申请实施例提供的输入控制方法流程示意图三;
图15为本申请实施例提供的界面示意图十二;
图16为本申请实施例提供的第一电子设备的结构示意图;
图17为本申请实施例提供的第二电子设备的结构示意图。
具体实施方式
下面结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。
在本申请实施例中,“示例性地”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性地”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性地”或者“例如”等词旨在以具体方式呈现相关概念。
图2为本申请实施例提供的输入控制方法应用的通信系统的示意图。如图2中所示,该通信系统包括第一电子设备100和第二电子设备200。其中,第二电子设备200的数量为一个或多个,如图2所示的第二电子设备1、第二电子设备2等。
在一些实施例中,第一电子设备100可响应于用户操作将鼠标指针或文件移动到第二电子设备200,可实现通过第一电子设备100的输入设备控制第二电子设备200。此外,第二电子设备200也可响应于用户操作将鼠标指针或文件移动到第一电子设备100,可实现通过第二电子设备200的输入设备控制第一电子设备100。进一步的,多个第二电子设备200中的任一第二电子设备200也可响应于用户操作将鼠标指针或文件移动到其他第二电子设备200,可实现通过该第二电子设备200的输入设备控制其 他第二电子设备200,如第二电子设备1可响应于用户操作将鼠标指针或文件移动到第二电子设备2。
可选的,第一电子设备100或第二电子设备200例如可以为手机、平板电脑、笔记本电脑、大屏设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、可穿戴设备、人工智能(artificial intelligence,AI)设备、车机等终端设备。第一电子设备100或第二电子设备200安装的操作系统包括但不限于或者其它操作系统。本申请对第一电子设备100或第二电子设备200的具体类型、所安装的操作系统均不作限制。
可选的,第一电子设备100或第二电子设备200的输入设备可以为鼠标,触摸板或触摸屏等。
在一些实施例中,第一电子设备100和第二电子设备200之间建立有通信连接,该通信连接可以为有线连接(如通过USB连接等),也可以为无线通信连接。其中,建立无线通信连接的无线通信技术包括但不限于以下的至少一种:蓝牙(bluetooth,BT)(例如,传统蓝牙或者低功耗(bluetooth low energy,BLE)蓝牙),无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),近距离无线通信(near field communication,NFC),紫蜂(Zigbee),调频(frequency modulation,FM),红外(infrared,IR),超宽带(ultra wide band,UWB)技术等。
可选的,第一电子设备100和第二电子设备200也可以通过局域网中的第三方设备建立通信连接,第三方设备例如是路由器、网关、智能设备控制器、服务器、无线访问接入点(access point,AP)设备等。
比如,第一电子设备100和第二电子设备200都支持靠近发现功能。示例性地,第一电子设备100靠近第二电子设备200后,第一电子设备100和第二电子设备200能够互相发现对方,之后建立诸如Wi-Fi端到端(peer to peer,P2P)连接和/或蓝牙连接等无线通信连接。之后,用户能够使用第一电子设备100的输入设备操作第二电子设备200。
又比如,第一电子设备100和第二电子设备200通过局域网,建立无线通信连接。比如,第一电子设备100和第二电子设备200都连接至同一路由器。
再比如,第一电子设备100和第二电子设备200通过蜂窝网络、因特网等,建立无线通信连接。如第二电子设备200通过路由器接入因特网,第一电子设备100通过蜂窝网络接入因特网;进而,第一电子设备100和第二电子设备200建立无线通信连接。
示例性的,图3示出了第一电子设备100或第二电子设备200的一种结构示意图。
第一电子设备100或第二电子设备200可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,传感器模块180,按键190,马达191,输入模块192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
可以理解的是,本申请实施例示意的结构并不构成对第一电子设备100或第二电 子设备200的具体限定。在本申请另一些实施例中,第一电子设备100或第二电子设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
比如,在第一电子设备100为PC的情况下,第一电子设备100可以不包括移动通信模块150和SIM卡接口195。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器,使处理器110与触摸传感器通过I2C总线接口通信,实现第一电子设备100或第二电子设备200的触摸功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现第一电子设备100或第二电子设备200的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现第一电子设备100或第二电子设备200的显示功能。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为第一电子设备100或第二电子设备200充电,也可以用于第一电子设备100或第二电子设备200与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连 接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对第一电子设备100或第二电子设备200的结构限定。在本申请另一些实施例中,第一电子设备100或第二电子设备200也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过第一电子设备100或第二电子设备200的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
第一电子设备100或第二电子设备200的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。第一电子设备100或第二电子设备200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在第一电子设备100或第二电子设备200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在第一电子设备100或第二电子设备200上的包 括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,第一电子设备100或第二电子设备200的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得第一电子设备100或第二电子设备200可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
第一电子设备100或第二电子设备200通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),例如采用有机发光二极管(organiclight-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Mini-led,Micro-led,Micro-oled,量子点发光二极管(quantum dot light emitting diodes,QLED)等生产制造。在一些实施例中,第一电子设备100或第二电子设备200可以包括1个或N个显示屏194,N为大于1的正整数。
输入模块192可以包括鼠标、键盘、可用于实现键盘鼠标功能的触摸板或触摸屏等。在一些实施例中,第一电子设备100通过输入模块192检测到用户操作,确定用户指示将鼠标指针或选中的文件发送至其他电子设备,可通过显示屏194显示一个或多个与第一电子设备100建立连接的电子设备的设备视图。其中,设备视图用于指示相应的电子设备。响应于用户通过输入模块192选择设备视图的操作,确定将鼠标指针或选中的文件发送到对应的第二电子设备200。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半 导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,第一电子设备100或第二电子设备200可以包括1个或N个摄像头193,N为大于1的正整数。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展第一电子设备100或第二电子设备200的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储第一电子设备100或第二电子设备200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行第一电子设备100或第二电子设备200的各种功能应用以及数据处理。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。第一电子设备100或第二电子设备200可以通过音频模块170,例如音乐播放,录音等。音频模块170可以包括扬声器,受话器,麦克风,耳机接口,以及应用处理器等实现音频功能。
传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
触摸传感器,也称“触控器件”。触摸传感器可以设置于显示屏194,由触摸传感器与显示屏194组成触摸屏,也称“触控屏”。触摸传感器用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器也可以设置于第一电子设备100或第二电子设备200的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。第一电子设备100或第二电子设备200可以接收按键输入,产生与第一电子设备100或第二电子设备200的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同 的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和第一电子设备100或第二电子设备200的接触和分离。第一电子设备100或第二电子设备200可以支持1个或N个SIM卡接口,N为大于1的正整数。
以下以第一电子设备100为第一电子设备,第二电子设备200为第二电子设备为例,对本申请实施例提供的输入控制方法进行说明。
示例性的,图4为本申请实施例提供的一种输入控制方法的流程示意图。如图4所示,该方法包括如下步骤。
S401、第一电子设备和第二电子设备建立通信连接。
在一些实施例中,第一电子设备可通过预设方式与第二电子设备建立通信连接。可选的,预设方式例如包括蓝牙、Wi-Fi、NFC、紫蜂(Zigbee)、红外、UWB等中的一种或多种方式。
比如,第一电子设备和第二电子设备接入同一Wi-Fi局域网络后,第一电子设备可响应于用户操作,搜索到位于当前Wi-Fi局域网络中的第二电子设备,并建立通信连接。
可选的,第二电子设备的数量为一个或多个。
S402、第一电子设备获取第二电子设备的设备信息。
在一些实施例中,第一电子设备与第二电子设备建立通信连接后,可按照预设同步方式获取到第二电子设备的设备信息。其中,该设备信息用于生成电子设备对应的设备视图,可用于区分不同的第二电子设备。
可选的,设备信息例如包括电子设备的桌面截图、设备标识(identity,ID)、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片等中的任一项或几项。
比如,第一电子设备按照预设周期获取第二电子设备的桌面截图。这样后续在步骤S403中,第一电子设备可根据获取到的桌面截图显示电子设备的设备视图,从而用户可根据设备视图的显示内容,区分并确定各个设备视图对应的电子设备。
S403、第一电子设备检测到用户指示移动鼠标指针至其他电子设备的操作后,显示已建立通信连接的电子设备的设备视图。
在一些实施例中,用户指示移动鼠标指针至其他电子设备的操作例如包括对特殊按键(如鼠标中键、键盘组合键等)的操作、将鼠标指针移动到显示屏预设区域(如显示屏右上角,显示屏边缘区域等)的操作、对显示屏上显示的预设图标的操作等。第一电子设备检测到用户指示移动鼠标指针至其他电子设备的操作后,可显示已经与第一电子设备建立通信连接的至少一个电子设备的设备视图。
示例性的,如图5所示,第一电子设备100检测到用户移动鼠标指针至显示屏右上角的操作,确定用于指示移动鼠标指针至其他电子设备。那么,第一电子设备100可显示已经建立连接的电子设备的设备视图。
需要说明的是,后续实施例中所描述的“上”,“下”,“左”和“右”均参考 图5所示的方位,后续不再赘述。
在一些实施例中,第一电子设备可在检测到用户指示移动鼠标指针至其他电子设备的操作后,再执行上述步骤S402,获取第二电子设备的设备信息。之后,第一电子设备再根据获取到的第二电子设备的设备信息,显示第二电子设备的设备视图。即本申请实施例不限制步骤S402和步骤S403的执行顺序。
比如,第一电子设备检测到用户指示移动鼠标指针至其他电子设备的操作后,获取已经建立通信连接的电子设备的桌面截图。之后,第一电子设备根据获取到的桌面截图,显示设备视图。这样,可保证第一电子设备显示的设备视图对应于第二电子设备最新的桌面截图,保证用户确定第二电子设备的准确率。
可选的,设备视图的布局方式例如包括平面布局、线性布局、概念布局等多种方式。
示例性的,如图6A所示,响应于用户操作,第一电子设备100按照平面布局的布局方式,根据与第一电子设备建立通信连接的各个电子设备之间的相对位置关系,显示各个电子设备对应的设备视图。其中,显示的设备视图中可以包括第一电子设备的设备视图。如附图标记61指示的设备视图为第一电子设备的设备视图,第一电子设备可根据与其建立通信连接的各个电子设备与第一电子设备的位置关系,在附图标记61指示的设备视图周围显示各个电子设备的设备视图。
又示例性的,如图6B所示,响应于用户操作,第一电子设备100按照线性布局的布局方式,根据各个电子设备与第一电子设备建立通信连接的时间先后顺序,或通过第一电子设备的输入设备(如鼠标、触摸板等)控制各个电子设备的频率,或通过第一电子设备的输入设备最近一次控制其他各个电子设备的时间顺序,或其他优先级顺序,显示各个电子设备对应的设备视图。比如,如图6B所示,第一电子设备按照从左往右优先级依次降低的顺序,根据建立连接的时间先后顺序或使用频率或最近控制时间,显示各个电子设备的设备视图。如附图标记62指示的设备视图对应的电子设备为最先与第一电子设备建立通信连接的电子设备、或第一电子设备的输入设备控制频率最高的电子设备、或第一电子设备的输入设备最近一次控制的电子设备。
再示例性的,如图6C所示,响应于用户操作,第一电子设备100按照概念布局的布局方式,显示各个电子设备对应的设备视图。比如,附图标记63指示的中心设备视图为第一电子设备的设备视图,在其周围按照电子设备的产品特性显示各个电子设备的设备视图,如越靠近附图标记63指示的中心设备视图的设备视图对应的电子设备的显示屏越大等。
在一些实施例中,第一电子设备可根据在上述步骤S402获取到的设备信息,生成对应的设备视图,设备视图用于帮助用户区分不同的电子设备。比如,设备视图的显示内容例如包括桌面截图、当前显示的界面截图、设备ID、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片等中的任一项或几项。
示例性的,如图7A所示,第一电子设备100获取到的设备信息包括电子设备的桌面截图。那么,响应于用户指示移动鼠标指针至其他电子设备的操作,第一电子设备100显示包括电子设备桌面截图的设备视图。可选的,电子设备桌面截图包括用户设置的电子设备桌面背景、桌面图标、桌面开启窗口等中的至少一项信息,并且,第 一电子设备可按照预设周期获取到最新周期内电子设备的桌面截图。因此,用户可根据电子设备桌面截图,区分不同的电子设备,从而确定需要将鼠标指针移动到的电子设备。
又示例性的,如图7B所示,第一电子设备100获取到的设备信息包括电子设备的用户编辑的设备昵称。那么,响应于用户指示移动鼠标指针至其他电子设备的操作,第一电子设备100显示包括用户编辑的设备昵称的设备视图。那么,用户可根据自己设置的设备昵称,区分设备视图对应的不同的电子设备,从而确定需要将鼠标指针移动到的电子设备。
S404、第一电子设备检测到用户选择目标设备视图的操作,确定目标设备视图对应的第二电子设备。
在一些实施例中,用户选择目标设备视图的操作例如包括用户通过鼠标点击目标设备视图的操作、用户移动鼠标指针至目标设备视图并停留超过预设时间的操作等。这样,用户通过在第一电子设备显示的设备视图中选择目标设备视图的操作,可实现快速在与第一电子设备建立通信连接的至少一个电子设备中选择所需的第二电子设备。
可选的,第一电子设备检测到用户选择目标设备视图的操作后,可突出显示该设备视图,便于用户确认选择的电子设备是否正确。可选的,第一电子设备响应于用户选择目标设备视图的操作,确定目标设备视图对应的第二电子设备后,可向第二电子设备发送提示指示。相应的,第二电子设备接收到该提示指示后,也可通过在显示屏边缘一圈显示提示条、音频模块提醒、显示提示信息等多种状态变化提示方式,提示用户确认是否将鼠标指针移动到该第二电子设备上显示。
示例性的,如图8中(a)所示,第一电子设备100检测到用户移动鼠标指针81至显示屏右上角区域的操作后,显示包括桌面截图的设备视图,如附图标记82指示的设备视图。如图8中(b)所示,第一电子设备100检测到用户移动鼠标指针81至附图标记82指示的设备视图显示位置并点击鼠标的操作后,可确定用户选择移动鼠标指针81至附图标记82指示的设备视图对应的第二电子设备200。第一电子设备100通过附图标记82指示的设备视图的状态变化提示用户确定选择,以及第二电子设备200响应于第一电子设备100发送的提示指示,在显示屏边缘一圈显示提示条提示用户确定选择。
S405、第一电子设备隐藏鼠标指针。
在一些实施例中,第一电子设备确定用户指示移动鼠标指针至第二电子设备后,可隐藏本侧鼠标指针。第一电子设备隐藏鼠标指针,也可以理解为,第一电子设备上的鼠标指针消失、或者第一电子设备上不显示鼠标指针。
S406、第一电子设备向第二电子设备发送鼠标信息。
在一些实施例中,鼠标信息例如包括鼠标位移信息等信息。第一电子设备响应于用户指示移动鼠标指针至第二电子设备,确定当前鼠标信息,并将鼠标信息发送至第二电子设备。相应的第二电子设备接收第一电子设备发送的鼠标信息。
需要说明的是,本申请实施例不限制步骤S405和步骤S406的执行顺序。比如,第一电子设备可在隐藏鼠标指针后,再向第二电子设备发送鼠标信息(即第一电子设备先执行步骤S405,再执行步骤S406)。又比如,第一电子设备可在向第二电子设备 发送鼠标信息后,再隐藏鼠标指针(即第一电子设备先执行步骤S406后,再执行步骤S405)。再比如,第一电子设备在隐藏鼠标指针的过程中,向第二电子设备发送鼠标信息(即第一电子设备同时执行步骤S405和步骤S406)。
S407、第二电子设备根据鼠标信息,显示第一电子设备的输入设备对应的鼠标指针。
在一些实施例中,第二电子设备接收到鼠标信息后,可根据鼠标信息显示第一电子设备的输入设备对应的鼠标指针,从而实现将第一电子设备的鼠标指针移动至第二电子设备上显示,用户可通过第一电子设备的键鼠控制第二电子设备。
可选的,鼠标信息包括鼠标位移信息。第二电子设备根据鼠标位移信息,可在第一电子设备隐藏鼠标指针之前第一电子设备显示鼠标指针的相应位置(即鼠标指针在第一电子设备消失的位置的对应位置),对应显示第一电子设备的输入设备对应的鼠标指针,或者,第二电子设备根据鼠标位移信息,可在用户点击第二电子设备的设备视图(如对应于桌面截图)中的相应位置,对应显示第一电子设备的输入设备对应的鼠标指针。或者,第二电子设备在预设的固定位置显示第一电子设备的输入设备对应的鼠标指针。从而便于用户在第二电子设备显示屏上确定鼠标指针显示位置,提升用户使用体验。
示例性的,如图8中(b)所示,第一电子设备100检测到用户通过鼠标指针81点击附图标记82指示的设备视图的操作后,如图8中(c)所示,第一电子设备100隐藏鼠标指针,并向第二电子设备200发送鼠标信息。第二电子设备200根据接收到的鼠标信息,按照上述任一种方式显示鼠标指针81。从而实现将第一电子设备100的输入设备控制的鼠标指针81移动至第二电子设备200上显示。
如此,在跨设备移动鼠标指针的过程中,源端设备(如第一电子设备)通过显示设备视图的方式,帮助用户实现快速移动鼠标指针至目标设备(如第二电子设备)。这样,用户不需要通过较长路径跨设备移动鼠标指针,避免用户操作失误导致的鼠标指针移动失败,并且可提升交互效率。
此外,直接通过设备视图跨设备移动鼠标指针,相比于现有技术,鼠标指针移动的复杂度不会由于电子设备数量的增多而增加,提升用户使用体验。
以上实施例详细介绍了鼠标指针跨设备移动过程,在输入控制过程中,第一电子设备还可响应于用户操作,将第一电子设备中的文件或窗口移动到第二电子设备。如下对文件或窗口的移动过程进行详细介绍。
示例性的,图9为本申请实施例提供的又一种输入控制方法的流程示意图。如图9所示,该方法包括如下步骤。
S901、第一电子设备和第二电子设备建立通信连接。
S902、第一电子设备获取第二电子设备的设备信息。
可选的,步骤S901和步骤S902的内容,可参考上述步骤S401和步骤S402所述的相关内容,在此不再赘述。
S903、第一电子设备检测到用户选中文件或窗口的操作后,检测到用户指示移动鼠标指针至其他电子设备的操作,显示已建立通信连接的电子设备的设备视图。
在一些实施例中,用户在使用第一电子设备的过程中,可通过第一电子设备的输 入设备选中第一电子设备上显示的目标对象,该目标对象如包括上述用户选中的文件或窗口。其中,文件例如包括第一电子设备中各种类型的文件,例如图标、文本、用户选中的文字、文档、图片、视频等。窗口例如包括第一电子设备显示的窗口。需要说明的是,下文中以跨设备移动用户选中的文件或窗口的过程为例,对跨设备移动用户选中的目标对象的过程进行详细介绍,对此下文不再说明。
在一些实施例中,用户指示移动鼠标指针至其他电子设备的操作例如包括用户在选中文件或窗口后,晃动鼠标的操作、对特殊按键(如鼠标中键、键盘组合键等)的操作、将鼠标指针移动到显示屏预设区域(如显示屏右上角,显示屏边缘区域等)的操作、对显示屏上显示的预设图标的操作等。
示例性的,如图10中(a)所示,第一电子设备100检测到用户通过鼠标指针101长按图片102的操作(如用户长按鼠标左键的操作)后,确定用户选中图片102。之后,第一电子设备100检测到用户在通过鼠标指针101长按图片102的过程中,晃动鼠标的操作,确定用户指示移动鼠标指针101和图片102至其他电子设备。如图10中(b)所示,第一电子设备100显示与其已经建立通信连接的各个电子设备的设备视图。
在一些实施例中,设备视图的布局方式例如包括平面布局、线性布局、概念布局等多种方式。
可选的,步骤S903的其他内容,可参考上述步骤S403的相关内容,在此不再赘述。需要说明的是,本申请实施例同样不限制步骤S902和步骤S903的执行顺序。比如,第一电子设备可在检测到用户指示移动鼠标指针至其他电子设备的操作后,再获取与其建立通信连接的电子设备的设备信息,以显示设备视图。
S904、第一电子设备检测到用户将选中的文件或窗口移动到目标设备视图显示位置的操作后,确定目标设备视图对应的第二电子设备。
在一些实施例中,第一电子设备检测到用户将选中的文件或窗口移动到目标设备视图显示位置的操作后,可确定用于指示将选中的文件或窗口移动到目标设备视图对应的第二电子设备显示。
示例性的,如图10中(b)所示,第一电子设备100检测到用户通过鼠标指针101移动图片102至附图标记103指示的目标设备视图显示位置的操作,可突出显示附图标记103指示的目标设备视图,以提示用户确认选择的电子设备是否正确。可选的,第一电子设备100还可向附图标记103指示的目标设备视图对应的第二电子设备200发送提示指示,用于指示第二电子设备200按照预设方式提示用户选择的电子设备是否正确,该预设方式例如包括在显示屏边缘一圈显示提示条、音频模块提醒、显示提示信息等。
可选的,第一电子设备100确定鼠标指针101在附图标记103指示的目标设备视图的显示位置上保持选中状态停止超过预设时间,可确定用户选择移动图片102至附图标记103指示的目标设备视图对应的第二电子设备200上显示。
在一些实施例中,第一电子设备在移动选中的文件或窗口至目标设备视图的过程中,可按照预设方式变化显示文件或窗口对应的视图。比如,第一电子设备在移动选中的文件或窗口至目标设备视图的过程中,缩小文件或窗口对应的视图,以适应目标 设备视图的大小。
示例性的,如图10中(b)所示,响应于用户操作,第一电子设备100缩小显示图片102,以适应附图标记103指示的目标设备视图的大小。
S905、第一电子设备隐藏鼠标指针。
在一些实施例中,第一电子设备确定用户指示移动鼠标指针、文件或窗口至第二电子设备后,可隐藏本侧鼠标指针(如鼠标指针消失,或不显示鼠标指针)。从而产生鼠标指针跨设备穿越的显示效果,帮助用户确定鼠标指针已移动出第一电子设备。
S906、第一电子设备向第二电子设备发送鼠标信息、文件或窗口的信息。
在一些实施例中,第一电子设备在确定目标视图对应的第二电子设备后,可将被用户选中的文件或窗口的信息以及鼠标信息,发送到第二电子设备。这样,实现将鼠标指针、用户选中的文件或窗口穿越到第二电子设备上显示。
可选的,文件或窗口对应的信息用于显示文件或窗口对应的视图。
需要说明的是,如上述步骤S405和步骤S406所述的相关内容,本申请实施例同样不限制步骤S905和步骤S906的执行顺序。
S907、第二电子设备显示被第一电子设备的输入设备对应的鼠标指针拖拽的文件或窗口。
S908、第一电子设备向第二电子设备发送拖拽文件或窗口的信息。
S909、第二电子设备显示被第一电子设备的输入设备对应的鼠标指针拖拽的文件或窗口。
在一些实施例中,在步骤S907-步骤S909中,第二电子设备根据获取到的鼠标信息、文件或窗口的信息,显示保持拖拽状态的文件或窗口的视图,从而用户可继续通过第一电子设备的鼠标将文件或窗口拖拽到所需位置。其中,第一电子设备检测到鼠标拖拽文件或窗口移动的操作后,将移动过程中的拖拽文件或窗口的信息(如包括位置信息等)发送到第二电子设备,从而第二电子设备可响应于用户操作,在第二电子设备的显示屏上显示移动的文件或窗口。
示例性的,如图10中(c)所示,第一电子设备100将鼠标信息、文件或窗口的信息发送到第二电子设备200,并隐藏本侧鼠标指针。相应的,第二电子设备200根据鼠标信息、文件或窗口的信息,可显示处于被鼠标指针101拖拽状态的图片104。之后,响应于用户移动第一电子设备100的鼠标拖拽图片的操作,第一电子设备100在拖拽过程中向第二电子设备200发送拖拽文件或窗口的信息。如图10中(d)所示,第二电子设备200根据拖拽文件或窗口的信息,移动图片104的显示位置。
可选的,步骤S908-步骤S909为循环步骤,第一电子设备可响应于用户移动鼠标的操作,多次向第二电子设备发送拖拽文件或窗口的信息的操作,这样第二电子设备可响应于用户操作,显示移动过程中的文件或窗口。
可选的,步骤S908-步骤S909为可选步骤,用户可能在将文件或窗口移动至第二电子设备后,不再移动该文件或窗口,而是直接执行拖拽释放操作,结束此次跨设备移动文件或窗口的操作(即在步骤S907后,直接执行下述步骤S910)。那么,第二电子设备直接解除文件或窗口的拖拽状态,在相应的位置显示文件或窗口(具体过程详见下述步骤)。
S910、第一电子设备检测到用户的拖拽释放操作。
S911、第一电子设备向第二电子设备发送拖拽完成指示。
在一些实施例中,在步骤S910和步骤S911中,第一电子设备检测到用户的拖拽释放操作(如用户松开鼠标的操作等操作)后,可确定用户已经将文件或窗口移动到所需位置。那么,第一电子设备可向第二电子设备发送拖拽完成指示,用于指示第二电子设备停止移动显示文件或窗口,在当前显示位置,显示移动后的文件或窗口。
可选的,拖拽完成指示中可携带拖拽文件或窗口的内容信息。比如,在拖拽文件或窗口的过程中,第二电子设备可根据文件或窗口的缩略图信息显示文件或窗口的视图。之后,在确定已完成文件或窗口的拖拽后,后续用户可能需要编辑文件或窗口,那么第二电子设备需要获取到文件或窗口的详细内容。
S912、第二电子设备根据显示屏显示内容,显示文件或窗口。
在一些实施例中,第二电子设备在获取到拖拽完成指示后,可根据显示屏的显示内容,显示拖拽后的文件或窗口。
比如,第二电子设备显示屏显示桌面,第二电子设备可在桌面相应位置显示拖拽后的文件或窗口。
示例性的,如图10所示场景,在第二电子设备200显示桌面的过程中,用户将第一电子设备100中的图片拖拽到第二电子设备200上显示。如图10中(d)所示,第二电子设备200接收到第一电子设备100发送的拖拽完成指示后,在桌面上的相应位置显示拖拽后的图片104。可选的,第一电子设备100将图片102移动到第二电子设备200显示后,在第一电子设备100上的原显示位置仍可保持显示图片102。也可以理解为,第一电子设备100上的图片102通过拖拽操作复制到第二电子设备200。
又示例性的,如图11中(a)所示,第一电子设备100检测到用户通过鼠标指针111选中窗口112的操作后,又触发指示移动鼠标指针至其他电子设备的操作(如晃动选中窗口112的操作),可显示如图11中(b)所示的已建立通信连接的电子设备的设备视图。第一电子设备100检测到用户移动窗口112至附图标记113指示的目标设备视图显示位置的操作,可确定用户指示将窗口112发送至附图标记113指示的目标设备视图对应的第二电子设备中显示。那么,如图11中(c)所示,第一电子设备100向第二电子设备200发送鼠标指针111和窗口112的信息,并隐藏鼠标指针111。
相应的,第二电子设备200根据接收到的鼠标信息和窗口信息,显示鼠标指针111和窗口114。之后,响应于用户移动第一电子设备100鼠标的操作,第一电子设备100向第二电子设备200发送拖拽窗口的信息,第二电子设备200相应的移动窗口114的显示位置。如图11中(d)所示,第一电子设备100检测到用户的拖拽释放操作,可向第二电子设备200发送携带有窗口内容信息的拖拽完成指示。第二电子设备200接收到拖拽完成指示后,确定当前显示桌面,可在桌面相应位置显示窗口114。
又比如,第二电子设备显示屏显示窗口,可在该窗口的相应光标位置插入拖拽的文件。或者,该窗口不支持插入拖入的文件,可提示用户文件拖拽失败;或将该窗口最小化,将拖拽的文件在桌面上显示。或者,第二电子设备显示窗口,可在该窗口上覆盖显示由第一电子设备拖拽来的窗口。
示例性的,如图12中(a)所示,第一电子设备100检测到用户通过鼠标指针121 选中图片122的操作后,又触发指示移动鼠标指针至其他电子设备的操作(如晃动选中图片122的操作),可显示如图12中(b)所示的已建立通信连接的电子设备的设备视图。第一电子设备100检测到用户移动图片122至附图标记123指示的目标设备视图显示位置的操作,可确定用户指示将图片122发送至附图标记123指示的目标设备视图对应的第二电子设备。可选的,如图12中的(b)所示,附图标记123指示的目标设备视图中可以显示第二电子设备200上当前显示的窗口的预览窗口。第一电子设备检测到用户移动图片122至该预览窗口中,可确定用户指示将图片122发送至第二电子设备中对应的窗口。那么,如图12中(c)所示,第一电子设备100向第二电子设备200发送鼠标指针121和图片122的信息,并隐藏鼠标指针121。可选的,第一电子设备100仍在原位置保持显示图片122。
相应的,第二电子设备200根据接收到的鼠标信息和图片信息,显示鼠标指针121和图片124。如图12中(c)所示,第二电子设备200当前显示的窗口中显示鼠标指针121和图片124。之后,响应于用户移动第一电子设备100鼠标的操作,第一电子设备100向第二电子设备200发送拖拽图片的信息,第二电子设备200相应的移动图片124的显示位置。如图12中(d)所示,第一电子设备100检测到用户的拖拽释放操作,可向第二电子设备200发送携带有图片内容信息的拖拽完成指示。第二电子设备200接收到拖拽完成指示后,根据当前显示的窗口,确定该窗口允许在光标位置插入图片124,那么第二电子设备200根据图片信息在光标位置插入显示图片124。
在一些场景中,第二电子设备可运行多个窗口(包括桌面或应用程序对应的窗口)。那么,第二电子设备检测到用户移动文件至第二电子设备的操作后,可显示多个窗口的预览栏。之后,响应于用户操作,确定用户指示将文件移动到哪一窗口中显示。或者,第二电子设备检测到用户移动鼠标指针至第二电子设备的操作后,可显示多个窗口的预览栏。之后,响应于用户操作,确定用户指示将鼠标指针移动到哪一窗口中显示。
示例性的,如图13中(a)所示,第一电子设备100响应于用户通过鼠标指针131将图片132移动至其他电子设备的操作,显示如图13中(b)所示的设备视图界面。第一电子设备100检测到用户移动图片132至附图标记133指示的目标设备视图的操作,向附图标记133指示的目标设备视图对应的第二电子设备200发送鼠标信息和图片信息,并隐藏鼠标指针131。
相应的,第二电子设备200接收到鼠标信息和图片信息后,如任务栏134所示,确定当前后台正在运行多个窗口。那么,如图13中(c)所示,第二电子设备200显示运行的多个窗口对应的预览窗口(如包括桌面或应用程序对应的预览窗口)。之后,第二电子设备200接收到第一电子设备100发送的拖拽图片信息后,确定用户指示移动图片135至预览窗口136的操作,可确定用户指示将图片136插入到预览窗口136对应的窗口中。那么,如图13中(d)所示,第二电子设备200可显示预览窗口136对应的窗口137,并在窗口137中显示保持拖拽状态的图片135,这样用户可将图片135移动至窗口137中合适的位置显示。之后,响应于用户的拖拽释放操作,第一电子设备100向第二电子设备200发送拖拽完成指示,第二电子设备200根据拖拽完成指示在窗口137的相应位置插入并显示图片135。
如此,在跨设备移动文件或窗口的过程中,源端设备(如第一电子设备)通过显示设备视图的方式,帮助用户实现快速移动文件或窗口至目标设备(如第二电子设备),并实现在目标设备上继续移动文件或窗口。这样,用户不需要通过较长路径跨设备移动文件或窗口,避免用户操作失误导致的文件或窗口移动失败。并且,提升在多设备之间移动文件或窗口的交互效率。
此外,直接通过设备视图跨设备移动文件或窗口,相比于现有技术,文件或窗口移动的复杂度不会由于电子设备数量的增多而增加,提升用户使用体验。
以上实施例详细介绍了文件或窗口跨设备移动过程。在一些场景中,同一第二电子设备可对应于多个虚拟桌面(如第二电子设备连接有多个显示屏;或者,第二电子设备的一个显示屏可切换显示多个虚拟桌面等),那么第一电子设备需确定用户指示将鼠标指针、窗口或文件移动至哪一第二电子设备的哪一桌面上显示。如下对多桌面场景中,输入控制过程进行详细介绍。
示例性的,图14为本申请实施例提供的又一种输入控制方法的流程示意图。如图14所示,该方法包括如下步骤。
S1401、第一电子设备和第二电子设备建立通信连接。
S1402、第一电子设备获取第二电子设备的设备信息。
S1403、第一电子设备检测到用户选中文件或窗口的操作后,检测到用户指示移动鼠标指针至其他电子设备的操作,显示已建立通信连接的电子设备的设备视图。
S1404、第一电子设备检测到用户将选中的文件或窗口移动到目标设备视图显示位置的操作,确定目标设备视图对应的第二电子设备。
可选的,步骤S1401-步骤S1404的内容,可参考上述步骤S901-步骤S904所述的相关内容,在此不再赘述。
需要说明的是,本申请实施例同样不限制步骤S1402和步骤S1403的执行顺序。比如,第一电子设备可在检测到用户指示移动鼠标指针至其他电子设备的操作后,再获取与其建立通信连接的电子设备的设备信息,以显示设备视图。
S1405、第一电子设备确定第二电子设备配置有多个虚拟桌面,显示多个虚拟桌面视图。
在一些实施例中,第一电子设备在与第二电子设备建立通信连接后,可获取第二电子设备配置的虚拟桌面情况,如第二电子设备配置有多个虚拟桌面。那么,第一电子设备可根据第二电子设备的虚拟桌面情况,创建对应于第二电子设备的设备视图以及对应于该设备视图的多个虚拟桌面视图,其中每一虚拟桌面视图对应于第二电子设备的一个虚拟桌面。
可选的,在上述步骤S1402中,第一电子设备获取第二电子设备的设备信息中可包括第二电子设备的虚拟桌面的信息。如包括虚拟桌面截图、虚拟桌面当前显示界面截图、虚拟桌面ID、虚拟桌面名称、用户编辑的虚拟桌面昵称、用于标识虚拟桌面的图片等中的任一项或几项。这样,第一电子设备根据虚拟桌面信息显示的虚拟桌面视图可帮助用户区分第二电子设备不同的虚拟桌面。
示例性的,如图15中(a)所示,第一电子设备100响应于用户通过鼠标指针151将图片152移动至其他电子设备的操作,显示如图15中(b)所示的设备视图界面。 第一电子设备100检测到用户移动图片152至附图标记153指示的目标设备视图的操作,确定附图标记153指示的目标设备视图对应的第二电子设备200,并确定第二电子设备200配置有多个虚拟桌面。那么,如图15中(c)所示,第一电子设备100显示附图标记153指示的目标设备视图对应的多个虚拟桌面视图,用户可根据虚拟桌面视图,确定需要将图片152移动至第二电子设备200的哪一虚拟桌面上显示。
可选的,第一电子设备100检测到用户通过鼠标指针151控制图片152在附图标记153指示的目标设备视图显示超过预设时间后,可以显示如图15中(c)所示的附图标记153指示的目标设备视图对应的多个虚拟桌面视图。
可选的,如图15中(c)所示场景,第一电子设备100检测到在多个虚拟桌面视图显示位置以外的位置,用户通过鼠标指针151控制图片152显示超过预设时间的操作后,可确定用户目标设备视图选择错误。那么,第一电子设备100可返回显示如图15中(b)所示的设备视图界面,接收用户重新选择目标设备视图的操作。
可选的,第一电子设备显示的设备视图对应的电子设备,可只配置有一个虚拟桌面(如电子设备只配置一个显示屏,或者配置的一个显示屏中只创建了一个虚拟桌面等),那么第一电子设备检测到用户移动文件或窗口至该设备视图的操作后,可直接将文件或窗口的信息、鼠标信息发送至该设备视图对应的电子设备,而不必再展开显示该设备视图对应的虚拟桌面视图。
S1406、第一电子设备检测到用户将文件或窗口移动到目标虚拟桌面视图的操作后,确定目标虚拟桌面视图对应的目标虚拟桌面。
在一些实施例中,第一电子设备检测到用户将文件或窗口移动到目标虚拟桌面视图的操作后,可确定用户指示将文件或窗口发送到第二电子设备的该目标虚拟桌面视图对应的虚拟桌面上显示。
示例性的,如图15中(c)所示,第一电子设备100检测到用户移动图片152至附图标记154指示的目标虚拟桌面视图的操作后,可突出显示该目标虚拟桌面视图,以提示用户确认选择的虚拟桌面是否正确。可选的,第一电子设备100还可向附图标记154指示的目标虚拟桌面视图对应的第二电子设备200发送提示指示,用于指示第二电子设备200按照预设方式提示用户选择的虚拟桌面是否正确,该预设方式例如包括在显示屏边缘一圈显示提示条、切换至目标虚拟桌面、音频模块提醒、显示提示信息等。
可选的,如图15所示场景,第二电子设备200在同一个显示屏上可切换显示的多个虚拟桌面。第二电子设备200接收到携带有目标虚拟桌面标识的提示指示后,如图15中(c)所示,可切换显示目标虚拟桌面,并在显示屏边缘一圈显示提示条,提示用户确定是否将图片移动至该目标虚拟桌面上显示。
可选的,如图15中(c)所示,第一电子设备100确定鼠标指针151在附图标记154指示的目标虚拟桌面视图的显示位置上保持选中状态停止超过预设时间,可确定用户指示移动图片152至附图标记154指示的目标虚拟桌面视图对应的第二电子设备200的目标虚拟桌面。
在一些实施例中,第一电子设备在移动选中的文件或窗口至目标设备视图或目标虚拟桌面视图的过程中,可按照预设方式变化显示文件或窗口对应的视图。比如,第 一电子设备在移动选中的文件或窗口至目标设备视图或目标虚拟桌面视图的过程中,缩小文件或窗口对应的视图,以适应目标设备视图或目标虚拟桌面视图的大小。
示例性的,如图15中(b)所示,响应于用户操作,第一电子设备100缩小显示图片152,以适应附图标记153指示的目标设备视图的大小。
又示例性的,如图15中(c)所示,响应于用户操作,第一电子设备100缩小显示图片152,以适应附图标记154指示的虚拟桌面视图的大小。
S1407、第一电子设备隐藏鼠标指针。
在一些实施例中,第一电子设备确定用户指示移动鼠标指针、文件或窗口至第二电子设备的目标虚拟桌面后,可隐藏本侧鼠标指针(如鼠标指针消失,或不显示鼠标指针)。从而产生鼠标指针跨设备穿越的显示效果,帮助用户确定鼠标指针已移动出第一电子设备。
S1408、第一电子设备向第二电子设备发送鼠标信息、文件或窗口的信息、目标虚拟桌面信息。
在一些实施例中,第一电子设备在确定目标虚拟桌面视图对应的第二电子设备目标虚拟桌面后,可将被用户选中的文件或窗口的信息、鼠标信息、以及目标虚拟桌面信息,发送到第二电子设备。第二电子设备可根据目标虚拟桌面信息,确定用户选择的目标虚拟桌面。这样,实现将鼠标指针、用户选中的文件或窗口穿越到第二电子设备的目标虚拟桌面上显示。
可选的,文件或窗口对应的信息用于显示文件或窗口对应的视图。
需要说明的是,如上述步骤S405和步骤S406所述的相关内容,本申请实施例同样不限制步骤S1407和步骤S1408的执行顺序。
S1409、第二电子设备在目标虚拟桌面上,显示被第一电子设备的输入设备对应的鼠标指针拖拽的文件或窗口。
S1410、第一电子设备向第二电子设备发送拖拽文件或窗口的信息。
S1411、第二电子设备在目标虚拟桌面上,显示被第一电子设备的输入设备对应的鼠标指针拖拽的文件或窗口。
在一些实施例中,在步骤S1409-步骤S1411中,第二电子设备根据获取到的鼠标信息、文件或窗口的信息、目标虚拟桌面信息,在目标虚拟桌面上显示保持拖拽状态的文件或窗口的视图,从而用户可继续通过第一电子设备的鼠标将文件或窗口拖拽到所需位置。其中,第一电子设备检测到鼠标拖拽文件或窗口移动的操作后,将移动过程中的拖拽文件或窗口的信息(如包括位置信息等)发送到第二电子设备,从而第二电子设备可响应于用户操作,在第二电子设备的目标虚拟桌面上显示移动的文件或窗口。
示例性的,如图15中(c)所示,第一电子设备100将鼠标信息、文件或窗口的信息、目标虚拟桌面信息发送到第二电子设备200,并隐藏本侧鼠标指针。相应的,第二电子设备200根据鼠标信息、文件或窗口的信息、目标虚拟桌面信息,如图15中(d)所示,第二电子设备200可在目标虚拟桌面上显示处于被鼠标指针151拖拽状态的图片155。之后,响应于用户移动第一电子设备100的鼠标拖拽图片的操作,第一电子设备100在拖拽过程中向第二电子设备200发送拖拽文件或窗口的信息,这样 第二电子设备200根据拖拽文件或窗口的信息,移动图片155的显示位置。
可选的,步骤S1410-步骤S1411为循环步骤,第一电子设备可响应于用户移动鼠标的操作,多次向第二电子设备发送拖拽文件或窗口的信息的操作,这样第二电子设备可响应于用户操作,显示移动过程中的文件或窗口。
可选的,步骤S1410-步骤S1411为可选步骤,用户可能在将文件或窗口移动至第二电子设备后,不再移动该文件或窗口,而是直接执行拖拽释放操作,结束此次跨设备移动文件或窗口的操作(即在步骤S1409后,直接执行下述步骤S1412)。那么,第二电子设备直接解除文件或窗口的拖拽状态,在相应的位置显示文件或窗口(具体过程详见下述步骤)。
在一些实施例中,在上述图4所示的跨设备传输鼠标指针的场景中,第一电子设备在上述步骤S404中确定目标设备视图对应的第二电子设备后,也可确定第二电子设备配置的虚拟桌面的数量。如果第二电子设备配置有多个虚拟桌面,那么如上述步骤S1405,第一电子设备也可显示第二电子设备对应的多个虚拟桌面视图,以接收用户选择目标虚拟桌面视图的操作。之后,第一电子设备根据用户操作,可将鼠标指针发送到第二电子设备的目标虚拟桌面上显示。
需要说明的是,鼠标指针跨设备传输到目标虚拟桌面上显示的具体实现方法,可参考上述步骤S1405-步骤S1409,在此不再赘述。
S1412、第一电子设备检测到用户的拖拽释放操作。
S1413、第一电子设备向第二电子设备发送拖拽完成指示。
在一些实施例中,在步骤S1412和步骤S1413中,第一电子设备检测到用户的拖拽释放操作(如松开鼠标的操作等操作)后,可确定用户已经将文件或窗口移动到目标虚拟桌面的所需位置。那么,第一电子设备可向第二电子设备发送拖拽完成指示,用于指示第二电子设备停止移动显示文件或窗口,在目标虚拟桌面当前显示位置,显示移动后的文件或窗口。
可选的,拖拽完成指示中可携带拖拽文件或窗口的内容信息。比如,在拖拽文件或窗口的过程中,第二电子设备可根据文件或窗口的缩略图信息显示文件或窗口的视图。之后,在确定已完成文件或窗口的拖拽后,后续用户可能需要编辑文件或窗口,那么第二电子设备需要获取到文件或窗口的详细内容。
S1414、第二电子设备根据目标虚拟桌面显示内容,在目标虚拟桌面上显示文件或窗口。
在一些实施例中,第二电子设备在获取到拖拽完成指示后,可根据目标虚拟桌面的显示内容,显示拖拽后的文件或窗口。
比如,第二电子设备的目标虚拟桌面显示桌面,第二电子设备可在桌面相应位置显示拖拽后的文件或窗口。
又比如,第二电子设备的目标虚拟桌面显示窗口,可在该窗口的相应光标位置插入拖拽的文件。或者,该窗口不支持插入拖入的文件,可提示用户文件拖拽失败;或将该窗口最小化,将拖拽的文件在桌面上显示。或者,第二电子设备的目标虚拟桌面显示窗口,可在该窗口上覆盖显示由第一电子设备拖拽来的窗口。
再比如,目标虚拟桌面可运行多个窗口,如上述图13所述的相关内容,第二电子 设备可在目标虚拟桌面上显示多个窗口的预览栏,之后响应于用户操作,在目标虚拟桌面的用户选择的窗口中插入由第一电子设备移动来的文件。
如此,在多桌面场景的跨设备移动文件或窗口的过程中,源端设备(如第一电子设备)也可通过显示设备视图和虚拟桌面视图的方式,帮助用户实现快速移动文件或窗口至目标设备(如第二电子设备)的目标虚拟桌面,并实现在目标设备的目标虚拟桌面上继续移动文件或窗口。这样,用户不需要通过较长路径跨设备移动文件或窗口,避免用户操作失误导致的文件或窗口移动失败。并且,提升在多设备之间移动文件或窗口的交互效率。
以上结合图4-图15详细说明了本申请实施例提供的输入控制方法。以下结合图16和图17详细说明本申请实施例提供的电子设备。
在一种可能的设计中,图16为本申请实施例提供的第一电子设备的结构示意图。如图16所示,第一电子设备1600可以包括:收发单元1601、处理单元1602以及显示单元1603。第一电子设备1600可用于实现上述方法实施例中涉及的第一电子设备的功能。
可选地,收发单元1601,用于支持第一电子设备1600执行图4中的S401、S402、S403、S404以及S406;和/或,用于支持第一电子设备1600执行图9中的S901、S902、S903、S904、S906、S908、S910以及S911;和/或,用于支持第一电子设备1600执行图14中的S1401、S1402、S1403、S1404、S1406、S1408、S1410、S1412以及S1413。
可选地,处理单元1602,用于支持第一电子设备1600执行图4中的S404和S405;和/或,用于支持第一电子设备1600执行图9中的S904和S905;和/或,用于支持第一电子设备1600执行图14中的S1404、S1405、S1406以及S1407。
可选地,显示单元1603,用于支持第一电子设备1600执行图4中的S403和S405;和/或,用于支持第一电子设备1600执行图9中的S903和S905;和/或,用于支持第一电子设备1600执行图14中的S1403、S1405以及S1407。
其中,收发单元可以包括接收单元和发送单元,可以由收发器或收发器相关电路组件实现,可以为收发器或收发模块。第一电子设备1600中的各个单元的操作和/或功能分别为了实现上述方法实施例中所述的输入控制方法的相应流程,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能单元的功能描述,为了简洁,在此不再赘述。
可选地,图16所示的第一电子设备1600还可以包括存储单元(图16中未示出),该存储单元中存储有程序或指令。当收发单元1601、处理单元1602以及显示单元1603执行该程序或指令时,使得图16所示的第一电子设备1600可以执行上述方法实施例中所述的输入控制方法。
图16所示的第一电子设备1600的技术效果可以参考上述方法实施例中所述的输入控制方法的技术效果,此处不再赘述。
除了以第一电子设备1600的形式以外,本申请提供的技术方案也可以为第一电子设备中的功能单元或者芯片,或者与第一电子设备匹配使用的装置。
在一种可能的设计中,图17为本申请实施例提供的第二电子设备的结构示意图。如图17所示,第二电子设备1700可以包括:收发单元1701、处理单元1702以及显 示单元1703。第二电子设备1700可用于实现上述方法实施例中涉及的第二电子设备的功能。
可选地,收发单元1701,用于支持第二电子设备1700执行图4中的S401、S402以及S406;和/或,用于支持第二电子设备1700执行图9中的S901、S902、S906、S908以及S911;和/或,用于支持第二电子设备1700执行图14中的S1401、S1402、S1408、S1410以及S1413。
可选地,处理单元1702,用于支持第二电子设备1700执行图4中的S407;和/或,用于支持第二电子设备1700执行图9中的S907、S909以及S912;和/或,用于支持第二电子设备1700执行图14中的S1409、S1411以及S1414。
可选地,显示单元1703,用于支持第二电子设备1700执行图4中的S407;和/或,用于支持第二电子设备1700执行图9中的S907、S909以及S912;和/或,用于支持第二电子设备1700执行图14中的S1409、S1411以及S1414。
其中,收发单元可以包括接收单元和发送单元,可以由收发器或收发器相关电路组件实现,可以为收发器或收发模块。第二电子设备1700中的各个单元的操作和/或功能分别为了实现上述方法实施例中所述的输入控制方法的相应流程,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能单元的功能描述,为了简洁,在此不再赘述。
可选地,图17所示的第二电子设备1700还可以包括存储单元(图17中未示出),该存储单元中存储有程序或指令。当收发单元1701、处理单元1702以及显示单元1703执行该程序或指令时,使得图17所示的第二电子设备1700可以执行上述方法实施例中所述的输入控制方法。
图17所示的第二电子设备1700的技术效果可以参考上述方法实施例中所述的输入控制方法的技术效果,此处不再赘述。
除了以第一电子设备1600的形式以外,本申请提供的技术方案也可以为第一电子设备中的功能单元或者芯片,或者与第一电子设备匹配使用的装置。
本申请实施例还提供一种芯片系统,包括:处理器,所述处理器与存储器耦合,所述存储器用于存储程序或指令,当所述程序或指令被所述处理器执行时,使得该芯片系统实现上述任一方法实施例中的方法。
可选地,该芯片系统中的处理器可以为一个或多个。该处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。
可选地,该芯片系统中的存储器也可以为一个或多个。该存储器可以与处理器集成在一起,也可以和处理器分离设置,本申请实施例并不限定。示例性地,存储器可以是非瞬时性处理器,例如只读存储器ROM,其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请实施例对存储器的类型,以及存储器与处理器的设置方式不作具体限定。
示例性地,该芯片系统可以是现场可编程门阵列(field programmable gate array,FPGA),可以是专用集成芯片(AP设备plication specific integrated circuit,ASIC), 还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processor unit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
应理解,上述方法实施例中的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,当该计算机程序在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的输入控制方法。
本申请实施例还提供一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的输入控制方法。
另外,本申请实施例还提供一种装置。该装置具体可以是组件或模块,该装置可包括相连的一个或多个处理器和存储器。其中,存储器用于存储计算机程序。当该计算机程序被一个或多个处理器执行时,使得装置执行上述各方法实施例中的输入控制方法。
其中,本申请实施例提供的装置、计算机可读存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法。因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
结合本申请实施例公开内容所描述的方法或者算法的步骤可以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应地软件模块组成,软件模块可以被存放于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read only memory,ROM)、可擦除可编程只读存储器(erasable programmable ROM,EPROM)、电可擦可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、只读光盘(CD-ROM)或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于专用集成电路(AP设备plication specific integrated circuit,ASIC)中。
通过以上的实施方式的描述,本领域技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明。实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成;即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的。例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式;例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,模块或单元 的间接耦合或通信连接,可以是电性,机械或其它的形式。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
计算机可读存储介质包括但不限于以下的任意一种:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (34)

  1. 一种输入控制方法,其特征在于,所述方法包括:
    第一电子设备接收用户指示跨设备移动鼠标指针的第一操作,显示与所述第一电子设备建立通信连接的至少一个第二电子设备对应的设备视图;
    所述第一电子设备检测到用户选择所述设备视图中目标设备视图的第二操作,确定所述目标设备视图对应的目标第二电子设备,所述至少一个第二电子设备包括所述目标第二电子设备;
    所述第一电子设备向所述目标第二电子设备发送显示指示,所述显示指示用于指示所述目标第二电子设备显示所述鼠标指针。
  2. 根据权利要求1所述的方法,其特征在于,在所述第一电子设备接收用户指示跨设备移动鼠标指针的第一操作之前,所述方法还包括:
    所述第一电子设备检测用户选中目标对象的第三操作。
  3. 根据权利要求2所述的方法,其特征在于,所述显示指示还用于指示所述目标第二电子设备显示选中状态下的所述目标对象;所述方法还包括:
    所述第一电子设备检测到用户移动所述鼠标指针的第四操作,向所述目标第二电子设备发送拖拽所述目标对象的第一信息,所述第一信息用于所述目标第二电子设备显示拖拽后的所述目标对象。
  4. 根据权利要求2或3所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备检测到用户的拖拽释放操作,向所述目标第二电子设备发送拖拽完成指示,所述拖拽完成指示用于指示所述目标第二电子设备根据显示内容,显示所述目标对象。
  5. 根据权利要求2-4任一项所述的方法,其特征在于,所述目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,在所述第一电子设备检测到用户选择所述设备视图中目标设备视图的第二操作之后,所述方法还包括:
    所述第一电子设备显示所述第二电子设备的多个虚拟桌面对应的多个虚拟桌面视图;
    所述第一电子设备检测到用户选择所述多个虚拟桌面视图中目标虚拟桌面视图的第五操作,确定所述目标虚拟桌面视图对应的所述目标第二电子设备的目标虚拟桌面,所述多个虚拟桌面包括所述目标虚拟桌面;所述显示指示用于指示所述目标第二电子设备在所述目标虚拟桌面上显示所述鼠标指针。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述第一电子设备检测到用户选择所述设备视图中目标设备视图的第二操作,确定所述目标设备视图对应的目标第二电子设备,包括:
    所述第一电子设备检测到用户选择所述设备视图中所述目标设备视图中显示的目标窗口预览图的操作,确定所述目标设备视图对应的目标第二电子设备,所述显示指示用于指示所述目标第二电子设备在所述目标窗口预览图对应的第一窗口中显示所述鼠标指针。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备按照预设方式获取所述第二电子设备的设备信息;
    其中,所述预设方式包括按照预设周期获取所述设备信息,或者,响应于所述第一操作,向所述第二电子设备请求获取所述设备信息;
    其中,所述设备信息用于生成所述设备视图,所述设备信息包括如下一项或几项:桌面截图、设备标识、当前显示的界面截图、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备隐藏所述鼠标指针。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述显示指示中携带有所述鼠标指针的位移信息,所述位移信息用于指示所述目标第二电子设备在相应的显示位置显示所述鼠标指针,所述显示位置包括如下任一项:对应于所述第一电子设备隐藏所述鼠标指针的位置、对应于在所述第一电子设备的显示屏上所述第二操作的位置、对应于在所述目标设备视图上所述第二操作的位置。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述第一操作包括如下一项或几项:对所述第一电子设备的第一按键的操作、将所述鼠标指针移动到显示屏预设区域的操作、对显示屏上显示的预设图标的操作。
  12. 一种输入控制方法,其特征在于,所述方法包括:
    第二电子设备接收第一电子设备发送的显示指示,所述显示指示为所述第一电子设备响应于用户在所述第一电子设备显示的与所述第一电子设备建立通信连接的至少一个电子设备对应的设备视图中选择目标设备视图的第二操作,向所述目标设备视图对应的所述第二电子设备发送的指示,所述至少一个电子设备包括所述第二电子设备;
    所述第二电子设备根据所述显示指示,显示所述第一电子设备的输入设备对应的鼠标指针。
  13. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    所述第二电子设备根据所述显示指示,显示选中状态下的目标对象;所述目标对象为所述第一电子设备在发送所述显示指示之前响应于用户的第三操作选中的对象;
    所述第二电子设备接收所述第一电子设备发送的第一信息,所述第一信息为所述第一电子设备响应于用户移动所述鼠标指针的第四操作确定的信息;
    所述第二电子设备根据所述第一信息,显示拖拽后的所述目标对象。
  14. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    所述第二电子设备接收所述第一电子设备发送的拖拽完成指示;
    响应于所述拖拽完成指示,所述第二电子设备根据显示内容,显示所述目标对象。
  15. 根据权利要求13或14所述的方法,其特征在于,所述目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
  16. 根据权利要求12-15任一项所述的方法,其特征在于,所述方法还包括:
    所述第二电子设备确定正在运行的多个窗口;
    响应于所述显示指示,所述第二电子设备显示所述多个窗口对应的多个预览窗口;
    所述第二电子设备接收所述第一电子设备发送的第二信息;所述第二信息为所述第一电子设备在检测到用户移动所述鼠标指针后确定的信息;
    所述第二电子设备根据所述第二信息,在所述多个预览窗口中的第一预览窗口对应的第二窗口中显示拖拽状态的所述目标对象,所述多个窗口包括所述第二窗口。
  17. 根据权利要求12-16任一项所述的方法,其特征在于,所述第二电子设备配置有多个虚拟桌面,所述显示指示用于指示所述第二电子设备在所述多个虚拟桌面中的目标虚拟桌面上显示所述鼠标指针,所述目标虚拟桌面为所述第一电子设备显示所述多个虚拟桌面对应的多个虚拟桌面视图后,响应于用户的第五操作确定的虚拟桌面。
  18. 根据权利要求12-17任一项所述的方法,其特征在于,所述方法还包括:
    所述第二电子设备按照预设方式向所述第一电子设备发送所述第二电子设备的设备信息;
    其中,所述预设方式包括按照预设周期发送所述设备信息,或者,响应于所述第一电子设备的请求,向所述第一电子设备发送所述设备信息;
    其中,所述设备信息用于所述第一电子设备显示所述第二电子设备对应的设备视图,所述设备信息包括如下一项或几项:桌面截图、当前显示的界面截图、设备标识、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
  19. 根据权利要求12-18任一项所述的方法,其特征在于,所述显示指示中携带有所述鼠标指针的位移信息,所述第二电子设备根据所述显示指示,显示所述第一电子设备的输入设备对应的鼠标指针,包括:
    所述第二电子设备根据所述鼠标指针的位移信息,在确定的对应显示位置显示所述鼠标指针,所述对应显示位置包括如下任一项:对应于所述第一电子设备隐藏所述鼠标指针的位置、对应于在所述第一电子设备的显示屏上所述第二操作的位置、对应于在所述目标设备视图上所述第二操作的位置。
  20. 一种输入控制系统,其特征在于,所述系统包括第一电子设备和第二电子设备;
    所述第一电子设备,用于接收用户指示跨设备移动鼠标指针的第一操作,显示与所述第一电子设备建立通信连接的至少一个第二电子设备对应的设备视图;
    所述第一电子设备,还用于检测到用户选择所述设备视图中目标设备视图的第二操作,确定所述目标设备视图对应的目标第二电子设备,所述至少一个第二电子设备包括所述目标第二电子设备;
    所述第一电子设备,还用于向所述目标第二电子设备发送显示指示;
    所述第二电子设备中的所述目标第二电子设备,用于接收所述第一电子设备发送的所述显示指示;
    所述目标第二电子设备,还用于根据所述显示指示,显示所述鼠标指针。
  21. 根据权利要求20所述的系统,其特征在于,
    所述第一电子设备,还用于检测用户选中目标对象的第三操作;
    所述第二电子设备,还用于显示选中状态下的所述目标对象。
  22. 根据权利要求21所述的系统,其特征在于,
    所述第一电子设备,还用于检测到用户移动所述鼠标指针的第四操作,向所述目标第二电子设备发送拖拽所述目标对象的第一信息;
    所述目标第二电子设备,还用于接收所述第一电子发送的所述第一信息;
    所述目标第二电子设备,还用于根据所述第一信息显示拖拽后的所述目标对象。
  23. 根据权利要求21或22所述的系统,其特征在于,
    所述第一电子设备,还用于检测到用户的拖拽释放操作,向所述目标第二电子设备发送拖拽完成指示;
    所述目标第二电子设备,还用于接收所述第一电子发送的所述拖拽完成指示;
    所述目标第二电子设备,还用于响应于所述拖拽完成指示,根据所述目标第二电子设备的显示内容,显示所述目标对象。
  24. 根据权利要求21-23任一项所述的系统,其特征在于,所述目标对象包括文件、窗口、图标、文本、用户选中的文字、文档、图片、视频中的一项或几项。
  25. 根据权利要求20-24任一项所述的系统,其特征在于,
    所述目标第二电子设备,还用于确定正在运行的多个窗口;
    所述目标第二电子设备,还用于响应于所述显示指示,显示所述多个窗口对应的多个预览窗口;
    所述第一电子设备,还用于检测到用户移动所述鼠标指针至所述多个预览窗口中的第一预览窗口的操作,确定第二信息;
    所述第一电子设备,还用于向所述目标第二电子设备发送所述第二信息;
    所述目标第二电子设备,还用于根据所述第二信息,在所述多个预览窗口中的所述第一预览窗口对应的第二窗口中显示拖拽状态的所述目标对象,所述多个窗口包括所述第二窗口。
  26. 根据权利要求20-25任一项所述的系统,其特征在于,
    所述第一电子设备,还用于显示所述第二电子设备的多个虚拟桌面对应的多个虚拟桌面视图;
    所述第一电子设备,还用于检测到用户选择所述多个虚拟桌面视图中目标虚拟桌面视图的第五操作,确定所述目标虚拟桌面视图对应的所述目标第二电子设备的目标虚拟桌面,所述多个虚拟桌面包括所述目标虚拟桌面;所述显示指示用于指示所述目标第二电子设备在所述目标虚拟桌面上显示所述鼠标指针。
  27. 根据权利要求20-26任一项所述的系统,其特征在于,
    所述第一电子设备,具体用于检测到用户选择所述设备视图中目标设备视图中显示的目标窗口预览图的操作,确定所述目标设备视图对应的目标第二电子设备,所述显示指示用于指示所述目标第二电子设备在所述目标窗口预览图对应的第一窗口中显示所述鼠标指针。
  28. 根据权利要求20-27任一项所述的系统,其特征在于,
    所述第一电子设备,还用于按照预设方式获取所述第二电子设备的设备信息;
    其中,所述预设方式包括按照预设周期获取所述设备信息,或者,响应于所述第一操作,向所述第二电子设备请求所述设备信息;
    其中,所述设备信息用于生成所述设备视图,所述设备信息包括如下一项或几项:桌面截图、设备标识、当前显示的界面截图、设备名称、用户编辑的设备昵称、系统账号、用于标识电子设备的图片。
  29. 根据权利要求20-28任一项所述的系统,其特征在于,
    所述第一电子设备,还用于隐藏所述鼠标指针。
  30. 根据权利要求20-29任一项所述的系统,其特征在于,所述显示指示中携带有所述鼠标指针的位移信息;
    所述目标第二电子设备,具体用于根据所述鼠标指针的位移信息,在确定的对应显示位置显示所述鼠标指针,所述对应显示位置包括如下任一项:对应于所述第一电子设备隐藏所述鼠标指针的位置、对应于在所述第一电子设备的显示屏上所述第二操作的位置、对应于在所述目标设备视图上所述第二操作的位置。
  31. 根据权利要求20-30任一项所述的系统,其特征在于,所述第一操作包括如下一项或几项:对所述第一电子设备的第一按键的操作、将所述鼠标指针移动到显示屏预设区域的操作、对显示屏上显示的预设图标的操作。
  32. 一种电子设备,其特征在于,包括:处理器、显示屏和存储器,所述存储器和所述显示屏与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器从所述存储器中读取所述计算机指令,使得所述电子设备执行如权利要求1-11中任意一项所述的方法;或者,使得所述电子设备执行如权利要求12-19中任意一项所述的方法。
  33. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1-11中任意一项所述的方法;或者,使得所述电子设备执行如权利要求12-19中任意一项所述的方法。
  34. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-11中任意一项所述的方法;或者,使得所述计算机执行如权利要求12-19中任意一项所述的方法。
PCT/CN2023/100503 2022-06-30 2023-06-15 输入控制方法、电子设备及系统 WO2024001813A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210760417.2A CN117369682A (zh) 2022-06-30 2022-06-30 输入控制方法、电子设备及系统
CN202210760417.2 2022-06-30

Publications (1)

Publication Number Publication Date
WO2024001813A1 true WO2024001813A1 (zh) 2024-01-04

Family

ID=89383210

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/100503 WO2024001813A1 (zh) 2022-06-30 2023-06-15 输入控制方法、电子设备及系统

Country Status (2)

Country Link
CN (1) CN117369682A (zh)
WO (1) WO2024001813A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955513A (zh) * 2016-04-25 2016-09-21 北京润科通用技术有限公司 信息处理方法、电子设备及无线鼠标
CN107908296A (zh) * 2017-11-28 2018-04-13 深圳市东微智能科技股份有限公司 Kvm控制方法、装置、存储介质和计算机设备
CN108845783A (zh) * 2018-08-01 2018-11-20 广州魅视电子科技有限公司 一种基于kvm装置的多视窗合屏显示方法和系统
CN114286167A (zh) * 2021-12-03 2022-04-05 杭州逗酷软件科技有限公司 跨设备的交互方法、装置、电子设备以及存储介质
CN114885442A (zh) * 2022-03-25 2022-08-09 华为技术有限公司 一种输入设备的连接方法、设备及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955513A (zh) * 2016-04-25 2016-09-21 北京润科通用技术有限公司 信息处理方法、电子设备及无线鼠标
CN107908296A (zh) * 2017-11-28 2018-04-13 深圳市东微智能科技股份有限公司 Kvm控制方法、装置、存储介质和计算机设备
CN108845783A (zh) * 2018-08-01 2018-11-20 广州魅视电子科技有限公司 一种基于kvm装置的多视窗合屏显示方法和系统
CN114286167A (zh) * 2021-12-03 2022-04-05 杭州逗酷软件科技有限公司 跨设备的交互方法、装置、电子设备以及存储介质
CN114885442A (zh) * 2022-03-25 2022-08-09 华为技术有限公司 一种输入设备的连接方法、设备及系统

Also Published As

Publication number Publication date
CN117369682A (zh) 2024-01-09

Similar Documents

Publication Publication Date Title
US11385857B2 (en) Method for displaying UI component and electronic device
EP4020954A1 (en) Method for transmitting information over short distances and electronic devices
CN111164983B (zh) 互联终端出借本地处理能力
CN112130788A (zh) 一种内容分享方法及其装置
WO2024016559A1 (zh) 一种多设备协同方法、电子设备及相关产品
WO2023207761A1 (zh) 外设控制方法、电子设备及系统
CN114885442A (zh) 一种输入设备的连接方法、设备及系统
CN114647350A (zh) 应用共享方法、电子设备和存储介质
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
WO2022213941A1 (zh) 一种协同编辑方法和终端设备
CN115426521A (zh) 用于截屏的方法、电子设备、介质以及程序产品
WO2022063159A1 (zh) 一种文件传输的方法及相关设备
WO2022028537A1 (zh) 一种设备识别方法及相关装置
WO2021197354A1 (zh) 一种设备的定位方法及相关装置
JP2023534182A (ja) ファイルを開く方法およびデバイス
WO2024001813A1 (zh) 输入控制方法、电子设备及系统
CN115242994B (zh) 视频通话系统、方法和装置
EP4311223A1 (en) Image capture method, and related apparatus and system
US20240143262A1 (en) Splicing Display Method, Electronic Device, and System
WO2021218544A1 (zh) 一种提供无线上网的系统、方法及电子设备
CN114691059A (zh) 一种投屏显示方法及电子设备
CN114079691A (zh) 一种设备识别方法及相关装置
CN117061266B (zh) 智能家居设备的控制方法和控制装置
WO2024027238A1 (zh) 一种多设备协同方法、电子设备及相关产品
WO2023071590A1 (zh) 输入控制方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23829993

Country of ref document: EP

Kind code of ref document: A1