WO2022048500A1 - 一种显示方法及设备 - Google Patents

一种显示方法及设备 Download PDF

Info

Publication number
WO2022048500A1
WO2022048500A1 PCT/CN2021/114990 CN2021114990W WO2022048500A1 WO 2022048500 A1 WO2022048500 A1 WO 2022048500A1 CN 2021114990 W CN2021114990 W CN 2021114990W WO 2022048500 A1 WO2022048500 A1 WO 2022048500A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
input
input device
mobile phone
interface
Prior art date
Application number
PCT/CN2021/114990
Other languages
English (en)
French (fr)
Inventor
卞苏成
胡凯
周学而
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21863565.4A priority Critical patent/EP4195638A4/en
Priority to US18/043,627 priority patent/US11947998B2/en
Publication of WO2022048500A1 publication Critical patent/WO2022048500A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • G06F9/4856Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration
    • G06F9/4862Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration the task being a mobile agent, i.e. specifically designed to migrate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/84Protecting input, output or interconnection devices output devices, e.g. displays or monitors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0227Cooperation and interconnection of the input arrangement with other functional units of a computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/54Indexing scheme relating to G06F9/54
    • G06F2209/545Gui
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit

Definitions

  • the present application relates to the field of electronic devices, and in particular, to a display method and device.
  • users can have more terminals such as mobile phones, tablet computers, personal computers (personal computers, PCs) at the same time.
  • a user may use multiple terminals at the same time, and attention needs to be frequently switched among the multiple terminals.
  • the mobile phone receives a short message, and at this time, the user needs to pick up the mobile phone to reply.
  • the user needs to use the picture on the mobile phone when using the PC to write an email, and at this time, the user needs to pick up the mobile phone and transfer the picture from the mobile phone to the PC. This seriously affects the convenience of collaborative use of multiple terminals.
  • the user can connect multiple terminals to use them together.
  • a user owns a PC and a mobile phone, and the user can connect the PC and the mobile phone wirelessly or wiredly to work together.
  • the multi-screen collaboration realizes the projection display of the mobile phone display interface on the PC display screen by using mirror projection.
  • the user can operate the mobile phone on the PC side.
  • the user can use the mouse of the PC to perform mouse operations such as mouse click and mouse movement in the interface projected on the PC, so as to realize the operation of displaying the actual interface on the mobile phone.
  • Users can also use the touch screen of the mobile phone to operate the mobile phone directly.
  • the display interface of the mobile phone is always projected and displayed on the PC display screen.
  • Embodiments of the present application provide a display method and device.
  • an interface can be displayed on the corresponding device according to different input devices used for user input operations. It not only protects user privacy, but also avoids the distraction of users' attention and improves the user experience.
  • a first aspect of the present application provides a display method, the method can be applied to a second terminal, the second terminal is connected to the first terminal, the method can include: the second terminal displays a first interface; the second terminal receives a user A first operation on the content of the first interface; if the input source of the first operation is the input device of the first terminal, in response to the first operation, the second terminal sends data to the first terminal, and the data is used for the first terminal.
  • a terminal displays a second interface on the display screen of the first terminal; if the input source of the first operation is the input device of the second terminal, in response to the first operation, the second terminal displays the second interface on the display screen of the second terminal The second interface is displayed.
  • the second terminal can project the corresponding interface to the first terminal for display.
  • the second terminal displays the corresponding interface on the second terminal, and does not project the corresponding interface to the first terminal for display. In this way, the user can freely control the display of the interface of the second terminal on different devices according to his actual needs. It not only protects user privacy, but also avoids user distraction. Improve user experience.
  • the method may further include: the second terminal receives an input from the first terminal.
  • the shuttle state information of the terminal can be used to indicate the start of the shuttle of the input device, or the shuttle state information can be used to instruct the second terminal to start accepting the input from the input device of the first terminal.
  • the second terminal when the input source of the first operation is the input device of the first terminal, receives the user's first operation on the content of the first interface, which may include: the second terminal Receive a first operation parameter from the first terminal, where the first operation parameter is an operation parameter included in the first input event corresponding to the first operation when the user uses the input device of the first terminal to perform the first operation; the second The terminal simulates the first input event according to the first operation parameter; in response to the first operation, the second terminal sends data to the first terminal, which may include: the second terminal determines the input source of the first operation according to the simulated first input event is the input device of the first terminal; in response to the first input event, the second terminal sends data to the first terminal. Based on the input event, the input source of the corresponding operation can be determined, so as to determine whether to display the corresponding interface on the second terminal or to project the interface to display on other devices.
  • the second terminal determines, according to the simulated first input event, that the input source of the first operation is the input device of the first terminal, which may include: the second terminal determining that the simulated first input event includes: The identifier of the input device is the identifier of the virtual input device, and the virtual input device is created by the second terminal to simulate the input event; or, the second terminal determines that the input device type indicated by the input mode included in the simulated first input event is the same as the first The input devices of the terminals are of the same type, and it is determined that the shuttle status information for indicating the start of the shuttle of the input device is received from the first terminal.
  • the second terminal in response to the first operation, displays the second interface on the display screen of the second terminal, which may include: when the user uses the input device of the second terminal to perform the first operation Next, the second terminal determines that the input source of the first operation is the input device of the second terminal according to the second input event corresponding to the first operation; in response to the second input event, the second terminal displays on the display screen of the second terminal Second interface.
  • the second terminal determines, according to the second input event corresponding to the first operation, that the input source of the first operation is the input device of the second terminal, which may include: the second terminal determining the second input event The included input device identifier is the identifier of the input device of the second terminal; or, the second terminal determines that the type of the input device indicated by the input mode included in the second input event is the same as the type of the input device of the second terminal.
  • the method may further include: after the second terminal successfully establishes the connection with the first terminal, creating a virtual input device; or, the second terminal receives a notification message from the first terminal, notifying The message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on, and in response to the notification message, the second terminal creates a virtual input device; wherein, the virtual input device is used for the second terminal to simulate the input of the first terminal input device.
  • the method may further include: displaying a third interface on the second terminal, where the third interface includes an input box; the second terminal receives a second operation by the user on the input box;
  • the input source is the input device of the second terminal
  • the second terminal displays a virtual keyboard on the display screen of the second terminal.
  • a virtual keyboard can be displayed on the second terminal, so that the user can use the virtual keyboard to input without diverting attention, which improves the collaborative use of multiple terminals. effectiveness.
  • the method may further include: if the input source of the second operation is an input device of the first terminal, in response to the second operation, the second terminal sends the third terminal to the first terminal
  • the data of the interface, the virtual keyboard is not displayed on the third interface, and the data of the third interface is used by the first terminal to display the third interface on the display screen of the first terminal.
  • the virtual keyboard may not be displayed, and the third interface may be projected on the first terminal for display. The user can use the keyboard of the first terminal to realize input without the need for Divert attention and improve the efficiency of multi-terminal collaborative use.
  • the second terminal in response to the second operation, displays a virtual keyboard on the display screen of the second terminal, which may include: when the user uses the input device of the second terminal to perform the second operation , the second terminal determines that the input source of the second operation is the input device of the second terminal according to the third input event corresponding to the second operation; in response to the third input event, the second terminal displays a virtual keyboard.
  • the second terminal determines, according to the third input event corresponding to the second operation, that the input source of the second operation is the input device of the second terminal, which may include: the second terminal determining the third input event The included input device identifier is the identifier of the input device of the second terminal; or, the second terminal determines that the type of the input device indicated by the input mode included in the third input event is the same as the type of the input device of the second terminal.
  • the second terminal receiving the user's second operation on the input box may include: the second terminal receiving data from the first terminal.
  • a second operation parameter of a terminal where the second operation parameter is an operation parameter included in the fourth input event corresponding to the second operation when the user uses the input device of the first terminal to perform the second operation;
  • the second operation parameter simulates the fourth input event; in response to the second operation, the second terminal sends the data of the third interface to the first terminal, and the virtual keyboard is not displayed on the third interface, which may include: the second terminal according to the simulated fourth
  • the input event determines that the input source of the second operation is the input device of the first terminal; in response to the fourth input event, the second terminal sends data of the third interface to the first terminal, and the virtual keyboard is not displayed on the third interface.
  • the second terminal determines, according to the simulated fourth input event, that the input source of the second operation is the input device of the first terminal, which may include: the second terminal determining that the simulated fourth input event includes: The identifier of the input device is the identifier of the virtual input device; or, the second terminal determines that the input device type indicated by the input mode included in the simulated fourth input event is the same as the type of the input device of the first terminal, and determines that the input device from the first terminal is received.
  • Shuttle status information used by the terminal to indicate the start of the shuttle of the input device.
  • a second aspect of the present application provides a display device, which can be applied to a second terminal, the second terminal is connected to the first terminal, and the device can include: a display unit for displaying a first interface; an input unit, a first operation for receiving the content of the first interface by a user; a sending unit for sending data to the first terminal in response to the first operation when the input source of the first operation is an input device of the first terminal , the data is used by the first terminal to display the second interface on the display screen of the first terminal; the display unit is also used for responding to the first operation when the input source of the first operation is the input device of the second terminal , and display the second interface on the display screen of the second terminal.
  • the apparatus may further include: a receiving unit, configured to receive the shuttle status information from the first terminal, the shuttle Status information may be used to indicate the start of a shuttle for the input device.
  • the receiving unit is further configured to receive a first operation parameter from the first terminal, where the first operation parameter is when the user performs the first operation using the input device of the first terminal, Operation parameters included in the first input event corresponding to the first operation;
  • the device may further include: a simulation unit and a determination unit, the simulation unit is used to simulate the first input event according to the first operation parameter; the determination unit is used to simulate the first input event;
  • the first input event of the first input event determines that the input source of the first operation is the input device of the first terminal.
  • the sending unit sends data to the first terminal in response to the first operation, which specifically includes: the sending unit sends data to the first terminal in response to the first input event.
  • the determining unit is specifically configured to determine that the identifier of the input device included in the simulated first input event is the identifier of the virtual input device, and the virtual input device is created by the second terminal for simulating the input event; Or, it is determined that the type of the input device indicated by the input mode included in the simulated first input event is the same as that of the input device of the first terminal, and it is determined that the shuttle state information from the first terminal for indicating the start of the shuttle of the input device is received.
  • the determining unit is further configured to determine the input of the first operation according to the second input event corresponding to the first operation
  • the source is the input device of the second terminal.
  • the display unit displaying the second interface on the display screen of the second terminal in response to the first operation may include: the display unit displaying the second interface on the display screen of the second terminal in response to the second input event.
  • the determining unit is specifically configured to: determine that the input device identifier included in the second input event is an identifier of the input device of the second terminal; or, determine that the input mode included in the second input event indicates an identifier
  • the input device type is the same as that of the input device of the second terminal.
  • the apparatus may further include: a creating unit, configured to create a virtual input device after the connection with the first terminal is successfully established; or, a receiving unit, further configured to receive data from the first terminal
  • the notification message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on, and the creation unit is used to create a virtual input device in response to the notification message; wherein, the virtual input device is used for the second terminal to simulate the input of the first terminal device input.
  • the display unit is further configured to display a third interface, where the third interface includes an input box; the input unit is further configured to receive a user's second operation on the input box; the display unit is also configured to use In the case where the input source of the second operation is the input device of the second terminal, in response to the second operation, the virtual keyboard is displayed on the display screen of the second terminal.
  • the sending unit is further configured to send the data of the third interface to the first terminal in response to the second operation , the virtual keyboard is not displayed on the third interface, and the data of the third interface is used by the first terminal to display the third interface on the display screen of the first terminal.
  • the determining unit is further configured to determine the input of the second operation according to the third input event corresponding to the second operation
  • the source is the input device of the second terminal.
  • the display unit displaying the virtual keyboard on the display screen of the second terminal in response to the second operation may include: the display unit displaying the virtual keyboard on the display screen of the second terminal in response to the third input event.
  • the determining unit is specifically configured to: determine that the identifier of the input device included in the third input event is the identifier of the input device of the second terminal; or, determine that the identifier of the input device included in the third input event indicates the The input device type is the same as that of the input device of the second terminal.
  • the receiving unit is further configured to receive a second operation parameter from the first terminal, where the second operation parameter is when the user performs the second operation using the input device of the first terminal, Operation parameters included in the fourth input event corresponding to the second operation; the simulation unit is further configured to simulate the fourth input event according to the second operation parameter; the determination unit is further configured to determine the second operation according to the simulated fourth input event
  • the input source is the input device of the first terminal.
  • the sending unit sends the data of the third interface to the first terminal in response to the second operation, which specifically includes: the sending unit sends the data of the third interface to the first terminal in response to the fourth input event.
  • the determining unit is specifically configured to: determine that the input device identifier included in the simulated fourth input event is an identifier of the virtual input device; or, determine the input mode indication included in the simulated fourth input event
  • the type of the input device is the same as the type of the input device of the first terminal, and it is determined that the shuttle state information for indicating the start of the shuttle of the input device is received from the first terminal.
  • a third aspect of the present application provides a display method, which can be applied to a second terminal, the second terminal is connected to the first terminal, the keyboard and mouse sharing mode of the first terminal is enabled, and the second terminal creates a virtual input device,
  • the method may include: the second terminal displays an interface; the second terminal receives the user's operation on the input box of the interface; the input source of the operation is the input of the input device of the second terminal.
  • the second terminal displays a virtual keyboard on the display screen of the second terminal.
  • the method may further include: in the case that the input source of the operation is the input device of the first terminal, in response to the operation, the second terminal does not display virtual images on the display screen of the second terminal keyboard.
  • the first terminal and the second terminal are used cooperatively.
  • the second terminal may not display the virtual keyboard of the first terminal, and the user may use the keyboard of the first terminal to implement input. There is no need for users to frequently switch their attention between two devices, which improves the efficiency of collaborative use of multiple terminals.
  • the method may further include: if the input source of the operation is the input device of the first terminal, in response to the operation, the second terminal sends the data of the above interface to the first terminal , the data is used by the first terminal to display the interface on the display screen of the first terminal, and the virtual keyboard is not displayed on the interface.
  • the second terminal in response to the operation, displays a virtual keyboard on the display screen of the second terminal, which may include: when the user uses the input device of the second terminal to perform an operation on the input box Next, the second terminal determines that the input source of the operation is the input device of the second terminal according to the input event corresponding to the operation, and in response to the input event corresponding to the operation, the second terminal displays a virtual keyboard on the display screen of the second terminal .
  • the second terminal determines, according to the input event corresponding to the operation, that the input source of the operation is the input device of the second terminal, including: the second terminal determining that the input device included in the input event is identified as the second terminal or, the second terminal determines that the type of the input device indicated by the input method included in the input event is the same as the type of the input device of the second terminal.
  • the method may further include: the second terminal receives the input from the first terminal.
  • the shuttle status information of the first terminal can be used to indicate the start of shuttle of the input device, or the shuttle status information can be used to instruct the second terminal to start accepting input from the input device of the first terminal.
  • the second terminal receiving the user's operation on the input box may include: the second terminal receiving an input from the first terminal.
  • Operation parameters are the operation parameters included in the input event corresponding to the operation when the user uses the input device of the first terminal to perform the above operation; the second terminal simulates the corresponding input event according to the operation parameter; the method further It may include: the second terminal determines, according to the simulated input event, that the input source of the above operation is the input device of the first terminal.
  • the second terminal not displaying the virtual keyboard on the display screen of the second terminal may include: in response to the input event, the second terminal not displaying the virtual keyboard on the display screen of the second terminal.
  • the second terminal sending the data of the interface to the first terminal may include: in response to the input event, sending the data of the interface to the first terminal by the second terminal.
  • the second terminal determines, according to the simulated input event, that the input source of the operation is the input device of the first terminal, which may include: the second terminal determining that the input device included in the simulated input event is identified as virtual The identifier of the input device, the virtual input device is created by the second terminal to simulate the input event; or, the input device type indicated by the input mode that the second terminal determines to include in the simulated input event is the same as the type of the input device of the first terminal, And it is determined that the shuttle state information for indicating the start of the shuttle of the input device is received from the first terminal.
  • the method may further include: after the second terminal successfully establishes the connection with the first terminal, creating a virtual input device; or, the second terminal receives a notification message from the first terminal, notifying The message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on, and in response to the notification message, the second terminal creates a virtual input device; wherein, the virtual input device is used for the second terminal to simulate the input of the first terminal input device.
  • a fourth aspect of the present application provides a display device, which can be applied to a second terminal, the second terminal is connected to the first terminal, the keyboard and mouse sharing mode of the first terminal is enabled, and the second terminal creates a virtual input device,
  • the apparatus may include: a display unit, used for displaying an interface; an input unit, used for receiving an operation of an input box of the interface by a user; a display unit, also used in the operation When the input source is the input device of the second terminal, in response to the operation, the second terminal displays a virtual keyboard on the display screen of the second terminal.
  • the display unit is further configured to not display the virtual keyboard on the display screen of the second terminal in response to the operation when the input source of the operation is the input device of the first terminal.
  • the apparatus may further include: a sending unit, configured to send the above-mentioned interface to the first terminal in response to the operation when the input source of the operation is an input device of the first terminal
  • the data is used for the first terminal to display the interface on the display screen of the first terminal, and the virtual keyboard is not displayed on the interface.
  • the apparatus may further include: when the user uses the input device of the second terminal to perform an operation on the input box, a determining unit, configured to determine the input box according to the input event corresponding to the operation
  • the input source of the operation is the input device of the second terminal.
  • the display unit displaying the virtual keyboard on the display screen of the second terminal in response to the operation may include: the display unit displaying the virtual keyboard on the display screen of the second terminal in response to the input event corresponding to the operation.
  • the determining unit is specifically configured to: determine that the identifier of the input device included in the input event is the identifier of the input device of the second terminal; or, determine that the type of the input device indicated by the input method included in the input event is the same as that of the input device.
  • the input device of the second terminal is of the same type.
  • the apparatus may further include: a receiving unit, configured to receive the shuttle status information from the first terminal, the shuttle The state information can be used to indicate the start of the shuttle of the input device, or the shuttle state information can be used to instruct the second terminal to start accepting the input from the input device of the first terminal.
  • the receiving unit is further configured to receive the operation parameter from the first terminal, where the operation parameter is obtained when the user uses the first terminal.
  • the apparatus may further include: a simulation unit, used for simulating the corresponding input event according to the operation parameter; a determination unit, also used for According to the simulated input event, it is determined that the input source of the above operation is the input device of the first terminal.
  • the display unit not displaying the virtual keyboard on the display screen of the second terminal in response to the operation may include: the display unit not displaying the virtual keyboard on the display screen of the second terminal in response to the input event.
  • the sending unit sending the data of the interface to the first terminal in response to the operation may include: the sending unit sending the data of the interface to the first terminal in response to the input event.
  • the determining unit is specifically configured to: determine that the identifier of the input device included in the simulated input event is the identifier of the virtual input device, and the virtual input device is created by the second terminal for simulating the input event; or , it is determined that the input device type indicated by the input mode included in the simulated input event is the same as that of the input device of the first terminal, and it is determined that the shuttle state information from the first terminal for indicating the start of the shuttle of the input device is received.
  • the apparatus may further include: a creating unit, configured to create a virtual input device after the connection with the first terminal is successfully established; or, a receiving unit, further configured to receive data from the first terminal
  • the notification message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on, and the creation unit is used to create a virtual input device in response to the notification message; wherein, the virtual input device is used for the second terminal to simulate the input of the first terminal device input.
  • a fifth aspect of the present application provides a display device, the display device may include: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions so that the display device implements the first
  • a sixth aspect of the present application provides a computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by an electronic device, enables the electronic device to implement the first aspect or a possible implementation manner of the first aspect. any one, or the method described in any one of the third aspect or possible implementations of the third aspect.
  • a seventh aspect of the present application provides an electronic device, the electronic device includes a display screen, one or more processors and a memory; the display screen, the processor and the memory are coupled; the memory is used for storing computer program codes, and the computer program codes include Computer instructions that, when executed by an electronic device, cause the electronic device to perform the method described in any one of the first aspect or possible implementations of the first aspect, or cause the electronic device to perform the third aspect or the method described in any one of the possible implementation manners of the third aspect.
  • An eighth aspect of the present application provides a computer program product, comprising computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are executed in an electronic device , the processor in the electronic device executes the method described in any one of the first aspect or possible implementation manners of the first aspect, or executes the third aspect or any one of the possible implementation manners of the third aspect Methods.
  • a ninth aspect of the present application provides a display system, comprising: a first terminal and a second terminal, the first terminal is connected to the second terminal; the second terminal is used for displaying a first interface and receives user feedback on the first interface. the first operation of the content; the second terminal is further configured to display the second interface on the display screen of the second terminal in response to the first operation when the input source of the first operation is the input device of the second terminal; In the case where the input source of the first operation is the input device of the first terminal, in response to the first operation, data is sent to the first terminal; the first terminal is used for receiving data, and according to the data, the display screen of the first terminal is displayed on the display screen of the first terminal. The second interface is displayed.
  • the first terminal is further configured to intercept the first input event corresponding to the first operation when the user uses the input device of the first terminal to input the first operation, and send the first input event to the second terminal.
  • An input event includes a first operational parameter.
  • the second terminal is configured to receive the first operation of the user, specifically: the second terminal is configured to receive the first operation parameter from the first terminal; according to the first operation parameter, simulate the first operation parameter. an input event; the second terminal sends data to the first terminal in response to the first operation, specifically: the second terminal is configured to determine, according to the simulated first input event, that the input source of the first operation is the first terminal The input device sends the data to the first terminal in response to the first input event.
  • the first terminal is further configured to determine that the cursor displayed on the first terminal slides out of the edge of the display screen of the first terminal, so as to enable interception of input events.
  • the second terminal displays the second interface on the display screen of the second terminal in response to the first operation, specifically: the second terminal is used for displaying the second interface when the user uses the second terminal.
  • the input device performs the first operation, according to the second input event corresponding to the first operation, it is determined that the input source of the first operation is the input device of the second terminal, and in response to the second input event, in the second A second interface is displayed on the display screen of the terminal.
  • the first terminal is further configured to send the shuttle status information to the second terminal, and the shuttle status information uses The shuttle starts at the direction of the input device.
  • the first terminal is specifically configured to intercept the first input event corresponding to the first operation when the user uses the input device of the first terminal to input the first operation, and send the first input event to the second terminal.
  • An input event includes a first operation parameter, and the first operation parameter is used by the second terminal to simulate the first input event, and then send the data of the second interface to the first terminal.
  • the display device described in the second aspect and any possible implementation manner thereof provided above, the display device described in the fourth aspect and any possible implementation manner thereof, and the display device described in the fifth aspect For the beneficial effects that can be achieved by the device, the computer-readable storage medium described in the sixth aspect, the electronic device described in the seventh aspect, the computer program product described in the eighth aspect, and the display system described in the ninth aspect, please refer to The beneficial effects in the first aspect or the third aspect and any possible implementation manners thereof will not be repeated here.
  • FIG. 1 is a simplified schematic diagram of a system architecture provided by an embodiment of the present application
  • 2A is a schematic structural diagram of a mobile phone according to an embodiment of the application.
  • FIG. 2B is a schematic composition diagram of a software architecture provided by an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a display method provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a coordinate system on a display screen provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 7 is another schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 8 is another schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 9 is another schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 11 is a schematic flowchart of another display method provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of the composition of a display device according to an embodiment of the present application.
  • FIG. 15 is a schematic diagram of the composition of a chip system according to an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • multi-screen collaboration is used to realize multi-terminal, for example, when a PC and a mobile phone work together, the display interface of the mobile phone is always projected and displayed on the PC display screen. Even if the user directly operates the mobile phone using the touch screen of the mobile phone, the display interface of the mobile phone will be projected on the PC display screen. However, when the user directly operates the mobile phone by using the touch screen of the mobile phone, the focus or attention is on the mobile phone, and it is meaningless to continue to display the interface content of the mobile phone on the display screen of the PC, and user privacy may be leaked.
  • Embodiments of the present application provide a display method and device, and the method can be applied to a scenario where multiple terminals are used collaboratively.
  • the input device such as mouse, touchpad, keyboard
  • the second terminal can be used to realize other terminals (such as called the second terminal).
  • the input device of the second terminal is also available.
  • the second terminal can project the corresponding interface to the first terminal for display.
  • the corresponding interface is displayed on the second terminal, and the corresponding interface is not projected to the first terminal for display.
  • the user can freely control the display of the interface of the second terminal on different devices according to his actual needs. It not only protects user privacy, but also avoids user distraction. Improve user experience.
  • FIG. 1 is a simplified schematic diagram of a system architecture to which the above method can be applied, provided by an embodiment of the present application.
  • the system architecture may at least include: a first terminal 101 and a second terminal 102 .
  • the first terminal 101 is connected to the input device 101-1 (as shown in FIG. 1 ), or includes the input device 101-1 (not shown in FIG. 1 ).
  • the input device 101-1 may be a mouse, a touch pad, a keyboard, and the like. In FIG. 1, the input device 101-1 is shown as an example of a mouse.
  • the second terminal 102 includes an input device 102-1 (shown in FIG. 1 ), or is connected to the input device 102-1 (not shown in FIG. 1 ).
  • FIG. 1 takes the input device 102-1 as a touch screen as an example.
  • the touch screen is also a display device, or a display screen, of the second terminal 102 .
  • the first terminal 101 and the second terminal 102 may establish a connection in a wired or wireless manner. Based on the established connection, the first terminal 101 and the second terminal 102 may be used together in cooperation.
  • the wireless communication protocol adopted when the first terminal 101 and the second terminal 102 establish a connection wirelessly may be wireless fidelity (Wi-Fi) protocol, Bluetooth (Bluetooth) protocol, ZigBee protocol, The near field communication (Near Field Communication, NFC) protocol, etc., may also be various cellular network protocols, which are not specifically limited here.
  • the user can use a set of input devices, such as the above-mentioned input device 101-1, to control both the first terminal 101 and the second terminal 102 by using the keyboard and mouse sharing technology. That is to say, the user can not only use the input device 101-1 of the first terminal 101 to control the first terminal 101, but the first terminal 101 can also share its input device 101-1 with the second terminal 102 for the user to realize Control of the second terminal 102 . In addition, the user can also control the second terminal 102 by using the input device 102 - 1 of the second terminal 102 .
  • the second terminal 102 when the user uses the input device 101-1 of the first terminal 101 to control the second terminal 102, the second terminal 102 can project the corresponding interface to the display screen 101 of the first terminal 101 -2 is displayed.
  • the corresponding interface is displayed on the touch screen (or called the display screen) of the second terminal 102, and the corresponding interface is not projected to the first terminal 102. on the display screen 101-2 of the terminal 101.
  • the second terminal 102 may display the icon of the corresponding application on the touch screen of the second terminal 102 .
  • the user can use the above-mentioned input device 101-1 to perform operation on the icon of the application displayed on the touch screen of the second terminal 102.
  • Actions such as click actions.
  • the second terminal 102 may project the interface of the application to the display screen 101-2 of the first terminal 101 for display.
  • the user can also operate the icon of the application displayed on the touch screen of the second terminal 102 through the above-mentioned input device 102-1, for example, the user uses a finger to click on the icon of the application.
  • the second terminal 102 displays the interface of the application on the touch screen of the second terminal 102 , and the interface of the application is not projected on the display screen 101 - 2 of the first terminal 101 .
  • the touch screen of the second terminal 102 when the user uses the input device 101-1 of the first terminal 101 to operate the input box displayed on the touch screen of the second terminal 102, for example, after a click operation, the touch screen of the second terminal 102 The virtual keyboard is not displayed on the device, and the user can use the keyboard (such as a physical keyboard) of the first terminal 101 to input text in the input box.
  • the second terminal 102 can display a virtual keyboard on the touch screen of the second terminal 102 , the user can use the virtual keyboard to input text in the input box.
  • the terminals in the embodiments of the present application may be mobile phones, tablet computers, handheld computers, PCs, cellular phones, personal digital assistants (personal digital assistants) , PDA), wearable devices (such as smart watches), in-vehicle computers, game consoles, and augmented reality (AR) ⁇ virtual reality (virtual reality, VR) devices, etc.
  • the first terminal 101 is a PC and the second terminal 102 is a mobile phone as an example in FIG. 1 .
  • the technical solutions provided in this embodiment can be applied to other electronic devices, such as smart home devices (eg, TV sets), in addition to the above-mentioned terminals (or mobile terminals).
  • the terminal is a mobile phone as an example.
  • FIG. 2A is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1.
  • Antenna 2 Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Interface 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, Camera 193, Display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environmental sensor Light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in this embodiment does not constitute a specific limitation on the mobile phone.
  • the cell phone may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller can be the nerve center and command center of the phone.
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, SIM interface, and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface SIM interface
  • USB interface etc.
  • the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 charges the battery 142 , it can also supply power to the mobile phone through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 can also receive the input of the battery 142 to supply power to the mobile phone.
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in a cell phone can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile phone.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the mobile phone including wireless local area networks (WLAN) (such as Wi-Fi networks), bluetooth (BT), global navigation satellite system (GNSS), Solutions for wireless communication such as frequency modulation (FM), NFC, infrared technology (infrared, IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC infrared technology
  • IR infrared
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the handset may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the mobile phone may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the mobile phone can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the gyroscope sensor 180B can be used to determine the motion attitude of the mobile phone.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the mobile phone can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone in various directions (generally three axes).
  • Distance sensor 180F for measuring distance.
  • the mobile phone can use the proximity light sensor 180G to detect the user holding the mobile phone close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints. The mobile phone can use the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a picture with the fingerprint, answer the incoming call with the fingerprint, etc.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile phone, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • Motor 191 can generate vibrating cues. The motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the mobile phone.
  • the mobile phone can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the mobile phone interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the handset employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone and cannot be separated from the mobile phone.
  • the methods in the following embodiments can be implemented in a mobile phone having the above-mentioned hardware structure.
  • the software system of the first terminal 101 is a windows system and the software system of the second terminal 102 is an Android system in this embodiment of the present application to illustrate the software architectures of the first terminal 101 and the second terminal 102 .
  • FIG. 2B is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
  • the software architecture of the first terminal 101 may include: an application layer and a windows system (windows shell).
  • the application layer may include various applications installed on the first terminal 101 . Applications at the application layer can directly interact with the Windows system.
  • the application layer may further include a mouse and keyboard module and a screen projection service module.
  • the software system of the second terminal 102 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Take the software system of the second terminal 102 as an example of a layered architecture.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the second terminal 102 may include an application layer and a framework layer (framework, FWK).
  • the application layer can include a series of application packages.
  • an application package can include settings, calculator, camera, SMS, music player, gallery, etc. applications.
  • the application included in the application layer may be a system application of the second terminal 102 or a third-party application, which is not specifically limited in this embodiment of the present application.
  • the application layer may also include a screen projection service module.
  • the application layer can also include launchers.
  • the framework layer is mainly responsible for providing an application programming interface (API) and a programming framework for applications in the application layer.
  • the second terminal 102 may also include other layers, such as a kernel layer (not shown in FIG. 2B ) and the like.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can contain at least display drivers, camera drivers, audio drivers, sensor drivers, etc.
  • the multiple terminals include a first terminal 101 and a second terminal 102, the input device 101-1 of the first terminal 101 is a mouse, and the input device 102-1 of the second terminal 102 is a touch screen as an example.
  • the user can use the mouse of the first terminal 101 to control the first terminal 101 and the second terminal 102 based on the above software architecture and the mouse and keyboard sharing technology.
  • the user can also control the second terminal 102 by using the touch screen of the second terminal 102 .
  • the second terminal 102 when the user uses the mouse of the first terminal 101 to control the second terminal 102, the second terminal 102 can project the corresponding interface to the first terminal 101 for display.
  • the corresponding interface is displayed on the second terminal 102 and the corresponding interface is not projected on the first terminal 101 .
  • the keyboard and mouse sharing technology may refer to a technology that uses an input device (such as a mouse, a touchpad, and a keyboard) of one terminal to control other terminals.
  • an input device such as a mouse, a touchpad, and a keyboard
  • the first terminal 101 is a PC
  • the second terminal 102 is a mobile phone
  • the input device 101-1 is a mouse
  • the input device 102-1 is a touch screen as an example, the method provided in this embodiment is described with reference to the accompanying drawings. for a detailed introduction.
  • FIG. 3 is a schematic flowchart of a display method provided by an embodiment of the present application. As shown in FIG. 3, the method may include the following S301-S309.
  • the mobile phone establishes a connection with the PC.
  • connection between the mobile phone and the PC can be established in a wired manner.
  • a wired connection can be established between a mobile phone and a PC through a data cable.
  • connection between the mobile phone and the PC can be established wirelessly.
  • the connection information may be a device identifier of the terminal, such as an internet protocol (internet protocol, IP) address, a port number, or an account logged in by the terminal, and the like.
  • IP internet protocol
  • the account logged in by the terminal may be an account provided by the operator for the user, such as a Huawei account.
  • the account logged in by the terminal can also be an application account, such as WeChat Account, Youku account, etc.
  • the transmission capability of the terminal may be near-field communication capability or long-distance communication capability. That is to say, the wireless communication protocol used to establish the connection between the terminals may be a near field communication protocol such as a Wi-Fi protocol, a Bluetooth protocol, or an NFC protocol, or a cellular network protocol.
  • the user can touch the NFC tag of the PC with the mobile phone, and the mobile phone reads the connection information stored in the NFC tag, for example, the connection information includes the IP address of the PC.
  • the mobile phone can establish a connection with the PC using the NFC protocol according to the IP address of the PC.
  • both the mobile phone and the PC have the Bluetooth function and the Wi-Fi function turned on.
  • the PC may broadcast a Bluetooth signal to discover surrounding terminals.
  • the PC may display a list of discovered devices, and the list of discovered devices may include the identifiers of the mobile phones discovered by the PC.
  • the PC can also exchange connection information, such as IP addresses, with the discovered devices.
  • connection information such as IP addresses
  • the PC can establish a connection with the mobile phone by using the Wi-Fi protocol according to the IP address of the mobile phone.
  • both the mobile phone and the PC are connected to the cellular network, and the mobile phone and the PC are logged into the same Huawei account.
  • the mobile phone and the PC can establish a connection based on the cellular network according to the Huawei account.
  • the two can be used together.
  • the user can use a set of input devices, such as a PC mouse, to control both the PC and the mobile phone. That is to say, the user can not only use the input device of the PC to control the PC, but the PC can also share the input device of the PC with the mobile phone for the user to control the mobile phone.
  • a set of input devices can be used to control both the PC and the mobile phone.
  • the PC may display a pop-up window for asking the user whether to enable the keyboard and mouse sharing mode. If an operation of selecting to enable the keyboard and mouse sharing mode is received from the user, the PC can enable the keyboard and mouse sharing mode.
  • the PC After the PC has enabled the keyboard and mouse sharing mode, it can notify all terminals connected to itself that the keyboard and mouse sharing mode has been enabled. If a connection is established between the PC and the mobile phone, the PC will notify the mobile phone that the mouse and keyboard sharing mode is enabled.
  • the mobile phone After the mobile phone receives the notification (such as a notification message), it can create a virtual input device, which has the same function as conventional input devices such as mouse, touchpad, keyboard, etc., and can be used for the mobile phone to simulate corresponding input events. .
  • the virtual input device created by the mobile phone has the same function as a conventional mouse. It can be regarded as a mouse shared by the PC to the mobile phone and can be used to simulate mouse events on the mobile phone to realize the mouse of the PC. Control over the phone.
  • the operating system of the mobile phone is the Android system.
  • Mobile phones can use the uinput capability of linux to create virtual input devices.
  • uinput is a kernel layer module that can simulate input devices.
  • a process can create a virtual input device with a specific function. Once the virtual input device is created, it can simulate corresponding events.
  • other terminals that have established connections with the PC will also create virtual input devices according to the notification messages received.
  • the operating system of the terminal receiving the notification message is the Android system
  • the uinput capability of linux can be used to create a virtual input device
  • the human interface device (HID) protocol can be used to achieve Creation of virtual input devices.
  • the operating system of the terminal receiving the notification message is other operating systems such as the IOS system or the windows system
  • the HID protocol can be used to realize the creation of the virtual input device.
  • the above-mentioned embodiment is described by taking an example of creating a virtual input device after the terminal connected to the PC receives a notification message for notifying the PC that the keyboard and mouse sharing mode has been turned on.
  • a pop-up window may also be displayed to ask the user whether he wants to use the input device of the PC to control the device. If it is received that the user chooses to use the input device of the PC to control the device, then create a virtual input device, otherwise, no virtual input device is created.
  • the PC after the connection between the mobile phone and the PC is established, the PC automatically enables the keyboard and mouse sharing mode, and the user does not need to manually enable it.
  • the virtual input device can also be created automatically without the PC sending a notification message.
  • a pop-up window may be displayed to ask the user whether he wants to use the input device of the PC to control the device. If it is received that the user chooses to use the input device of the PC to control the device, the virtual input device is automatically created, otherwise the virtual input device is not created.
  • the PC enables the keyboard and mouse sharing mode after the connection between the PC and the mobile phone is established.
  • the PC may also start the keyboard and mouse sharing mode first.
  • the keyboard and mouse sharing mode can be manually turned on by the user or automatically turned on by the PC.
  • the PC will establish a connection with the mobile phone.
  • the realization of establishing the connection between the PC and the mobile phone is the same as that described in the above-mentioned embodiment, and details are not repeated here.
  • the mobile phone can create a virtual input device automatically or according to a notification message of the PC or according to a user's choice.
  • the keyboard and mouse sharing mode of the PC When the keyboard and mouse sharing mode of the PC is turned on, the PC and the mobile phone are connected, and the virtual input device of the mobile phone is created, it can be considered that the PC shares the input device of the PC with the mobile phone, and the user can use the input device of the PC to control the mobile phone. .
  • the mouse is the input device of the PC
  • the PC temporarily responds to the operation of the mouse, or the user can temporarily use the mouse to achieve Control over PC.
  • the PC can also trigger the transfer to other terminals that have established a connection with the PC and created a virtual input device when it is determined that the mouse shuttle condition is met, such as the operation of the mouse on the mobile phone. response. That is to say, after the mouse shuttle condition is satisfied, the user can use the mouse to control other terminals that have established a connection with the PC and created a virtual input device, such as a mobile phone.
  • the mouse shuttle condition may be that the cursor displayed on the PC display screen slides over the edge of the PC display screen. That is to say, the user can move the mouse to make the cursor displayed on the PC display slide over the edge of the PC display to trigger the other terminal that has established a connection with the PC and created a virtual input device to respond to the operation of the mouse .
  • the PC may enable input (input) monitoring, and mount a hook (HOOK).
  • the input monitor can be used to monitor the relative displacement and coordinate position of the cursor.
  • Input listeners can also be used to listen for keyboard events.
  • the mounted HOOK can be used to intercept the corresponding input event (or shield the corresponding input event) after the mouse shuttle starts. For example, if the input device is a mouse, the input event can be a mouse event, and the mounted HOOK is used after the mouse shuttle starts. It can be used to intercept mouse events, so that the mouse events will not be transmitted to the Windows system of the PC after being received by the keyboard and mouse module of the PC.
  • the mounted HOOK can also be used to capture intercepted input events, such as parameters in mouse events, after the mouse shuttle starts.
  • the PC can use input monitoring to monitor the relative displacement and coordinate position of the cursor, and determine whether the mouse shuttle condition is satisfied according to the monitored data.
  • the mounted HOOK intercepts the mouse event, captures the parameters in the mouse event, and sends the captured parameters to other terminals connected to the PC that have created a virtual input device, so that the terminal can use the created virtual input device.
  • the virtual input device simulates corresponding input events, such as mouse events, and then responds to them. That is, it is realized that the other terminal connected with the PC and created the virtual input device responds to the operation of the mouse.
  • the interception of input events and the capture of parameters therein can also be implemented in different ways. For example, taking the mouse as the input device, after the keyboard and mouse sharing mode is turned on, the PC can mount the HOOK and register the RAWINPUT. After the mouse shuttle starts, the mounted HOOK can be used to intercept mouse events (or shield the mouse event), the registered RAWINPUT can be used to capture the parameters in the intercepted mouse events.
  • This embodiment does not limit the specific implementation of the interception of mouse events and the capture of parameters therein.
  • the implementation of the interception of input events and the capture of parameters therein by mounting a HOOK is used as an example for introduction.
  • the user can use the input device of the PC to control the mobile phone only after the cursor displayed on the PC display screen slides over the edge of the PC display screen.
  • the connection between the PC and the mobile phone can also be established, after the keyboard and mouse sharing mode of the PC is turned on and the mobile phone creates a virtual input device, the user can use the input device of the PC to control the mobile phone, that is, after the PC and the mobile phone are established After connecting, the PC's keyboard and mouse sharing mode is turned on and the mobile phone creates a virtual input device, if the user operates the PC's input device, such as a mouse, the PC can intercept the corresponding input event, and capture the operation parameters and send them to the mobile phone.
  • the user can use the input device of the PC to control the mobile phone, and can also use the input device of the mobile phone to control the mobile phone.
  • corresponding interfaces may be displayed on different terminals according to different input devices for controlling the mobile phone.
  • the method may include the following S302-S307.
  • the PC receives a mouse movement event, and displays an animation of the cursor movement on the display screen of the PC according to the mouse movement event.
  • the cursor described in this embodiment may also be referred to as a mouse pointer.
  • the cursor can be an image, which can be dynamic or static, and the style of the cursor can be different in different situations.
  • the PC monitors the coordinate position of the cursor on the PC display screen.
  • the PC intercepts the mouse movement event according to the coordinate position of the cursor on the PC display screen and determines that the cursor slides out of the edge of the PC display screen, and sends the parameter 1 contained in the mouse movement event to the mobile phone.
  • the user wants to use the mouse to control other terminals connected to the PC with virtual input devices created, for example, when he wants to operate the current display interface of the mobile phone, the user By continuously moving the mouse in the same direction, the cursor displayed on the PC display screen slides (or slides out) the edge of the PC display screen, that is, the mouse shuttle condition is triggered.
  • the PC may determine the coordinate position of the cursor on the PC display screen according to the initial position and relative displacement of the cursor, so as to determine whether the cursor slides out of the edge of the PC display screen.
  • the initial position of the cursor may be the coordinate position of the cursor on the PC display screen when the mouse starts to move, or the coordinate position of the cursor on the PC display screen before the mouse starts to move.
  • the initial position of the cursor may specifically refer to a coordinate in a coordinate system with the upper left corner of the PC display screen as the coordinate origin, the X axis points from the upper left corner to the right edge of the PC display screen, and the Y axis points from the upper left corner to the lower edge of the PC display screen. Location.
  • the specific process for the PC to determine whether the cursor slides out of the edge of the PC display screen may be: Referring to FIG. 4 , the PC may establish that the initial coordinate position is the coordinate origin (position o shown in FIG. 4 ), and the X axis starts from the coordinate The origin o points to the right edge of the PC display, and the Y-axis points from the coordinate origin o to the coordinate system at the top edge of the PC display.
  • the PC can determine the coordinate values of each edge of the PC display screen in this coordinate system.
  • the coordinate values of each edge of the PC display screen in this coordinate system can be determined according to the resolution of the PC display screen and the initial position of the cursor.
  • the coordinate value of the right edge of the PC display screen on the X axis is x1
  • the coordinate value of the left edge on the X axis is -x2
  • the upper edge on the Y axis The coordinate value is y1
  • the coordinate value of the lower edge on the Y axis is -y2.
  • the coordinate value x of the cursor on the X axis is greater than x1, it can be determined that the cursor slides out of the right edge of the PC display screen. If the coordinate value x of the cursor on the X axis is less than -x2, it can be determined that the cursor slides out of the left edge of the PC display screen. If the coordinate value y of the cursor on the Y axis is greater than y1, it can be determined that the cursor slides out of the upper edge of the PC display screen. If the coordinate value y of the cursor on the Y axis is less than -y2, it can be determined that the cursor slides out of the lower edge of the PC display screen.
  • the user can use the input device of the PC to operate other terminals.
  • the PC can send the data of the operation input by the user using the input device of the PC to the device that created the virtual input device.
  • Other terminals, or rather, the mouse shuttle to this terminal the user uses the input device of the PC to input an operation, and the PC can intercept the input events corresponding to the operation, such as mouse move events, mouse press events, mouse up events, etc.
  • the parameters contained in the intercepted input events are transmitted to other terminals connected to the PC that have created a virtual input device, so that the terminal can respond to the operation of the PC's input device, such as a mouse.
  • the PC can determine the device to which the mouse is shuttled in the following manner.
  • the PC can determine that the mouse is shuttled to the mobile phone, that is, the PC can transmit the corresponding parameters to the mobile phone, so as to transfer the corresponding parameters to the mobile phone.
  • the mobile phone responds to the operation of the PC's input device, such as a mouse. If there are multiple devices connected to the PC, and some or all of the multiple devices have established virtual input devices, the PC can display a list option on the display screen of the PC when it is determined that the mouse shuttle condition is triggered.
  • the list options include the identifiers of the devices connected to the PC that have created the virtual input device (for example, including the identifiers of the above mobile phones).
  • the PC can determine, according to the user's selection, that the mouse is shuttled to the device, that is, the device that responds to the operation of the input device of the PC. If the user selects the above mobile phone identifier, the PC determines that the mouse is shuttled to the mobile phone, that is, the PC can send corresponding parameters to the mobile phone, so that the mobile phone can respond to the operation of the input device of the PC. After the mobile phone receives the corresponding parameters, it can simulate the corresponding input events, such as mouse events, and make corresponding responses, that is, the mobile phone will respond to the operation of the input device of the PC.
  • the device connected to the PC may send to the PC a message indicating that the virtual input device is successfully created.
  • the PC can obtain which devices among the devices connected to the PC have successfully created the virtual input device, and display the above list options based on this.
  • the shuttle relationship may be pre-configured. If there are multiple devices connected to the PC, and some or all of the multiple devices have established virtual input devices, you can determine to which device the mouse is shuttled according to the pre-configured shuttle relationship, that is, determine which device to switch to. Responds to the operation of the PC's input device.
  • multiple devices connected to the PC include the above-mentioned mobile phone, and the mobile phone creates a virtual input device.
  • the pre-configured shuttle relationship is that the cursor slides out from the left side (or left edge) of the PC display screen, and the mouse shuttles to the mobile phone. .
  • the PC can not only determine the start of the mouse shuttle, but also determine the mouse shuttle to the mobile phone, that is, the PC can send the corresponding parameters to the mobile phone, so that the mobile phone can pair the The PC's input device responds to operation.
  • the device creates a virtual input device
  • it can also be determined whether the mouse shuttles to the device according to the preconfigured shuttle relationship.
  • the pre-configured shuttle relationship is that the cursor slides out from the left edge of the PC display screen, and the mouse shuttles to the mobile phone.
  • the device to which the mouse is shuttled can be determined by identifying the device position. For example, taking the input device as a mouse as an example, the user presses and moves the mouse so that the cursor slides over the left edge of the PC display screen, and positioning technologies such as Bluetooth, Ultra-wideband (UWB), and ultrasound can be used to identify the location on the PC. The position of the surrounding equipment, if the PC recognizes that the left side of the PC is a mobile phone, it can determine that the mouse is shuttled to the mobile phone.
  • positioning technologies such as Bluetooth, Ultra-wideband (UWB), and ultrasound
  • the shuttle relationship may be configured in advance by the user through a configuration file, or a configuration interface for configuring the shuttle relationship may be provided for the user, and the user may configure the shuttle relationship in advance through the configuration interface.
  • a configuration interface for configuring the shuttle relationship may be provided for the user, and the user may configure the shuttle relationship in advance through the configuration interface.
  • the PC receives the user's operation of opening the configuration interface, and displays the configuration interface.
  • the configuration interface includes the logo of the PC (such as the icon of the PC) and the logo of the mobile phone (such as the icon of the mobile phone). The user can configure the shuttle relationship by dragging these two logos. .
  • the PC may determine that when the cursor slides over the left edge of the display screen of the PC, the mouse is shuttled to the mobile phone. If the user places the logo of the mobile phone to the right of the logo of the PC, the PC can determine that when the cursor slides over the right edge of the display screen of the PC, the mouse shuttles to the mobile phone.
  • the shuttle relationship of each device can be configured in a pre-configured manner. In the following embodiments, the determined mouse shuttle to the mobile phone is taken as an example for description.
  • the above S301 can be executed before the mouse shuttle is triggered, or can be executed after the mouse shuttle is triggered.
  • the example is not specifically limited here.
  • the parameters included in the above input event may include: operation parameters.
  • the input event is a mouse event as an example.
  • the operation parameters (or referred to as mouse operation parameters) included in the mouse event may include: mouse button flags, coordinate information, wheel information, and key position information.
  • the mouse button flag is used to indicate which operation the user presses, lifts, moves, or scrolls the mouse wheel.
  • the coordinate information is used to indicate the X coordinate and Y coordinate of the cursor movement when the user moves the mouse.
  • the scroll wheel information is used to indicate the X-axis distance and the Y-axis distance of the scroll wheel when the user operates the scroll wheel of the mouse.
  • the key position information is used to indicate which key of the left button, middle button or right button of the mouse is operated by the user.
  • the input device of the PC is a mouse as an example.
  • S302-S304 are illustrated by example.
  • the user wants to use the mouse of the PC to operate the mobile phone, the user can move the mouse of the PC.
  • the mouse and keyboard module of the PC can receive a corresponding input event, such as a movement event, and the movement event can be called a mouse movement event.
  • the mouse shuttle condition has not been triggered at this time, the HOOK will not intercept the mouse movement event, and the mouse movement event will be transmitted to the Windows system of the PC.
  • the Windows system of the PC can draw the animation of the cursor movement and display it on the display screen of the PC. For example, as shown in FIG. 5 , along with the movement of the mouse 501 , the PC displays an animation of the movement of the cursor 503 on the display screen 502 of the PC correspondingly.
  • the PC will enable input monitoring and mount HOOK.
  • the keyboard and mouse module of the PC can monitor the real-time coordinate position of the cursor on the PC display screen by using input monitoring.
  • the mouse and keyboard module of the PC determines that the cursor slides over the edge of the PC display screen according to the monitored real-time coordinate position of the cursor on the PC display screen, it can be determined that the above mouse shuttle conditions are satisfied, indicating that the user wants to use the PC mouse to control other terminal for control.
  • the mouse and keyboard module of the PC determines the start of the mouse shuttle.
  • the keyboard and mouse module of the PC determines that the mouse shuttle starts, if the user operates the mouse of the PC, the keyboard and mouse module of the PC will use HOOK to intercept and receive input events, such as mouse events, and use HOOK to capture the intercepted input events parameters in . Then, the PC can transmit to the mobile phone through the established connection for the mobile phone to respond accordingly. For example, continuing with the example shown in FIG. 5 , after the cursor slides over the edge of the PC display screen, the user continues to move the mouse in the same direction. The mouse and keyboard module of the PC can receive movement events, such as mouse movement events.
  • the mouse and keyboard module of the PC can use HOOK to intercept (or shield) it, so that the mouse movement event will not be sent to the Windows system of the PC, so that the PC will not connect to the received mouse.
  • the keyboard and mouse module of the PC can also use HOOK to capture the parameters in the mouse movement event, such as parameter 1.
  • the PC can send the parameter 1 to the mobile phone through the established connection.
  • parameter 1 included in the mouse movement event may include: operation parameter 1.
  • the operation parameter 1 may include: the mouse button flag for indicating that the user has moved the mouse, the coordinate information for indicating the X coordinate and the Y coordinate of the cursor movement, the wheel information (the value is empty) and the key position information (value is empty).
  • the PC can also send the shuttle status information to the mobile phone to instruct the mouse to start shuttle through the established connection.
  • the shuttle status information is used to instruct the mobile phone to start accepting the input from the input device of the PC.
  • the mobile phone can simulate a cursor and display the cursor on the display screen of the mobile phone. For example, after the mobile phone receives the information, a cursor can be created and the cursor can be displayed by the launcher of the mobile phone.
  • the PC may also hide the cursor displayed on the PC display after determining that the mouse shuttle has started. Gives the user the visual effect of moving the cursor from the PC to the phone.
  • the position where the mobile phone displays the cursor may be predefined, for example, it may be any position on the display screen of the mobile phone.
  • the position where the mobile phone displays the cursor may also correspond to the position where the cursor slides out on the PC.
  • the cursor slides out from the right edge of the PC display and displays the cursor on the left edge of the phone display.
  • the cursor slides out from the center position of the right edge of the display screen of the PC, and the cursor is displayed at the center position of the left edge of the display screen of the mobile phone.
  • the PC may send the coordinate position on the PC display screen when the cursor slides out of the PC display screen to the mobile phone.
  • the mobile phone can determine which edge of the PC display the cursor slides out from according to the coordinate position and the resolution of the PC (such as A*B). For example, when the cursor slides out of the PC display, the coordinates are (x1, y1), y1 equals A, the phone can determine that the cursor slides out from the right edge of the PC display.
  • the mobile phone can also determine the ratio of the position where the cursor slides out of the PC display screen to the height of the mobile phone according to x1 and the resolution of the PC (such as height B). Based on this ratio and the resolution of the phone, it is possible to determine exactly where the cursor is displayed on the right edge.
  • the resolution of the PC can be sent from the PC to the mobile phone during the process of establishing a connection with the mobile phone or after the connection is successfully established.
  • the mobile phone receives parameter 1, simulates a mouse movement event according to parameter 1, and displays an animation of cursor movement on the display screen of the mobile phone according to the mouse movement event.
  • the purpose of triggering the cursor to shuttle from the PC to the mobile phone is to control the mobile phone. Therefore, after the user moves the mouse to shuttle the cursor from the display screen of the PC to the display screen of the mobile phone, the user can continue to move the mouse, that is, the user will use the mouse of the PC to input the operation of moving the cursor on the mobile phone.
  • the mobile phone will receive corresponding parameters, such as parameter 1 included in the mouse movement event.
  • a movement event such as a mouse movement event, can be simulated according to the operation parameters included in the parameter 1.
  • the mobile phone can draw an animation of the cursor movement and display it on the display screen of the mobile phone until the cursor moves to the position on the display screen of the mobile phone that the user wants to operate.
  • the mobile phone currently displays a first interface, and the first interface includes one or more contents.
  • the content of the first interface may refer to elements displayed on the first interface.
  • This content can be a user-actionable element such as a control. Taking the content that the user wants to operate is a control as an example, the user can move the cursor on the mobile phone to the position of the control by continuing to move the mouse of the PC.
  • the display interface on the mobile phone may have been projected on the PC display screen, or may not be projected on the PC, but only displayed on the mobile phone. It can be understood that, after the mouse is shuttled to the mobile phone, the operation performed by the user using the mouse of the PC may only be used to move the cursor on the mobile phone, and will not cause the display interface of the mobile phone to change. For example, continuing to combine the above example, after the mouse is shuttled to the mobile phone, the user continues to move the mouse, and the mobile phone can simulate the mouse movement event. In response to the simulated mouse movement event, the mobile phone can display the movement animation of the cursor, and the display interface on the mobile phone will not change.
  • the mobile phone can continue to maintain the current display strategy. For example, before the user uses the PC mouse to control the mobile phone, and the mobile phone projects the interface to the PC, the mobile phone can continue to project the interface to the PC. For another example, before the user uses the mouse of the PC to control the mobile phone, the mobile phone does not project the interface to the PC, the mobile phone may continue to display the interface only on the mobile phone.
  • the following description is given by taking as an example that the mobile phone does not project the interface to the PC before the user uses the PC mouse to control the mobile phone.
  • the user wants to open an application in the mobile phone for example, the first interface is the desktop, and the content the user wants to operate is the icon of the application on the desktop.
  • the mobile phone can display the icons of these applications on the desktop (or called a home screen) of the mobile phone.
  • the mouse may continue to be moved until the cursor moves to the position of the icon of the application that the user wants to open displayed on the display screen of the mobile phone.
  • the application described in this embodiment may be an embedded application (ie, a system application of a mobile phone, such as a calculator, camera, settings, gallery, etc.), or a downloadable application (such as a browser, weather, etc.) , email, etc.).
  • embedded applications are applications that are provided as part of the mobile phone implementation.
  • a downloadable application is an application that can provide its own internet protocol multimedia subsystem (IMS) connection.
  • the downloadable application may be an application pre-installed in the cell phone or may be a third-party application downloaded by the user and installed in the cell phone.
  • the input device of the PC is a mouse as an example.
  • S305 is illustrated by an example. Since the operating systems of the PC and the mobile phone are different, the key values of the operation parameters in the input events, such as mouse events, are different.
  • the mobile phone After the mobile phone receives the corresponding parameters, such as the above parameter 1, the mobile phone can convert the received key code of the operation parameter 1 in the parameter 1 into a key code that can be recognized by the mobile phone according to the preset mapping relationship.
  • the mobile phone can simulate the input event that the mobile phone can recognize by using the created virtual input device according to the operation parameter 1 after the key code is converted, such as corresponding mouse event, that is, can simulate the movement event, such as the mouse movement event.
  • the mobile phone can draw the animation of the cursor movement according to the simulated mouse movement event, and send it to the launcher of the mobile phone to display the animation of the cursor movement on the display screen of the mobile phone.
  • the application that the user wants to open is the calculator in the mobile phone.
  • the mobile phone With the movement of the mouse 501 , the mobile phone correspondingly displays an animation of the movement of the cursor 602 on the display screen 601 of the mobile phone. That is, with the movement of the mouse 501, the cursor 602 can move along the track 603 to the position of the icon 604 of the calculator.
  • the display interface of the mobile phone will not change.
  • the mobile phone can determine to continue to maintain the current display strategy, that is, the mobile phone can continue to display the current interface and the animation of the cursor movement, without projecting the interface content of the mobile phone to the PC for display.
  • the user can input the corresponding operation through the input device of the PC, This operation may be referred to as the first operation.
  • the first operation can be a click operation, so that the mobile phone can open the application according to the first operation.
  • the PC eg, a keyboard and mouse module of the PC
  • the first operation may include one operation, or may include multiple operations.
  • the first operation includes one operation, the first input event includes one input event, the first operation includes multiple operations, and the first input event includes a corresponding number of input events.
  • the click operation may include two operations, namely a pressing operation and a lifting operation.
  • the corresponding first operation is a click operation.
  • Input events include press events and lift events.
  • the press operation may be a mouse press operation
  • the lift operation may be a mouse lift operation
  • the press event may be a mouse press event
  • the lift event may be a mouse lift event.
  • the process that the user uses the mouse of the PC to open the application on the mobile phone may include the following S306-S307.
  • the PC receives the mouse down event and the mouse up event, intercepts the mouse down event and the mouse up event, and converts the parameter 2 contained in the mouse down event and the mouse up event.
  • the lift event includes parameter 3 sent to the phone.
  • the first operation parameter in the embodiment of the present application may include the above-mentioned parameter 2 and parameter 3.
  • the mobile phone receives parameter 2 and parameter 3, simulates the mouse down event and the mouse up event according to the parameter 2 and the mouse up event, and displays the application interface on the PC display screen according to the mouse down event and the mouse up event.
  • the interface of the application may be the second interface in this embodiment of the application.
  • the user wants to open a certain application (such as a calculator), and the operation of opening the application is a click operation on the icon of the application displayed on the mobile phone desktop, that is, the first operation is a click operation.
  • a certain application such as a calculator
  • the operation of opening the application is a click operation on the icon of the application displayed on the mobile phone desktop, that is, the first operation is a click operation.
  • the user can press the mouse (eg, the left button of the mouse), and then lift the finger.
  • the mouse and keyboard module of the PC can receive a press event (such as a mouse press event) and a lift event (such as a mouse up event).
  • the mouse and keyboard module of the PC can use HOOK to intercept (or shield) the mouse down event and the mouse up event, so that the mouse down event and the mouse up event will not be sent to the PC Windows system, so that the PC does not respond to received mouse down events and mouse up events.
  • the keyboard and mouse module of the PC can also use HOOK to capture the parameters in the mouse down event, such as parameter 2, and capture the parameters in the mouse up event, such as parameter 3.
  • the PC can also send the captured parameter 2 of the mouse down event and parameter 3 of the mouse up event to the mobile phone through the established connection.
  • parameter 2 may include operation parameter 2 .
  • Operation parameter 3 may be included in parameter 3 .
  • the operation parameter 2 may include: a mouse button flag for instructing the user to press the mouse, coordinate information (null), wheel information (null) and a mouse button for instructing the user to press the left button of the mouse
  • the key bit information of the operation may include: a mouse button flag for indicating that the user has lifted the mouse, coordinate information (the value is empty), wheel information (the value is empty), and a mouse button for indicating that the user has lifted the left button of the mouse. Operation key information.
  • the mobile phone can receive parameter 2 and parameter 3. After that, according to the preset mapping relationship, the mobile phone can convert the received key code of the operation parameter 2 in the parameter 2 into a key code that can be recognized by the mobile phone. After converting the key code of the operation parameter 3 in the received parameter 3 into a key code that can be recognized by the mobile phone, the mobile phone uses the created virtual input device to simulate the pressing event according to the operation parameter 2 after the conversion of the key code, such as For the mouse down event, the created virtual input device is used to simulate a lift event, such as a mouse lift event, according to the operation parameter 3 after converting the key code.
  • the mobile phone can determine that the user has clicked the icon of the calculator on the desktop according to the mouse down event, the mouse up event and the currently displayed position of the cursor. For example, after the keyboard and mouse shuttle starts (eg, the mobile phone receives the shuttle status information from the PC for indicating that the mouse starts to shuttle), the mobile phone can register a listener for the coordinate position of the cursor. Through the listener, the mobile phone can monitor the coordinate position of the cursor on the display screen of the mobile phone in real time. That is, the mobile phone can use the listener to determine the current coordinate position of the cursor.
  • the mobile phone After the mobile phone determines that the user has clicked the icon of the calculator, the mobile phone can determine that the display interface of the mobile phone will change. After that, the mobile phone can first determine whether the user's intention to input the operation is to display a corresponding interface, such as the interface of the application, on the mobile phone, or to display the interface of the application on the PC. If the mobile phone determines that the user's operation intention is to display the interface of the application on the PC, the interface of the application can be displayed on the PC display screen. If the mobile phone determines that the user's operation intention is to display the interface of the application on the mobile phone, the interface of the application is displayed on the mobile phone, and the interface of the application is not projected on the display screen of the PC.
  • a corresponding interface such as the interface of the application, on the mobile phone, or to display the interface of the application on the PC. If the mobile phone determines that the user's operation intention is to display the interface of the application on the PC, the interface of the application can be displayed on the PC display screen.
  • the mobile phone may determine the user's operation intention according to the input device that inputs the click operation. If the input device (or input source) for inputting the click operation is the mouse of the PC, it can be determined that the user's operation intention is to display the corresponding interface on the PC. If the input device for inputting the click operation is the touch screen of the mobile phone, it can be determined that the user's operation intention is to display the corresponding interface on the mobile phone. For example, the mobile phone can determine whether the input source of the click operation is the mouse of the PC or the touch screen of the mobile phone according to the mouse down event and the mouse up event in S307.
  • the mobile phone can determine the input source of the click operation in the following ways.
  • the mobile phone can determine the input source of the input corresponding operation according to the input device identification (identify, ID) included in the input event.
  • the input event may also include an input device ID, where the input device ID is used to identify the input source for inputting the corresponding operation.
  • the input event simulated by the mobile phone using the virtual input device created is no exception, and also includes the input device ID. Therefore, the mobile phone can determine the input source for inputting the corresponding operation according to the input device ID included in the input event.
  • the mouse down event and the mouse up event in S307 may include the input device ID. Since the mouse down event and the mouse up event are simulated by the mobile phone using the virtual input device created, the input device ID is the ID of the virtual input device. The ID of the virtual input device may be generated and saved in the mobile phone when the mobile phone creates the virtual input device. Based on this, after the mobile phone simulates the mouse down event and the mouse up event, the mobile phone can obtain the input device ID in the mouse down event and the mouse up event. The mobile phone can determine that the input device ID is the ID of the virtual input device.
  • the virtual input device is used by the mobile phone to simulate the corresponding input event after the user uses the input device of the PC to input an operation
  • the mobile phone determines that the input device ID included in the input event is the ID of the virtual input device
  • the mobile phone can determine that the input corresponds to Operation, that is, the input device of the above-mentioned click operation is the mouse of the PC.
  • Mode 2 The mobile phone can determine the input source of the corresponding operation according to the input mode included in the input event.
  • the input event may further include an input mode, which is used to indicate the type of the device for inputting the corresponding operation, such as a mouse, a touch screen, a touch pad, and the like.
  • the input event simulated by the virtual input device created by the mobile phone is no exception, and also includes the input method. Therefore, the mobile phone can determine the input source for inputting the corresponding operation according to the input mode included in the input event.
  • the mouse down event and the mouse up event in S307 may include input methods.
  • the mouse down event and the mouse up event are mouse events simulated by the mobile phone, so the input method therein is used to indicate that the device for inputting the corresponding operation is the mouse.
  • the mobile phone can obtain the input methods in the mouse down event and the mouse up event.
  • the mobile phone can determine the input corresponding operation, that is, the input source of the above click operation is the mouse.
  • the mobile phone determines that the input source for inputting the corresponding operation is a mouse
  • the mouse may be directly connected to the mobile phone, or may be a mouse shared by other devices, such as a PC, to the mobile phone. Therefore, further, the mobile phone can also determine whether it is currently in the mouse shuttle state, that is, to determine whether the mouse is a mouse shared by the PC to the mobile phone.
  • the mobile phone determines that it is currently in the mouse shuttle state, it indicates that the user uses the mouse of the PC to input a click operation, and the mobile phone can determine that the corresponding operation is input, that is, the input source of the click operation is the mouse of the PC.
  • the mobile phone can determine that the input source of the click operation is the mouse of the PC, indicating that the user's operation intention is to display the corresponding interface on the PC. Then, in response to the above-mentioned mouse down event and mouse up event, the mobile phone can display the corresponding interface, that is, the interface of the application of the icon clicked by the user, on the display screen of the PC. For example, with reference to the example of FIG. 6, as shown in FIG. 7, after the cursor on the mobile phone moves to the icon 701 of the calculator, the user clicks the icon 701 of the calculator with the mouse 702 of the PC, such as pressing Left button of the PC's mouse 702 and lift your finger.
  • the phone can display the calculator's interface to the PC display.
  • the PC displays the interface 703 of the calculator.
  • the interface displayed by the mobile phone may not be changed.
  • the desktop 704 is still displayed.
  • the mobile phone can also display the interface 703 of the calculator, which is not shown in the figure.
  • the specific implementation of displaying the interface of the calculator on the display screen of the PC by the mobile phone may be: after determining that the user uses a mouse input click operation, the mobile phone can start the screen projection service. After that, the mobile phone, such as the screen projection service module of the mobile phone, can obtain the data corresponding to the interface of the calculator and send it to the PC. After the PC receives the data, it can display the interface of the calculator on the PC display screen according to the data.
  • the screen projection service module of the mobile phone can obtain the corresponding data of the calculator's interface through the display manager of the mobile phone (for example, the display manager is a module of the framework layer of the mobile phone), such as the screen recording data, and send it to the PC.
  • the interface of the calculator to the display on the PC display.
  • the Distributed Multi-media Protocol can be used to realize the display of the interface in the mobile phone to the display screen of the PC.
  • the screen projection service module of the mobile phone can use the display manager (DisplayManager) of the mobile phone to create a virtual display (VirtualDisplay).
  • the screencasting service module of the mobile phone sends a request to create a VirtualDisplay to the display manager of the mobile phone.
  • the display manager of the mobile phone completes the creation of the VirtualDisplay, it can return the created VirtualDisplay to the screencasting service module of the mobile phone.
  • the screen projection service module of the mobile phone can move the interface to be drawn in response to the operation, such as the interface of the calculator, to the VirtualDisplay for drawing.
  • the screen projection service module of the mobile phone can obtain corresponding screen recording data.
  • the screen-casting service module of the mobile phone obtains the screen-recording data
  • the screen-recording data can be encoded and sent to the PC.
  • the screen projection service module of the PC can receive the corresponding data, and after decoding the data, the screen recording data can be obtained.
  • the screen projection service module of the PC cooperates with the frame layer of the PC to draw a corresponding interface, such as an interface of a calculator, and display it on the display screen of the PC according to the screen recording data.
  • the frame layer of the PC can provide a surfaceview to realize the display of the interface in the mobile phone on the PC side.
  • wireless projection can also be used to realize the display of the interface in the mobile phone on the PC display screen, that is, the mobile phone can obtain all layers of the interface that need to be drawn in response to the operation, and then use all the obtained The layers are integrated into a video stream (or called screen recording data) and encoded and sent to the PC through the real time streaming protocol (RTSP) protocol. After the PC receives the video stream, it can decode and play it, so as to realize the display of the interface in the mobile phone on the PC display screen.
  • RTSP real time streaming protocol
  • the mobile phone can extract the interface that needs to be drawn in response to the operation, such as the interface of the calculator, and obtain the instruction stream after extracting the instruction, and obtain the layer information of the interface, etc., and then send the instruction stream and layer information to the PC, It is used for the PC to restore the interface that needs to be drawn in response to the operation, so as to realize the display of the interface in the mobile phone on the PC.
  • the method may include the following S308-S309.
  • the mobile phone receives the user's click operation on the icon of the application displayed on the mobile phone on the touch screen of the mobile phone.
  • the mobile phone displays an application interface on the mobile phone according to the click operation.
  • the user can use a finger to perform a touch operation at a corresponding position on the touch screen.
  • an application such as a calculator
  • the content the user wants to operate is the icon of the application on the desktop
  • the operation of opening the application is to the application displayed on the mobile phone desktop.
  • the click operation of the icon that is, the first operation is the click operation as an example.
  • the user can use the finger to click the icon of the calculator displayed on the desktop of the mobile phone.
  • the mobile phone may receive the corresponding input event (the input event may be the second input event in this embodiment of the application). According to the input event and the user's operation position, the mobile phone can determine that the user has clicked the icon of the calculator on the desktop.
  • the mobile phone when the mobile phone determines that the user has clicked the icon of the calculator, the mobile phone can determine that the display interface of the mobile phone will change. After that, the mobile phone can first determine whether the user's intention to input the operation is to display a corresponding interface, such as the interface of the application, on the mobile phone, or to display the interface of the application on the PC. The mobile phone can determine the user's operation intention according to the input device (or input source) inputting the click operation.
  • the input source for inputting the click operation may be implemented by way 1 or way 2 in the above S307.
  • the mobile phone can receive a corresponding input event, and the input event includes the input device ID, and the input device ID is used to identify the input of the click operation.
  • the source is the touch screen of the mobile phone, so according to the input device ID in the input event, the mobile phone can determine that the input source of the click operation is the touch screen of the mobile phone.
  • the mobile phone can receive a corresponding input event, and the input event includes an input method, and the input method is used to indicate the input source for inputting the click operation. It is the touch screen of the mobile phone, so according to the input method in the input event, the mobile phone can determine that the input source for inputting the click operation is the touch screen of the mobile phone.
  • the mobile phone can determine that the input source for inputting the above click operation is the touch screen of the mobile phone, indicating that the user wants to display the interface of the calculator on the mobile phone, then as a response to the input event, the mobile phone can The interface of the calculator is displayed on the mobile phone, and the interface of the calculator is not displayed on the display screen of the PC.
  • the mobile phone when a connection is established between the mobile phone and the PC, and the mobile phone creates a virtual input device, the user touches the icon 801 of the calculator displayed on the desktop of the mobile phone with a finger. , such as a click action.
  • the mobile phone may display the interface 802 of the calculator on the mobile phone without projecting the interface 802 of the calculator onto the display screen 803 of the PC.
  • the user can also operate the mobile phone to display other content (such as the content in the first interface in the embodiment of the present application) by using the input device of the PC or the input device of the mobile phone.
  • the mobile phone can determine that the input source of the operation is the touch screen of the mobile phone according to the input device ID or input method included in the input event corresponding to the operation.
  • the mouse of the PC determines whether the user's operation intention is to display the corresponding interface on the PC (such as the second interface in the embodiment of the application) or display the corresponding interface on the mobile phone (such as the first interface in the embodiment of the application) second interface). If the input source for inputting the operation is the mouse of the PC, the mobile phone can display the corresponding interface on the display screen of the PC. If the input source for inputting the operation is the touch screen of the mobile phone, the mobile phone can display the corresponding interface on the mobile phone without projecting it to the display screen of the PC.
  • the mobile phone currently displays the home page 901 of the gallery.
  • the home page 901 of the gallery includes thumbnails of a plurality of pictures, including a thumbnail 903 of picture 1 .
  • the user wants to display the picture 1 on the PC, he can move the mouse 902 of the PC, so that the mouse of the PC can shuttle to the mobile phone. After that, the user can continue to move the mouse of the PC, so that the cursor on the mobile phone moves to the position of the thumbnail 903 of the picture 1 .
  • the user can use the mouse 902 of the PC to perform a click operation.
  • the mobile phone can obtain the input event corresponding to the click operation.
  • the mobile phone can determine that the input source for inputting the click operation is a PC mouse. In this way, in response to the click operation, the mobile phone can display the details interface including the picture 1 on the display screen of the PC, as shown in (b) of FIG. 9 , the details interface of the picture 1 displayed by the PC is shown in 904 Show.
  • the mobile phone currently displays the home page 1001 of the gallery.
  • the home page 1001 of the gallery includes thumbnails of a plurality of pictures, including a thumbnail 1002 of picture 1 .
  • the mobile phone can obtain the corresponding input event, and according to the input device ID or input method included in the input event, the mobile phone can determine that the input source of the click operation is the touch screen of the mobile phone. Therefore, as shown in (b) of FIG. 10 , in response to the click operation, the mobile phone can display the details interface 1003 of the picture 1 on the mobile phone without projecting the details interface 1003 to the display screen 1004 of the PC. .
  • the specific implementation is similar to the implementation in which the user uses the input device of the PC or the input device of the mobile phone to operate the icon of the application displayed on the mobile phone, which will not be described in detail here.
  • the above example is described by taking an example that the interface of the mobile phone is not projected and displayed on the PC before the user performs a touch operation on the touch screen of the mobile phone.
  • the mobile phone receives the touch of the user on the touch screen of the mobile phone. After the operation, the mobile phone can not only perform the operation of displaying the corresponding interface on the mobile phone, but also perform the operation of stopping the projection display of the interface in the mobile phone on the PC display screen.
  • the user can move the mouse to make the cursor displayed on the mobile phone slide out of the edge of the mobile phone display .
  • the mouse and keyboard shuttle ends after the cursor on the phone slides off the edge of the phone's display. After the keyboard and mouse shuttle ends, the user can control the PC by using the mouse of the PC.
  • the phone determines that the cursor on the phone slides off the edge of the phone's display, it indicates that the user wants to use the mouse to control other devices.
  • the mobile phone has only established a connection with the PC, it indicates that the user wants to use the mouse to control the PC.
  • the mobile phone may display a list option including the identifiers of all devices connected to the mobile phone for the user to select the device that the user wants to control with the mouse. If the user selects the identification of the PC, it indicates that the user wants to use the mouse to control the PC.
  • the shuttle relationship can also be preconfigured in the mobile phone to determine which device the mouse is shuttled to, that is, to determine which device to respond to the operation of the mouse.
  • the configuration and application of the shuttle relationship are described in detail in the above-mentioned embodiments. The description is similar and will not be described in detail here.
  • the mobile phone After the mobile phone determines that the user wants to use the mouse to control the PC, the mobile phone can send the shuttle status information for indicating the end of the keyboard and mouse shuttle to the PC. After the PC receives the shuttle status information, it can determine that the mouse shuttle ends. After that, the PC can unload the HOOK (or close the HOOK), that is, cancel the interception of input events, such as mouse events.
  • the keyboard and mouse module of the PC will not intercept the received input event, but will send the received input event to the Windows system of the PC, so that the Windows system of the PC can respond to the input.
  • the user can use the PC's mouse to control the PC.
  • the mouse and keyboard module of the PC can also redisplay the cursor on the PC display screen.
  • the specific implementation of determining by the mobile phone that the cursor on the mobile phone slides out of the edge of the display screen of the mobile phone may be: after the cursor is displayed on the mobile phone, the mobile phone can monitor the real-time coordinate position of the cursor on the display screen of the mobile phone (for example, the real-time coordinate position of the cursor Coordinate positions can be obtained using registered listeners). The mobile phone can determine the coordinate position of the cursor on the display screen of the mobile phone according to the initial position and relative displacement of the cursor, so as to determine whether the cursor slides out of the edge of the display screen of the mobile phone.
  • the initial position of the cursor may be the coordinate position of the cursor on the display screen of the mobile phone when the mouse starts to move, or the coordinate position of the cursor on the display screen of the mobile phone before the mouse starts to move.
  • the initial position of the cursor may specifically refer to a coordinate in a coordinate system with the upper left corner of the mobile phone display as the coordinate origin, the X axis points from the upper left corner to the right edge of the mobile phone display, and the Y axis points from the upper left corner to the lower edge of the mobile phone display. Location.
  • the specific implementation of the mobile phone determining that the cursor slides out of the edge of the display screen of the mobile phone is similar to the specific implementation of determining that the cursor slides out of the edge of the display screen of the PC by the PC, and will not be described in detail here.
  • the mobile phone can project the corresponding interface to the PC for display.
  • the corresponding interface is displayed on the mobile phone, and the corresponding interface is not projected to the PC for display. In this way, the user can freely control the display of the interface of the second terminal on different devices according to his actual needs. It not only protects user privacy, but also avoids user distraction. Improve user experience.
  • the mobile phone selects to display the corresponding interface on different devices according to the input device ID or input method included in the corresponding input event.
  • the PC can not only share the mouse of the PC with other terminals, such as the mobile phone, but also share the keyboard of the PC with the mobile phone.
  • the keyboard and mouse sharing mode no matter whether the user uses the input device of the mobile phone to operate the input box displayed on the mobile phone, or uses the mouse of the PC to operate the input box displayed on the mobile phone, the mobile phone will not operate the input box displayed on the mobile phone.
  • the virtual keyboard is displayed on the mobile phone, and the default is to use the keyboard of the PC, such as the so-called physical keyboard to realize input.
  • the input device of the mobile phone such as the touch screen to operate the input box
  • the focus is on the mobile phone. If the physical keyboard of the PC is still used for input at this time, it is necessary to frequently switch between the two devices. Efficiency of multi-terminal collaborative use.
  • FIG. 11 is a schematic flowchart of another display method provided by an embodiment of the present application. As shown in FIG. 11 , the method may include the following S1101-S1109.
  • the mobile phone establishes a connection with the PC.
  • the PC receives a mouse movement event, and displays an animation of the cursor movement on the display screen of the PC according to the mouse movement event.
  • the PC monitors the coordinate position of the cursor on the PC display screen.
  • the PC intercepts the mouse movement event according to the coordinate position of the cursor on the PC display screen and determines that the cursor slides out of the edge of the PC display screen, and sends the parameter 1 included in the mouse movement event to the mobile phone.
  • the mobile phone receives parameter 1, simulates a mouse movement event according to parameter 1, and displays an animation of cursor movement on the display screen of the mobile phone according to the mouse movement event.
  • the PC receives the mouse down event and the mouse up event, intercepts the mouse down event and the mouse up event, and converts the parameters 2 and 2 contained in the mouse down event.
  • the mouse up event includes parameter 3 sent to the phone.
  • the interface may be the third interface in this embodiment of the present application.
  • the mobile phone receives parameter 2 and parameter 3, simulates a mouse down event and a mouse up event according to the parameter 2 and parameter 3, and determines not to display the virtual keyboard on the mobile phone according to the mouse down event and the mouse up event.
  • the mobile phone receives the user's click operation on the input box in the interface displayed by the mobile phone on the touch screen of the mobile phone.
  • the interface may be the third interface in this embodiment of the present application.
  • the mobile phone displays a virtual keyboard on the mobile phone according to the click operation.
  • the mobile phone after receiving the user's operation on the input box displayed on the mobile phone (such as the second operation in this embodiment of the present application), the mobile phone can correspond to the input event (for example, the input device ID or input method included in the third input event or the fourth input event in the embodiment of the present application), determine whether the input source for inputting the operation is the input device of the mobile phone or the input device of the PC, so as to determine whether it is on the mobile phone Display the virtual keyboard. If the input source for inputting the operation is an input device of a PC, such as a mouse, the mobile phone may not display a virtual keyboard on the mobile phone, and the user may use the keyboard of the PC to realize the input.
  • the input event For example, the input device ID or input method included in the third input event or the fourth input event in the embodiment of the present application
  • determine whether the input source for inputting the operation is the input device of the mobile phone or the input device of the PC, so as to determine whether it is on the mobile phone Display the virtual keyboard. If the input source for inputting the operation
  • the input source for inputting the operation is the input device of the mobile phone, such as a touch screen
  • the mobile phone can display a virtual keyboard on the mobile phone, and the user can use the virtual keyboard to realize input.
  • the specific description of the above S1101-S1109 is similar to the description of the corresponding steps in the above-mentioned embodiment S301-S309, and details are not repeated here.
  • a chat interface 1201 is currently displayed on the mobile phone.
  • the chat interface 1201 includes an input box 1203 .
  • the user wants to use the keyboard of the PC to input text in the input box 1203, the user can move the mouse 1202 of the PC, so that the mouse of the PC can shuttle to the mobile phone. After that, the user can continue to move the mouse of the PC, so that the cursor on the mobile phone moves to the position of the input box 1203 .
  • the user can use the mouse 1202 of the PC to perform a click operation.
  • the mobile phone can obtain the input event corresponding to the click operation.
  • the mobile phone can determine that the input source for inputting the click operation is the mouse of the PC. In this way, the mobile phone can determine that the virtual keyboard is not displayed on the mobile phone, and the user can use the keyboard 1204 of the PC to input characters in the input box 1203 .
  • the specific implementation that the user uses the keyboard 1204 of the PC to input in the input box 1203 may be: after the user operates the keyboard 1204 of the PC, the PC (such as the keyboard and mouse module of the PC) can receive a corresponding input event, such as a keyboard event , the keyboard event includes the operation parameters of the specific operation performed by the user on the keyboard 1204 .
  • the keyboard and mouse module of the PC can intercept the keyboard event (for example, by using a mounted HOOK), so that the keyboard event will not be sent to the Windows system of the PC, so that the PC will not respond to the keyboard event.
  • the keyboard and mouse module of the PC can also capture operation parameters in keyboard events. After that, the PC can send the operating parameters to the mobile phone. After receiving the operation parameter, the mobile phone can simulate (for example, simulate by using the created virtual input device) the corresponding keyboard event according to the operation parameter. In response to the keyboard event, the mobile phone can display the corresponding text in the input box 1203 to realize input.
  • the mobile phone can display the chat interface 1201 on the display screen of the PC, as shown in (b) of FIG. 12 , and the chat interface displayed by the PC is shown as 1205 .
  • the PC displays the chat interface 1205
  • the user still uses the keyboard 1204 of the PC to input text, and the input result can be displayed on the PC synchronously (for example, the PC updates the screen-casting interface, so that the input result can be displayed on the PC synchronously).
  • a chat interface 1301 is currently displayed on the mobile phone.
  • the chat interface 1301 includes an input box 1302 .
  • the user wants to use the virtual keyboard of the mobile phone to input text in the input box 1302, the user can perform a click operation at the position of the input box 1302 on the touch screen of the mobile phone.
  • the mobile phone can obtain the corresponding input event, and according to the input device ID or input method included in the input event, the mobile phone can determine that the input source of the click operation is the touch screen of the mobile phone. Therefore, as shown in (b) of FIG. 13 , in response to the click operation, the mobile phone can display a virtual keyboard 1303 on the mobile phone, and the user can use the virtual keyboard 1303 to input characters in the input box 1302 .
  • the mobile phone may not display the virtual keyboard of the mobile phone, and the user may use the keyboard of the PC to input.
  • a virtual keyboard is displayed on the mobile phone, and the user can use the virtual keyboard to realize input. In this way, there is no need for the user to frequently switch between the two devices, which improves the efficiency of multi-terminal collaborative use.
  • FIG. 14 is a schematic diagram of the composition of a display device according to an embodiment of the present application.
  • the apparatus can be applied to a second terminal (such as the above-mentioned mobile phone), the second terminal is connected to the first terminal, and the apparatus can include: a display unit 1401 , an input unit 1402 and a sending unit 1403 .
  • the display unit 1401 is used to display the first interface.
  • the input unit 1402 is configured to receive a user's first operation on the content of the first interface.
  • the sending unit 1403 is configured to send data to the first terminal in response to the first operation when the input source of the first operation is the input device of the first terminal, and the data is used for the display of the first terminal on the first terminal
  • the second interface is displayed on the screen.
  • the display unit 1401 is further configured to display the second interface on the display screen of the second terminal in response to the first operation when the input source of the first operation is the input device of the second terminal.
  • the apparatus may further include: a receiving unit 1404, configured to receive shuttle status information from the first terminal, where the shuttle status information can be used to indicate The shuttle of the input device starts.
  • the receiving unit 1404 is further configured to receive a first operation parameter from the first terminal, where the first operation parameter is the first operation parameter corresponding to the first operation when the user uses the input device of the first terminal to perform the first operation.
  • An input event contains operational parameters.
  • the apparatus may further include: a simulation unit 1405 and a determination unit 1406 .
  • the simulation unit 1405 is configured to simulate the first input event according to the first operation parameter.
  • the determining unit 1406 is configured to determine, according to the simulated first input event, that the input source of the first operation is the input device of the first terminal.
  • the sending unit 1403 sends data to the first terminal in response to the first operation, which specifically includes: the sending unit 1403 sends data to the first terminal in response to the first input event.
  • the determining unit 1406 is specifically configured to determine that the identifier of the input device included in the simulated first input event is the identifier of the virtual input device, and the virtual input device is created by the second terminal to simulate the input event; or, to determine the simulated first input event.
  • the type of the input device indicated by the input mode included in an input event is the same as the type of the input device of the first terminal, and it is determined that the shuttle state information for indicating the start of the shuttle of the input device is received from the first terminal.
  • the determining unit 1406 is further configured to determine, according to the second input event corresponding to the first operation, that the input source of the first operation is the input source of the second terminal. input device.
  • the display unit 1401 displaying the second interface on the display screen of the second terminal in response to the first operation may include: the display unit 1401 displaying the second interface on the display screen of the second terminal in response to the second input event.
  • the determining unit 1406 is specifically configured to: determine that the input device identification included in the second input event is the identification of the input device of the second terminal; or, determine that the input device type indicated by the input mode included in the second input event is the same as the second input device type.
  • the input device of the terminal is of the same type.
  • the apparatus may further include: a creating unit 1407, configured to create a virtual input device after the connection with the first terminal is successfully established; or, a receiving unit 1404, further configured to receive a notification message from the first terminal, notifying The message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on, and the creating unit 1407 is configured to create a virtual input device in response to the notification message; wherein the virtual input device is used for the second terminal to simulate the input of the first terminal input device.
  • a creating unit 1407 configured to create a virtual input device after the connection with the first terminal is successfully established
  • a receiving unit 1404 further configured to receive a notification message from the first terminal, notifying The message is used to indicate that the keyboard and mouse sharing mode of the first terminal has been turned on
  • the creating unit 1407 is configured to create a virtual input device in response to the notification message; wherein the virtual input device is used for the second terminal to simulate the input of the first terminal input device.
  • the display unit 1401 is further configured to display a third interface, where the third interface includes an input box.
  • the input unit 1402 is further configured to receive the user's second operation on the input box.
  • the display unit 1401 is further configured to display a virtual keyboard on the display screen of the second terminal in response to the second operation when the input source of the second operation is the input device of the second terminal.
  • the sending unit 1403 is further configured to send the data of the third interface to the first terminal in response to the second operation.
  • the virtual keyboard is displayed, and the data of the third interface is used for the first terminal to display the third interface on the display screen of the first terminal.
  • the determining unit 1406 is further configured to determine, according to the third input event corresponding to the second operation, that the input source of the second operation is the input source of the second terminal. input device.
  • the display unit 1401 displaying the virtual keyboard on the display screen of the second terminal in response to the second operation may include: the display unit 1401 displaying the virtual keyboard on the display screen of the second terminal in response to the third input event.
  • the determining unit 1406 is specifically configured to: determine that the input device identification included in the third input event is the identification of the input device of the second terminal; or, determine that the input device type indicated by the input mode included in the third input event is the same as the second input device type.
  • the input device of the terminal is of the same type.
  • the receiving unit 1404 is further configured to receive a second operation parameter from the first terminal, where the second operation parameter is the first operation parameter corresponding to the second operation when the user uses the input device of the first terminal to perform the second operation.
  • Four input events contain operational parameters.
  • the simulation unit 1405 is further configured to simulate a fourth input event according to the second operation parameter; the determination unit 1406 is further configured to determine that the input source of the second operation is the input device of the first terminal according to the simulated fourth input event.
  • the sending unit 1403 sends the data of the third interface to the first terminal in response to the second operation, which specifically includes: the sending unit 1403 sends the data of the third interface to the first terminal in response to the fourth input event.
  • the determining unit 1406 is specifically configured to: determine that the input device identification included in the simulated fourth input event is the identification of the virtual input device; or, determine the input device type indicated by the input mode included in the simulated fourth input event and the first input device type.
  • the input devices of one terminal are of the same type, and it is determined that shuttle status information for indicating the start of the shuttle of the input device is received from the first terminal.
  • the display device shown in FIG. 14 can also be used for the second terminal to realize the following functions.
  • the second terminal is connected to the first terminal, the keyboard and mouse sharing mode of the first terminal is enabled, and the second terminal creates a virtual input device for simulating the input of the input device of the first terminal.
  • the display unit 1401 is used to display an interface.
  • the input unit 1402 is configured to receive the user's operation on the input box of the interface.
  • the display unit 1401 is further configured to display a virtual keyboard on the display screen of the second terminal in response to the operation when the input source of the operation is an input device of the second terminal.
  • the display unit 1401 is further configured to not display the virtual keyboard on the display screen of the second terminal in response to the operation when the input source of the operation is the input device of the first terminal.
  • the sending unit 1403 is configured to send the data of the above interface to the first terminal in response to the operation when the input source of the operation is the input device of the first terminal, and the data is used by the first terminal in the first terminal.
  • the interface is displayed on the display screen of a terminal, and the virtual keyboard is not displayed on the interface.
  • the determining unit 1406 is configured to determine that the input source of the operation is the input device of the second terminal according to the input event corresponding to the operation.
  • the display unit 1401 displaying the virtual keyboard on the display screen of the second terminal in response to the operation may include: the display unit 1401 displaying the virtual keyboard on the display screen of the second terminal in response to the input event corresponding to the operation.
  • the determining unit 1406 is specifically configured to: determine that the input device identifier included in the input event is the identifier of the input device of the second terminal; or, determine the input device type indicated by the input method included in the input event and the input device of the second terminal of the same type.
  • the receiving unit 1404 is configured to receive the shuttle status information from the first terminal, where the shuttle status information can be used to indicate the start of the shuttle of the input device, or It is said that the shuttle status information can be used to instruct the second terminal to start accepting input from the input device of the first terminal.
  • the receiving unit 1404 is further configured to receive the operation parameter from the first terminal, and the operation parameter is executed when the user uses the input device of the first terminal.
  • the operation parameters contained in the input event corresponding to the operation are the operation parameters contained in the input event corresponding to the operation.
  • the simulation unit 1405 is configured to simulate the corresponding input event according to the operation parameter.
  • the determining unit 1406 is further configured to determine, according to the simulated input event, that the input source of the above operation is the input device of the first terminal.
  • the display unit 1401 not displaying the virtual keyboard on the display screen of the second terminal in response to the operation may include: the display unit 1401 not displaying the virtual keyboard on the display screen of the second terminal in response to the input event.
  • the sending unit 1403 sending the data of the interface to the first terminal in response to the operation may include: the sending unit 1403 sending the data of the interface to the first terminal in response to the input event.
  • the determining unit 1406 is specifically configured to: determine that the input device identifier included in the simulated input event is the identifier of the virtual input device, and the virtual input device is created by the second terminal to simulate the input event; or, determine the simulated input event
  • the type of the input device indicated by the included input mode is the same as the type of the input device of the first terminal, and it is determined that the shuttle state information for indicating the start of the shuttle of the input device is received from the first terminal.
  • the creating unit 1407 is configured to create a virtual input device after the connection with the first terminal is successfully established; or, the receiving unit 1404 is further configured to receive a notification message from the first terminal, where the notification message is used to indicate the first terminal
  • the keyboard and mouse sharing mode of the terminal is enabled, and the creating unit 1407 is configured to create a virtual input device in response to the notification message; wherein, the virtual input device is used for the second terminal to simulate the input of the first terminal input device.
  • An embodiment of the present application further provides a display apparatus, which can be applied to an electronic device, such as the first terminal or the second terminal in the foregoing embodiments.
  • the apparatus may include: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions to cause the display apparatus to implement various functions or steps performed by the mobile phone or PC in the above method embodiments.
  • An embodiment of the present application further provides an electronic device (the electronic device may be a terminal, such as the first terminal or the second terminal in the above-mentioned embodiment), and the electronic device may include: a display screen, a memory, and one or more processor.
  • the display screen, memory and processor are coupled.
  • the memory is used to store computer program code comprising computer instructions.
  • the processor executes the computer instructions, the electronic device can execute various functions or steps executed by the mobile phone or PC in the foregoing method embodiments.
  • the electronic device includes but is not limited to the above-mentioned display screen, memory and one or more processors.
  • the structure of the electronic device may refer to the structure of the mobile phone shown in FIG. 2A .
  • the chip system includes at least one processor 1501 and at least one interface circuit 1502 .
  • the processor 1501 may be the processor in the above electronic device.
  • the processor 1501 and the interface circuit 1502 may be interconnected by wires.
  • the processor 1501 may receive and execute computer instructions from the memory of the above electronic device through the interface circuit 1502 .
  • the electronic device can be made to execute various steps executed by the mobile phone or PC in the above-mentioned embodiments.
  • the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
  • Embodiments of the present application further provide a computer-readable storage medium, which is used to store an electronic device, such as a computer instruction run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • an electronic device such as a computer instruction run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • Embodiments of the present application further provide a computer program product, including an electronic device, such as computer instructions run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • a computer program product including an electronic device, such as computer instructions run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种显示方法及设备,涉及电子设备领域。多终端协同使用时,可根据用户输入操作使用的输入设备的不同,在对应设备上显示界面。不仅保护了用户隐私,也避免了用户注意力转移,提高了用户的使用体验。具体方案为:第二终端显示第一界面,接收用户对第一界面的内容的第一操作;在第一操作的输入源是第一终端的输入设备的情况下,响应于第一操作,第二终端向第一终端发送数据,用于第一终端在第一终端的显示屏上显示第二界面;在第一操作的输入源是第二终端的输入设备的情况下,响应于第一操作,第二终端在第二终端的显示屏上显示第二界面。

Description

一种显示方法及设备
本申请要求于2020年09月02日提交国家知识产权局、申请号为202010911452.0、申请名称为“一种显示方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,尤其涉及一种显示方法及设备。
背景技术
目前,用户可同时拥有更多如手机、平板电脑、个人电脑(personal computer,PC)等终端。在日常工作中用户可能会同时使用多个终端,则注意力需要经常在这多个终端间切换。如,用户在使用PC时,手机接收到了短消息,此时需要拿起手机去回复。又如,用户在使用PC写邮件时需要用到手机上的图片,此时需要拿起手机把图片从手机传给PC。这严重影响了多个终端协同使用的便捷性。
为了提高多终端协同使用的便捷性,用户可将多个终端连接起来配合使用。例如,用户拥有一台PC和一部手机,用户可将PC和手机采用无线或有线的方式连接起来协同办公。具体的,在PC与手机协同办公的场景中,多屏协同实现了利用镜像投屏将手机显示界面到PC显示屏上的投射显示。这样,用户可以在PC端实现对手机的操作。如,用户可利用PC的鼠标,通过在PC上投射的界面中执行鼠标点击,鼠标移动等鼠标操作,实现对手机上显示实际界面的操作。用户也可以利用手机的触摸屏直接对手机进行操作。
在上述多屏协同方案中,手机的显示界面一直都会投射显示到PC显示屏上。
发明内容
本申请实施例提供一种显示方法及设备,在多终端协同使用时,可根据用户输入操作使用的输入设备的不同,在对应设备上显示界面。不仅保护了用户隐私,也避免了用户注意力转移,提高了用户的使用体验。
为达到上述目的,本申请实施例采用如下技术方案:
本申请的第一方面,提供一种显示方法,该方法可以应用于第二终端,该第二终端与第一终端连接,该方法可以包括:第二终端显示第一界面;第二终端接收用户对第一界面的内容的第一操作;在第一操作的输入源是第一终端的输入设备的情况下,响应于第一操作,第二终端向第一终端发送数据,该数据用于第一终端在第一终端的显示屏上显示第二界面;在第一操作的输入源是第二终端的输入设备的情况下,响应于第一操作,第二终端在第二终端的显示屏上显示第二界面。
采用上述技术方案,在第一终端和第二终端协同使用的场景中。当用户使用第一终端的输入设备对第二终端进行控制时,第二终端可将对应界面投射到第一终端上显示。当用户使用第二终端的输入设备对第二终端进行控制时,则第二终端在第二终端上显示对应界面,不将对应界面投射到第一终端上显示。这样,用户可根据自己的实际需要自由控制第二终端的界面在不同设备上的显示。不仅保护了用户隐私,也避免了用户注意力转移。提高了用户的使用体验。
在一种可能的实现方式中,在第一操作的输入源是第一终端的输入设备的情况下,在第二终端接收第一操作之前,该方法还可以包括:第二终端接收来自第一终端的穿梭状态信息,该穿梭状态信息可用于指示输入设备的穿梭开始,或者说该穿梭状态信息可用于指示第二终端开始接受第一终端的输入设备的输入。
在另一种可能的实现方式中,在第一操作的输入源是第一终端的输入设备的情况下,第二终端接收用户对第一界面的内容的第一操作,可以包括:第二终端接收来自第一终端的第一操作参数,该第一操作参数是在用户使用第一终端的输入设备执行第一操作的情况下,第一操作对应的第一输入事件包含的操作参数;第二终端根据第一操作参数,模拟第一输入事件;响应于第一操作,第二终端向第一终端发送数据,可以包括:第二终端根据模拟的第一输入事件,确定第一操作的输入源是第一终端的输入设备;响应于该第一输入事件,第二终端向第一终端发送数据。以输入事件为依据,可确定出对应操作的输入源,从而确定是在第二终端上显示对应界面,还是将界面投射到其他设备的显示。
在另一种可能的实现方式中,第二终端根据模拟的第一输入事件,确定第一操作的输入源是第一终端的输入设备,可以包括:第二终端确定模拟的第一输入事件包括的输入设备标识为虚拟输入设备的标识,虚拟输入设备是第二终端创建的用于模拟输入事件;或,第二终端确定模拟的第一输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
在另一种可能的实现方式中,响应于第一操作,第二终端在第二终端的显示屏上显示第二界面,可以包括:在用户使用第二终端的输入设备执行第一操作的情况下,第二终端根据第一操作对应的第二输入事件,确定第一操作的输入源是第二终端的输入设备;响应于第二输入事件,第二终端在第二终端的显示屏上显示第二界面。
在另一种可能的实现方式中,第二终端根据第一操作对应的第二输入事件,确定第一操作的输入源是第二终端的输入设备,可以包括:第二终端确定第二输入事件包括的输入设备标识为第二终端的输入设备的标识;或,第二终端确定第二输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
在另一种可能的实现方式中,该方法还可以包括:第二终端在与第一终端的连接建立成功后,创建虚拟输入设备;或者,第二终端接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,响应于通知消息,第二终端创建虚拟输入设备;其中,虚拟输入设备用于第二终端模拟第一终端输入设备的输入。通过创建虚拟输入设备,实现了多设备的键鼠共享,以借助键鼠共享技术,实现一个终端的输入设备对多个终端的控制。
在另一种可能的实现方式中,该方法还可以包括:第二终端显示第三界面,该第三界面包括输入框;第二终端接收用户对输入框的第二操作;在第二操作的输入源是第二终端的输入设备的情况下,响应于第二操作,第二终端在第二终端的显示屏上显示虚拟键盘。当用户使用第二终端的输入设备对界面中的输入框进行操作时,可以在第二终端上显示虚拟键盘,以便用户可使用虚拟键盘实现输入,无需转移注意力,提高了多终端协同使用的效率。
在另一种可能的实现方式中,该方法还可以包括:在第二操作的输入源是第一终端的输入设备的情况下,响应于第二操作,第二终端向第一终端发送第三界面的数据,第 三界面上不显示虚拟键盘,第三界面的数据用于第一终端在第一终端的显示屏上显示第三界面。当用户使用第一终端的输入设备对界面中的输入框进行操作时,可以不显示虚拟键盘,并将第三界面投射到第一终端上显示,用户可使用第一终端的键盘实现输入,无需转移注意力,提高了多终端协同使用的效率。
在另一种可能的实现方式中,响应于第二操作,第二终端在第二终端的显示屏上显示虚拟键盘,可以包括:在用户使用第二终端的输入设备执行第二操作的情况下,第二终端根据第二操作对应的第三输入事件,确定第二操作的输入源是第二终端的输入设备;响应于第三输入事件,第二终端在第二终端的显示屏上显示虚拟键盘。
在另一种可能的实现方式中,第二终端根据第二操作对应的第三输入事件,确定第二操作的输入源是第二终端的输入设备,可以包括:第二终端确定第三输入事件包括的输入设备标识为第二终端的输入设备的标识;或,第二终端确定第三输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
在另一种可能的实现方式中,在第二操作的输入源是第一终端的输入设备的情况下,第二终端接收用户对输入框的第二操作,可以包括:第二终端接收来自第一终端的第二操作参数,该第二操作参数是在用户使用第一终端的输入设备执行第二操作的情况下,第二操作对应的第四输入事件包含的操作参数;第二终端根据第二操作参数,模拟第四输入事件;响应于第二操作,第二终端向第一终端发送第三界面的数据,第三界面上不显示虚拟键盘,可以包括:第二终端根据模拟的第四输入事件,确定第二操作的输入源是第一终端的输入设备;响应于第四输入事件,第二终端向第一终端发送第三界面的数据,第三界面上不显示虚拟键盘。
在另一种可能的实现方式中,第二终端根据模拟的第四输入事件,确定第二操作的输入源是第一终端的输入设备,可以包括:第二终端确定模拟的第四输入事件包括的输入设备标识为虚拟输入设备的标识;或,第二终端确定模拟的第四输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
本申请的第二方面,提供一种显示装置,该装置可以应用于第二终端,该第二终端与第一终端连接,该装置可以包括:显示单元,用于显示第一界面;输入单元,用于接收用户对第一界面的内容的第一操作;发送单元,用于在第一操作的输入源是第一终端的输入设备的情况下,响应于第一操作,向第一终端发送数据,该数据用于第一终端在第一终端的显示屏上显示第二界面;显示单元,还用于在第一操作的输入源是第二终端的输入设备的情况下,响应于第一操作,在第二终端的显示屏上显示第二界面。
在一种可能的实现方式中,在第一操作的输入源是第一终端的输入设备的情况下,该装置还可以包括:接收单元,用于接收来自第一终端的穿梭状态信息,该穿梭状态信息可用于指示输入设备的穿梭开始。
在另一种可能的实现方式中,接收单元,还用于接收来自第一终端的第一操作参数,该第一操作参数是在用户使用第一终端的输入设备执行第一操作的情况下,第一操作对应的第一输入事件包含的操作参数;该装置还可以包括:模拟单元和确定单元,模拟单元,用于根据第一操作参数,模拟第一输入事件;确定单元,用于根据模拟的第一输入事件,确定第一操作的输入源是第一终端的输入设备。发送单元响应于第一操作,向第一终端发送数据,具体包括:发送单元响应于第一输入事件,向第一终端发送数据。
在另一种可能的实现方式中,确定单元,具体用于确定模拟的第一输入事件包括的 输入设备标识为虚拟输入设备的标识,虚拟输入设备是第二终端创建的用于模拟输入事件;或,确定模拟的第一输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
在另一种可能的实现方式中,在用户使用第二终端的输入设备执行第一操作的情况下,确定单元,还用于根据第一操作对应的第二输入事件,确定第一操作的输入源是第二终端的输入设备。显示单元响应于第一操作,在第二终端的显示屏上显示第二界面,可以包括:显示单元响应于第二输入事件,在第二终端的显示屏上显示第二界面。
在另一种可能的实现方式中,确定单元,具体用于:确定第二输入事件包括的输入设备标识为第二终端的输入设备的标识;或,确定第二输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
在另一种可能的实现方式中,该装置还可以包括:创建单元,用于在与第一终端的连接建立成功后,创建虚拟输入设备;或者,接收单元,还用于接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,创建单元,用于响应于通知消息,创建虚拟输入设备;其中,虚拟输入设备用于第二终端模拟第一终端输入设备的输入。
在另一种可能的实现方式中,显示单元,还用于显示第三界面,该第三界面包括输入框;输入单元,还用于接收用户对输入框的第二操作;显示单元,还用于在第二操作的输入源是第二终端的输入设备的情况下,响应于第二操作,在第二终端的显示屏上显示虚拟键盘。
在另一种可能的实现方式中,在第二操作的输入源是第一终端的输入设备的情况下,发送单元,还用于响应于第二操作,向第一终端发送第三界面的数据,第三界面上不显示虚拟键盘,第三界面的数据用于第一终端在第一终端的显示屏上显示第三界面。
在另一种可能的实现方式中,在用户使用第二终端的输入设备执行第二操作的情况下,确定单元,还用于根据第二操作对应的第三输入事件,确定第二操作的输入源是第二终端的输入设备。显示单元响应于第二操作,在第二终端的显示屏上显示虚拟键盘,可以包括:显示单元响应于第三输入事件在第二终端的显示屏上显示虚拟键盘。
在另一种可能的实现方式中,确定单元,具体用于:确定第三输入事件包括的输入设备标识为第二终端的输入设备的标识;或,确定第三输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
在另一种可能的实现方式中,接收单元,还用于接收来自第一终端的第二操作参数,该第二操作参数是在用户使用第一终端的输入设备执行第二操作的情况下,第二操作对应的第四输入事件包含的操作参数;模拟单元,还用于根据第二操作参数,模拟第四输入事件;确定单元,还用于根据模拟的第四输入事件,确定第二操作的输入源是第一终端的输入设备。发送单元响应于第二操作,向第一终端发送第三界面的数据,具体包括:发送单元响应于第四输入事件,向第一终端发送第三界面的数据。
在另一种可能的实现方式中,确定单元,具体用于:确定模拟的第四输入事件包括的输入设备标识为虚拟输入设备的标识;或,确定模拟的第四输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
本申请的第三方面,提供一种显示方法,该方法可以应用于第二终端,第二终端与 第一终端连接,第一终端的键鼠共享模式开启,第二终端创建了虚拟输入设备,用于模拟第一终端输入设备的输入,该方法可以包括:第二终端显示界面;第二终端接收用户对该界面的输入框的操作;在该操作的输入源是第二终端的输入设备的情况下,响应于该操作,第二终端在第二终端的显示屏上显示虚拟键盘。
采用上述技术方案,在第一终端和第二终端协同使用的场景中。当用户使用第二终端的输入设备对输入框进行操作时,则在第二终端上显示虚拟键盘,用户可使用该虚拟键盘实现输入。这样,无需用户将注意力频繁在两个设备间切换,提高了多终端协同使用的效率。
在一种可能的实现方式中,该方法还可以包括:在该操作的输入源是第一终端的输入设备的情况下,响应于该操作,第二终端不在第二终端的显示屏上显示虚拟键盘。在第一终端和第二终端协同使用的场景中。当用户使用第一终端的输入设备对第二终端中的输入框进行操作时,第二终端可不显示第一终端的虚拟键盘,用户可使用第一终端的键盘实现输入。无需用户将注意力频繁在两个设备间切换,提高了多终端协同使用的效率。
在另一种可能的实现方式中,该方法还可以包括:在该操作的输入源是第一终端的输入设备的情况下,响应于该操作,第二终端向第一终端发送上述界面的数据,该数据用于第一终端在第一终端的显示屏上显示该界面,该界面上不显示虚拟键盘。
在另一种可能的实现方式中,响应于该操作,第二终端在第二终端的显示屏上显示虚拟键盘,可以包括:在用户使用第二终端的输入设备执行对输入框的操作的情况下,第二终端根据该操作对应的输入事件,确定该操作的输入源是第二终端的输入设备,响应于该操作对应的输入事件,第二终端在第二终端的显示屏上显示虚拟键盘。
在另一种可能的实现方式中,第二终端根据操作对应的输入事件,确定操作的输入源是第二终端的输入设备,包括:第二终端确定输入事件包括的输入设备标识为第二终端的输入设备的标识;或,第二终端确定输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
在另一种可能的实现方式中,在上述操作的输入源是第一终端的输入设备的情况下,在第二终端接收对输入框的操作之前,该方法还可以包括:第二终端接收来自第一终端的穿梭状态信息,该穿梭状态信息可用于指示输入设备的穿梭开始,或者说该穿梭状态信息可用于指示第二终端开始接受第一终端的输入设备的输入。
在另一种可能的实现方式中,在上述操作的输入源是第一终端的输入设备的情况下,第二终端接收用户对输入框的操作,可以包括:第二终端接收来自第一终端的操作参数,该操作参数是在用户使用第一终端的输入设备执行上述操作的情况下,该操作对应的输入事件包含的操作参数;第二终端根据该操作参数,模拟对应输入事件;该方法还可以包括:第二终端根据模拟的输入事件,确定上述操作的输入源是第一终端的输入设备。响应于操作,第二终端不在第二终端的显示屏上显示虚拟键盘可以包括:响应于该输入事件,第二终端不在第二终端的显示屏上显示虚拟键盘。响应于该操作,第二终端向第一终端发送上述界面的数据,可以包括:响应于该输入事件,第二终端向第一终端发送界面的数据。
在另一种可能的实现方式中,第二终端根据模拟的输入事件,确定操作的输入源是第一终端的输入设备,可以包括:第二终端确定模拟的输入事件包括的输入设备标识为虚拟输入设备的标识,虚拟输入设备是第二终端创建的用于模拟输入事件;或,第二终 端确定模拟的输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
在另一种可能的实现方式中,该方法还可以包括:第二终端在与第一终端的连接建立成功后,创建虚拟输入设备;或者,第二终端接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,响应于通知消息,第二终端创建虚拟输入设备;其中,虚拟输入设备用于第二终端模拟第一终端输入设备的输入。通过创建虚拟输入设备,实现了多设备的键鼠共享,以借助键鼠共享技术,实现一个终端的输入设备对多个终端的控制。
本申请的第四方面,提供一种显示装置,该装置可以应用于第二终端,第二终端与第一终端连接,第一终端的键鼠共享模式开启,第二终端创建了虚拟输入设备,用于模拟第一终端输入设备的输入,该装置可以包括:显示单元,用于显示界面;输入单元,用于接收用户对该界面的输入框的操作;显示单元,还用于在该操作的输入源是第二终端的输入设备的情况下,响应于该操作,第二终端在第二终端的显示屏上显示虚拟键盘。
在一种可能的实现方式中,显示单元,还用于在该操作的输入源是第一终端的输入设备的情况下,响应于该操作,不在第二终端的显示屏上显示虚拟键盘。
在另一种可能的实现方式中,该装置还可以包括:发送单元,用于在该操作的输入源是第一终端的输入设备的情况下,响应于该操作,向第一终端发送上述界面的数据,该数据用于第一终端在第一终端的显示屏上显示该界面,该界面上不显示虚拟键盘。
在另一种可能的实现方式中,该装置还可以包括:在用户使用第二终端的输入设备执行对输入框的操作的情况下,确定单元,用于根据该操作对应的输入事件,确定该操作的输入源是第二终端的输入设备。显示单元响应于该操作,在第二终端的显示屏上显示虚拟键盘,可以包括:显示单元响应于该操作对应的输入事件,在第二终端的显示屏上显示虚拟键盘。
在另一种可能的实现方式中,确定单元,具体用于:确定输入事件包括的输入设备标识为第二终端的输入设备的标识;或,确定输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
在另一种可能的实现方式中,在上述操作的输入源是第一终端的输入设备的情况下,该装置还可以包括:接收单元,用于接收来自第一终端的穿梭状态信息,该穿梭状态信息可用于指示输入设备的穿梭开始,或者说该穿梭状态信息可用于指示第二终端开始接受第一终端的输入设备的输入。
在另一种可能的实现方式中,在上述操作的输入源是第一终端的输入设备的情况下,接收单元,还用于接收来自第一终端的操作参数,该操作参数是在用户使用第一终端的输入设备执行上述操作的情况下,该操作对应的输入事件包含的操作参数;该装置还可以包括:模拟单元,用于根据该操作参数,模拟对应输入事件;确定单元,还用于根据模拟的输入事件,确定上述操作的输入源是第一终端的输入设备。显示单元响应于操作,不在第二终端的显示屏上显示虚拟键盘可以包括:显示单元响应于该输入事件,不在第二终端的显示屏上显示虚拟键盘。发送单元响应于该操作,向第一终端发送上述界面的数据,可以包括:发送单元响应于该输入事件,向第一终端发送界面的数据。
在另一种可能的实现方式中,确定单元,具体用于:确定模拟的输入事件包括的输入设备标识为虚拟输入设备的标识,虚拟输入设备是第二终端创建的用于模拟输入事件;或,确定模拟的输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的 类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
在另一种可能的实现方式中,该装置还可以包括:创建单元,用于在与第一终端的连接建立成功后,创建虚拟输入设备;或者,接收单元,还用于接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,创建单元,用于响应于通知消息,创建虚拟输入设备;其中,虚拟输入设备用于第二终端模拟第一终端输入设备的输入。通过创建虚拟输入设备,实现了多设备的键鼠共享,以借助键鼠共享技术,实现一个终端的输入设备对多个终端的控制。
本申请的第五方面,提供一种显示装置,该显示装置可以包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令时使得显示装置实现如第一方面或第一方面的可能的实现方式中任一项,或者第三方面或第三方面的可能的实现方式中任一项所述的方法。
本申请的第六方面,提供一种计算机可读存储介质,其上存储有计算机程序指令,计算机程序指令被电子设备执行时使得电子设备实现如第一方面或第一方面的可能的实现方式中任一项,或者第三方面或第三方面的可能的实现方式中任一项所述的方法。
本申请的第七方面,提供一种电子设备,该电子设备包括显示屏,一个或多个处理器和存储器;显示屏,处理器和存储器耦合;存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当计算机指令被电子设备执行时,使得该电子设备执行如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者,使得该电子设备执行如第三方面或第三方面的可能的实现方式中任一项所述的方法。
本申请的第八方面,提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,电子设备中的处理器执行第一方面或第一方面的可能的实现方式中任一项所述的方法,或者执行第三方面或第三方面的可能的实现方式中任一项所述的方法。
本申请的第九方面,提供一种显示系统,包括:第一终端和第二终端,第一终端与第二终端连接;第二终端,用于显示第一界面,接收用户对第一界面的内容的第一操作;第二终端,还用于在第一操作的输入源是第二终端的输入设备的情况下,响应于第一操作,在第二终端的显示屏上显示第二界面;在第一操作的输入源是第一终端的输入设备的情况下,响应于第一操作,向第一终端发送数据;第一终端,用于接收数据,根据该数据在第一终端的显示屏上显示第二界面。
在一种可能的实现方式中,第一终端,还用于在用户使用第一终端的输入设备输入第一操作的情况下,拦截第一操作对应的第一输入事件,向第二终端发送第一输入事件包括的第一操作参数。
在另一种可能的实现方式中,第二终端用于接收用户的第一操作,具体为:第二终端,用于接收来自第一终端的第一操作参数;根据第一操作参数,模拟第一输入事件;第二终端响应于第一操作,向第一终端发送数据,具体为:第二终端,用于根据模拟的第一输入事件,确定第一操作的输入源是所述第一终端的输入设备,响应于所述第一输入事件,向所述第一终端发送所述数据。
在另一种可能的实现方式中,第一终端,还用于确定第一终端显示的光标滑出第一终端的显示屏的边缘,开启输入事件的拦截。
在另一种可能的实现方式中,第二终端响应于第一操作,在第二终端的显示屏上显示第二界面,具体为:第二终端,用于在用户使用所述第二终端的输入设备执行所述第 一操作的情况下,根据第一操作对应的第二输入事件,确定第一操作的输入源是所述第二终端的输入设备,响应于第二输入事件,在第二终端的显示屏上显示第二界面。
在另一种可能的实现方式中,在确定第一终端显示的光标滑出第一终端的显示屏的边缘后,第一终端还用于,向第二终端发送穿梭状态信息,穿梭状态信息用于输入设备的指示穿梭开始。
在另一种可能的实现方式中,第一终端具体用于在用户使用第一终端的输入设备输入第一操作的情况下,拦截第一操作对应的第一输入事件,向第二终端发送第一输入事件包括的第一操作参数,第一操作参数用于第二终端模拟第一输入事件,进而向第一终端发送第二界面的数据。
可以理解地,上述提供的第二方面及其任一种可能的实现方式所述的显示装置,第四方面及其任一种可能的实现方式所述的显示装置,第五方面所述的显示装置,第六方面所述的计算机可读存储介质,第七方面所述的电子设备,第八方面所述的计算机程序产品及第九方面所述的显示系统所能达到的有益效果,可参考如第一方面或第三方面及其任一种可能的实现方式中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种系统架构的简化示意图;
图2A为本申请实施例提供的一种手机的结构示意图;
图2B为本申请实施例提供的一种软件架构的组成示意图;
图3为本申请实施例提供的一种显示方法的流程示意图;
图4为本申请实施例提供的一种显示屏上坐标系的示意图;
图5为本申请实施例提供的一种显示界面示意图;
图6为本申请实施例提供的另一种显示界面示意图;
图7为本申请实施例提供的又一种显示界面示意图;
图8为本申请实施例提供的又一种显示界面示意图;
图9为本申请实施例提供的又一种显示界面示意图;
图10为本申请实施例提供的又一种显示界面示意图;
图11为本申请实施例提供的另一种显示方法的流程示意图;
图12为本申请实施例提供的又一种显示界面示意图;
图13为本申请实施例提供的又一种显示界面示意图;
图14为本申请实施例提供的一种显示装置的组成示意图;
图15为本申请实施例提供的一种芯片系统的组成示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
在现有技术中,利用多屏协同实现多终端,如PC和手机协同办公时,手机的显示界面一直都会投射显示到PC显示屏上。即使用户利用手机的触摸屏直接对手机进行操作,手机的显示界面也会投射到PC显示屏上。而用户利用手机的触摸屏直接对手机进行操作 时,关注点或者说注意力在手机上,PC的显示屏继续显示手机的界面内容没有意义,还可能泄露用户隐私。
本申请实施例提供一种显示方法及设备,该方法可以应用于多个终端协同使用的场景中。在该场景中,在不启动投屏的前提下,借助键鼠共享技术,可使用一个终端(如称为第一终端)的输入设备(如鼠标,触摸板,键盘)实现对其他终端(如称为第二终端)的控制。第二终端的输入设备也是可使用的。采用本实施例提供的方法,当用户使用第一终端的输入设备对第二终端进行控制时,第二终端可将对应界面投射到第一终端上显示。当用户使用第二终端的输入设备对第二终端进行控制时,则在第二终端上显示对应界面,不将对应界面投射到第一终端上显示。这样,用户可根据自己的实际需要自由控制第二终端的界面在不同设备上的显示。不仅保护了用户隐私,也避免了用户注意力转移。提高了用户的使用体验。
下面将结合附图对本申请实施例的实施方式进行详细描述。
请参考图1,为本申请实施例提供的一种可以应用上述方法的系统架构的简化示意图。如图1所示,该系统架构至少可以包括:第一终端101和第二终端102。
其中,第一终端101与输入设备101-1连接(如图1所示),或包括输入设备101-1(图1中未示出)。作为一种示例,该输入设备101-1可以为鼠标,触摸板,键盘等。图1中以输入设备101-1是鼠标为例示出。
第二终端102包括输入设备102-1(如图1所示),或与输入设备102-1连接(图1中未示出)。作为一种示例,图1中以输入设备102-1为触摸屏为例示出。当输入设备102-1为触摸屏时,该触摸屏也是第二终端102的显示设备,或者说显示屏。
在本实施例中,第一终端101和第二终端102可通过有线或无线的方式建立连接。基于建立的连接,第一终端101和第二终端102可配合一起使用。在本实施例中,第一终端101和第二终端102采用无线方式建立连接时采用的无线通信协议可以为无线保真(wireless fidelity,Wi-Fi)协议、蓝牙(Bluetooth)协议、ZigBee协议、近距离无线通信(Near Field Communication,NFC)协议等,还可以是各种蜂窝网协议,在此不做具体限制。
在第一终端101与第二终端102连接后,利用键鼠共享技术,用户可使用一套输入设备,如上述输入设备101-1实现对第一终端101和第二终端102两者的控制。也就是说,用户不仅可以使用第一终端101的输入设备101-1实现对第一终端101的控制,第一终端101还可将其输入设备101-1共享给第二终端102,供用户实现对第二终端102的控制。另外,用户使用第二终端102的输入设备102-1也可以实现对第二终端102的控制。
示例性的,在本实施例中,当用户使用第一终端101的输入设备101-1对第二终端102进行控制时,第二终端102可将对应界面投射到第一终端101的显示屏101-2上显示。当用户使用第二终端102的输入设备102-1对第二终端102进行控制时,则在第二终端102的触摸屏(或称为显示屏)上显示对应界面,不将对应界面投射到第一终端101的显示屏101-2上。
例如,第二终端102中安装有一个或多个应用。第二终端102可在第二终端102的触摸屏上显示对应应用的图标。在本实施例中,在第一终端101和第二终端102建立连接后,利用键鼠共享技术,用户可使用上述输入设备101-1,对第二终端102的触摸屏上显示的应用的图标进行操作,如点击操作。作为对该点击操作的响应,第二终端102可 将该应用的界面投射到第一终端101的显示屏101-2上显示。用户也可以通过上述输入设备102-1,对第二终端102的触摸屏上显示的应用的图标进行操作,如用户使用手指对该应用的图标进行点击操作。作为对该点击操作的响应,第二终端102在第二终端102的触摸屏上显示该应用的界面,该应用的界面不会投射到第一终端101的显示屏101-2上。
又示例性的,在本实施例中,当用户使用第一终端101的输入设备101-1对第二终端102的触摸屏上显示的输入框进行操作,如点击操作后,第二终端102的触摸屏上不显示虚拟键盘,用户使用第一终端101的键盘(如物理键盘),可在该输入框中输入文字。当用户通过第二终端102的输入设备102-1,对第二终端102的触摸屏上显示的输入框进行操作,如点击操作后,第二终端102可在第二终端102的触摸屏上显示虚拟键盘,用户使用该虚拟键盘,可实现在该输入框中文字的输入。
需要说明的是,本申请实施例中的终端,如上述第一终端101,又如上述第二终端102,可以为手机,平板电脑,手持计算机,PC,蜂窝电话,个人数字助理(personal digital assistant,PDA),可穿戴式设备(如智能手表),车载电脑,游戏机,以及增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等,本实施例对终端的具体形式不做特殊限制。其中,图1中以第一终端101为PC,第二终端102为手机为例示出。另外,本实施例提供的技术方案除了可以应用于上述终端(或者说移动终端)外,还可以应用于其他电子设备,如智能家居设备(如电视机)等。
在本实施例中,以终端为手机为例。请参考图2A,为本申请实施例提供的一种手机的结构示意图。
如图2A所示,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本实施例示意的结构并不构成对手机的具体限定。在另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是手机的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,SIM接口,和/或USB接口等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为手机供电。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141也可接收电池142的输入为手机供电。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),NFC,红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。当有触摸操作作用于显示屏194,手机根据压力传感器180A检测所述触摸操作强度。手机也可以根据压力传感器180A的检测信号计算触摸的位置。
陀螺仪传感器180B可以用于确定手机的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。手机可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测手机在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。手机可以利用接近光传感器180G检测用户手持手机贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。手机可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。温度传感器180J用于检测温度。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机的接触和分离。手机可以支持1个或N个SIM卡接口,N为大于1的正整数。手机通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机中,不能和手机分离。以下实施例中的方法可以在具有上述硬件结构的手机中实现。
结合图1,本申请实施例以第一终端101的软件系统是windows系统,第二终端102的软件系统是Android系统为例,示例性说明第一终端101和第二终端102的软件架构。请参考图2B,为本申请实施例提供的一种软件架构的组成示意图。
其中,如图2B所示,第一终端101的软件架构可以包括:应用层和windows系统(windows shell)。在一些实施例中,应用层可以包括安装在第一终端101的各个应用。应用层的应用可直接与windows系统交互。示例性的,应用层还可以包括键鼠模块和投屏服务模块。
第二终端102的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。以第二终端102的软件系统是分层架构为例。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,如图2B所示,第二终端102可以包括应用层和框架层(framework,FWK)。应用层可以包括一系列应用程序包。例如,应用程序包可以包括设置,计算器,相机,短信息,音乐播放器,图库等应用。应用层包括的应用可以是第二终端102的系统应用,也可以是第三方应用,本申请实施例在此不做具体限制。应用层还可以包括投屏服务模块。应用层还可以包括launcher。框架层主要负责为应用层的应用提供应用编程接口(application programming interface,API)和编程框架。当然,第二终端102还可以包括其他层,如内核层(图2B中未示出)等。该内核层是硬件和软件之间的层。内核层至少可以包含显示驱动,摄像头驱动,音频驱动,传感器驱动等。
在多终端协同使用的场景中,用户可使用同一个输入设备实现对这多个终端的控制以提高使用效率。结合图1,以多个终端包括第一终端101和第二终端102,第一终端101的输入设备101-1为鼠标,第二终端102的输入设备102-1为触摸屏为例。在第一终端101和第二终端102建立连接后,基于上述软件架构,并借助键鼠共享技术,用户可使用第一终端101的鼠标实现对第一终端101和第二终端102的控制。用户使用第二终端102的触摸屏也可实现对第二终端102的控制。在本实施例中,当用户使用第一终端101的鼠标对第二终端102进行控制时,第二终端102可将对应界面投射到第一终端101上显示。当用户使用第二终端102的触摸屏对第二终端102进行控制时,则在第二终端102上显示对应界面,不将对应界面投射到第一终端101上。
其中,键鼠共享技术可以是指用一个终端的输入设备(如鼠标,触摸板,键盘),实现对其他终端控制的技术。
以下结合图1和图2B,以第一终端101为PC,第二终端102为手机,输入设备101-1为鼠标,输入设备102-1为触摸屏为例,结合附图对本实施例提供的方法进行详细介绍。
图3为本申请实施例提供的一种显示方法的流程示意图。如图3所示,该方法可以包括以下S301-S309。
S301、手机与PC建立连接。
在一些实施例中,手机与PC可以采用有线的方式建立连接。例如,手机与PC可通过数据线建立有线连接。
在其他一些实施例中,手机与PC可以采用无线的方式建立连接。
其中,终端之间采用无线方式建立连接有两点要求,一个是终端之间互相知晓对端的连接信息,另一个是各终端具有传输能力。连接信息可以是终端的设备标识,如互联网协议(internet protocol,IP)地址,端口号或终端登录的账号等。终端登录的账号可以是运营商为用户提供的账号,如华为账号等。终端登录的账号还可以为应用账号,如微信
Figure PCTCN2021114990-appb-000001
账号、优酷
Figure PCTCN2021114990-appb-000002
账号等。终端具有传输能力可以是近场通信能力,也可以是长距离通信能力。也就是说,终端间建立连接采用的无线通信协议可以是如Wi-Fi协议或蓝牙协议或NFC协议等近场通信协议,也可以是蜂窝网协议。
以手机与PC采用无线的方式建立连接为例。例如,用户可使用手机触碰PC的NFC标签,手机读取该NFC标签中保存的连接信息,如该连接信息中包括PC的IP地址。之后,手机可根据PC的IP地址采用NFC协议与PC建立连接。又例如,手机与PC均打开了蓝牙功能和Wi-Fi功能。PC可广播蓝牙信号,以发现周围的终端,如PC可显示发现的设备列表,该发现设备列表中可包括PC发现的手机的标识。在PC进行设备发现的过程中也可与发现的设备互相交换连接信息,如IP地址。之后,在PC接收到用户在显示的设备列表中选择该手机的标识的操作后,PC根据手机的IP地址,可采用Wi-Fi协议与该手机建立连接。再例如,手机和PC均接入了蜂窝网,手机与PC登录了同一华为账号。手机与PC可根据该华为账号基于蜂窝网建立连接。
在手机与PC成功建立连接后,两者便可协同使用。为了提高协同使用的效率,用户可使用一套输入设备,如PC的鼠标实现对PC和手机两者的控制。也就是说,用户不仅可以使用PC的输入设备实现对PC的控制,PC还可将PC的输入设备共享给手机,供用户对手机进行控制。
作为一种示例性的实现,在手机与PC成功建立连接,且PC的键鼠共享模式开启的情况下,可使用一套输入设备实现对PC和手机两者的控制。
例如,在一些实施例中,在其他终端与PC成功建立连接后,PC可显示弹窗,该弹窗用于询问用户是否开启键鼠共享模式。如果接收到用户选择开启键鼠共享模式的操作,PC可开启键鼠共享模式。
PC在开启键鼠共享模式后,可通知与自身建立了连接的所有终端键鼠共享模式已开启。如PC与手机建立了连接,则PC会向手机通知键鼠共享模式已开启。手机在接收到该通知(如称为通知消息)后,可创建一个虚拟输入设备,该虚拟输入设备与常规的如鼠标,触摸板,键盘等输入设备的作用相同,可用于手机模拟对应输入事件。例如,以输入设备为鼠标为例,手机创建的该虚拟输入设备与常规的鼠标作用相同,可以看作是PC共享给手机的鼠标,能够用于在手机端模拟鼠标事件,以实现PC的鼠标对手机的控制。
示例性的,以手机的操作系统是Android系统为例。手机可利用linux的uinput能力实现虚拟输入设备的创建。其中,uinput是一个内核层模块,可以模拟输入设备。通过写入/dev/uinput(或/dev/input/uinput)设备,进程可以创建具有特定功能的虚拟输入设备。一旦创建了该虚拟输入设备,其便可模拟对应的事件。类似的,其他与PC建立了连接的终端也会根据接收到通知消息,进行虚拟输入设备的创建。
需要说明的是,如果接收到通知消息的终端的操作系统是Android系统,则可以利用linux的uinput能力实现虚拟输入设备的创建,或者可以使用人机交互设备(human interface device,HID)协议来实现虚拟输入设备的创建。如果接收到通知消息的终端的操作系统是IOS系统或windows系统等其他操作系统,则可使用HID协议来实现虚拟输入设备的创建。另外,上述实施例是以与PC连接的终端接收到用于通知PC的键鼠共享模式已开启的通知消息后,便进行虚拟输入设备的创建为例进行说明的。在其他一些实施例中,在与PC连接的终端接收到上述通知消息后,也可以显示弹窗,以询问用户是否想要使用PC的输入设备实现对本设备的控制。如果接收到用户选择使用PC的输入设备实现对本设备的控制的操作,则再进行虚拟输入设备的创建,否则不创建虚拟输入设备。
又例如,在其他一些实施例中,如手机与PC建立连接后,PC自动开启键鼠共享模式,无需用户手动打开。在其他终端,如上述手机与PC建立连接后,也可自动进行虚拟输入设备的创建,无需PC发送通知消息。或者,在其他终端与PC建立连接后,可以先显示弹窗询问用户是否想要使用PC的输入设备实现对本设备的控制。如果接收到用户选择使用PC的输入设备实现对本设备的控制的操作,则再进行虚拟输入设备的自动创建,否则不创建虚拟输入设备。
以上实施例是以PC与手机建立连接后,PC开启键鼠共享模式为例进行说明。再例如,在另外一些实施例中,PC也可以先开始键鼠共享模式。如键鼠共享模式可以是用户手动打开的,也可以是PC自动开启的。在PC的键鼠共享模式开启后,PC再与手机建立连接。PC与手机建立连接的实现与上述实施例的描述相同,此处不再详细赘述。在手机与PC连接建立后,手机可自动或根据PC的通知消息或根据用户的选择进行虚拟输入设备的创建。
在PC的键鼠共享模式开启,PC与手机建立了连接,手机的虚拟输入设备创建完成后,可以认为PC将PC的输入设备共享给了手机,用户便可以用PC的输入设备对手机进行控制。
其中,结合图1,由于鼠标是PC的输入设备,在其他终端,如手机与PC建立连接后,一般情况下,暂时是由PC对鼠标的操作进行响应的,或者说用户使用鼠标暂时可实 现对PC的控制。在本实施例中,PC在开启键鼠共享模式后,还可在确定满足鼠标穿梭条件时,触发转由与PC建立了连接的创建了虚拟输入设备的其他终端,如手机对鼠标的操作进行响应。也就是说,在满足鼠标穿梭条件后,用户便可使用鼠标实现对与PC建立了连接的创建了虚拟输入设备的其他终端,如手机的控制。
示例性的,鼠标穿梭条件可以是PC显示屏上显示的光标滑过PC显示屏的边缘。也就是说,用户可通过移动鼠标,使得PC显示屏上显示的光标滑过PC显示屏的边缘,以触发转由与PC建立了连接的创建了虚拟输入设备的其他终端对鼠标的操作进行响应。
作为一种示例性的实现,PC可在开启键鼠共享模式后,开启输入(input)监听,并挂载钩子(HOOK)。输入监听可用于监听光标的相对位移和坐标位置。输入监听还可用于监听键盘事件。挂载的HOOK在鼠标穿梭开始后可用于拦截对应输入事件(或者说屏蔽对应输入事件),如以输入设备是鼠标为例,该输入事件可以是鼠标事件,挂载的HOOK在鼠标穿梭开始后可用于拦截鼠标事件,以使得鼠标事件在被PC的键鼠模块接收后不会传输到PC的windows系统。挂载的HOOK在鼠标穿梭开始后还可用于捕获拦截到的输入事件,如鼠标事件中的参数。例如,PC可利用输入监听,监听光标的相对位移和坐标位置,并根据监听到的数据确定是否满足鼠标穿梭条件。在确定满足鼠标穿梭条件后,挂载的HOOK拦截鼠标事件,捕获鼠标事件中的参数,并通过将捕获到的参数发送给与PC连接的创建了虚拟输入设备的其他终端,以便该终端利用创建的虚拟输入设备模拟对应输入事件,如鼠标事件,进而对其进行响应。即实现转由与PC连接的创建了虚拟输入设备的其他终端对鼠标的操作进行响应。
当然,也可以通过其他方式(如在PC中注册RAWINPUT)来实现输入事件的拦截和其中参数的捕获。或者,还可以通过不同的方式来分别实现输入事件的拦截和其中参数的捕获。例如,以输入设备是鼠标为例,PC在开启键鼠共享模式后,可挂载HOOK,并注册RAWINPUT,其中,在鼠标穿梭开始后,挂载的HOOK可用于拦截鼠标事件(或者说屏蔽鼠标事件),注册的RAWINPUT可用于捕获拦截到的鼠标事件中的参数。本实施例在此对鼠标事件的拦截和其中参数的捕获的具体实现不做限制。为了便于描述,以下实施例中以通过挂载HOOK来实现输入事件的拦截和其中参数的捕获为例进行介绍。
需要说明的是,以上实施例是以在PC显示屏上显示的光标滑过PC显示屏的边缘后,用户才可用PC的输入设备对手机进行控制为例进行说明的。在其他一些实施例中,也可以在PC与手机建立连接,PC的键鼠共享模式开启及手机创建了虚拟输入设备后,用户便可用PC的输入设备对手机进行控制,即在PC与手机建立连接,PC的键鼠共享模式开启及手机创建了虚拟输入设备后,如果用户操作了PC的输入设备,如鼠标,则PC可将对应的输入事件拦截,并捕获其中的操作参数发送给手机,以便使用PC的输入设备实现对手机的控制。以下实施例以在PC显示屏上显示的光标滑过PC显示屏的边缘后,用户可用PC的输入设备对手机进行控制为例进行说明。
可以理解的是,在PC将PC的输入设备共享给手机的情况下,用户可以使用PC的输入设备实现对手机的控制,也可以使用手机的输入设备对手机进行控制。在本实施例中,可以根据控制手机的输入设备的不同,在不同的终端上显示对应界面。
如,以用户使用PC的输入设备控制手机为例。该方法可以包括以下S302-S307。
S302、PC接收鼠标移动事件,根据鼠标移动事件在PC的显示屏上显示光标移动的动画。
需要说明的是,本实施例中所述的光标也可以称为鼠标指针。光标可以是一个图像, 其可以是动态的也可以是静态的,在不同情况下光标的样式也可能有所不同。
S303、PC监测光标在PC显示屏上的坐标位置。
S304、PC根据光标在PC显示屏上的坐标位置,在确定光标滑出PC显示屏边缘时,拦截鼠标移动事件,并将鼠标移动事件包含的参数1发送给手机。
在本实施例中,在键鼠共享模式开启后,在用户想要使用鼠标实现对与PC连接的创建了虚拟输入设备的其他终端的控制,如想要对手机当前显示界面进行操作时,用户可通过持续向同一个方向移动鼠标,使显示在PC显示屏上的光标滑过(或者说滑出)PC显示屏的边缘,即触发鼠标穿梭条件。
示例性的,PC可根据光标的初始位置和相对位移确定光标在PC显示屏上的坐标位置,从而确定光标是否滑出PC显示屏的边缘。
其中,光标的初始位置可以是鼠标开始移动时,光标在PC显示屏上的坐标位置,或者说是鼠标开始移动之前光标在PC显示屏上的坐标位置。该光标的初始位置具体可以是指在以PC显示屏的左上角为坐标原点,X轴从左上角指向PC显示屏右边缘,Y轴从左上角指向PC显示屏下边缘的坐标系中的坐标位置。
例如,PC确定光标是否滑出PC显示屏的边缘的具体过程可以是:结合图4,PC可建立以该初始坐标位置为坐标原点(如图4中所示的位置o),X轴从坐标原点o指向PC显示屏右边缘,Y轴从坐标原点o指向PC显示屏上边缘的坐标系。PC可确定在该坐标系中,PC显示屏各边缘的坐标值。PC显示屏各边缘在该坐标系中的坐标值可根据PC显示屏的分辨率和光标的初始位置来确定。如,如图4所示,在该坐标系中,PC显示屏的右边缘在X轴上的坐标值为x1,左边缘在X轴上的坐标值为-x2,上边缘在Y轴上的坐标值为y1,下边缘在Y轴上的坐标值为-y2。在鼠标移动后,鼠标会向PC上报光标的相对位移。PC根据鼠标上报的相对位移可计算出鼠标移动后光标在PC显示屏上的坐标位置(x,y)。根据该坐标位置(x,y),PC即可确定光标是否滑出PC显示屏的边缘。如,如果光标在X轴的坐标值x大于x1,则可确定光标滑出PC显示屏的右边缘。如果光标在X轴的坐标值x小于-x2,则可确定光标滑出PC显示屏的左边缘。如果光标在Y轴的坐标值y大于y1,则可确定光标滑出PC显示屏的上边缘。如果光标在Y轴的坐标值y小于-y2,则可确定光标滑出PC显示屏的下边缘。
在确定光标滑出PC显示屏的边缘,即鼠标穿梭条件触发后,用户便可使用PC的输入设备对其他终端进行操作。示例性的,在光标滑出PC显示屏的边缘后,如果用户继续操作PC的输入设备,如鼠标,则PC可将用户使用PC的输入设备输入的操作的数据发送给创建了虚拟输入设备的其他终端,或者说,鼠标穿梭到该终端。如,在光标滑出PC显示屏的边缘后,用户使用PC的输入设备输入操作,PC可将该操作对应的输入事件,如鼠标移动事件,鼠标按下事件,鼠标抬起事件等拦截,并将拦截到的输入事件包含的参数传输给与PC连接的创建了虚拟输入设备的其他终端,以便转由该终端对PC的输入设备,如鼠标的操作进行响应。
示例性的,PC可通过以下方式确定鼠标穿梭到的设备。
在一些实施例中,如果与PC连接的设备仅有一个,如上述手机,且手机创建了虚拟输入设备,则PC可确定鼠标穿梭到该手机,即PC可将对应参数传输给手机,以便转由手机对PC的输入设备,如鼠标的操作进行响应。如果与PC连接的设备有多个,这多个设备中存在部分设备或全部设备建立了虚拟输入设备,则PC可在确定鼠标穿梭条件被触发时,在PC的显示屏上显示列表选项,该列表选项中包括与PC连接的设备中创建了虚 拟输入设备的设备的标识(如包括上述手机的标识)。PC可根据用户的选择,确定鼠标穿梭到设备,即确定对PC的输入设备的操作进行响应的设备。如用户选择了上述手机的标识,则PC确定鼠标穿梭到该手机,即PC可将对应参数发送给手机,以便转由手机对PC的输入设备的操作进行响应。手机接收到对应参数后,可模拟出对应输入事件,如鼠标事件,并做出对应响应,即转由手机对PC的输入设备的操作进行响应。其中,在本实施例中,与PC连接的设备在完成虚拟输入设备的创建后,可向PC发送虚拟输入设备创建成功的指示消息。PC根据接收到的指示消息可获得与PC连接的设备中哪些设备成功创建了虚拟输入设备,并以此为依据显示上述列表选项。
在其他一些实施例中,可以预先配置穿梭关系。如果与PC连接的设备有多个,且这多个设备中存在部分设备或全部设备建立了虚拟输入设备,则可以根据预先配置的穿梭关系确定鼠标穿梭到哪个设备上,即确定转由哪个设备对PC的输入设备的操作进行响应。例如,与PC连接的多个设备中包括上述手机,且手机创建了虚拟输入设备,预先配置的穿梭关系是光标从PC显示屏的左侧(或者说左边缘)滑出,则鼠标穿梭到手机。那么,当用户移动鼠标,使得光标滑过PC显示屏的左边缘时,PC不仅可以确定鼠标穿梭开始,还可确定鼠标穿梭到手机,即PC可将对应参数发送给手机,以便转由手机对PC的输入设备的操作进行响应。当然,如果与PC连接的设备有一个,且该设备创建了虚拟输入设备时,也可以根据预先配置的穿梭关系确定鼠标是否穿梭到该设备。如,预先配置的穿梭关系是光标从PC显示屏的左边缘滑出,则鼠标穿梭到手机。但用户移动鼠标后,使得光标滑过PC显示屏的右边缘,则可确定鼠标不穿梭到手机。在另外一些实施例中,可以通过识别设备位置确定鼠标穿梭到哪个设备上。例如,以输入设备为鼠标为例,用户按下并移动鼠标,使得光标滑过PC显示屏的左边缘,则可以利用蓝牙、超宽带(Ultra-wideband,UWB)、超声波等定位技术识别位于PC周围的设备位置,如PC识别出PC的左边是手机,则可确定鼠标穿梭到手机。
其中,穿梭关系可以是用户通过配置文件提前配置的,也可以为用户提供配置穿梭关系的配置界面,用户可通过该配置界面提前配置穿梭关系。例如,以用户通过界面配置手机的穿梭关系为例。PC接收用户打开配置界面操作,显示配置界面,该配置界面中包括PC的标识(如PC的图标)和手机的标识(如手机的图标),用户可通过拖动这两个标识来配置穿梭关系。作为一种示例,若用户将手机的标识放置到PC的标识的左侧,则PC可确定当光标滑过PC显示屏的左边缘时,鼠标穿梭到手机。若用户将手机的标识放置到PC的标识的右侧,则PC可确定当光标滑过PC显示屏的右边缘时,鼠标穿梭到手机。当存在多个设备时,可通过预先配置的方式配置每个设备的穿梭关系。以下实施例均以确定出的鼠标穿梭到手机为例进行说明。需要说明的是,对于根据预先配置穿梭关系及识别设备位置来确定鼠标穿梭到哪个设备这两种实现来说,上述S301可以在鼠标穿梭触发之前执行,也可以在鼠标穿梭触发之后执行,本实施例在此不做具体限制。
在本实施例中,上述输入事件包括的参数可以包括:操作参数。其中,以输入事件是鼠标事件为例。鼠标事件包含的操作参数(或者称为鼠标操作参数)可以包括:鼠标按键标志位、坐标信息、滚轮信息、键位信息。鼠标按键标志位用于指示用户对鼠标进行了按下、抬起、移动或滚轮滚动中的何种操作。坐标信息,在用户移动了鼠标时,用于指示光标移动的X坐标和Y坐标。滚轮信息,在用户操作了鼠标的滚轮时,用于指示滚轮滚动的X轴距离和Y轴距离。键位信息用于指示用户对鼠标的左键、中键或右键中的哪个键进行了操作。
示例性的,以PC的输入设备为鼠标为例。结合图2B,对S302-S304进行举例说明。在用户想使用PC的鼠标对手机进行操作时,用户可移动PC的鼠标。在用户移动PC的鼠标的过程中,PC的键鼠模块可接收到对应的输入事件,如移动事件,该移动事件可以称为鼠标移动事件。由于此时鼠标穿梭条件还未触发,因此HOOK不会拦截该鼠标移动事件,该鼠标移动事件会传输给PC的windows系统。根据接收到的鼠标移动事件,PC的windows系统可绘制光标移动的动画并在PC的显示屏上显示。例如,如图5所示,随着鼠标501的移动,PC在PC的显示屏502上对应显示光标503移动的动画,如图5中所示光标503的移动轨迹如轨迹504所示。
如S301中的描述,在键鼠共享模式开启后,PC会开启输入监听,并挂载HOOK。在光标在PC显示屏上移动的过程中,PC的键鼠模块可利用输入监听,监测光标在PC显示屏上的实时坐标位置。当PC的键鼠模块根据监测到的光标在PC显示屏上的实时坐标位置,确定光标滑过PC显示屏的边缘时,可确定满足上述鼠标穿梭条件,表明用户想要使用PC的鼠标对其他终端进行控制。PC的键鼠模块可确定鼠标穿梭开始。
在PC的键鼠模块确定鼠标穿梭开始后,如果用户对PC的鼠标进行了操作,PC的键鼠模块会利用HOOK,拦截接收到输入事件,如鼠标事件,并利用HOOK捕获拦截到的输入事件中的参数。然后,PC可通过建立的连接传输给手机,用于手机做出对应响应。例如,继续结合图5所示示例,在光标滑过PC显示屏的边缘后,用户继续向同一个方向移动鼠标。PC的键鼠模块可接收移动事件,如鼠标移动事件。由于鼠标穿梭已开始,因此PC的键鼠模块可利用HOOK,将其拦截(或者说屏蔽),以便该鼠标移动事件不会被发送给PC的windows系统,从而使得PC不会对接收到的鼠标移动事件做响应,PC的键鼠模块还可利用HOOK,捕获鼠标移动事件中的参数,如称为参数1。之后,PC可通过建立的连接将该参数1发送给手机。作为一种示例,鼠标移动事件包含的参数1可以包括:操作参数1。其中,操作参数1可以包括:用于指示用户对鼠标进行了移动的鼠标按键标志位,用于指示光标移动的X坐标和Y坐标的坐标信息,滚轮信息(取值为空)和键位信息(取值为空)。
另外,在本实施例中,在确定鼠标穿梭开始后,PC还可通过建立的连接向手机发送用于指示鼠标开始穿梭的穿梭状态信息。或者说该穿梭状态信息用于指示手机开始接受PC的输入设备的输入。手机接收到该信息后,可模拟出一个光标,并在手机的显示屏上显示该光标。例如,手机接收到该信息后,可创建一个光标,并交由手机的launcher显示该光标。在确定鼠标穿梭开始后,PC也可将PC显示屏上显示的光标隐藏。给用户光标从PC穿梭到手机的视觉效果。
其中,手机显示光标的位置可以是预先定义的,如可以是手机显示屏上的任意一个位置。手机显示光标的位置也可以和光标在PC上的滑出位置对应。如,光标从PC显示屏的右边缘滑出,在手机显示屏的左边缘上显示光标。又如,光标从PC显示屏的右边缘的居中位置滑出,在手机显示屏的左边缘的居中位置显示光标。作为一种示例,在光标滑出PC显示屏后,PC可将光标滑出PC显示屏时在PC显示屏上的坐标位置发送给手机。手机根据该坐标位置和PC的分辨率(如A*B),可确定光标是从PC显示屏的哪个边缘滑出的,如光标滑出PC显示屏时的坐标为(x1,y1),y1等于A,则手机可确定光标从PC显示屏的右边缘滑出。手机还可根据x1和PC的分辨率(如高度B)确定光标滑出PC显示屏的位置占手机高度的比例。根据该比例和手机的分辨率,可确定具体在右边缘的哪个位置显示光标。PC的分辨率可以是与手机建立连接的过程中或连接建立成功后 PC发送给手机的。
S305、手机接收参数1,根据参数1模拟鼠标移动事件,根据鼠标移动事件在手机的显示屏上显示光标移动的动画。
如上描述,用户触发光标从PC穿梭到手机的目的是想要对手机进行控制。因此,在用户通过移动鼠标,将光标从PC的显示屏穿梭到手机的显示屏后,用户可以继续移动鼠标,即用户会使用PC的鼠标输入移动手机上的光标的操作。在用户继续移动PC的鼠标的过程中,手机会接收到对应参数,如鼠标移动事件包括的参数1。在手机接收到参数1后,可根据参数1中包括的操作参数模拟出移动事件,如鼠标移动事件。作为对该鼠标移动事件的响应,手机可绘制光标移动的动画并在手机的显示屏上显示,直到光标移动到手机显示屏上用户想要操作的位置。如,手机当前显示第一界面,该第一界面包括一个或多个内容。第一界面的内容可以是指显示在第一界面的元素。该内容可以是控件等用户可操作的元素。以用户想操作的内容是控件为例,用户可通过继续移动PC的鼠标,使得手机上的光标移动到该控件的位置。
另外,在本实施例中,在用户使用PC的鼠标对手机进行控制之前,手机上的显示界面可以已投射到了PC显示屏上,也可以未投射到PC上,仅在手机上显示。可以理解的是,在鼠标穿梭到手机上后,用户使用PC的鼠标执行的操作可能仅用于移动手机上的光标,不会引起手机显示界面的变化。如,继续结合上述示例,在鼠标穿梭到手机上后,用户继续移动鼠标,手机可模拟出鼠标移动事件。响应模拟出的该鼠标移动事件,手机可显示光标的移动动画,手机上的显示界面不会发生变化。在手机响应于输入事件,显示界面不发生变化时,手机可继续保持当前的显示策略。如,在用户使用PC的鼠标对手机进行控制之前,手机投射界面到了PC上,则手机可继续投射界面到PC上。又如,在用户使用PC的鼠标对手机进行控制之前,手机未投射界面到PC上,则手机可继续仅在手机上显示界面。
以下以在用户使用PC的鼠标对手机进行控制之前,手机未投射界面到PC上为例进行说明。
以用户想要打开手机中的应用,如第一界面为桌面,用户想要操作的内容是桌面上应用的图标为例。一般的,对于安装在手机中的应用,手机可在手机的桌面(或者称为主屏幕(home screen))上显示这些应用的图标。在用户通过移动鼠标,将光标从PC的显示屏穿梭到手机的显示屏后,可继续移动鼠标,直到光标移动到手机显示屏上显示的用户想要打开的应用的图标的位置。
需要说明的是,本实施例中所述的应用可以是嵌入式应用(即手机的系统应用,如计算器,相机,设置,图库等),也可以是可下载应用(如,浏览器,天气,电子邮件等)。嵌入式应用是作为手机实现的一部分提供的应用程序。可下载应用是一个可以提供自己的因特网协议多媒体子系统(internet protocol multimedia subsystem,IMS)连接的应用程序。该可下载应用可以是预先安装在手机中的应用或可以是由用户下载并安装在手机中的第三方应用。
示例性的,以PC的输入设备为鼠标为例。结合图2B,对S305进行举例说明。由于PC与手机的操作系统不同,其输入事件,如鼠标事件中操作参数的键值存在差异。在手机接收到对应参数,如上述参数1后,手机可根据预设映射关系,将接收到的参数1中的操作参数1的键位码转换成手机能够识别的键位码。之后,手机利用创建的虚拟输入设备根据转换键位码后的操作参数1可模拟出手机能够识别的输入事件,如对应鼠标事 件,即可以模拟出移动事件,如鼠标移动事件。手机根据模拟出的鼠标移动事件,可绘制光标移动的动画,并交由手机的launcher在手机的显示屏上显示光标移动的动画。例如,如图6所示,以用户想要打开的应用是手机中的计算器为例。随着鼠标501的移动,手机在手机的显示屏601上对应显示光标602移动的动画,如光标602的移动轨迹如轨迹603所示。即随着鼠标501的移动,光标602可沿着轨迹603移动到计算器的图标604的位置。另外,在光标602移动的过程中,手机的显示界面不会发生变化。则如上描述,手机可确定继续保持当前的显示策略,即在手机上继续显示当前界面和光标移动的动画即可,不将手机的界面内容投射到PC上显示。
在光标移动到想要操作的位置,即光标移动到第一界面中想要操作的内容的位置,如用户想要打开的应用的图标的位置后,用户可通过PC的输入设备输入对应操作,该操作可以称为第一操作。如该第一操作可以为点击操作,以便手机可根据该第一操作,打开该应用。可以理解的是,在用户使用PC的输入设备执行该第一操作后,PC(如PC的键鼠模块)可接收到对应的输入事件,如称为第一输入事件。其中,第一操作可以包括一个操作,也可以包括多个操作。第一操作包括一个操作,第一输入事件包括一个输入事件,第一操作包括多个操作,第一输入事件包括对应数量的输入事件。例如,以PC的输入设备为鼠标,第一操作为点击操作为例,用户使用鼠标输入点击操作时,该点击操作可以包括两个操作,分别为按下操作和抬起操作,对应的第一输入事件包括按下事件和抬起事件。作为一种示例,按下操作可以是鼠标按下操作,抬起操作可以是鼠标抬起操作,按下事件可以是鼠标按下事件,抬起事件可以是鼠标抬起事件。基于此,用户使用PC的鼠标打开手机上应用的过程可以包括以下S306-S307。
S306、在手机的光标显示在应用的图标上时,PC接收鼠标按下事件和鼠标抬起事件,拦截该鼠标按下事件和鼠标抬起事件,并将鼠标按下事件包含的参数2和鼠标抬起事件包括的参数3发送给手机。
其中,本申请实施例中的第一操作参数可以包括上述参数2和参数3。
S307、手机接收参数2和参数3,根据参数2和参数3模拟鼠标按下事件和鼠标抬起事件,根据鼠标按下事件和鼠标抬起事件,将应用的界面显示到PC显示屏上。
其中,该应用的界面可以为本申请实施例中的第二界面。
结合图6及S305的描述,以用户想要打开某应用(如计算器),且以打开该应用的操作是对手机桌面上显示的该应用的图标的点击操作,即第一操作为点击操作为例。在用户通过移动PC的鼠标,将手机上的光标移动到计算器的图标的位置时,用户可对鼠标(如鼠标左键)进行按下操作,后又抬起手指。PC的键鼠模块可接收按下事件(如鼠标按下事件)和抬起事件(如鼠标抬起事件)。由于鼠标穿梭已开始,因此PC的键鼠模块可利用HOOK,将鼠标按下事件和鼠标抬起事件拦截(或者说屏蔽),以便该鼠标按下事件和鼠标抬起事件不会被发送给PC的windows系统,从而使得PC不会对接收到的鼠标按下事件和鼠标抬起事件做响应。PC的键鼠模块还可利用HOOK,捕获鼠标按下事件中的参数,如称为参数2,捕获鼠标抬起事件中的参数,如称为参数3。PC还可将捕获到的鼠标按下事件的参数2和鼠标抬起事件的参数3通过建立的连接发送给手机。
作为一种示例,参数2中可以包括操作参数2。参数3中可以包括操作参数3。其中,操作参数2可以包括:用于指示用户对鼠标进行了按下的鼠标按键标志位,坐标信息(取值为空),滚轮信息(取值为空)和用于指示用户对鼠标左键进行了操作的键位信息。操作参数3可以包括:用于指示用户对鼠标进行了抬起的鼠标按键标志位,坐标信息(取 值为空),滚轮信息(取值为空)和用于指示用户对鼠标左键进行了操作的键位信息。
手机可接收到参数2和参数3。之后,手机根据预设映射关系,可将接收到的参数2中的操作参数2的键位码转换成手机能够识别的键位码。将接收到的参数3中的操作参数3的键位码转换成手机能够识别的键位码后,手机利用创建的虚拟输入设备根据转换键位码后的操作参数2模拟出按下事件,如鼠标按下事件,利用创建的虚拟输入设备根据转换键位码后的操作参数3模拟出抬起事件,如鼠标抬起事件。手机根据鼠标按下事件,鼠标抬起事件及光标当前显示的位置,可以确定用户对桌面上的计算器的图标进行了点击操作。例如,在键鼠穿梭开始(如手机接收到来自PC的用于指示鼠标开始穿梭的穿梭状态信息)后,手机可注册光标坐标位置的侦听器。通过该侦听器,手机可实时监测光标在手机显示屏上的坐标位置。即手机可利用侦听器确定光标当前的坐标位置。
在手机确定出用户对计算器的图标进行了点击操作后,手机可确定手机的显示界面会发生变化。之后,手机可先确定用户输入该操作的意图是想在手机上显示对应界面,如该应用的界面,还是想在PC上显示该应用的界面。如果手机确定出用户的操作意图是在PC上显示该应用的界面,则可将该应用的界面显示到PC显示屏上。如果手机确定出用户的操作意图是在手机上显示该应用的界面,则在手机上显示该应用的界面,该应用的界面不会投射到PC的显示屏上。
示例性的,手机可根据输入该点击操作的输入设备来确定用户的操作意图。如果输入该点击操作的输入设备(或者说输入源)是PC的鼠标,则可确定用户的操作意图是在PC上显示对应界面。如果输入该点击操作的输入设备是手机的触摸屏,则可确定用户的操作意图是在手机上显示对应界面。例如,手机可根据S307中的鼠标按下事件和鼠标抬起事件,确定输入该点击操作的输入源是PC的鼠标还是手机的触摸屏。
手机可采用以下方式确定输入该点击操的输入源。
方式1、手机可根据输入事件中包含的输入设备标识(identify,ID)来确定输入对应操作的输入源。
一般的,输入事件中除了包括用于触发终端执行对应操作的操作参数外,还可以包括输入设备ID,该输入设备ID用于标识输入对应操作的输入源。手机利用创建的虚拟输入设备模拟出的输入事件也不例外,也会包括输入设备ID。因此,手机可根据输入事件中包含的输入设备ID,来确定输入对应操作的输入源。
例如,S307中的鼠标按下事件和鼠标抬起事件中可包括输入设备ID。由于该鼠标按下事件和鼠标抬起事件是手机利用创建的虚拟输入设备模拟的,因此该输入设备ID为该虚拟输入设备的ID。该虚拟输入设备的ID可以是手机创建该虚拟输入设备时,生成并保存在手机中的。基于此,在手机模拟出鼠标按下事件和鼠标抬起事件后,手机可获取该鼠标按下事件和鼠标抬起事件中的输入设备ID。手机可确定该输入设备ID为虚拟输入设备的ID。由于该虚拟输入设备是在用户使用PC的输入设备输入操作后,用于手机模拟对应输入事件,因此,当手机确定输入事件包含的输入设备ID是虚拟输入设备的ID时,手机可确定输入对应操作,即上述点击操作的输入设备是PC的鼠标。
方式2、手机可根据输入事件中包含的输入方式来确定输入对应操作的输入源。
输入事件中还可包括输入方式,该输入方式用于指示输入对应操作的设备的类型,如鼠标,触摸屏,触摸板等。手机利用创建的虚拟输入设备模拟出的输入事件也不例外,也会包括输入方式。因此,手机可根据输入事件中包括的输入方式,来确定输入对应操作的输入源。
例如,S307中的鼠标按下事件和鼠标抬起事件中可包括输入方式。该鼠标按下事件和鼠标抬起事件是手机模拟的鼠标事件,因此其中的输入方式用于指示输入对应操作的设备为鼠标。基于此,在手机模拟出鼠标按下事件和鼠标抬起事件后,手机可获取该鼠标按下事件和鼠标抬起事件中的输入方式。根据该输入方式,手机可确定输入对应操作,即上述点击操作的输入源为鼠标。
在手机确定输入对应操作的输入源是鼠标时,可以理解的,该鼠标可能与手机直接连接,也可能是其他设备,如PC共享给手机的鼠标。因此,进一步的,手机还可以确定当前是否处于鼠标穿梭状态,即确定该鼠标是否是PC共享给手机的鼠标。在手机确定当前处于鼠标穿梭状态时,表明用户是使用PC的鼠标输入的点击操作,则手机可确定输入对应操作,即上述点击操作的输入源为PC的鼠标。
通过上述方式1或方式2,手机可确定输入上述点击操作的输入源是PC的鼠标,表明用户的操作意图是在PC上显示对应界面。则作为对上述鼠标按下事件和鼠标抬起事件的响应,手机可将对应界面,即用户点击的图标的应用的界面显示到PC的显示屏上。例如,结合图6的示例,如图7所示,当手机上的光标移动到计算器的图标701上后,用户使用PC的鼠标702对该计算器的图标701进行了点击操作,如按下PC的鼠标702的左键并抬起手指。此时,作为响应,手机可将计算器的界面显示到PC显示屏上。如图7所示,PC显示计算器的界面703。对于手机而言,在用户使用PC的鼠标对计算器的图标701进行点击操作后,手机显示的界面可不做改变,如图7所示,依然显示桌面704。手机也可以显示计算器的界面703,图中未示出。
结合图2B,作为一种示例,手机将计算器的界面显示到PC显示屏的具体实现可以是:在确定用户是使用鼠标输入的点击操作后,手机可启动投屏服务。之后,手机,如手机的投屏服务模块可获取该计算器的界面对应的数据,并发送给PC。PC接收到该数据后,可根据该数据在PC显示屏上显示该计算器的界面。如,手机的投屏服务模块可通过手机的显示管理器(如该显示管理器是手机框架层的模块)获取该计算器的界面对应数据,如录屏数据,并发送给PC,即可实现计算器的界面到PC显示屏上的显示。
在一些实施例中,可采用分布式多媒体协议(Distributed Multi-media Protocol,DMP)来实现手机中界面到PC显示屏上的显示。例如,在手机确定用户使用PC的输入设备输入操作后,手机的投屏服务模块可使用手机的显示管理器(DisplayManager)创建虚拟显示(VirtualDisplay)。如手机的投屏服务模块向手机的显示管理器发送创建VirtualDisplay的请求,手机的显示管理器完成VirtualDisplay的创建后,可将创建的VirtualDisplay返回给手机的投屏服务模块。之后,手机的投屏服务模块可将响应于该操作所需绘制的界面,如计算器的界面,移到该VirtualDisplay中进行绘制。这样,手机的投屏服务模块可获得对应的录屏数据。在手机的投屏服务模块获得录屏数据后,可将录屏数据进行编码后发送给PC。PC的投屏服务模块可接收到对应数据,对该数据进行解码后便可获得录屏数据。之后,PC的投屏服务模块与PC的框架层配合可根据录屏数据,绘制对应界面,如计算器的界面并显示在PC的显示屏上。如PC的框架层可提供一个surfaceview来实现手机中界面在PC端的显示。
在其他一些实施例中,也可以采用无线投影(Miracast)实现手机中界面在PC显示屏上的显示,即手机可获取响应于该操作所需绘制的界面的所有图层,然后将获得的所有图层整合成视频流(或者说称为录屏数据)并编码后通过实时流传输协议(real time streaming protocol,RTSP)协议发送给PC。PC在接收到视频流后可对其进行解码并播 放,以实现手机中界面在PC显示屏上的显示。或者,手机可以将响应于该操作所需绘制的界面,如计算器的界面进行指令抽取后获得指令流,并获取该界面的层信息等,之后通过将指令流及层信息等发送给PC,用于PC恢复出响应于该操作所需绘制的界面,以实现手机中界面在PC上的显示。
如,以用户使用手机的输入设备控制手机为例。该方法可以包括以下S308-S309。
S308、手机接收用户在手机的触摸屏上对手机显示的应用的图标的点击操作。
S309、手机根据该点击操作,在手机上显示应用的界面。
在用户想要使用手机的输入设备,如触摸屏对手机进行控制时,用户可使用手指在触摸屏上相应位置进行触控操作。如以用户想要打开某应用(如计算器),如第一界面为桌面,用户想要操作的内容是桌面上应用的图标,且以打开该应用的操作是对手机桌面上显示的该应用的图标的点击操作,即以第一操作是点击操作为例。用户可使用手指对手机桌面上显示的计算器的图标进行点击操作。之后,手机可接收到对应输入事件(该输入事件可以为本申请实施例中的第二输入事件)。手机根据该输入事件和用户的操作位置,可确定用户对桌面上计算器的图标进行了点击操作。
如S306-307中的描述,在手机确定出用户对计算器的图标进行了点击操作,手机可确定手机的显示界面会发生变化。之后,手机可先确定用户输入该操作的意图是想在手机上显示对应界面,如该应用的界面,还是想在PC上显示该应用的界面。手机可根据输入该点击操作的输入设备(或者说输入源)来确定用户的操作意图。输入该点击操作的输入源可以通过上述S307中的方式1或方式2来实现。
例如,当用户在手机触摸屏上计算器的图标的位置进行了点击操作后,手机可接收到对应输入事件,该输入事件中包括输入设备ID,该输入设备ID用于标识输入该点击操作的输入源是手机的触摸屏,因此根据该输入事件中的输入设备ID,手机可确定输入该点击操作的输入源是手机的触摸屏。又例如,当用户在手机触摸屏上计算器的图标的位置进行了点击操作后,手机可接收到对应输入事件,该输入事件中包括输入方式,该输入方式用于指示输入该点击操作的输入源是手机的触摸屏,因此根据该输入事件中的输入方式,手机可确定输入该点击操作的输入源是手机的触摸屏。
通过上述方式1或方式2,手机可确定输入上述点击操作的输入源是手机的触摸屏,表明用户是想要将计算器的界面显示在手机上,则作为对该输入事件的响应,手机可在手机上显示计算器的界面,不将该计算器的界面显示到PC显示屏上。例如,如图8中的(a)所示,在手机与PC建立了连接,手机创建了虚拟输入设备的情况下,用户使用手指对手机桌面上显示的计算器的图标801进行了触控操作,如点击操作。如图8中的(b)所示,作为响应,手机可在手机上显示计算器的界面802,而不将该计算器的界面802投射显示到PC的显示屏803上。
以上示例是以用户使用PC的输入设备或手机的输入设备对手机显示的应用的图标进行操作为例进行说明的。在其他一些实施例中,用户使用PC的输入设备或手机的输入设备也可以对手机显示其他内容(如本申请实施例中的第一界面中的内容)进行操作。在用户对手机显示的内容进行操作后,如果会引起手机显示界面的变化,则手机可根据该操作对应输入事件包括的输入设备ID或输入方式,确定输入该操作的输入源是手机的触摸屏,还是PC的鼠标,从而确定用户的操作意图是在PC上显示对应界面(如可以为本申请实施例中的第二界面)还是在手机上显示对应界面(如可以为本申请实施例中的第二界面)。如果输入该操作的输入源是PC的鼠标,则手机可将对应界面显示到PC的 显示屏上。如果输入该操作的输入源是手机的触摸屏,则手机可在手机上显示对应界面,而不投射到PC的显示屏上。
例如,如图9中的(a)所示,手机当前显示图库的首页901。该图库的首页901包括多个图片的缩略图,其中包括图片1的缩略图903。用户在想要在PC上显示该图片1时,可移动PC的鼠标902,以使得PC的鼠标穿梭到手机。之后,用户可继续移动PC的鼠标,使得手机上的光标移动到该图片1的缩略图903的位置。用户可使用PC的鼠标902进行点击操作。手机可获取到该点击操作对应的输入事件。根据该输入事件包括的输入设备ID或输入方式,手机可确定输入该点击操作的输入源是PC鼠标。这样,作为对该点击操作的响应,手机可将包括该图片1的详情界面显示到PC的显示屏上,如图9中的(b)所示,PC显示的图片1的详情界面如904所示。又例如,如图10中的(a)所示,手机当前显示图库的首页1001。该图库的首页1001包括多个图片的缩略图,其中包括图片1的缩略图1002。用户在想要在手机上显示该图片1时,可在手机触摸屏上该图片1的缩略图1002的位置处进行点击操作。之后,手机可获取到对应输入事件,根据该输入事件包括的输入设备ID或输入方式,手机可确定输入该点击操作的输入源是手机的触摸屏。因此,如图10中的(b)所示,作为对该点击操作的响应,手机可在手机上显示图片1的详情界面1003,而不将该详情界面1003投射显示到PC的显示屏1004上。具体实现与用户使用PC的输入设备或手机的输入设备对手机显示的应用的图标进行操作的实现类似,此处不再详细赘述。
另外,需要说明的是,以上示例是以用户在手机触摸屏上进行触控操作之前,手机的界面未在PC上投屏显示为例进行说明的。在其他一些实施例中,如果用户在手机触摸屏上进行触控操作之前,手机已将手机中的界面投射到了PC的显示屏上,则在手机接收到用户在手机的触控屏上的触控操作之后,则手机不仅可执行在手机上显示对应界面的操作,还可执行停止手机中界面在PC显示屏上的投射显示的操作。
在一些实施例中,在上述S302-S307之后,如果用户想使用PC的输入设备,如鼠标实现对PC的控制,则用户可通过移动鼠标,使得手机上显示的光标滑出手机显示屏的边缘。在手机上的光标滑出手机显示屏的边缘后,键鼠穿梭结束。在键鼠穿梭结束后,用户使用PC的鼠标便可实现对PC的控制。
例如,在手机确定手机上的光标滑出手机显示屏的边缘时,表明用户想要使用鼠标对其他设备进行控制。如S304中的描述,如果手机仅与PC一个设备建立了连接,则表明用户想使用鼠标对PC进行控制。如果手机与多个设备建立了连接,则手机可显示列表选项,该列表选项包括与手机连接的所有设备的标识,供用户选择想用鼠标控制的设备。如用户选择了PC的标识,则表明用户想使用鼠标对PC进行控制。或者也可以在手机中预先配置穿梭关系,用于确定鼠标穿梭到哪个设备,即确定转由哪个设备对鼠标的操作进行响应,对于穿梭关系的配置和应用的具体描述上述实施例中对应内容的描述类似,此处不在详细赘述。在手机确定用户想使用鼠标对PC进行控制后,手机可向PC发送用于指示键鼠穿梭结束的穿梭状态信息。PC接收到该穿梭状态信息后,可确定鼠标穿梭结束。之后,PC可卸载HOOK(或者说关闭HOOK),也即取消对输入事件,如鼠标事件的拦截。此后当用户对PC的输入设备进行操作时,PC的键鼠模块将不会拦截接收到输入事件,而是会将接收到的输入事件发送给PC的windows系统,以便PC的windows系统对该输入事件进行响应,也即使得用户可使用PC的鼠标实现对PC的控制。PC的键鼠模块还可重新在PC显示屏上显示的光标。
其中,作为一种示例,手机确定手机上的光标滑出手机显示屏边缘的具体实现可以是:在手机上显示光标后,手机可监测光标在手机显示屏上的实时坐标位置(如光标的实时坐标位置可利用注册的侦听器来获取)。手机可根据光标的初始位置和相对位移确定光标在手机显示屏上的坐标位置,从而确定光标是否滑出手机显示屏的边缘。其中,光标的初始位置可以是鼠标开始移动时,光标在手机显示屏上的坐标位置,或者说是鼠标开始移动之前光标在手机显示屏上的坐标位置。该光标的初始位置具体可以是指在以手机显示屏的左上角为坐标原点,X轴从左上角指向手机显示屏右边缘,Y轴从左上角指向手机显示屏下边缘的坐标系中的坐标位置。手机确定光标滑出手机显示屏边缘的具体实现与PC确定光标滑出PC显示屏边缘的具体实现类似,此处不再详细赘述。
采用上述方法,在手机和PC协同使用的场景中。当用户使用PC的输入设备对手机进行控制时,手机可将对应界面投射到PC上显示。当用户使用手机的输入设备对手机进行控制时,则在手机上显示对应界面,不将对应界面投射到PC上显示。这样,用户可根据自己的实际需要自由控制第二终端的界面在不同设备上的显示。不仅保护了用户隐私,也避免了用户注意力转移。提高了用户的使用体验。
以上实施例是以用户使用PC的输入设备或手机的输入设备对手机显示内容进行操作后,手机根据对应输入事件包括的输入设备ID或输入方式的不同,选择在不同的设备上进行对应界面显示为例进行说明的。可以理解的,在键鼠共享模式下,PC不仅可以把PC的鼠标共享给其他终端,如手机,还可将PC的键盘也共享给手机。而在相关技术中,在键鼠共享模式下,无论用户是使用手机的输入设备对手机显示的输入框进行了操作,还是使用PC的鼠标对手机显示的输入框进行了操作,手机均不会在手机上显示虚拟键盘,默认是使用PC的键盘,如称为物理键盘实现输入的。而当用户利用手机的输入设备,如触摸屏对输入框进行操作时,关注点在手机上,如果此时仍使用PC的物理键盘实现输入,则需要注意力频繁在两个设备间切换,降低了多终端协同使用的效率。
为解决该问题,请参考图11,本实施例还提供一种显示方法。图11为本申请实施例提供的另一种显示方法的流程示意图。如图11所示,该方法可以包括以下S1101-S1109。
S1101、手机与PC建立连接。
S1102、PC接收鼠标移动事件,根据鼠标移动事件在PC的显示屏上显示光标移动的动画。
S1103、PC监测光标在PC显示屏上的坐标位置。
S1104、PC根据光标在PC显示屏上的坐标位置,在确定光标滑出PC显示屏边缘时,拦截鼠标移动事件,并将鼠标移动事件包含的参数1发送给手机。
S1105、手机接收参数1,根据参数1模拟鼠标移动事件,根据鼠标移动事件在手机的显示屏上显示光标移动的动画。
S1106、在手机的光标显示在界面的输入框上时,PC接收鼠标按下事件和鼠标抬起事件,拦截该鼠标按下事件和鼠标抬起事件,并将鼠标按下事件包含的参数2和鼠标抬起事件包括的参数3发送给手机。
其中,该界面可以为本申请实施例中的第三界面。
S1107、手机接收参数2和参数3,根据参数2和参数3模拟鼠标按下事件和鼠标抬起事件,根据鼠标按下事件和鼠标抬起事件,确定不在手机上显示虚拟键盘。
S1108、手机接收用户在手机的触摸屏上对手机显示的界面中的输入框的点击操作。 该界面可以为本申请实施例中的第三界面。
S1109、手机根据该点击操作,在手机上显示虚拟键盘。
结合上述步骤可以理解的是,在本实施例中,手机在接收到用户对手机上显示的输入框的操作(如本申请实施例中的第二操作)后,可以根据该操作对应输入事件(如本申请实施例中的第三输入事件或第四输入事件)包括的输入设备ID或输入方式,确定输入该操作的输入源是手机的输入设备还是PC的输入设备,从而确定是否在手机上显示虚拟键盘。如果输入该操作的输入源是PC的输入设备,如鼠标,则手机可不在手机上显示虚拟键盘,用户可使用PC的键盘实现输入。如果输入该操作的输入源是手机的输入设备,如触摸屏,则手机可在手机上显示虚拟键盘,用户可使用该虚拟键盘实现输入。需要说明的是,上述S1101-S1109的具体描述与上述实施例S301-S309中对应步骤的描述类似,此处不再详细赘述。
例如,如图12中的(a)所示,手机当前显示聊天界面1201。该聊天界面1201包括输入框1203。用户在想要使用PC的键盘在该输入框1203中输入文字时,可移动PC的鼠标1202,以使得PC的鼠标穿梭到手机。之后,用户可继续移动PC的鼠标,使得手机上的光标移动到输入框1203的位置。用户可使用PC的鼠标1202进行点击操作。手机可获取到该点击操作对应的输入事件。根据该输入事件包括的输入设备ID或输入方式,手机可确定输入该点击操作的输入源是PC的鼠标。这样,手机可确定不在手机上显示虚拟键盘,用户可使用PC的键盘1204实现在输入框1203中文字的输入。其中,用户使用PC的键盘1204实现在输入框1203中输入的具体实现可以是:在用户操作PC的键盘1204后,PC(如PC的键鼠模块)可接收到对应的输入事件,如键盘事件,该键盘事件中包括用户对键盘1204进行的具体操作的操作参数。PC的键鼠模块可将该键盘事件拦截(如利用挂载的HOOK拦截),以便该键盘事件不会被发送给PC的windows系统,从而使得PC不会对键盘事件做响应。PC的键鼠模块还可捕获键盘事件中的操作参数。之后,PC可将该操作参数发送给手机。手机接收到该操作参数后,可根据该操作参数模拟(如利用创建的虚拟输入设备模拟)出对应的键盘事件。作为对该键盘事件的响应,手机可在输入框1203中显示对应的文字,以实现输入。
可选的,手机可以将聊天界面1201显示到PC的显示屏上,如图12中的(b)所示,PC显示的聊天界面如1205所示。在PC显示聊天界面1205的情况下,用户还是使用PC的键盘1204实现文字输入的,且输入的结果可同步显示到PC上(如PC更新投屏界面,可使得输入结果同步显示到PC)。
又例如,如图13中的(a)所示,手机当前显示聊天界面1301。该聊天界面1301包括输入框1302。用户在想要使用手机的虚拟键盘在该输入框1302中输入文字时,可在手机触摸屏上该输入框1302的位置处进行点击操作。之后,手机可获取到对应输入事件,根据该输入事件包括的输入设备ID或输入方式,手机可确定输入该点击操作的输入源是手机的触摸屏。因此,如图13中的(b)所示,作为对该点击操作的响应,手机可在手机上显示虚拟键盘1303,用户可使用虚拟键盘1303实现在输入框1302中文字的输入。
采用上述方法,在手机和PC协同使用的场景中。当用户使用PC的输入设备对手机中的输入框进行操作时,手机可不显示手机的虚拟键盘,用户可使用PC的键盘实现输入。当用户使用手机的输入设备对输入框进行操作时,则在手机上显示虚拟键盘,用户可使用该虚拟键盘实现输入。这样,无需用户将注意力频繁在两个设备间切换,提高了多终 端协同使用的效率。
图14为本申请实施例提供的一种显示装置的组成示意图。如图14所示,该装置可以应用于第二终端(如上述手机),该第二终端与第一终端连接,该装置可以包括:显示单元1401,输入单元1402和发送单元1403。
显示单元1401,用于显示第一界面。
输入单元1402,用于接收用户对第一界面的内容的第一操作。
发送单元1403,用于在第一操作的输入源是第一终端的输入设备的情况下,响应于第一操作,向第一终端发送数据,该数据用于第一终端在第一终端的显示屏上显示第二界面。
显示单元1401,还用于在第一操作的输入源是第二终端的输入设备的情况下,响应于第一操作,在第二终端的显示屏上显示第二界面。
进一步的,在第一操作的输入源是第一终端的输入设备的情况下,该装置还可以包括:接收单元1404,用于接收来自第一终端的穿梭状态信息,该穿梭状态信息可用于指示输入设备的穿梭开始。
进一步的,接收单元1404,还用于接收来自第一终端的第一操作参数,该第一操作参数是在用户使用第一终端的输入设备执行第一操作的情况下,第一操作对应的第一输入事件包含的操作参数。
该装置还可以包括:模拟单元1405和确定单元1406。
模拟单元1405,用于根据第一操作参数,模拟第一输入事件。
确定单元1406,用于根据模拟的第一输入事件,确定第一操作的输入源是第一终端的输入设备。发送单元1403响应于第一操作,向第一终端发送数据,具体包括:发送单元1403响应于第一输入事件,向第一终端发送数据。
进一步的,确定单元1406,具体用于确定模拟的第一输入事件包括的输入设备标识为虚拟输入设备的标识,虚拟输入设备是第二终端创建的用于模拟输入事件;或,确定模拟的第一输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
进一步的,在用户使用第二终端的输入设备执行第一操作的情况下,确定单元1406,还用于根据第一操作对应的第二输入事件,确定第一操作的输入源是第二终端的输入设备。显示单元1401响应于第一操作,在第二终端的显示屏上显示第二界面,可以包括:显示单元1401响应于第二输入事件,在第二终端的显示屏上显示第二界面。
进一步的,确定单元1406,具体用于:确定第二输入事件包括的输入设备标识为第二终端的输入设备的标识;或,确定第二输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
进一步的,该装置还可以包括:创建单元1407,用于在与第一终端的连接建立成功后,创建虚拟输入设备;或者,接收单元1404,还用于接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,创建单元1407,用于响应于通知消息,创建虚拟输入设备;其中,虚拟输入设备用于第二终端模拟第一终端输入设备的输入。
进一步的,显示单元1401,还用于显示第三界面,该第三界面包括输入框。
输入单元1402,还用于接收用户对输入框的第二操作。
显示单元1401,还用于在第二操作的输入源是第二终端的输入设备的情况下,响应 于第二操作,在第二终端的显示屏上显示虚拟键盘。
进一步的,在第二操作的输入源是第一终端的输入设备的情况下,发送单元1403,还用于响应于第二操作,向第一终端发送第三界面的数据,第三界面上不显示虚拟键盘,第三界面的数据用于第一终端在第一终端的显示屏上显示第三界面。
进一步的,在用户使用第二终端的输入设备执行第二操作的情况下,确定单元1406,还用于根据第二操作对应的第三输入事件,确定第二操作的输入源是第二终端的输入设备。显示单元1401响应于第二操作,在第二终端的显示屏上显示虚拟键盘,可以包括:显示单元1401响应于第三输入事件在第二终端的显示屏上显示虚拟键盘。
进一步的,确定单元1406,具体用于:确定第三输入事件包括的输入设备标识为第二终端的输入设备的标识;或,确定第三输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
进一步的,接收单元1404,还用于接收来自第一终端的第二操作参数,该第二操作参数是在用户使用第一终端的输入设备执行第二操作的情况下,第二操作对应的第四输入事件包含的操作参数。模拟单元1405,还用于根据第二操作参数,模拟第四输入事件;确定单元1406,还用于根据模拟的第四输入事件,确定第二操作的输入源是第一终端的输入设备。发送单元1403响应于第二操作,向第一终端发送第三界面的数据,具体包括:发送单元1403响应于第四输入事件,向第一终端发送第三界面的数据。
进一步的,确定单元1406,具体用于:确定模拟的第四输入事件包括的输入设备标识为虚拟输入设备的标识;或,确定模拟的第四输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
图14所示的显示装置还可以用于第二终端实现以下功能。其中,第二终端与第一终端连接,第一终端的键鼠共享模式开启,第二终端创建了虚拟输入设备,用于模拟第一终端输入设备的输入。
显示单元1401,用于显示界面。
输入单元1402,用于接收用户对该界面的输入框的操作。
显示单元1401,还用于在该操作的输入源是第二终端的输入设备的情况下,响应于该操作,第二终端在第二终端的显示屏上显示虚拟键盘。
进一步的,显示单元1401,还用于在该操作的输入源是第一终端的输入设备的情况下,响应于该操作,不在第二终端的显示屏上显示虚拟键盘。
进一步的,发送单元1403,用于在该操作的输入源是第一终端的输入设备的情况下,响应于该操作,向第一终端发送上述界面的数据,该数据用于第一终端在第一终端的显示屏上显示该界面,该界面上不显示虚拟键盘。
进一步的,在用户使用第二终端的输入设备执行对输入框的操作的情况下,确定单元1406,用于根据该操作对应的输入事件,确定该操作的输入源是第二终端的输入设备。显示单元1401响应于该操作,在第二终端的显示屏上显示虚拟键盘,可以包括:显示单元1401响应于该操作对应的输入事件,在第二终端的显示屏上显示虚拟键盘。
进一步的,确定单元1406,具体用于:确定输入事件包括的输入设备标识为第二终端的输入设备的标识;或,确定输入事件包括的输入方式指示的输入设备类型与第二终端的输入设备的类型相同。
进一步的,在上述操作的输入源是第一终端的输入设备的情况下,接收单元1404, 用于接收来自第一终端的穿梭状态信息,该穿梭状态信息可用于指示输入设备的穿梭开始,或者说该穿梭状态信息可用于指示第二终端开始接受第一终端的输入设备的输入。
进一步的,在上述操作的输入源是第一终端的输入设备的情况下,接收单元1404,还用于接收来自第一终端的操作参数,该操作参数是在用户使用第一终端的输入设备执行上述操作的情况下,该操作对应的输入事件包含的操作参数。
模拟单元1405,用于根据该操作参数,模拟对应输入事件。
确定单元1406,还用于根据模拟的输入事件,确定上述操作的输入源是第一终端的输入设备。显示单元1401响应于操作,不在第二终端的显示屏上显示虚拟键盘可以包括:显示单元1401响应于该输入事件,不在第二终端的显示屏上显示虚拟键盘。发送单元1403响应于该操作,向第一终端发送上述界面的数据,可以包括:发送单元1403响应于该输入事件,向第一终端发送界面的数据。
进一步的,确定单元1406,具体用于:确定模拟的输入事件包括的输入设备标识为虚拟输入设备的标识,虚拟输入设备是第二终端创建的用于模拟输入事件;或,确定模拟的输入事件包括的输入方式指示的输入设备类型与第一终端的输入设备的类型相同,且确定接收到来自第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
进一步的,创建单元1407,用于在与第一终端的连接建立成功后,创建虚拟输入设备;或者,接收单元1404,还用于接收来自第一终端的通知消息,通知消息用于指示第一终端的键鼠共享模式已开启,创建单元1407,用于响应于通知消息,创建虚拟输入设备;其中,虚拟输入设备用于第二终端模拟第一终端输入设备的输入。
本申请实施例还提供一种显示装置,该装置可以应用于电子设备,如上述实施例中的第一终端或第二终端。该装置可以包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令时使得该显示装置实现上述方法实施例中手机或PC执行的各个功能或者步骤。
本申请实施例还提供一种电子设备(该电子设备可以是终端,如可以为上述实施例中的第一终端或第二终端),该电子设备可以包括:显示屏、存储器和一个或多个处理器。该显示屏、存储器和处理器耦合。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,电子设备可执行上述方法实施例中手机或PC执行的各个功能或者步骤。当然,该电子设备包括但不限于上述显示屏、存储器和一个或多个处理器。例如,该电子设备的结构可以参考图2A所示的手机的结构。
本申请实施例还提供一种芯片系统,该芯片系统可以应用于电子设备,如前述实施例中的终端(如第一终端或第二终端)。如图15所示,该芯片系统包括至少一个处理器1501和至少一个接口电路1502。该处理器1501可以是上述电子设备中的处理器。处理器1501和接口电路1502可通过线路互联。该处理器1501可以通过接口电路1502从上述电子设备的存储器接收并执行计算机指令。当计算机指令被处理器1501执行时,可使得电子设备执行上述实施例中手机或PC执行的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机可读存储介质,用于存储电子设备,如上述终端(如手机或PC)运行的计算机指令。
本申请实施例还提供一种计算机程序产品,包括电子设备,如上述终端(如手机或PC)运行的计算机指令。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便 和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (16)

  1. 一种显示方法,其特征在于,应用于第二终端,所述第二终端与第一终端连接,所述方法包括:
    所述第二终端显示第一界面;
    所述第二终端接收用户对所述第一界面的内容的第一操作;
    在所述第一操作的输入源是所述第一终端的输入设备的情况下,响应于所述第一操作,所述第二终端向所述第一终端发送数据,所述数据用于所述第一终端在所述第一终端的显示屏上显示第二界面;
    在所述第一操作的输入源是所述第二终端的输入设备的情况下,响应于所述第一操作,所述第二终端在所述第二终端的显示屏上显示所述第二界面。
  2. 根据权利要求1所述的方法,其特征在于,在所述第一操作的输入源是所述第一终端的输入设备的情况下,在所述第二终端接收所述第一操作之前,所述方法还包括:
    所述第二终端接收来自所述第一终端的穿梭状态信息,所述穿梭状态信息用于指示输入设备的穿梭开始。
  3. 根据权利要求1或2所述的方法,其特征在于,在所述第一操作的输入源是所述第一终端的输入设备的情况下,
    所述第二终端接收用户对所述第一界面的内容的第一操作,包括:
    所述第二终端接收来自所述第一终端的第一操作参数,所述第一操作参数是在用户使用所述第一终端的输入设备执行所述第一操作的情况下,所述第一操作对应的第一输入事件包含的操作参数;
    所述第二终端根据所述第一操作参数,模拟所述第一输入事件;
    所述响应于所述第一操作,所述第二终端向所述第一终端发送数据,包括:
    所述第二终端根据模拟的所述第一输入事件,确定所述第一操作的输入源是所述第一终端的输入设备;
    响应于所述第一输入事件,所述第二终端向所述第一终端发送所述数据。
  4. 根据权利要求3所述的方法,其特征在于,所述第二终端根据模拟的所述第一输入事件,确定所述第一操作的输入源是所述第一终端的输入设备,包括:
    所述第二终端确定模拟的所述第一输入事件包括的输入设备标识为虚拟输入设备的标识,所述虚拟输入设备是所述第二终端创建的用于模拟输入事件;或,
    所述第二终端确定模拟的所述第一输入事件包括的输入方式指示的输入设备类型与所述第一终端的输入设备的类型相同,且确定接收到来自所述第一终端用于指示输入设备的穿梭开始的穿梭状态信息。
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,所述响应于所述第一操作,所述第二终端在所述第二终端的显示屏上显示所述第二界面,包括:
    在用户使用所述第二终端的输入设备执行所述第一操作的情况下,所述第二终端根据所述第一操作对应的第二输入事件,确定所述第一操作的输入源是所述第二终端的输入设备;
    响应于所述第二输入事件,所述第二终端在所述第二终端的显示屏上显示所述第二界面。
  6. 根据权利要求5所述的方法,其特征在于,所述第二终端根据所述第一操作对应 的第二输入事件,确定所述第一操作的输入源是所述第二终端的输入设备,包括:
    所述第二终端确定所述第二输入事件包括的输入设备标识为所述第二终端的输入设备的标识;或,
    所述第二终端确定所述第二输入事件包括的输入方式指示的输入设备类型与所述第二终端的输入设备的类型相同。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述方法还包括:
    所述第二终端在与所述第一终端的连接建立成功后,创建虚拟输入设备;或者,
    所述第二终端接收来自所述第一终端的通知消息,所述通知消息用于指示所述第一终端的键鼠共享模式已开启,响应于所述通知消息,所述第二终端创建所述虚拟输入设备;
    其中,所述虚拟输入设备用于所述第二终端模拟所述第一终端输入设备的输入。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述方法还包括:
    所述第二终端显示第三界面,所述第三界面包括输入框;
    所述第二终端接收用户对所述输入框的第二操作;
    在所述第二操作的输入源是所述第二终端的输入设备的情况下,响应于所述第二操作,所述第二终端在所述第二终端的显示屏上显示虚拟键盘。
  9. 根据权利要求8所述的方法,其特征在于,所述方法还包括:
    在所述第二操作的输入源是所述第一终端的输入设备的情况下,响应于所述第二操作,所述第二终端向所述第一终端发送所述第三界面的数据,所述第三界面上不显示虚拟键盘,所述第三界面的数据用于所述第一终端在所述第一终端的显示屏上显示所述第三界面。
  10. 一种显示装置,其特征在于,包括:处理器;用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令时使得所述显示装置实现如权利要求1-9中任一项所述的方法。
  11. 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被电子设备执行时使得所述电子设备实现如权利要求1-9中任一项所述的方法。
  12. 一种显示系统,其特征在于,包括:第一终端和第二终端,所述第一终端与所述第二终端连接;
    所述第二终端,用于显示第一界面,接收用户对所述第一界面的内容的第一操作;
    所述第二终端,还用于在所述第一操作的输入源是所述第二终端的输入设备的情况下,响应于所述第一操作,在所述第二终端的显示屏上显示第二界面;在所述第一操作的输入源是所述第一终端的输入设备的情况下,响应于所述第一操作,向所述第一终端发送数据;
    所述第一终端,用于接收所述数据,根据所述数据在所述第一终端的显示屏上显示所述第二界面。
  13. 根据权利要求12所述的系统,其特征在于,所述第一终端,还用于在用户使用所述第一终端的输入设备输入所述第一操作的情况下,拦截所述第一操作对应的第一输入事件,向所述第二终端发送所述第一输入事件包括的第一操作参数。
  14. 根据权利要求13所述的系统,其特征在于,
    所述第二终端用于接收用户的所述第一操作,具体为:所述第二终端,用于接收来自所述第一终端的所述第一操作参数;根据所述第一操作参数,模拟所述第一输入事件;
    所述第二终端响应于所述第一操作,向所述第一终端发送数据,具体为:所述第二终端用于根据模拟的所述第一输入事件,确定所述第一操作的输入源是所述第一终端的输入设备,响应于所述第一输入事件,向所述第一终端发送所述数据。
  15. 根据权利要求13或14所述的系统,其特征在于,
    所述第一终端,还用于确定所述第一终端显示的光标滑出所述第一终端的显示屏的边缘,开启输入事件的拦截。
  16. 根据权利要求12-15中任一项所述的系统,其特征在于,所述第二终端响应于所述第一操作,在所述第二终端的显示屏上显示第二界面,具体为:所述第二终端,用于在用户使用所述第二终端的输入设备执行所述第一操作的情况下,根据所述第一操作对应的第二输入事件,确定所述第一操作的输入源是所述第二终端的输入设备,响应于所述第二输入事件,在所述第二终端的显示屏上显示所述第二界面。
PCT/CN2021/114990 2020-09-02 2021-08-27 一种显示方法及设备 WO2022048500A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21863565.4A EP4195638A4 (en) 2020-09-02 2021-08-27 DISPLAY METHOD AND DEVICE
US18/043,627 US11947998B2 (en) 2020-09-02 2021-08-27 Display method and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010911452.0A CN114201128A (zh) 2020-09-02 2020-09-02 一种显示方法及设备
CN202010911452.0 2020-09-02

Publications (1)

Publication Number Publication Date
WO2022048500A1 true WO2022048500A1 (zh) 2022-03-10

Family

ID=80491584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/114990 WO2022048500A1 (zh) 2020-09-02 2021-08-27 一种显示方法及设备

Country Status (4)

Country Link
US (1) US11947998B2 (zh)
EP (1) EP4195638A4 (zh)
CN (2) CN114201128A (zh)
WO (1) WO2022048500A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114884990A (zh) * 2022-05-06 2022-08-09 亿咖通(湖北)技术有限公司 一种基于虚拟屏的投屏方法和设备
CN114760291B (zh) * 2022-06-14 2022-09-13 深圳乐播科技有限公司 一种文件处理方法及装置
CN117555433A (zh) * 2022-08-04 2024-02-13 华为技术有限公司 一种设备的输入方法、输入设备及系统
CN117687555A (zh) * 2022-09-02 2024-03-12 荣耀终端有限公司 键鼠穿越方法和终端设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150293741A1 (en) * 2014-04-10 2015-10-15 Screenovate Technologies Ltd. Method for real-time multimedia interface management
CN105516754A (zh) * 2015-12-07 2016-04-20 小米科技有限责任公司 画面显示控制方法、装置及终端
CN106095237A (zh) * 2016-06-08 2016-11-09 联想(北京)有限公司 信息处理方法及电子设备
CN108781235A (zh) * 2017-06-13 2018-11-09 华为技术有限公司 一种显示方法及装置

Family Cites Families (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012025A1 (en) 1998-03-20 2001-08-09 Toshiba America Information Systems, Inc. Display scrolling system using pointing device
TW571221B (en) 2000-06-30 2004-01-11 Intel Corp Method and apparatus for using an input device of a first computer system to wirelessly enter data into a second computer system
US20130080143A1 (en) * 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop docking behavior with device as master
US20070055941A1 (en) * 2005-09-08 2007-03-08 Bhakta Dharmesh N Method and apparatus to selectively display portions of a shared desktop in a collaborative environment
JP4507281B2 (ja) * 2006-03-30 2010-07-21 富士フイルム株式会社 画像表示装置、撮像装置および画像表示方法
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US20120046071A1 (en) 2010-08-20 2012-02-23 Robert Craig Brandis Smartphone-based user interfaces, such as for browsing print media
CN102333152B (zh) * 2011-09-15 2013-04-03 陈俊儒 支持多路输入输出的新型智能移动终端设备
KR101922283B1 (ko) 2011-12-28 2019-02-13 노키아 테크놀로지스 오와이 애플리케이션의 오픈 인스턴스의 제공
EP2632188B1 (en) 2012-02-24 2018-04-11 BlackBerry Limited Method and apparatus for interconnected devices
CN102670730B (zh) 2012-05-17 2013-12-11 杭州天厨蜜源保健品有限公司 一种蜂王浆提取物冻干粉生产工艺
TWI547861B (zh) * 2012-05-23 2016-09-01 群邁通訊股份有限公司 滑鼠鍵盤共用方法及系統
CN103425408A (zh) 2012-05-23 2013-12-04 深圳富泰宏精密工业有限公司 鼠标键盘共享方法及系统
KR101916416B1 (ko) * 2012-07-30 2018-11-08 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 디스플레이 방법
US9288299B2 (en) 2012-09-14 2016-03-15 Dewmobile, Inc. Method and apparatus for file sharing in a network
KR101621524B1 (ko) 2012-11-02 2016-05-31 삼성전자 주식회사 디스플레이장치 및 그 제어방법
CN103279288B (zh) 2013-05-31 2016-08-31 小米科技有限责任公司 数据传输方法、装置和终端设备
KR102367551B1 (ko) * 2014-06-10 2022-02-25 삼성전자 주식회사 전자 장치의 정보 처리 방법 및 장치
US10045105B2 (en) 2014-08-12 2018-08-07 Bose Corporation Wireless communication between endpoint devices
CN104349110A (zh) 2014-09-24 2015-02-11 招商银行股份有限公司 终端远程交互方法和系统
CN104618793B (zh) 2015-02-28 2018-11-09 联想(北京)有限公司 一种信息处理方法及电子设备
CN104811793B (zh) 2015-03-18 2019-01-04 四川长虹电器股份有限公司 电视共享电脑输入设备来进行外设输入的方法
KR102390082B1 (ko) 2015-07-14 2022-04-25 삼성전자주식회사 전자 장치의 동작 방법 및 전자 장치
CN106406127B (zh) 2015-07-31 2019-08-13 腾讯科技(深圳)有限公司 物联网设备的控制界面的生成方法及生成装置
CN105183343A (zh) 2015-08-25 2015-12-23 努比亚技术有限公司 一种处理报点信息的装置和方法
US10234953B1 (en) * 2015-09-25 2019-03-19 Google Llc Cross-device interaction through user-demonstrated gestures
US20170228207A1 (en) 2016-02-09 2017-08-10 Nanoport Technology Inc. System and method for providing visual feedback related to cross device gestures
CN105512086B (zh) * 2016-02-16 2018-08-10 联想(北京)有限公司 信息处理设备以及信息处理方法
CN105892851A (zh) 2016-03-29 2016-08-24 北京金山安全软件有限公司 一种可视资源传输方法、装置及电子设备
CN105955689A (zh) 2016-05-19 2016-09-21 乐视控股(北京)有限公司 多屏互动时的屏幕适配方法及装置
CN107491469B (zh) 2016-06-11 2020-11-24 苹果公司 智能任务发现
US10154388B2 (en) 2016-09-15 2018-12-11 Qualcomm Incorporated Wireless directional sharing based on antenna sectors
KR20180039341A (ko) 2016-10-10 2018-04-18 삼성전자주식회사 외부 장치와 통신 방법 및 이를 지원하는 전자 장치
KR101890664B1 (ko) 2016-10-18 2018-08-22 주식회사 에이텍 외부 영상과 입력장치를 동기화하는 공유장치
CN106844063B (zh) 2016-12-30 2020-05-22 深圳市优博讯科技股份有限公司 跨平台数据处理方法、系统和跨平台数据共享系统
CN108647062A (zh) 2017-03-16 2018-10-12 上海信颐信息技术有限公司 一种双系统键鼠共享方法及设备和双系统键鼠共享一体机
CN107124690B (zh) 2017-03-31 2020-06-05 上海掌门科技有限公司 一种在智能手表和手机之间进行数据传输的方法
CN107222936B (zh) 2017-06-26 2020-05-05 Oppo广东移动通信有限公司 一种数据处理方法、装置及终端
CN107425942B (zh) 2017-07-27 2020-12-22 广州视源电子科技股份有限公司 数据发送、转发和传输的方法及装置
CN108123826B (zh) 2017-09-25 2021-05-25 珠海许继芝电网自动化有限公司 一种跨区数据的交互系统及方法
CN107846617B (zh) 2017-11-03 2019-12-17 中广热点云科技有限公司 一种智能终端和智能电视的互动方法
US10983748B2 (en) * 2018-02-28 2021-04-20 Ricoh Company, Ltd. Information management apparatus, information sharing system, and terminal
CN108718439B (zh) 2018-05-22 2021-02-05 北京硬壳科技有限公司 数据传输方法及装置
CN110557674A (zh) 2018-05-30 2019-12-10 环球天成科技(北京)有限公司 智能投屏播放、传输方法及装置
CN108874713B (zh) 2018-05-31 2021-10-22 联想(北京)有限公司 一种信息处理方法及装置
CN108829323B (zh) 2018-06-22 2022-04-22 联想(北京)有限公司 信息处理方法、装置及电子设备
CN108958684A (zh) 2018-06-22 2018-12-07 维沃移动通信有限公司 投屏方法及移动终端
CN110908625B (zh) 2018-09-18 2023-05-30 斑马智行网络(香港)有限公司 多屏显示方法、装置、设备、系统、舱体及存储介质
CN109120970A (zh) 2018-09-30 2019-01-01 珠海市君天电子科技有限公司 一种无线投屏方法、终端设备及存储介质
CN109669747A (zh) 2018-11-30 2019-04-23 维沃移动通信有限公司 一种移动图标的方法及移动终端
CN109889885A (zh) 2019-02-27 2019-06-14 努比亚技术有限公司 一种投屏控制方法、终端及计算机可读存储介质
CN110109636B (zh) 2019-04-28 2022-04-05 华为技术有限公司 投屏方法、电子设备以及系统
CN110221798A (zh) * 2019-05-29 2019-09-10 华为技术有限公司 一种投屏方法、系统及相关装置
CN110321093A (zh) 2019-06-17 2019-10-11 广州视源电子科技股份有限公司 一种投屏配对方法、装置、存储介质及投屏器
CN112148182B (zh) 2019-06-28 2022-10-04 华为技术服务有限公司 一种交互控制方法、终端以及存储介质
CN110267073A (zh) 2019-07-24 2019-09-20 深圳市颍创科技有限公司 一种投屏画面显示及投屏画面旋转方法
CN110597473A (zh) 2019-07-30 2019-12-20 华为技术有限公司 一种投屏方法与电子设备
CN110515579A (zh) * 2019-08-28 2019-11-29 北京小米移动软件有限公司 投屏方法、装置、终端及存储介质
CN110618970A (zh) 2019-09-12 2019-12-27 联想(北京)有限公司 文件传输方法和电子设备
US11599322B1 (en) 2019-09-26 2023-03-07 Apple Inc. Systems with overlapped displays
CN110602805B (zh) 2019-09-30 2021-06-15 联想(北京)有限公司 信息处理方法、第一电子设备和计算机系统
CN115315685A (zh) 2020-02-03 2022-11-08 苹果公司 光标与触摸屏用户界面的集成
CN111324327B (zh) 2020-02-20 2022-03-25 华为技术有限公司 投屏方法及终端设备
CN111399789B (zh) 2020-02-20 2021-11-19 华为技术有限公司 界面布局方法、装置及系统
CN111314768A (zh) 2020-02-24 2020-06-19 北京小米移动软件有限公司 投屏方法、投屏装置、电子设备以及计算机可读存储介质
CN111443884A (zh) 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备
CN114356197A (zh) 2020-04-24 2022-04-15 华为技术有限公司 数据的传输方法及装置
CN111782427A (zh) 2020-07-16 2020-10-16 Oppo广东移动通信有限公司 一种内容操作方法、装置及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150293741A1 (en) * 2014-04-10 2015-10-15 Screenovate Technologies Ltd. Method for real-time multimedia interface management
CN105516754A (zh) * 2015-12-07 2016-04-20 小米科技有限责任公司 画面显示控制方法、装置及终端
CN106095237A (zh) * 2016-06-08 2016-11-09 联想(北京)有限公司 信息处理方法及电子设备
CN108781235A (zh) * 2017-06-13 2018-11-09 华为技术有限公司 一种显示方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4195638A4

Also Published As

Publication number Publication date
US20230273812A1 (en) 2023-08-31
CN114201128A (zh) 2022-03-18
EP4195638A1 (en) 2023-06-14
EP4195638A4 (en) 2024-01-17
US11947998B2 (en) 2024-04-02
CN114816294A (zh) 2022-07-29

Similar Documents

Publication Publication Date Title
CN114764298B (zh) 一种跨设备的对象拖拽方法及设备
WO2021013158A1 (zh) 显示方法及相关装置
WO2020244497A1 (zh) 一种柔性屏幕的显示方法及电子设备
WO2022048500A1 (zh) 一种显示方法及设备
WO2022042656A1 (zh) 一种界面显示方法及设备
WO2020134872A1 (zh) 一种消息处理的方法、相关装置及系统
CN110737386A (zh) 一种屏幕截取方法及相关设备
WO2022022490A1 (zh) 一种跨设备的对象拖拽方法及设备
CN112130788A (zh) 一种内容分享方法及其装置
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
WO2022179405A1 (zh) 一种投屏显示方法及电子设备
CN116055773A (zh) 一种多屏协同方法、系统及电子设备
WO2022028537A1 (zh) 一种设备识别方法及相关装置
CN115657918A (zh) 一种跨设备的对象拖拽方法及设备
CN107864086B (zh) 信息快速分享方法、移动终端及计算机可读存储介质
CN115756268A (zh) 跨设备交互的方法、装置、投屏系统及终端
CN111158815B (zh) 一种动态壁纸模糊方法、终端和计算机可读存储介质
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021190353A1 (zh) 交互方法和显示设备
CN114489350A (zh) 一种输入法调用方法及相关设备
WO2022206848A1 (zh) 一种应用小部件的显示方法及设备
CN114143906B (zh) 一种电子设备连接方法及电子设备
CN111367448B (zh) 应用功能执行方法、装置、电子设备及存储介质
CN115407885A (zh) 一种手写笔的连接控制方法和电子设备
CN110012146B (zh) 一种通信共享方法及终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21863565

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021863565

Country of ref document: EP

Effective date: 20230308

NENP Non-entry into the national phase

Ref country code: DE