WO2022042656A1 - 一种界面显示方法及设备 - Google Patents

一种界面显示方法及设备 Download PDF

Info

Publication number
WO2022042656A1
WO2022042656A1 PCT/CN2021/114825 CN2021114825W WO2022042656A1 WO 2022042656 A1 WO2022042656 A1 WO 2022042656A1 CN 2021114825 W CN2021114825 W CN 2021114825W WO 2022042656 A1 WO2022042656 A1 WO 2022042656A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
cursor
interface
content
display
Prior art date
Application number
PCT/CN2021/114825
Other languages
English (en)
French (fr)
Inventor
周学而
魏凡翔
刘敏
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US18/042,688 priority Critical patent/US20230333703A1/en
Priority to EP21860490.8A priority patent/EP4195008A4/en
Publication of WO2022042656A1 publication Critical patent/WO2022042656A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit

Definitions

  • the present application relates to the field of electronic devices, and in particular, to an interface display method and device.
  • a user can have more terminals such as a mobile phone, a tablet computer, a personal computer (personal computer, PC) at the same time.
  • terminals such as a mobile phone, a tablet computer, a personal computer (personal computer, PC)
  • PC personal computer
  • the user can connect the PC and the mobile phone in a wireless or wired manner to work together to realize the collaborative office of the PC and the mobile phone.
  • the multi-screen collaboration realizes the projection display of the mobile phone display interface onto the PC display screen by using mirror projection.
  • the interface projected by the mobile phone displayed on the PC may be referred to as a screen projection interface, for example, the screen projection interface 102 in FIG. 1 is the desktop 101 of the mobile phone.
  • the user can perform mouse operations such as mouse click and mouse movement in the screen projection interface, so as to realize the operation of displaying the actual interface on the mobile phone.
  • Embodiments of the present application provide an interface display method and device.
  • a user uses a PC mouse to move a cursor to a control on a screen-casting interface
  • the control and/or cursor can give corresponding visual feedback.
  • an embodiment of the present application provides an interface display method, which is applied to a first terminal, where the first terminal is connected to a second terminal, and the method may include:
  • the first terminal displays a screen projection interface on the display screen of the first terminal, and the content of the screen projection interface is a mirror image of the content of the first interface displayed on the display screen of the second terminal; the first terminal receives the user using the input device of the first terminal
  • the inputted first operation is used to move the first cursor on the display screen of the first terminal; wherein, when the first cursor moves to the first content of the screen-casting interface, the cursor style of the first cursor is: The first style, and/or, the display mode of the first content is changed from the first mode to the second mode; when the first cursor moves to the second content of the screen projection interface, the cursor style of the first cursor is the second mode The style, and/or the display mode of the second content is changed from the third mode to the fourth mode.
  • the content of the screen projection interface in this embodiment may refer to elements displayed on the screen projection interface.
  • the first content and the second content are different elements displayed on the screen-casting interface, and the cursor style when the first cursor is moved to the first content is different from that of the cursor moved to the second content, That is, the first style is different from the second style.
  • the above-mentioned first form and the third form may be the same or different.
  • the second manner and the fourth manner may be the same or different.
  • the content and/or the cursor in the screen-casting interface Corresponding visual feedback will be given, for example, the content in the screen projection interface presents a highlighted background, and the cursor style changes accordingly. In this way, the user can visually determine whether the control in the screen-casting interface corresponding to the control displayed on the screen-casting source end can perform the next operation, which improves the user's use experience.
  • the above-mentioned screen projection interface is displayed on a partial area of the display screen of the first terminal; the method may further include: in response to the first operation, the first terminal displays the first screen on the display screen of the first terminal.
  • An animation of cursor movement during the process of the first cursor moving on the display screen of the first terminal, when the first terminal determines that the first cursor has entered the screen projection interface, it sends to the second terminal a notification that the first cursor has entered the screen projection interface
  • the initial coordinate position which sends the data of the first operation to the second terminal; wherein, the initial coordinate position is the coordinate position relative to the first corner of the screen-casting interface when the first cursor enters the screen-casting interface, which is used by the second terminal when the second terminal enters the screen-casting interface.
  • a second cursor is displayed on the display screen of the terminal; the data of the first operation is used to move the second cursor on the display screen of the second terminal, so that when the first cursor moves to the first content, the second cursor moves to the first cursor.
  • the cursor style of the second cursor is the first style
  • the cursor style of the second cursor is the first style
  • the first terminal When the first cursor moves to the first content on the screen-casting interface, the first terminal receives the cursor type of the first style from the second terminal, and displays the first cursor according to the cursor type of the first style, so that the first cursor The display is in the first style; when the first cursor moves to the second content of the screen-casting interface, the first terminal receives the cursor type of the second style from the second terminal, and displays the first cursor according to the cursor type of the second style , so that the first cursor appears in the second style.
  • the first terminal After the first cursor enters the screen projection interface, the first terminal sends the corresponding operation data to the second terminal, so that the second terminal can move the cursor of the second terminal according to the operation data, and feedback the cursor style to the first terminal terminal, so that the cursor style of the first cursor on the first terminal can be changed correspondingly, giving the user the visual effect that the cursor performs visual feedback after the cursor moves to the corresponding content of the screen projection interface.
  • the method when the second cursor moves to the content corresponding to the first content, the display mode of the content corresponding to the first content in the first interface is changed from the first mode to the second mode; the method also further It may include: after the first cursor is moved to the first content of the screen projection interface, the first terminal updates the screen projection interface, the display mode of the first content in the screen projection interface before the update is the first mode, and the updated screen projection interface is displayed in the first mode.
  • the display mode of the first content in the interface is the second mode.
  • the method when the second cursor moves to the content corresponding to the second content, the display mode of the content corresponding to the second content in the first interface is changed from the third mode to the fourth mode; the method also further It may include: after the first cursor is moved to the second content of the screen projection interface, the first terminal updates the screen projection interface, the display mode of the second content in the screen projection interface before the update is the third mode, and the updated screen projection interface is displayed in the third mode.
  • the display mode of the second content in the interface is the fourth mode.
  • the first terminal After the first cursor enters the screen projection interface, the first terminal sends the corresponding operation data to the second terminal, so that the second terminal can move the cursor of the second terminal to the corresponding content according to the operation data, so that the content can be Corresponding visual feedback, the first terminal can change the display mode of the corresponding content on the first terminal by updating the screen-casting interface. After the user moves the cursor to the content corresponding to the screen-casting interface, the content can be visually displayed. Feedback visuals.
  • the transparency of the second cursor is greater than a threshold.
  • the above-mentioned sending the data of the first operation to the second terminal may include: after the first cursor enters the screen-casting interface, the user uses the input device of the first terminal to input the first operation during the process of inputting the first operation. , the first terminal acquires the first operation parameter in the received first input event, where the first input event is a mobile event corresponding to the first operation; the first terminal sends the first operation parameter to the second terminal, the first operation parameter It is used for the second terminal to simulate the first input event, and then used to move the second cursor.
  • the method may further include: when the first cursor moves to the first content on the screen-casting interface, the first terminal receives a second operation input by the user using the input device of the first terminal; The first terminal sends data of the second operation to the second terminal, and the data of the second operation is used for the second terminal to display the second interface; the first terminal updates the screencasting interface, and the content of the updated screencasting interface is the second interface Mirror of the content.
  • the first terminal After the cursor moves to the content of the screen projection interface, if the user operates the content, the first terminal sends the corresponding operation data to the second terminal, so that the second terminal can respond accordingly.
  • the first terminal updates the screen projection interface, so that the updated interface of the second terminal can be correspondingly projected onto the first terminal.
  • sending the data of the second operation by the first terminal to the second terminal may include: after the user uses the input device of the first terminal to input the second operation, the first terminal intercepts the corresponding data of the second operation the second input event; the first terminal acquires and sends the second operation parameter in the second input event to the second terminal, and the second operation parameter is used by the second terminal to simulate the second input event, and then used to display the second interface.
  • the first operation corresponds to a movement event; after the first cursor enters the screen-casting interface, the method may further include: the first terminal enables the interception of input events to intercept events other than the movement event. Other input events; the first terminal sends first indication information to the second terminal, where the first indication information is used to indicate the start of sharing.
  • the method may further include: the first terminal cancels the interception of the input event; the first terminal sends second indication information to the second terminal, the first terminal The second indication information is used to instruct the sharing to stop.
  • the first terminal can adjust the transparency of the first cursor, and the adjusted transparency of the first cursor is greater than a threshold; the first terminal can also intercept the first An event is input, and the first operation parameter of the first input event is sent to the second terminal, so that the second terminal simulates the first input event according to the first operation parameter, and then moves the second cursor.
  • the first terminal updates the screen-casting interface; wherein, in the updated interface, the cursor moves to the first content of the screen-casting interface, and after the update
  • the cursor style of the cursor in the screen projection interface is the first style; and/or the display mode of the first content before the update is the first mode, and the display mode of the updated first content is the second display mode.
  • the first terminal updates the screen projection interface; wherein, in the updated interface, the cursor moves to the second content of the screen projection interface, and the updated projection interface
  • the cursor style of the cursor in the screen interface is the second style; and/or the display mode of the second content before the update is the first mode, and the display mode of the updated second content is the second display mode. It can be understood that, by updating the screen-casting interface, the first terminal can give the user a visual effect of corresponding visual feedback on the content and/or cursor in the screen-casting interface when the user cursor moves on the content of the screen-casting interface.
  • an embodiment of the present application provides an interface display method, which is applied to a second terminal, and the second terminal is connected to the first terminal.
  • the method may include:
  • the second terminal displays the first interface, and projects and displays the first interface on the first terminal, so that the first terminal displays the screen-casting interface; when the first cursor of the first terminal enters the screen-casting interface, the second terminal displays the screen-casting interface on the first interface.
  • the second cursor is displayed on the second terminal; the second terminal receives the first operation input by the user using the input device of the first terminal, and the first operation is used to move the second cursor on the display screen of the second terminal; when the second cursor moves to the first When the first content of an interface is on, the second terminal displays the second cursor in the first style, and/or changes the display mode of the first content from the first mode to the second mode, so that when the first cursor moves When the content corresponding to the first content on the screen projection interface is displayed, the first cursor is displayed in the first style, and/or the display mode of the content corresponding to the first content in the screen projection interface is changed from the first mode to the second mode ; When the second cursor moves to the second content of the first interface, the second terminal displays the second cursor as the second style, and/or, changes the display mode of the second content from the third mode to the fourth mode , so that when the first cursor moves to the content corresponding to the second content on the screen-casting interface, the first cursor is displayed
  • the method may further include: the second terminal sends the cursor type of the first style to the first terminal for the first style
  • a terminal displays the first cursor so that the first cursor is displayed in the first style
  • the method may further include: the second terminal sends the second style to the first terminal
  • the cursor type is used for the first terminal to display the first cursor, so that the first cursor is displayed in the second style.
  • the second terminal can feed back the cursor style of the second cursor to the first terminal, so that the cursor style of the first cursor on the first terminal can be changed accordingly.
  • the transparency of the second cursor is greater than a threshold.
  • the second terminal displays the second cursor on the first interface, which may include: the second terminal receives the first cursor from the first terminal. A cursor enters the initial coordinate position of the screen-casting interface; the second terminal determines the starting position according to the initial coordinate position, the size of the screen-casting interface and the resolution of the second terminal, and the starting position may be relative to the display screen of the second terminal The coordinate position of the first corner; the second terminal displays the second cursor at the starting position.
  • the second terminal receiving the first operation input by the user using the input device of the first terminal may include: the second terminal receiving a first operation parameter from the first terminal, the first operation parameter After the first cursor enters the screen projection interface, the user uses the input device of the first terminal to input the operation parameters in the first input event received by the first terminal in the process of inputting the first operation, and the first operation parameters include the comparison of the first cursor relative displacement at the initial coordinate position; the second terminal determines the relative displacement of the second cursor compared to the initial position according to the relative displacement of the first cursor equivalent to the initial coordinate position; the second terminal determines the relative displacement of the second cursor according to the determined second cursor The relative displacement from the starting position, and other parameters in the first operating parameters simulate the first input event. By converting the relative displacement in the received operation parameters, after the first cursor is moved, the second cursor can be moved to the corresponding content.
  • the method may further include: the second terminal displays an animation of the movement of the second cursor on the display screen of the second terminal according to the first input event.
  • the method may further include: when the second cursor moves on the first content of the first interface, the second terminal receives a second operation input by the user using the input device of the first terminal; In response to the second operation, the second terminal displays the second interface, and projects and displays the second interface to the first terminal, so that the content of the screen projection interface updated by the first terminal is a mirror image of the content of the second interface.
  • the second terminal receiving the second operation input by the user using the input device of the first terminal may include: the second terminal receiving a second operation parameter from the first terminal, the second operation parameter is the operation parameter included in the second input event intercepted by the first terminal after the user uses the input device of the first terminal to input the second operation when the first cursor moves to the content corresponding to the first content on the screen-casting interface;
  • the second terminal simulates a second input event according to the second operation parameter, and the second input event is used to display the second interface.
  • the method may further include: the second terminal receives first indication information from the first terminal, and the first indication information is used for Indicates the start of sharing.
  • the method may further include: the second terminal receives second indication information from the first terminal, the second indication information is used to instruct the sharing to stop, and the second indication information is that the first terminal is determining Sent after the first cursor moves out of the screen projection interface.
  • an embodiment of the present application provides an interface display apparatus, which is applied to a first terminal, where the first terminal is connected to the second terminal, and the apparatus may include:
  • a display unit used for displaying a screen projection interface on the display screen of the first terminal, and the content of the screen projection interface is a mirror image of the content of the first interface displayed on the display screen of the second terminal;
  • the input unit is used for receiving the user using the first interface
  • the first operation input by the input device of the terminal the first operation is used to move the first cursor on the display screen of the first terminal;
  • the cursor style is the first style, and/or, the display mode of the first content is changed from the first mode to the second mode; when the first cursor moves to the second content of the screen-casting interface, the cursor of the first cursor
  • the style is the second style, and/or the display mode of the second content is changed from the third mode to the fourth mode.
  • the above-mentioned screen projection interface is displayed on a partial area of the display screen of the first terminal; the display unit is further configured to display the movement of the first cursor on the display screen of the first terminal in response to the first operation
  • the device may further include: a sending unit, configured to send the first cursor to the second terminal when it is determined that the first cursor enters the screen-casting interface during the movement of the first cursor on the display screen of the first terminal Enter the initial coordinate position of the screen-casting interface, and send the data of the first operation to the second terminal; wherein, the initial coordinate position is the coordinate position relative to the first corner of the screen-casting interface when the first cursor enters the screen-casting interface, which is used for the first
  • the second terminal displays the second cursor on the display screen of the second terminal; the data of the first operation is used to move the second cursor on the display screen of the second terminal, so that when the first cursor moves to the first content, the When the second cursor moves to the content corresponding to the first content
  • the apparatus may further include: a receiving unit.
  • the receiving unit is configured to receive the cursor type of the first style from the second terminal when the first cursor moves to the first content of the screen projection interface; the display unit is further configured to display the cursor type according to the first style the first cursor, so that the first cursor is displayed in the first style; the receiving unit is further configured to receive the cursor type of the second style from the second terminal when the first cursor moves to the second content of the screen projection interface; The display unit is further configured to display the first cursor according to the cursor type of the second style, so that the first cursor is displayed in the second style.
  • the display mode of the content corresponding to the first content in the first interface is changed from the first mode to the second mode; the display unit, It is also used to update the screencasting interface after the first cursor moves to the first content of the screencasting interface.
  • the display mode of the first content in the screencasting interface before the update is the first mode
  • the display mode of the first content in the screencasting interface after the update is the first mode.
  • the display mode of a content is the second mode.
  • the display mode of the content corresponding to the second content in the first interface is changed from the third mode to the fourth mode; the display unit, It is also used to update the screencasting interface after the first cursor moves to the second content on the screencasting interface.
  • the display mode of the second content in the screencasting interface before the update is the third mode, and the screencasting interface after the update displays the second content
  • the display mode of the second content is the fourth mode.
  • the transparency of the second cursor is greater than a threshold.
  • the apparatus may further include: an acquiring unit, configured to acquire the received data during the process of inputting the first operation by the user using the input device of the first terminal after the first cursor enters the screen-casting interface.
  • the first operation parameter in the first input event of The terminal simulates the first input event, which is then used to move the second cursor.
  • the input unit is further configured to receive a second operation input by the user using the input device of the first terminal when the first cursor moves to the first content on the screen projection interface; the sending unit, It is also used to send the data of the second operation to the second terminal, and the data of the second operation is used for the second terminal to display the second interface; the display unit is also used to update the screen projection interface, and the content of the updated screen projection interface is: A mirror image of the content of the second interface.
  • the obtaining unit is further configured to intercept the second input event corresponding to the second operation after the user uses the input device of the first terminal to input the second operation, and obtain the first input event in the second input event.
  • the second operation parameter, the sending unit is specifically configured to send the second operation parameter to the second terminal, where the second operation parameter is used by the second terminal to simulate the second input event, and then used to display the second interface.
  • the first operation corresponds to a movement event; the acquiring unit is further configured to enable interception of input events, and is configured to intercept other input events except the movement event.
  • the sending unit is further configured to send first indication information to the second terminal, where the first indication information is used to indicate the start of sharing.
  • the acquiring unit is further configured to cancel the interception of the input event.
  • the sending unit is further configured to send second indication information to the second terminal, where the second indication information is used to instruct the sharing to stop.
  • an embodiment of the present application provides an interface display apparatus, which is applied to a second terminal, and the second terminal is connected to the first terminal.
  • the apparatus may include:
  • a display unit for displaying the first interface; a projection unit for projecting and displaying the first interface to the first terminal, so that the first terminal displays the screen projection interface; the display unit is also used for the first cursor on the first terminal When entering the screen projection interface, a second cursor is displayed on the first interface; the receiving unit is configured to receive a first operation input by the user using the input device of the first terminal, and the first operation is used to move the display screen of the second terminal the second cursor; the display unit is further configured to display the second cursor in the first style when the second cursor moves to the first content of the first interface, and/or, change the display mode of the first content by The first mode is changed to the second mode, so that when the first cursor is moved to the content corresponding to the first content on the screen-casting interface, the first cursor is displayed in the first style, and/or the screen-casting interface and the first content are displayed The display mode of the corresponding content is changed from the first mode to the second mode; the display unit is further configured to display the second curs
  • the apparatus may further include: a sending unit, configured to send the cursor type of the first style to the first terminal after the second cursor is displayed in the first style, for the first style
  • the terminal displays the first cursor, so that the first cursor is displayed in the first style
  • the sending unit is further configured to send the cursor type of the second style to the first terminal after the second cursor is displayed in the second style, for the first
  • a terminal displays the first cursor so that the first cursor is displayed in the second style.
  • the transparency of the second cursor is greater than a threshold.
  • the receiving unit is further configured to receive the initial coordinate position of the first cursor entering the screen projection interface from the first terminal.
  • the device may further include: a determining unit configured to determine a starting position according to the initial coordinate position, the size of the screen projection interface and the resolution of the second terminal, where the starting position may be a first corner relative to the display screen of the second terminal The coordinate position of ; the display unit, which is specifically used to display the second cursor at the starting position.
  • the receiving unit is specifically configured to receive a first operation parameter from the first terminal, where the first operation parameter is that after the first cursor enters the screen projection interface, the user uses the input device of the first terminal
  • the operation parameters in the first input event received by the first terminal during the process of inputting the first operation, the first operation parameters include the relative displacement of the first cursor compared to the initial coordinate position;
  • the determining unit is also used for determining according to the first cursor Equivalent to the relative displacement of the initial coordinate position, to determine the relative displacement of the second cursor compared to the initial position;
  • the device may further include: a simulation unit for determining the relative displacement of the second cursor compared to the initial position according to the determined relative displacement , and other parameters in the first operating parameter simulate the first input event.
  • the display unit is further configured to display the animation of the movement of the second cursor on the display screen of the second terminal according to the first input event.
  • the receiving unit is further configured to receive a second operation input by the user using the input device of the first terminal when the second cursor moves to the first content of the first interface; the display unit, is also used to display the second interface in response to the second operation; the projection unit is also used to project and display the second interface to the first terminal, so that the content of the screen-casting interface updated by the first terminal is the same as the content of the second interface. mirror.
  • the receiving unit is specifically configured to receive a second operation parameter from the first terminal, where the second operation parameter is when the first cursor moves to the content corresponding to the first content on the screen-casting interface
  • the second operation parameter is when the first cursor moves to the content corresponding to the first content on the screen-casting interface
  • the receiving unit is further configured to receive first indication information from the first terminal, where the first indication information is used to indicate the start of sharing.
  • the receiving unit is further configured to receive second indication information from the first terminal, the second indication information is used to instruct the sharing to stop, and the second indication information is that the first terminal is determining the first cursor Sent after moving out of the screencasting interface.
  • an embodiment of the present application provides an interface display apparatus, which may include: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions so that the interface display apparatus implements the following: The method described in any one of the first aspect or possible implementations of the first aspect, or the method described in any one of the second aspect or possible implementations of the second aspect.
  • embodiments of the present application provide a computer-readable storage medium on which computer program instructions are stored.
  • the computer program instructions When the computer program instructions are executed by an electronic device, the electronic device can realize the first aspect or a possible implementation manner of the first aspect.
  • the method described in any one of, or the method described in any one of the possible implementation manners of the second aspect or the second aspect is implemented.
  • an embodiment of the present application provides an electronic device, the electronic device includes a display screen, one or more processors and a memory; the display screen, the processor and the memory are coupled; the memory is used for storing computer program codes, the computer program codes Including computer instructions, when the computer instructions are executed by the electronic device, the electronic device is caused to perform the method as described in any one of the first aspect or the possible implementation manner of the first aspect, or, the terminal is caused to perform as the second aspect. or the method described in any one of the possible implementation manners of the second aspect.
  • an embodiment of the present application provides a computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are stored in an electronic device
  • the processor in the electronic device executes the method described in the first aspect or any one of the possible implementations of the first aspect, or executes the second aspect or any one of the possible implementations of the second aspect. method described.
  • an embodiment of the present application provides an interface display system, where the interface display system may include a first terminal and a second terminal, and the first terminal is connected to the second terminal.
  • the second terminal is configured to display the first interface, and project and display the first interface to the first terminal, so that the first terminal displays the screen projection interface.
  • the first terminal is used to display a screen projection interface on the display screen of the first terminal, and the content of the screen projection interface is a mirror image of the content of the first interface displayed on the display screen of the second terminal; receiving user input using the input device of the first terminal
  • the first operation is used to move the first cursor on the display screen of the first terminal.
  • the cursor style of the first cursor is the first style, and/or the display mode of the first content is changed from the first mode to the second mode;
  • the cursor style of the first cursor is the second style, and/or the display mode of the second content is changed from the third mode to the fourth mode.
  • the screen projection interface is displayed on a partial area of the display screen of the first terminal; the first terminal is further configured to display the movement of the first cursor on the display screen of the first terminal in response to the first operation animation; the second terminal is further configured to display the second cursor on the first interface when the first cursor enters the screen-casting interface; receive the first operation input by the user using the input device of the first terminal, and the first operation is used for moving the second cursor on the display screen of the second terminal; when the second cursor moves to the content corresponding to the first content on the first interface, displaying the second cursor in the first style, and sending the first cursor to the first terminal
  • the cursor type of the style the first terminal is also used to display the first cursor according to the cursor type of the first style; the second terminal is also used to move the second cursor to the content corresponding to the second content on the first interface When the second cursor is displayed in the second style, the cursor type of the second style is sent to the first terminal; the first terminal is further
  • the second terminal is further configured to display the content corresponding to the first content on the first interface when the second cursor moves over the content corresponding to the first content on the first interface
  • the first mode is changed to the second mode; the first terminal is also used to update the screen projection interface; the second terminal is also used to change the first interface to the content corresponding to the second content when the second cursor moves
  • the display mode of the content corresponding to the second content on an interface is changed from the third mode to the fourth mode; the first terminal is also used to update the screen projection interface.
  • the first corner may be any one of the upper left corner, the lower left corner, the upper right corner and the lower right corner of the display screen.
  • the interface display device described in the third aspect and any possible implementation manner thereof provided above, the interface display device described in the fourth aspect and any possible implementation manner thereof, and the fifth aspect The interface display device described in the sixth aspect, the computer-readable storage medium described in the sixth aspect, the terminal described in the seventh aspect, the computer program product described in the eighth aspect and the interface display system described in the ninth aspect can achieve beneficial effects , the beneficial effects in the first aspect or the second aspect and any possible implementation manners thereof may be referred to, which will not be repeated here.
  • FIG. 1 is a schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 2 is a simplified schematic diagram of a system architecture provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of an interface display method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 7 is another schematic diagram of a display interface provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a display coordinate system provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of another display coordinate system provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a cursor style provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of another display interface provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of the composition of an interface display device provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of the composition of another interface display device provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of the composition of a chip system provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • Multi-screen collaboration can use the mirror projection method to project the interface displayed by one terminal to the display screen of another terminal for display.
  • the terminal that projects its display interface may be referred to as the projection source
  • the terminal that receives the projection from the projection source and displays the display interface of the projection source is referred to as the projection destination.
  • the screen projected by the screencast source displayed on the screencast destination is called the screencasting interface
  • the window used by the screencasting destination to display the screencasting interface is called the screencasting window.
  • the source end of the screen projection is a mobile phone
  • the destination end of the screen projection is a PC as an example.
  • the mobile phone can project the interface (such as the desktop 101 ) displayed on its display screen to the display screen of the PC.
  • the PC can display the interface projected by the mobile phone on the display screen of the PC, for example, the PC displays the projection interface 102 in the projection window.
  • the user can use the input device of the screen-casting destination to operate on the screen-casting interface, so as to realize the operation on the actual interface of the screen-casting source. For example, continue with reference to FIG. 1 , taking the input device as a mouse as an example.
  • the user can use the mouse of the PC to perform mouse operations such as mouse click and mouse movement in the screen projection interface 102 .
  • the PC After the PC receives the corresponding mouse operation, it can convert the coordinates when the user performs the mouse operation in the screen-casting interface 102 into the original interface on the mobile phone according to the size ratio relationship between the screen-casting interface 102 and the original interface (such as the desktop 101 ) projected by the mobile phone. coordinates in .
  • the PC sends the transformed coordinates and operation type (such as move, click) to the mobile phone, so that the mobile phone can generate corresponding touch events to simulate the actual interface (such as the desktop 101 ), and project the operated interface to the PC.
  • the control and/or the cursor will have corresponding visual feedback, such as the control showing a highlighted background, or the cursor changing from a normal selection style to a text selection style.
  • the screen-casting destination such as a PC
  • the screen-casting destination usually does not have visual effects.
  • Feedback such as the controls in the screen projection interface will not display a highlighted background, and the cursor style will not change accordingly.
  • the icon 103 of the application (application, APP) 1 in the screen projection interface 102 by operating the mouse of the PC, the icon 103 will not have visual feedback (for example, no highlight is displayed). background), the cursor 104 has always been in the normal selection style, and has not changed. This is not very friendly to the user, and the user cannot visually know whether the icon 103 corresponding to the icon 105 displayed on the mobile phone can perform the next operation.
  • An embodiment of the present application provides an interface display method, which can be applied to when multiple terminals are used collaboratively, and the screen projection source end projects the interface displayed on its display screen to the scene displayed on the display screen of the screen projection destination end.
  • the input device of the screen-casting destination such as a mouse or a touchpad
  • the content and/or the cursor in the screen-casting interface will be correspondingly Visual feedback, for example, the content in the screen projection interface presents a highlighted background, and the cursor style changes accordingly.
  • the user can visually determine whether the content in the screen-casting interface corresponding to the content displayed on the screen-casting source end can perform the next operation, which improves the user experience.
  • the cursor described in this embodiment may also be referred to as a mouse pointer.
  • the cursor can be an image, it can be dynamic or static, and the cursor can be styled differently in different situations.
  • the content in this embodiment may be an operable element displayed in the interface such as a control, or may be an inoperable element displayed in the interface.
  • An element can include one or more of the following: text, buttons, icons, etc.
  • FIG. 2 is a simplified schematic diagram of a system architecture to which the above method can be applied, provided by an embodiment of the present application.
  • the system architecture may at least include: a first terminal 201 and a second terminal 202 .
  • the first terminal 201 is connected to the input device 201-1 (as shown in FIG. 2 ), or includes the input device 201-1 (not shown in FIG. 2 ).
  • the input device 201-1 may be a mouse, a touchpad, or the like.
  • the input device 201-1 is a mouse as an example.
  • the first terminal 201 and the second terminal 202 may establish a connection in a wired or wireless manner. Based on the established connection, the first terminal 201 and the second terminal 202 may be used together in cooperation.
  • the wireless communication protocol adopted when the first terminal 201 and the second terminal 202 establish a connection wirelessly may be wireless fidelity (Wi-Fi) protocol, Bluetooth (Bluetooth) protocol, ZigBee protocol, The near field communication (Near Field Communication, NFC) protocol, etc., may also be various cellular network protocols, which are not specifically limited here.
  • the screen projection source ends in the first terminal 201 and the second terminal 202 can project the interface displayed on the display screen to the display screen of the screen projection destination end for display.
  • the first terminal 201 as the screen projection destination
  • the second terminal 202 as the screen projection source as an example.
  • the second terminal 202 may project the interface displayed on its display screen to the display screen of the first terminal 201 for display.
  • the user can operate the actual interface displayed in the second terminal 202 by operating on the screen projection interface displayed on the display screen of the first terminal 201 using the input device 201-1 of the first terminal 201.
  • the user when the user operates the screen projection interface displayed on the display screen of the first terminal 201, the user operates the input device 201-1 of the first terminal 201, such as a mouse or a touchpad, to move the cursor to the screen.
  • the first terminal 201 can make the controls and/or cursors in the screen-casting interface give corresponding visual feedback, For example, a control in the screen projection interface presents a highlighted background, and the cursor style changes accordingly, so that the user can visually know whether the control in the screen projection interface corresponding to the control displayed on the second terminal 202 can perform the next operation. .
  • the terminals in the embodiments of the present application may be mobile phones, tablet computers, handheld computers, PCs, cellular phones, personal digital assistants (personal digital assistants) , PDA), wearable devices (such as smart watches), in-vehicle computers, game consoles, and augmented reality (AR) ⁇ virtual reality (virtual reality, VR) devices, etc.
  • the first terminal 201 is a PC and the second terminal 202 is a mobile phone as an example in FIG. 2 .
  • the technical solutions provided in this embodiment can be applied to other electronic devices, such as smart home devices (eg, TV sets), in addition to the above-mentioned terminals (or mobile terminals).
  • the terminal is a mobile phone as an example.
  • FIG. 3 is a schematic structural diagram of a mobile phone according to an embodiment of the present application. The methods in the following embodiments can be implemented in a mobile phone having the above-mentioned hardware structure.
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1.
  • Antenna 2 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193 and display screen 194, etc.
  • the mobile phone may further include a mobile communication module 150, a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environmental sensor Light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in this embodiment does not constitute a specific limitation on the mobile phone.
  • the cell phone may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller can be the nerve center and command center of the phone.
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, SIM interface, and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface SIM interface
  • USB interface etc.
  • the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 charges the battery 142 , it can also supply power to the mobile phone through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 can also receive the input of the battery 142 to supply power to the mobile phone.
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in a cell phone can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the mobile phone.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the mobile phone including wireless local area networks (WLAN) (such as Wi-Fi networks), bluetooth (BT), global navigation satellite system (GNSS), Solutions for wireless communication such as frequency modulation (FM), NFC, infrared technology (infrared, IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC infrared technology
  • IR infrared
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the handset may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the mobile phone may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the mobile phone can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the gyroscope sensor 180B can be used to determine the motion attitude of the mobile phone.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the mobile phone can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone in various directions (generally three axes).
  • Distance sensor 180F for measuring distance.
  • the mobile phone can use the proximity light sensor 180G to detect the user holding the mobile phone close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints. The mobile phone can use the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a picture with the fingerprint, answer the incoming call with the fingerprint, etc.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile phone, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • Motor 191 can generate vibrating cues. The motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the mobile phone.
  • the mobile phone can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the mobile phone interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the handset employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone and cannot be separated from the mobile phone.
  • FIG. 2 the software structure of the first terminal 201 and the second terminal 202 is exemplarily described in this embodiment of the present application by taking the software system of the first terminal 201 as the windows system and the software system of the second terminal 202 as the Android system.
  • FIG. 4 is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
  • the software architecture of the first terminal 201 may include: an application layer and a windows system (windows shell).
  • the application layer may include various applications installed on the first terminal 201 . Applications at the application layer can directly interact with the Windows system.
  • the application layer may further include a screen projection service module.
  • the software system of the second terminal 202 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Take the software system of the second terminal 202 as an example of a layered architecture.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the second terminal 202 may include an application layer and a framework layer (framework, FWK).
  • the application layer can include a series of application packages.
  • an application package can include settings, calculator, camera, SMS, music player, etc. applications.
  • the application included in the application layer may be a system application of the second terminal 202 or a third-party application, which is not specifically limited in this embodiment of the present application.
  • the application layer may also include a screen projection service module.
  • the framework layer is mainly responsible for providing an application programming interface (API) and a programming framework for applications in the application layer.
  • the second terminal 202 may also include other layers, such as a kernel layer (not shown in FIG. 4 ).
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can contain at least display drivers, camera drivers, audio drivers, sensor drivers, etc.
  • the first terminal 201 is used as the screen projection destination end, and the second terminal 202 is used as the screen projection source end as an example.
  • the second terminal 202 projects the interface displayed on its display screen to the display screen of the first terminal 201 for display, if the user operates the input device 201-1 of the first terminal 201, such as a mouse or a touchpad,
  • the content on the screen of the first terminal 201 is moved to the screen projection interface, such as a control, in the case that the control corresponding to the control displayed on the second terminal 202 can be operated, based on the above software architecture, and shared with the help of a keyboard and mouse technology, the first terminal 201 can cause the control and/or cursor in the screen projection interface to give corresponding visual feedback, for example, if the control presents a highlighted background, the cursor style changes accordingly.
  • the keyboard and mouse sharing technology may refer to a technology of realizing control of other terminals by using an input device (such as a mouse, a touchpad) of one terminal.
  • the first terminal 201 is a PC
  • the second terminal 202 is a mobile phone
  • the input device 202-1 is a mouse as an example, and the interface display method provided by the embodiments of the present application is described in detail with reference to the accompanying drawings.
  • FIG. 5 is a schematic flowchart of an interface display method provided by an embodiment of the present application. As shown in FIG. 5, the method may include the following S501-S511.
  • the terminal that is the source of the screen projection can project the interface displayed on its display screen to the terminal as the screen projection source. displayed on the terminal display of the destination end.
  • the terminal display of the destination end For example, take the mobile phone as the screen projection source and the PC as the screen projection destination as an example.
  • the phone establishes a connection with the PC. After that, the phone can project the interface displayed on its display to the display of the PC.
  • the PC can display the screen projection interface on the display screen of the PC.
  • connection between the mobile phone and the PC can be established in a wired manner.
  • a wired connection can be established between a mobile phone and a PC through a data cable.
  • connection between the mobile phone and the PC can be established wirelessly.
  • the connection information may be a device identifier of the terminal, such as an internet protocol (internet protocol, IP) address, a port number, or an account logged in by the terminal, and the like.
  • IP internet protocol
  • the account logged in by the terminal may be an account provided by the operator for the user, such as a Huawei account.
  • the account logged in by the terminal can also be an application account, such as WeChat Account, Youku account, etc.
  • the transmission capability of the terminal may be near-field communication capability or long-distance communication capability. That is to say, the wireless communication protocol used to establish a connection between terminals, such as a mobile phone and a PC, may be a near field communication protocol such as a Wi-Fi protocol, a Bluetooth protocol, or an NFC protocol, or a cellular network protocol.
  • the user can touch the NFC tag of the PC with the mobile phone, and the mobile phone reads the connection information stored in the NFC tag, for example, the connection information includes the IP address of the PC.
  • the mobile phone can establish a connection with the PC using the NFC protocol according to the IP address of the PC.
  • both the mobile phone and the PC have the Bluetooth function and the Wi-Fi function turned on.
  • the PC can broadcast a Bluetooth signal to discover surrounding terminals.
  • the PC can display a list of discovered devices, and the list of discovered devices can include the identifiers of the mobile phones discovered by the PC.
  • the PC can also exchange connection information, such as IP addresses, with the discovered devices.
  • the PC can establish a connection with the mobile phone by using the Wi-Fi protocol according to the IP address of the mobile phone.
  • both the mobile phone and the PC are connected to the cellular network, and the mobile phone and the PC are logged into the same Huawei account. The mobile phone and the PC can establish a connection based on the cellular network according to the Huawei account.
  • a connection between a mobile phone and a PC is established wirelessly as an example.
  • the user can manually enable the screen projection service function of the PC.
  • the screen mirroring service function of the PC can also be automatically enabled, for example, it is automatically enabled when the PC is powered on.
  • the screen-casting service module of the PC application layer can start to monitor the network to monitor whether a terminal is connected to the PC.
  • the user can turn on the NFC switch of the mobile phone and touch the NFC tag of the PC with the mobile phone.
  • the mobile phone can read the IP address of the PC stored in the NFC tag.
  • the mobile phone and the PC will display a confirmation interface respectively to ask the user whether to confirm the projection of the mobile phone display interface to the PC for display.
  • the PC such as the screen-casting service module of the PC, can send a message notifying the screen-casting to the mobile phone (eg, the screen-casting service module of the mobile phone).
  • the mobile phone receives the notification screencasting message, it can establish a connection with the PC according to the obtained IP address of the PC.
  • the mobile phone serving as the screencasting source can project the interface displayed on the mobile phone display to the display screen of the PC serving as the screencasting destination.
  • the PC displays the screen projection interface.
  • the content displayed in the screen projection interface is the same as the content of the interface (eg, the first interface) displayed on the display screen of the mobile phone, or the content in the screen projection interface is a mirror image of the content of the interface displayed on the display screen of the mobile phone.
  • a setting interface 601 is currently displayed on the display screen of the mobile phone.
  • the mobile phone can project the setting interface 601 on the display screen of the PC.
  • the PC displays the screen projection interface 602 . It can be seen that the content in the screen projection interface 602 is the same as the content in the setting interface 601 .
  • the window used by the PC to display the screen-casting interface may be called a screen-casting window.
  • the screen projection service module of the PC application layer can display the screen projection window.
  • the screen-casting service module of the PC can display the screen-casting window after the screen-casting service function of the PC is enabled, or after the screen-casting service function of the PC is enabled and the connection with other terminals (such as the above mobile phones) is successfully established.
  • the PC can display the screencast window on its entire display screen, that is, the screencast window occupies the entire display screen of the PC.
  • the PC may also display a screen projection window on a part of its display screen, that is, the screen projection interface in the screen projection window is only a part of the interface on the PC display screen, which is not specifically limited in this embodiment.
  • the specific implementation of the mobile phone projecting the interface displayed on the mobile phone display screen to the display screen of the PC can be as follows: the mobile phone, for example, the screen projection service module of the mobile phone can obtain the data corresponding to the current display interface of the mobile phone, and send to the PC. After the PC receives the data, it can display the screen-casting interface in the screen-casting window on the PC display screen according to the data.
  • the screen projection service module of the mobile phone can obtain the corresponding data of the current display interface of the mobile phone through the display manager of the mobile phone (for example, the display manager is a module of the framework layer of the mobile phone), such as screen recording data, and send it to the PC, then the mobile phone can be realized. Projection of the display interface to the PC display.
  • a distributed multimedia protocol can be used to realize the projection display of the display interface of the mobile phone on the display screen of the PC.
  • DMP distributed Multi-media Protocol
  • the screen casting service module of the mobile phone can use the display manager (DisplayManager) of the mobile phone to create a virtual display (VirtualDisplay).
  • the screencasting service module of the mobile phone sends a request to create a VirtualDisplay to the display manager of the mobile phone.
  • the display manager of the mobile phone completes the creation of the VirtualDisplay, it can return the created VirtualDisplay to the screencasting service module of the mobile phone.
  • the screen projection service module of the mobile phone can move the drawing of the interface displayed on the display screen of the mobile phone to the VirtualDisplay.
  • the screen casting service module of the mobile phone can obtain the screen recording data.
  • the screen-recording data can be encoded and sent to the PC.
  • the screen projection service module of the PC can receive the corresponding data, and after decoding the data, the screen recording data can be obtained.
  • the screen projection service module of the PC cooperates with the frame layer of the PC to draw the corresponding interface according to the screen recording data and display it in the screen projection window.
  • the frame layer of the PC can provide a surfaceview to realize the projection display of the screen projection interface on the PC side.
  • wireless projection can also be used to realize the projection display of the display interface of the mobile phone on the display screen of the PC, that is, the mobile phone can obtain all layers of the display interface of the mobile phone, and then integrate all the obtained layers into a video
  • the stream (or called screen recording data) is encoded and sent to the PC through the real time streaming protocol (RTSP) protocol.
  • RTSP real time streaming protocol
  • the PC can decode and play it, so as to realize the projection display of the display interface of the mobile phone on the PC display screen.
  • the mobile phone can extract the instructions from the mobile phone display interface to obtain the instruction stream, and obtain the layer information of the mobile phone display interface, etc., and then send the instruction stream and layer information to the PC, so that the PC can restore the information displayed on the mobile phone display screen. interface to realize the projection display of the mobile phone display interface on the PC.
  • the mobile phone creates a virtual input device.
  • the mobile phone serving as the screencasting source can also create a virtual input device, which is used for the user to use the screencasting destination, such as the input device (such as a mouse) of the PC, to operate the screencasting interface displayed on the PC , the corresponding input event can be simulated on the mobile phone.
  • the mobile phone can realize the control of the mobile phone by the PC input device by correspondingly responding to the simulated input event. That is to say, by using the input device of the screen projection destination, the user can not only control the projection destination, but also control the projection source, so as to realize keyboard and mouse sharing between the projection destination and the projection source.
  • the keyboard and mouse sharing mode of the PC when the keyboard and mouse sharing mode of the PC is enabled, the keyboard and mouse sharing between the PC and the mobile phone can be realized, that is, the user can use the input device of the PC to control both the PC and the mobile phone.
  • the PC may display a pop-up window. This pop-up window is used to ask the user whether to enable the keyboard and mouse sharing mode. If an operation of selecting to enable the keyboard and mouse sharing mode is received from the user, the PC can enable the keyboard and mouse sharing mode.
  • the PC After the PC has enabled the keyboard and mouse sharing mode, it can notify all the terminals that have established a connection with itself, or the terminals that have established a connection with itself and project the interface to the PC, and the keyboard and mouse sharing mode has been turned on. If a connection is established between the PC and the mobile phone, and the mobile phone projects the interface to the PC, the PC will notify the mobile phone that the keyboard and mouse sharing mode is enabled. For example, the PC may send a notification message to the mobile phone, and the notification message may be used to indicate that the keyboard and mouse sharing mode of the PC is turned on. After receiving the notification, the mobile phone can create a virtual input device, which has the same function as conventional input devices such as a mouse and a touchpad, and can be used for the mobile phone to simulate corresponding input events.
  • a virtual input device which has the same function as conventional input devices such as a mouse and a touchpad
  • the virtual input device created by the mobile phone has the same function as a conventional mouse. It can be regarded as a mouse shared by the PC to the mobile phone and can be used to simulate mouse events on the mobile phone to realize the mouse of the PC. Control over the phone.
  • the operating system of the mobile phone is the Android system.
  • Mobile phones can use the uinput capability of linux to create virtual input devices. Among them, uinput is a kernel layer module that can simulate input devices. By writing to the /dev/uinput (or /dev/input/uinput) device, a process can create a virtual input device with a specific function.
  • the virtual input device can simulate corresponding events.
  • other terminals that have established a connection with the PC will also create virtual input devices according to the notification received.
  • the operating system of the terminal receiving the notification is the Android system
  • the uinput capability of linux can be used to create a virtual input device, or the human interface device (HID) protocol can be used to realize the virtual input device. Creation of input devices.
  • the operating system of the terminal receiving the notification is another operating system such as the IOS system or the windows system, the HID protocol can be used to realize the creation of the virtual input device.
  • a pop-up window may also be displayed to ask the user whether he wants to use the input device of the PC to control the device. If it is received that the user chooses to use the input device of the PC to control the device, then create a virtual input device, otherwise, no virtual input device is created.
  • the PC after a connection is established between other terminals, such as a mobile phone and a PC, the PC automatically enables the keyboard and mouse sharing mode without requiring the user to manually enable it.
  • the virtual input device can also be created automatically without the need for the PC to send a notification.
  • a pop-up window may be displayed to ask the user whether he wants to use the input device of the PC to control the device. If it is received that the user chooses to use the input device of the PC to control the device, the virtual input device is automatically created, otherwise the virtual input device is not created.
  • the terminal serving as the screencasting source projects the interface to the PC serving as the screencasting destination
  • the PC automatically turns on the keyboard and mouse sharing mode, without the need for the user to manually turn it on.
  • the terminal that is the source of the screen projection can also create a virtual input device after projecting the interface to the PC, or after receiving a message from the PC informing the screen projection.
  • the mouse is the input device of the PC
  • other terminals such as mobile phones
  • the PC temporarily responds to the operation of the mouse. , or the user can temporarily control the PC by using the mouse.
  • the PC can trigger other terminals that have established a connection with the PC and created a virtual input device, such as a mobile phone, to respond to the operation of the mouse after determining that the mouse shuttle condition is satisfied. That is, the keyboard and mouse sharing between the PC and the mobile phone is triggered.
  • the mouse shuttle condition may be a cursor displayed on the PC display screen, such as cursor 1 sliding into the screen projection interface displayed on the PC display screen.
  • the user can move the mouse so that the cursor 1 displayed on the PC display slides into the screen projection interface displayed on the PC display to trigger keyboard and mouse sharing between the PC and the mobile phone.
  • the method further includes the following S503-S504.
  • the PC receives the mouse movement event, and displays the animation of the movement of the cursor 1 on the display screen of the PC according to the mouse movement event.
  • the cursor 1 may be the first cursor in this embodiment of the present application.
  • the PC sends first instruction information to the mobile phone, which is used to instruct the keyboard and mouse sharing to start.
  • the user When the user wants to operate on the screen-casting interface of the screen-casting destination to realize the operation on the actual interface of the screen-casting source, the user can operate the input device of the screen-casting destination, such as inputting the first operation, to make the screen-casting target
  • the cursor displayed on the terminal moves to the screen projection interface.
  • the screen projection destination end and the screen projection source end can start to share the keyboard and mouse.
  • the user can move the mouse of the PC so that the cursor 1 moves on the display screen of the PC.
  • the PC can determine whether the cursor 1 enters the screen projection interface displayed on the display screen of the PC. For example, as described above, the screen-casting interface is displayed in the screen-casting window of the PC, and the screen-casting window can be used to monitor whether the cursor 1 enters the screen-casting interface.
  • the screen-casting window can detect a corresponding event, and the event is used to instruct cursor 1 to enter the screen-casting window, and the PC can determine that cursor 1 enters the screen-casting interface according to the event.
  • the PC can determine that the mouse shuttle condition is satisfied, and then the PC can start to share the keyboard and mouse with the mobile phone.
  • the PC may also send the above-mentioned first instruction information to the mobile phone to indicate to the mobile phone that the keyboard and mouse sharing starts.
  • the PC when the user uses the input device of the PC to input a movement operation, such as the process of the user moving the mouse, the PC can receive a corresponding input event, such as a movement event, and the movement event can be called a mouse movement event .
  • the Windows system of the PC can draw an animation of the movement of the cursor 1 and display it on the display screen of the PC.
  • the PC displays an animation of the movement of the cursor 703 on the display screen 702 of the PC correspondingly.
  • FIG. 7 along with the movement of the mouse 701 , the PC displays an animation of the movement of the cursor 703 on the display screen 702 of the PC correspondingly.
  • the screen projection service module of the PC application layer can determine whether the cursor 703 enters the screen projection interface 705 through the screen projection window. For example, when the cursor 703 enters the screen-casting interface 705, the screen-casting window can detect an event for instructing the cursor 703 to enter the screen-casting window. After detecting the event, the screen-casting window can send a notification to the screen-casting service module of the PC application layer to notify the screen-casting service module of the PC that the cursor has entered the screen-casting interface 705 .
  • the screen projection service module of the PC determines that the cursor 703 enters the screen projection interface, it can be determined that the mouse shuttle condition is satisfied. After that, the PC and the mobile phone can start to share the keyboard and mouse.
  • the screen-casting service module of the PC may also send instruction information for instructing the start of keyboard-mouse sharing to the mobile phone (eg, the screen-casting service module of the mobile phone). After receiving the instruction information, the mobile phone can prepare for receiving input events from the PC, such as mouse events.
  • the above example is described by taking the communication between the PC and the mobile phone as an example through the screen projection service module included in both, that is to say, the screen projection service module has the function of communicating with other terminals.
  • the screen projection service module may not have the function of communicating with other terminals, and in this embodiment, the communication between the PC and the mobile phone may be implemented by other modules.
  • the PC and the mobile phone may further include a transmission management module, and communication between the two may be implemented through this module, which is not specifically limited in this embodiment.
  • the communication between the PC and the mobile phone is implemented by the screen projection service module as an example for description.
  • the screen-casting source can accurately locate the content on the screen-casting source corresponding to the user's operation position on the screen-casting interface, such as controls, in this implementation
  • a cursor such as a cursor 2
  • the cursor 2 can be displayed on the screen projection source end, and the cursor 2 can be moved according to the user's operation on the input device of the screen projection destination end.
  • the cursor 2 may be the second cursor in this embodiment of the present application.
  • the method further includes the following S505-S508.
  • the PC sends the initial coordinate position of the cursor 1 when the cursor 1 enters the screen projection interface to the mobile phone.
  • the above initial coordinate position is when the cursor 1 enters the screen projection interface, the entry point is relative to the origin of the screen projection interface (or the screen projection window) (the origin can be a corner of the screen projection interface (for example, called the first corner), such as The coordinate position of the origin O1) shown in FIG. 7 .
  • the mobile phone displays the cursor 2 on the display screen of the mobile phone according to the initial coordinate position.
  • the cursor 2 is an invisible cursor, and its transparency is greater than a threshold value, for example, the transparency of the cursor 2 is very high, or completely transparent.
  • the PC that is the destination of the screen projection can obtain the coordinate position of the entry point relative to the origin of the screen projection interface when the cursor 1 enters the screen projection interface (that is, obtain the above-mentioned initial coordinate position), and send the initial coordinate position to the The mobile phone as the source of the projection screen.
  • the PC can obtain the coordinate position of the entry point in the display coordinate system of the PC when the cursor 1 enters the screen projection interface, such as coordinate position 1 .
  • the PC can determine the above-mentioned initial coordinate position according to the coordinate position 1 and the coordinate position of the upper left corner of the screen projection interface in the display coordinate system of the PC, such as coordinate position 2 .
  • the display coordinate system of the PC can be the upper left corner of the PC as the coordinate origin (the position O2 shown in Figure 8), the X axis points from the coordinate origin O2 to the right edge of the PC display screen, and the Y axis starts from the coordinate The origin O2 points to the coordinate system of the lower edge of the PC display.
  • the entry point for entering the screen projection interface with cursor 1 is shown as entry point 706 in FIG. 7 .
  • the PC can obtain the coordinate position 1 of the entry point 706 in the display coordinate system of the PC.
  • the obtained coordinate position 1 is (a, b).
  • the position of the screen-casting window used to display the screen-casting interface in the PC display coordinate system is known, and the position of the screen-casting window is the position of the screen-casting interface.
  • the coordinate position of the corner in the display coordinate system of the PC is known, for example, the coordinate position 2 is (a, c).
  • the PC can determine the coordinate position of the entry point 706 relative to the coordinate origin O1, that is, determine the initial coordinate position.
  • the initial coordinate position as determined by PC is (0, b-c).
  • the initial position where the cursor 2 appears on the mobile phone can be determined according to the initial coordinate position.
  • the phone can display cursor 2 at this starting position.
  • the window for displaying the screen projection interface is the screen projection window, and the size of the screen projection interface is determined by the size of the screen projection window, for example, the size of the screen projection interface is the same as the size of the screen projection window.
  • the size of the screen projection window may be predefined, which may be the same as or different from the resolution of the mobile phone. For example, if the size of the projection window is different from the resolution of the mobile phone, the content of the projection interface in the projection window is the same as that of the interface projected by the mobile phone, but the projection interface is after stretching and/or compressing the interface projected by the mobile phone. interface.
  • the mobile phone in order to make the starting position of cursor 2 displayed on the mobile phone consistent with the position where cursor 1 enters the screen projection interface, the mobile phone can display the screen according to the resolution of the mobile phone and the screen projection interface.
  • the size based on the initial coordinate position, is converted to the starting position where the cursor 2 appears on the mobile phone. That is to say, after the mobile phone receives the initial coordinate position from the PC, it can be determined that cursor 2 appears on the mobile phone according to the resolution of the mobile phone, the size of the screen projection interface (or the size of the screen projection window) and the initial coordinate position. starting position.
  • the starting position is the coordinate position of the cursor 2 relative to the origin of the display screen of the mobile phone (the origin may be a corner (eg, the first corner) of the display screen of the mobile phone).
  • the size of the screen projection window may be sent from the PC to the mobile phone during the process of establishing a connection between the PC and the mobile phone, or after the connection is successfully established.
  • the position corresponding to cursor 2 that should be displayed on the display screen of the mobile phone is the position indicated by point 2 .
  • the size of the screen projection window is A1*B1
  • the resolution of the mobile phone is A2*B2
  • the coordinates of point 1 in coordinate system 1 are (x1, y1)
  • the coordinates of point 2 in coordinate system 2 are (x2, y2) For example.
  • the coordinate system 1 takes the upper left corner of the projection window as the coordinate origin (O1 in Figure 9), the X axis points from the coordinate origin O1 to the right edge of the projection window, and the Y axis points from the coordinate origin O1 to the lower edge of the projection window.
  • Coordinate system 2 takes the upper left corner of the mobile phone display as the coordinate origin (O2 in Figure 9), the X axis points from the coordinate origin O2 to the right edge of the mobile phone display, and the Y axis points from the coordinate origin O2 to the coordinates of the lower edge of the mobile phone display Tie.
  • the conversion ratio value 1 is equal to 2
  • the conversion ratio value 2 is equal to 2.
  • the mobile phone after receiving the initial coordinate position from the PC, can determine the starting position of the cursor 2 on the mobile phone according to the above conversion relationship (eg conversion relationship 1 and/or conversion relationship 2). In another possible implementation manner, the mobile phone may pre-determine the above-mentioned conversion ratio value 1 and conversion ratio value 2. After receiving the initial coordinate position, the starting position where the cursor 2 appears on the mobile phone can be determined according to the pre-determined conversion ratio value 1 and/or conversion ratio value 2. For example, with reference to the example shown in FIG. 8 , the initial coordinate position is (0, b-c), then the starting position of cursor 2 on the mobile phone determined by the mobile phone is (0, (B2/B1)*(b-c)).
  • the initial coordinate position is (0, b-c)
  • the starting position of cursor 2 on the mobile phone determined by the mobile phone is (0, (B2/B1)*(b-c)).
  • the mobile phone can display the cursor 2 on the display screen of the mobile phone according to the determined starting position. As shown in conjunction with FIG. 7 , the cursor 2 displayed by the mobile phone on the display screen of the mobile phone is shown as 707 in FIG. 7 . It can be seen that the starting position of cursor 2 displayed on the mobile phone is consistent with the position of the entry point of cursor 1 on the PC when it enters the screen projection interface.
  • the cursor 2 displayed on the mobile phone may be an invisible cursor whose transparency is greater than a threshold. For example, if the transparency of the cursor 2 is very high or completely transparent, the cursor 2 can also be said to be invisible to the user.
  • the cursor 2 may not be an invisible cursor and is visible to the user, which is not limited in this embodiment. For convenience of description, in the drawings of the embodiments of the present application, it is shown as an example that the cursor 2 is visible to the user.
  • the above embodiment is described by taking the initial coordinate position obtained by the projection destination terminal and sent to the projection source terminal as an example.
  • the target screen projection terminal can also determine the starting position where the cursor 2 of the screen projection source terminal appears according to the initial coordinate position, and then send the starting position to the projection terminal.
  • the screen source end used to display cursor 2 on the screen source end.
  • the specific determination process is the same as the determination process in which the screen projection source determines the starting position where the cursor 2 appears, and will not be described in detail here.
  • the device resolution of the screen-casting source end can be sent to the screen-casting destination during the process of establishing a connection with the screen-casting destination, or after the connection is successfully established.
  • the mobile phone can directly display the cursor 2 on the mobile phone according to the initial coordinate position without conversion processing.
  • the PC sends the mouse operation parameter 1 included in the mouse movement event to the mobile phone.
  • the mobile phone receives the mouse operation parameter 1, and simulates a mouse movement event according to the mouse operation parameter 1.
  • the mobile phone displays an animation of the movement of the cursor 2 on the mobile phone display screen according to the mouse movement event.
  • the mouse movement event may be the first input event in this embodiment of the present application.
  • the mouse operation parameter 1 may be the first operation parameter in this embodiment of the present application.
  • the user may continue to operate the input device of the screen-casting destination, so that the cursor 1 moves to the desired position in the screen-casting interface. Since cursor 1 enters the screen projection interface, keyboard and mouse sharing has already started. After the keyboard and mouse sharing starts, the screen projection destination can not respond to the input event received after the user operates the input device, but sends the operation parameters in the input event to the keyboard and mouse sharing screen projection source, so that the screen can be projected. The source responds to the input event.
  • the input event may include a mouse move event, a mouse press event, a mouse lift event, and the like.
  • the screen projection interface projected by the mobile phone to the PC does not include the cursor, and the PC displays the cursor 1. Therefore, when the user moves the mouse, the cursor , such as cursor 1 can move with the movement of the mouse.
  • the screen casting destination does not respond to the input events received after the user operates the input device. Specifically, the screen casting destination can respond to other mouse events other than mouse movement events, such as mouse press events and Mouse up event does not respond. Instead, it responds to mouse movement events so that when the user moves the mouse, cursor 1 moves with it on the PC display.
  • the screen projection destination such as a PC
  • the mounted HOOK can be used to intercept (or block) other input events except mouse movement events after the keyboard and mouse sharing starts.
  • the mounted HOOK can also be used to obtain (or capture) the operation parameters contained in the corresponding input events (including mouse movement events and other input events) after the keyboard and mouse sharing starts.
  • the input device is a mouse
  • the input event may be a mouse event. That is to say, after the cursor enters the screen projection interface, keyboard and mouse sharing starts.
  • the PC can use the mounted HOOK to intercept other input events except mouse movement events.
  • the PC can also use the mounted HOOK to capture the operation parameters in the received mouse events, such as mouse operation parameters, and send the captured operation parameters to the screen projection source that created the virtual input device, so that the projection
  • the screen source can use the created virtual input device to simulate corresponding input events, such as mouse events, and then respond to them.
  • the screen projection destination end can respond to the operation input by the input device, but also the screen projection source end can respond to the operation input by the input device.
  • the screencasting target since the mounted HOOK will intercept them, the screencasting target does not respond to them, but the screencasting source sends the input device according to the operation parameters sent by the screencasting destination. Responds to the entered action.
  • the above-mentioned mouse operation parameters may include: mouse button flag (used to indicate which operation the user presses, lifts, moves or scrolls the mouse), coordinate information (when the user moves the mouse, use the It is used to indicate the X coordinate and Y coordinate of the cursor movement), wheel information (used to indicate the X-axis distance and Y-axis distance of the scroll wheel when the user operates the mouse wheel), key information (used to indicate the user's which of the left, middle, or right key was used).
  • the mounted HOOK can determine whether the input event is a mouse movement event according to the mouse button flag in the above mouse operation parameters. If it is a mouse movement event, it will not be intercepted. If it is not a mouse movement event is intercepted.
  • the interception of input events and the capture of operation parameters therein can also be implemented in other ways (such as registering RAWINPUT in the PC).
  • the interception of input events and the capture of operation parameters therein can also be implemented in different ways. For example, taking the mouse as the input device, after the keyboard and mouse sharing mode is enabled, the PC can mount HOOK and register RAWINPUT. After the keyboard and mouse sharing starts, the mounted HOOK can be used to intercept events other than mouse movement events. For other mouse events, the registered RAWINPUT can be used to capture parameters in mouse events.
  • This embodiment does not limit the specific implementation of the interception of mouse events and the capture of parameters therein.
  • the implementation of the interception of input events and the capture of operation parameters therein by mounting a HOOK is used as an example for introduction.
  • the input device is a mouse as an example.
  • the user wants to open the Bluetooth setting interface of the mobile phone as an example.
  • the user moves the mouse of the PC to make the cursor 1 enter the screen projection interface 1001
  • the user continues to move the mouse to move the cursor 1 to the position of the Bluetooth option 1002 on the screen projection interface 1001 .
  • the keyboard and mouse sharing has already started. After keyboard and mouse sharing starts, the mounted HOOK can intercept other mouse events except mouse movement events.
  • the PC can receive a corresponding input event, such as a mouse movement event, and the mouse movement event will not be intercepted by the mounted HOOK, and the event will be transmitted to the Windows system of the PC.
  • the Windows system of the PC can continue to draw the animation of the movement of the cursor 1 and display it on the display screen of the PC.
  • the PC displays the animation of the movement of the cursor 1004 on the display screen of the PC.
  • the movement track of the cursor 1004 is shown as the track 1005.
  • the cursor 1004 moves to the position of the Bluetooth option 1002 on the screen projection interface 1001 .
  • the cursor 1004 is displayed on an element of the screen-casting interface, and the element can be the first content in this embodiment. It can be seen that, On the first content, the cursor style of the cursor 1004 is style 1 (the style 1 may be the first style in this embodiment), that is, the normal selection style.
  • the mounted HOOK can capture the operation parameters in the input event. Therefore, when the user continues to move the mouse, the PC, such as the screen-casting service module of the PC application layer, can use the mounted HOOK. Capture the operation parameters in the received mouse movement event, such as mouse operation parameter 1, and send the mouse operation parameter 1 to the mobile phone of the screen projection source.
  • the mouse operation parameter 1 may be: a mouse button flag bit used to indicate that the user has moved the mouse, coordinate information of the X coordinate and Y coordinate used to indicate the movement of the cursor (eg, cursor 1 ), and wheel information (the value is empty) and key information (the value is empty).
  • the coordinate information is the relative displacement of the cursor 1 compared to the position of the cursor 1 when the cursor 1 enters the screen projection interface during the mouse movement.
  • the mobile phone After the mobile phone receives the mouse operation parameter 1, it can use the created virtual input device to simulate the corresponding input event according to the mouse operation parameter 1, such as a mouse movement event, so that the cursor 2 on the mobile phone can also be moved to the actual interface displayed by the mobile phone.
  • the mouse operation parameter 1 such as a mouse movement event
  • the size of the screen projection window may be different from the resolution of the mobile phone. Therefore, after the user moves the mouse of the PC, in order to make the cursor 2 move to the position of the Bluetooth option in the actual interface, the mobile phone can also move to the position of the Bluetooth option in the actual interface.
  • the resolution of the mobile phone and the size of the screen projection interface are based on the coordinate information in the mouse operation parameter 1, and the relative displacement of the cursor 2 on the mobile phone compared to the starting position is converted.
  • the size of the screen projection window is A1*B1 and the resolution of the mobile phone is A2*B2.
  • the relative displacement of cursor 1 in coordinate system 1 compared to the entry point is (X3, Y3)
  • the relative displacement of cursor 2 in coordinate system 2 compared to the starting position is (X4, Y4) as an example.
  • the mobile phone can determine the relative displacement of cursor 1 in coordinate system 1 compared to the entry point and the relative displacement of cursor 2 in coordinate system 2 compared to the starting position after the mouse moves according to the size of the screen projection window and the resolution of the mobile phone. conversion relationship between.
  • the conversion ratio value 1 is equal to 2
  • the conversion ratio value 2 is equal to 2
  • cursor 1 is on the X axis and The distance moved by the Y-axis is doubled when converted to the mobile phone.
  • the relative displacement (X3, Y3) of the cursor 1 in the coordinate system 1 compared to the entry point can be determined according to the coordinate information in the mouse operation parameter 1.
  • the mobile phone after receiving the mouse operation parameter 1 from the PC, the mobile phone (such as the screen projection service module of the mobile phone application layer) can, according to the coordinate information in the mouse operation parameter 1 and the above conversion relationship (such as conversion Relationship 3 and/or conversion relationship 4) determine the relative displacement of the cursor 2 on the mobile phone compared to the starting position.
  • the mobile phone can pre-determine the above-mentioned conversion ratio value 1 and conversion ratio value 2.
  • the mobile phone can use the coordinate information in the mouse operation parameter 1 and the pre-determined conversion ratio value 1. And/or the scale value 2 is converted to determine the relative displacement of the cursor 2 on the mobile phone compared to the starting position.
  • the mobile phone (such as the frame layer of the mobile phone) can use the created virtual input device to simulate the corresponding input event , such as mouse move events.
  • the frame layer of the mobile phone can draw the animation of the movement of the cursor 2 and display it on the display screen of the mobile phone.
  • the mobile phone can display an animation of the movement of the cursor 1006 on the display screen of the mobile phone correspondingly.
  • the coordinate information is the transformed coordinate
  • the cursor 1006 moves to the position of the Bluetooth option 1009 on the actual interface 1008 .
  • the user can not only move the cursor 1 on the PC display screen to the desired operation position in the screen projection interface, but also move the cursor 2 on the mobile phone display screen to the corresponding position.
  • the mobile phone can convert the received key code of the mouse operation parameter 1 into a key code that can be recognized by the mobile phone according to the preset mapping relationship. Afterwards, the mobile phone can simulate an input event that the mobile phone can recognize, such as a mouse movement event, according to the mouse operation parameter 1 after converting the key code by using the created virtual input device, so as to respond accordingly.
  • the mobile phone sends the cursor type of style 2 to the PC.
  • the PC displays the cursor 1 on the display screen of the PC according to the cursor type, and updates the screen projection interface.
  • the control and/or cursor will have corresponding visual feedback.
  • the control when the cursor 1 moves to the position of a control in the screen-casting interface, if the control corresponding to the control on the mobile phone is a control that can perform the next operation, the control and/or the control on the screen-casting interface and/or Cursor 1 will give corresponding visual feedback. For example, the control presents a highlighted background, or the style of cursor 1 changes.
  • the cursor 2 on the display screen of the mobile phone also moves to the position of the corresponding control in the display interface of the mobile phone.
  • the cursor style of the cursor 2 will change, for example, the cursor style of the cursor 2 is changed from style 1 to style 2.
  • the cursor 2 moves from one content in the interface (for example, the content may be the first content in this embodiment) to another content (for example, the content may be the second content in this embodiment), then the The cursor style of the cursor 2 is changed from style 1 (the style 1 may be the first style in this embodiment) to style 2 (the style 2 may be the second style in this embodiment).
  • FIG. 11 it is a schematic diagram of a cursor style and a corresponding cursor type provided in this embodiment.
  • the cursor style may include: normal selection style, link selection style, text selection style, movement style, vertical adjustment style, and the like.
  • the corresponding cursor types include: normal selection, link selection, text selection, move and vertical adjustment, etc.
  • the cursor style of the cursor can be different or the same.
  • the corresponding relationship between the specific control and the cursor style can be pre-defined by the third-party application developer or device manufacturer and stored in the mobile phone.
  • the mobile phone can change the cursor style of the cursor 2 according to the pre-stored correspondence.
  • the cursor 2 that is, the cursor 1006
  • the position of the Bluetooth option 1009 on the actual interface 1008 that is, the cursor 1006 moves from the position of the first content displayed on the actual interface 1008 to the second content, Namely the location of Bluetooth option 1009.
  • the first content is an element displayed in the interface that cannot be operated.
  • the cursor style is the normal selection style
  • the cursor style is the link selection style as an example.
  • the cursor style of the cursor 2 is changed from the normal selection style to the link selection style.
  • the mobile phone can send the cursor type of the changed cursor style to the PC.
  • a cursor style listener can be registered in the framework layer of the mobile phone. In this way, after the cursor style of the cursor 2 is changed, the cursor style listener can monitor the event of the cursor style change.
  • the frame layer of the mobile phone can obtain the changed cursor style, such as the cursor type of style 2, and send it to the PC through the screen projection service module of the mobile phone application layer.
  • the cursor 1 can be displayed on the display screen of the PC according to the cursor type. For example, continuing with the example shown in FIG. 10 , the cursor 1, that is, the cursor 1004, is changed from the normal selection style at the first content of the screen projection interface to the link selection style. In this way, the user is presented with a visual effect that the cursor style is changed when the cursor 1 moves to the position of the control in the screen projection interface.
  • the content when the cursor 2 moves to the position of the corresponding content in the display interface of the mobile phone, the content may have corresponding visual feedback.
  • the display mode of the first content may be changed from the first mode to the second mode
  • the display mode of the second content may be changed from the third mode to the fourth mode.
  • the display manners before different contents are changed may be the same or different.
  • the display methods after different content changes can be the same or different. For example, if the content is a control, the first mode and the third mode are the same as no highlighted background, and the second mode and the fourth mode are the same, both of which are displayed as a highlighted background.
  • the control when the cursor 2 moves to the position of the corresponding control in the display interface of the mobile phone, the control is changed from not presenting a highlighted background to a highlighted background. It is understandable that during multi-screen collaboration, the mobile phone will project the interface displayed on the mobile phone display to the PC display in real time, so after the control presents a highlighted background, the change will also be projected to the PC display. . In this way, it is shown to the user that when the cursor 1 moves to the position of the control in the screen projection interface, the control gives the effect of responding to the visual feedback.
  • both the Bluetooth option and the cursor 2 have visual feedback as an example.
  • the Bluetooth option 1203 in the screen projection interface of the PC presents a highlighted background, and the cursor 1 is displayed by Normal selection style changed to link selection style.
  • the cursor 2 on the mobile phone also moves to the position of the Bluetooth option 1204 in the actual interface 1202.
  • the Bluetooth option 1204 in the actual interface 1202 of the mobile phone has a highlighted background, and the cursor 2 is changed from the normal selection style to the link selection style. It should be noted that the cursor 2 may be invisible to the user.
  • the input device of the PC can be used, such as the mouse input of the PC to input the pressing operation (the pressing operation can be the second operation in the embodiment of the application) , the second operation can also be other operations).
  • the user can press the left mouse button.
  • the PC can receive corresponding input events, such as mouse down events. Since the mouse press event is received after the keyboard and mouse sharing is turned on, the mounted HOOK will intercept the mouse press event, so that the Windows system of the PC does not respond to it.
  • the PC (such as the screen projection service module of the PC application layer) can obtain the operation parameters in the mouse press event, such as the mouse operation parameter 2 (for example, use the mounted HOOK to capture the mouse operation parameter 2), and use the mounted HOOK to capture the mouse operation parameter 2.
  • the mouse operation parameter 2 is sent to the mobile phone.
  • the mouse pressing event may be the second input event in the embodiment of the application
  • the mouse operation parameter 2 may be the second operation parameter in the embodiment of the application.
  • the mouse operation parameter 2 may include: a mouse button flag for indicating that the user has pressed the mouse, coordinate information (null), wheel information (null) and a left button for indicating that the user presses the mouse The key position information of the key operation.
  • the mobile phone needs to convert the coordinate information before simulating the corresponding input event (as described in the specific description of the corresponding content in the above S508). If the value of the coordinate information in the parameter is empty, conversion processing is not required, and the corresponding input event can be simulated according to the received mouse operation parameters.
  • the frame layer of the mobile phone can convert the received key code of the mouse operation parameter 2 into the key code that the mobile phone can recognize according to the preset mapping relationship.
  • the created virtual input device simulates an input event that can be recognized by the mobile phone, such as a mouse press event, according to the mouse operation parameter 2 after the key code is converted.
  • the mobile phone can respond accordingly, such as displaying a second interface.
  • the mobile phone displays a Bluetooth setting interface 1302 .
  • the mobile phone projects the Bluetooth setting interface 1302 displayed on the display screen of the mobile phone to the display screen of the PC, and the PC displays the screen projection interface 1301 .
  • the controls at the position of the cursor 2 will also change.
  • the mobile phone can use the cursor style monitor to monitor the style of the cursor 2 in real time. If the style of the cursor 2 is determined to occur If it changes, the cursor type of the changed cursor style is sent to the PC, so that the PC can change the cursor style of the cursor 1 accordingly. For example, referring to FIG. 13 , after the mobile phone displays the Bluetooth setting interface 1302, and there is no control at the position of the cursor 1304, the cursor 1304 is changed from the link selection style to the normal selection style. The mobile phone can send the cursor type of the normal selection style to the PC, so that the PC displays the cursor 1303 of the normal selection style on the screen projection window.
  • the user may move the cursor 1 out of the screen-casting interface by operating an input device of the PC, such as moving a mouse of the PC.
  • the screen-casting window used for displaying the screen-casting interface can also be used to monitor whether the cursor 1 moves out of the screen-casting interface. For example, when the cursor 1 moves out of the screen projection interface, the screen projection window can detect a corresponding event, the event is used to instruct the cursor 1 to move out of the screen projection window, and the PC can determine that the cursor 1 moves out of the screen projection interface according to the event. After the cursor 1 moves out of the screen projection interface, the PC can determine that the keyboard and mouse sharing with the mobile phone is stopped.
  • the PC may send the second instruction information to the mobile phone to instruct the mobile phone to stop sharing the keyboard and mouse.
  • the PC can also unload the HOOK (or close the HOOK), that is, cancel the interception of input events, such as mouse events, and the capture of operation parameters therein.
  • the PC will not intercept the received input event, but will send the received input event to the Windows system of the PC, so that the Windows system of the PC will respond to the input event. , that is, the user can use the PC mouse to control the PC.
  • the invisible cursor 2 on the mobile phone also moves to the edge of the display screen of the mobile phone.
  • the display of the cursor 2 can be restored, that is, the cursor 2 can be set to be visible, so as to ensure that the cursor can be displayed normally on the display screen of the mobile phone after the mobile phone is directly connected to the mouse.
  • the above description takes the determination of whether to stop the keyboard and mouse sharing by the PC as an example. In some other embodiments, the mobile phone may also determine whether to stop the keyboard and mouse sharing.
  • the mobile phone can monitor whether the cursor 2 moves out of the edge of the mobile phone display during the movement of the cursor 2. After determining that the cursor 2 moves out of the edge of the mobile phone display, the mobile phone can determine that the keyboard and mouse sharing between the mobile phone and the PC is stopped. The mobile phone may send the above-mentioned second indication information to the PC to instruct the PC to stop sharing the keyboard and mouse. After receiving the second indication information, the PC can uninstall the HOOK. The mobile phone can also restore the display of the cursor 2 after it is determined that the cursor 2 has moved out of the edge of the display screen of the mobile phone.
  • the cursor style of cursor 2 corresponding to the starting position displayed on the mobile phone does not change, that is, it is still a normal selection.
  • the style is illustrated as an example.
  • the mobile phone can display the changed cursor Type is sent to the PC so that the PC correspondingly changes the cursor style of cursor 1 at the entry point.
  • the mobile phone displays the invisible cursor 2, and after the style of the cursor 2 is changed, the changed cursor type is sent to the PC for the PC to correspondingly change the cursor 1.
  • style to implement visual feedback of the cursor in the screen projection interface of the PC as an example to illustrate.
  • the PC can hide the cursor on its display screen, such as cursor 1, and the mobile phone can display the visible cursor 2. In this way, when the cursor 2 on the mobile phone moves to a control that can perform the next operation, the style of the cursor 2 can change accordingly, and/or the control can give visual feedback.
  • the interface on the mobile phone will be projected to the display screen of the PC in real time, when the style of the cursor 2 changes and/or the controls give visual feedback, the corresponding content in the projection interface projected to the display screen of the PC will also correspond accordingly.
  • the control and/or cursor in the screen-casting interface can also be used to give the user the effect of corresponding visual feedback.
  • the specific implementation is similar to the description in the above embodiment, the difference is that after the cursor 1 slides into the screen projection window, the cursor 1 on the PC is hidden, and the visible cursor 2 is displayed on the mobile phone; the mounted HOOK after the keyboard and mouse sharing starts, Intercept all input events. Other descriptions are the same as those in the foregoing embodiments, and are not repeated here.
  • the input device is a mouse.
  • the input device may also be a touch pad.
  • the user can use a key (left key or right key) of the touchpad to input a pressing operation, and slide a finger on the touchpad to input a moving operation.
  • the specific implementation of the user using the touch pad to input the operation to realize the method of this embodiment is similar to the specific implementation of using the mouse to input the operation to implement the method of this embodiment, and will not be repeated here.
  • the control and/or the cursor in the screen-casting interface will respond accordingly.
  • the controls in the screen projection interface have a highlighted background, and the cursor style changes accordingly.
  • the user can visually determine whether the control in the screen-casting interface corresponding to the control displayed on the screen-casting source end can perform the next operation, which improves the user's use experience.
  • FIG. 14 is a schematic diagram of the composition of an interface display device provided by an embodiment of the present application.
  • the apparatus can be applied to a first terminal (such as the above-mentioned PC), the first terminal is connected to the second terminal, and the apparatus can include: a display unit 1401 and an input unit 1402 .
  • the display unit 1401 is configured to display a screen projection interface on the display screen of the first terminal, where the content of the screen projection interface is a mirror image of the content of the first interface displayed on the display screen of the second terminal.
  • the input unit 1402 is configured to receive a first operation input by the user using the input device of the first terminal, where the first operation is used to move the first cursor on the display screen of the first terminal.
  • the cursor style of the first cursor is the first style, and/or the display mode of the first content is changed from the first mode to the second mode;
  • the cursor style of the first cursor is the second style, and/or the display mode of the second content is changed from the third mode to the fourth mode.
  • the above-mentioned screen projection interface is displayed on a partial area of the display screen of the first terminal.
  • the display unit 1401 is further configured to display the animation of the movement of the first cursor on the display screen of the first terminal in response to the first operation.
  • the apparatus may further include: a sending unit 1403, configured to send the first cursor to the second terminal when it is determined that the first cursor enters the screen projection interface during the process of moving the first cursor on the display screen of the first terminal.
  • the initial coordinate position of the screen interface, and the data of the first operation is sent to the second terminal.
  • the initial coordinate position is the coordinate position relative to the first corner of the screen-casting interface when the first cursor enters the screen-casting interface, which is used by the second terminal to display the second cursor on the display screen of the second terminal; the data of the first operation Used to move the second cursor on the display screen of the second terminal, so that when the first cursor moves to the first content, the second cursor moves to the content corresponding to the first content on the first interface, and the second cursor moves
  • the cursor style of the second cursor is the first style, so that when the first cursor moves to the second content, the second cursor moves to the first interface and corresponds to the second content
  • the cursor style of the second cursor is the second style.
  • the apparatus may further include: a receiving unit 1404 .
  • the receiving unit 1404 is configured to receive the cursor type of the first style from the second terminal when the first cursor moves to the first content of the screen projection interface.
  • the display unit 1401 is further configured to display the first cursor according to the cursor type of the first style, so that the first cursor is displayed in the first style.
  • the receiving unit 1404 is further configured to receive the cursor type of the second style from the second terminal when the first cursor moves to the second content on the screen projection interface.
  • the display unit 1401 is further configured to display the first cursor according to the cursor type of the second style, so that the first cursor is displayed in the second style.
  • the display mode of the content corresponding to the first content in the first interface is changed from the first mode to the second mode.
  • the display unit 1401 is further configured to update the screen-casting interface after the first cursor is moved to the first content of the screen-casting interface.
  • the display mode of the first content in the screen-casting interface before the update is the first mode;
  • the display mode of the first content in the screen interface is the second mode.
  • the display mode of the content corresponding to the second content in the first interface is changed from the third mode to the fourth mode.
  • the display unit 1401 is further configured to update the screen-casting interface after the first cursor is moved to the second content of the screen-casting interface.
  • the display mode of the second content in the screen-casting interface before the update is the third mode;
  • the display mode of the second content in the screen interface is the fourth mode.
  • the transparency of the second cursor is greater than a threshold.
  • the apparatus may further include: an obtaining unit 1405, configured to obtain the receiving unit 1405 during the process of inputting the first operation by the user using the input device of the first terminal after the first cursor enters the screen-casting interface.
  • the first operation parameter in the received first input event where the first input event is a mobile event corresponding to the first operation;
  • the sending unit 1403 is specifically configured to send the first operation parameter to the second terminal, and the first operation parameter is used for The second terminal simulates the first input event, and is further used to move the second cursor.
  • the input unit 1402 is further configured to receive a second operation input by the user using the input device of the first terminal when the first cursor moves to the first content on the screen-casting interface; the sending unit 1403 is further used to send the data of the second operation to the second terminal, and the data of the second operation is used for the second terminal to display the second interface; the display unit 1401 is also used to update the screen projection interface, and the updated screen projection interface
  • the content is a mirror image of the content of the second interface.
  • the obtaining unit 1405 is further configured to intercept the second input event corresponding to the second operation after the user uses the input device of the first terminal to input the second operation, and obtain the information in the second input event.
  • the second operation parameter, the sending unit 1403, is specifically configured to send the second operation parameter to the second terminal, where the second operation parameter is used for the second terminal to simulate the second input event, and then used to display the second interface.
  • the first operation corresponds to a movement event.
  • the acquiring unit 1405 is further configured to enable interception of input events, and is configured to intercept other input events except the movement event.
  • the sending unit 1403 is further configured to send first indication information to the second terminal, where the first indication information is used to indicate the start of sharing.
  • the sending unit 1403 is further configured to send second indication information to the second terminal, where the second indication information is used to instruct the sharing to stop; the obtaining unit 1405 is also used to cancel the interception of input events.
  • FIG. 15 is a schematic diagram of the composition of another interface display device provided by an embodiment of the present application.
  • the apparatus can be applied to a second terminal (such as the above-mentioned mobile phone), and the second terminal is connected to the first terminal.
  • the apparatus may include: a display unit 1501 , a projection unit 1502 and a receiving unit 1503 .
  • the display unit 1501 is used to display the first interface.
  • the projection unit 1502 is configured to project and display the first interface to the first terminal, so that the first terminal displays the screen projection interface.
  • the display unit 1501 is further configured to display the second cursor on the first interface when the first cursor of the first terminal enters the screen projection interface.
  • the receiving unit 1503 is configured to receive a first operation input by the user using the input device of the first terminal, where the first operation is used to move the second cursor on the display screen of the second terminal.
  • the display unit 1501 is further configured to display the second cursor in the first style when the second cursor moves to the first content of the first interface, and/or change the display mode of the first content from the first mode It is the second way, so that when the first cursor moves to the content corresponding to the first content on the screen-casting interface, the first cursor is displayed in the first style, and/or the content corresponding to the first content on the screen-casting interface is displayed.
  • the display mode is changed from the first mode to the second mode; the display unit 1501 is further configured to display the second cursor in the second style when the second cursor moves to the second content of the first interface, and/or display The display mode of the second content is changed from the third mode to the fourth mode, so that when the first cursor moves to the content corresponding to the second content on the screen projection interface, the first cursor is displayed in the second style, and/or the projection
  • the display mode of the content corresponding to the second content on the screen interface is changed from the third mode to the fourth mode.
  • the apparatus may further include: a sending unit 1504, configured to send the cursor type of the first style to the first terminal after displaying the second cursor in the first style, for the first A terminal displays the first cursor, so that the first cursor is displayed in the first style; the sending unit 1504 is further configured to send the cursor type of the second style to the first terminal after displaying the second cursor in the second style, using The first cursor is displayed on the first terminal, so that the first cursor is displayed in the second style.
  • a sending unit 1504 configured to send the cursor type of the first style to the first terminal after displaying the second cursor in the first style, for the first A terminal displays the first cursor, so that the first cursor is displayed in the first style
  • the sending unit 1504 is further configured to send the cursor type of the second style to the first terminal after displaying the second cursor in the second style, using The first cursor is displayed on the first terminal, so that the first cursor is displayed in the second style.
  • the transparency of the second cursor is greater than a threshold.
  • the receiving unit 1503 is further configured to receive the initial coordinate position of the first cursor entering the screen projection interface from the first terminal.
  • the apparatus may further include: a determining unit 1505, configured to determine a starting position according to the initial coordinate position, the size of the screen projection interface and the resolution of the second terminal, where the starting position may be the first position relative to the display screen of the second terminal The coordinate position of the corner; the display unit 1501 is specifically used to display the second cursor at the starting position.
  • the receiving unit 1503 is specifically configured to receive the first operation parameter from the first terminal, where the first operation parameter is the input of the user using the first terminal after the first cursor enters the screen projection interface
  • the operation parameter in the first input event received by the first terminal during the process of inputting the first operation by the device, the first operation parameter includes the relative displacement of the first cursor compared to the initial coordinate position
  • the determining unit 1505 is further configured to A cursor is equivalent to the relative displacement of the initial coordinate position, and determines the relative displacement of the second cursor compared to the initial position
  • the apparatus may further include: a simulation unit 1506 for determining the relative displacement of the second cursor relative to the initial position according to the determined second cursor.
  • the relative displacement of , and other parameters in the first operating parameter simulate the first input event.
  • the display unit 1501 is further configured to display the animation of the movement of the second cursor on the display screen of the second terminal according to the first input event.
  • the receiving unit 1503 is further configured to receive a second operation input by the user using the input device of the first terminal when the second cursor moves to the first content of the first interface; the display unit 1501 is further configured to display the second interface in response to the second operation; the projection unit 1502 is further configured to project and display the second interface to the first terminal, so that the content of the screen-casting interface updated by the first terminal is the second interface.
  • the receiving unit 1503 is further configured to receive a second operation input by the user using the input device of the first terminal when the second cursor moves to the first content of the first interface
  • the display unit 1501 is further configured to display the second interface in response to the second operation
  • the projection unit 1502 is further configured to project and display the second interface to the first terminal, so that the content of the screen-casting interface updated by the first terminal is the second interface.
  • the receiving unit 1503 is specifically configured to receive a second operation parameter from the first terminal, where the second operation parameter is content corresponding to the first content when the first cursor moves to the screen-casting interface
  • the simulation unit 1506 is used to simulate the second input event according to the second operation parameter, The second input event is used to display the second interface.
  • the receiving unit 1503 is further configured to receive first indication information from the first terminal, where the first indication information is used to indicate the start of sharing.
  • the receiving unit 1503 is further configured to receive second indication information from the first terminal, where the second indication information is used to instruct the sharing to stop, and the second indication information is that the first terminal is determining the first Sent after the cursor moves out of the screen projection interface.
  • An embodiment of the present application further provides an interface display apparatus, and the apparatus can be applied to an electronic device, such as the first terminal or the second terminal in the foregoing embodiments.
  • the apparatus may include: a processor; a memory for storing instructions executable by the processor; wherein, the processor is configured to enable the interface display apparatus to implement various functions or steps performed by the mobile phone or PC in the above method embodiments when the processor is configured to execute the instructions.
  • An embodiment of the present application further provides an electronic device (the electronic device may be a terminal, such as the first terminal or the second terminal in the above-mentioned embodiment), and the electronic device may include: a display screen, a memory, and one or more processor.
  • the display screen, memory and processor are coupled.
  • the memory is used to store computer program code comprising computer instructions.
  • the processor executes the computer instructions, the electronic device can execute various functions or steps executed by the mobile phone or PC in the foregoing method embodiments.
  • the electronic device includes but is not limited to the above-mentioned display screen, memory and one or more processors.
  • the structure of the electronic device may refer to the structure of the mobile phone shown in FIG. 3 .
  • the chip system includes at least one processor 1601 and at least one interface circuit 1602 .
  • the processor 1601 may be the processor in the above electronic device.
  • the processor 1601 and the interface circuit 1602 may be interconnected by wires.
  • the processor 1601 may receive and execute computer instructions from the memory of the above electronic device through the interface circuit 1602 .
  • the computer instructions are executed by the processor 1601
  • the electronic device can be made to execute various steps executed by the mobile phone or PC in the above-mentioned embodiments.
  • the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
  • Embodiments of the present application further provide a computer-readable storage medium, which is used to store an electronic device, such as a computer instruction run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • an electronic device such as a computer instruction run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • Embodiments of the present application further provide a computer program product, including an electronic device, such as computer instructions run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • a computer program product including an electronic device, such as computer instructions run by the above-mentioned terminal (eg, a mobile phone or a PC).
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种界面显示方法及设备,涉及电子设备领域。用户利用PC的鼠标在将光标移动到投屏界面的内容上时,内容和/或光标可进行相应的视觉反馈。第一终端在第一终端的显示屏上显示投屏界面,投屏界面的内容为第二终端显示屏上显示的第一界面内容的镜像; 第一终端接收用户使用第一终端的输入设备输入的第一操作,用于移动第一终端的显示屏上的第一光标; 在第一光标移动到投屏界面的内容上时,第一光标的光标样式发生变更,和/或,内容的显示方式发送变更。

Description

一种界面显示方法及设备
本申请要求于2020年08月26日提交国家知识产权局、申请号为202010873983.5、申请名称为“一种界面显示方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,尤其涉及一种界面显示方法及设备。
背景技术
目前,一个用户可同时拥有更多如手机、平板电脑、个人电脑(personal computer,PC)等终端。在如协同办公等有多终端协同使用需求的场景下,用户可将多个终端连接起来一起使用。
例如,用户拥有一台PC和一部手机,用户可将PC和手机采用无线或有线的方式连接起来一起配合使用,实现PC和手机的协同办公。在PC与手机协同办公的场景中,多屏协同实现了利用镜像投屏将手机显示界面到PC显示屏上的投射显示。为了便于描述,可将PC上显示的手机投射的界面称为投屏界面,如在图1中投屏界面102为手机的桌面101。之后,用户利用PC的鼠标,可通过在投屏界面中执行鼠标点击,鼠标移动等鼠标操作,实现对手机上显示实际界面的操作。
但是,用户利用PC的鼠标在投屏界面中执行鼠标操作的过程中,在光标移动到投屏界面的控件上时,通常情况下不会有视觉反馈,如控件不会有视觉反馈,又如光标样式无相应变化。
发明内容
本申请实施例提供一种界面显示方法及设备,用户利用PC的鼠标在将光标移动到投屏界面的控件上时,控件和/或光标可进行相应的视觉反馈。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请实施例提供一种界面显示方法,应用于第一终端,该第一终端与第二终端连接,该方法可以包括:
第一终端在第一终端的显示屏上显示投屏界面,该投屏界面的内容为第二终端显示屏上显示的第一界面内容的镜像;第一终端接收用户使用第一终端的输入设备输入的第一操作,该第一操作用于移动第一终端的显示屏上的第一光标;其中,在第一光标移动到投屏界面的第一内容上时,第一光标的光标样式为第一样式,和/或,第一内容的显示方式由第一方式变更为第二方式;在第一光标移动到投屏界面的第二内容上时,第一光标的光标样式为第二样式,和/或,第二内容的显示方式由第三方式变更为第四方式。
其中,本实施例中投屏界面的内容,可以是指显示在投屏界面的元素。在本实施例中,第一内容与第二内容是显示在投屏界面中的不同元素,在第一光标移动到第一内容上时的光标样式与移动到第二内容上的光标样式不同,即第一样式与第二样式不同。另外,上述第一方式与第三方式可以相同,也可以不同。第二方式与第四方式可以相同,也可以不同。
采用上述技术方案,在用户通过操作投屏目的端如上述第一终端的输入设备,如鼠标或触摸板将光标移动到投屏界面的内容上时,投屏界面中的该内容和/或光标会进行相应的视觉反馈,如投屏界面中的内容呈现高亮的背景,光标样式进行相应变化。这样,用户可从视觉上确定投屏界面中的该控件对应在投屏源端上显示的控件是否可以进行下一步操作,提高了用户的使用体验。
在一种可能的实现方式中,上述投屏界面显示在第一终端显示屏的部分区域上;该方法还可以包括:响应于第一操作,第一终端在第一终端的显示屏上显示第一光标移动的动画;在第一光标在第一终端的显示屏上移动的过程中,第一终端在确定第一光标进入投屏界面时,向第二终端发送第一光标进入投屏界面的初始坐标位置,向第二终端发送第一操作的数据;其中,初始坐标位置为第一光标进入投屏界面时相对于投屏界面的第一角的坐标位置,用于第二终端在第二终端的显示屏上显示第二光标;第一操作的数据用于移动第二终端的显示屏上的第二光标,以使得在第一光标移动到第一内容上时,第二光标移动到第一界面的与第一内容对应的内容上,第二光标移动到与第一内容对应的内容上时,第二光标的光标样式为第一样式,还使得在第一光标移动到第二内容上时,第二光标移动到第一界面与第二内容对应的内容上,第二光标移动到与第二内容对应的内容上时,第二光标的光标样式为第二样式。
在第一光标移动到投屏界面的第一内容上时,第一终端接收来自第二终端的第一样式的光标类型,根据第一样式的光标类型显示第一光标,以便第一光标显示为第一样式;在第一光标移动到投屏界面的第二内容上时,第一终端接收来自第二终端的第二样式的光标类型,根据第二样式的光标类型显示第一光标,以便第一光标显示为第二样式。
在第一光标进入投屏界面后,第一终端通过将对应的操作数据发送给第二终端,以便第二终端可根据该操作的数据移动第二终端的光标,并将光标样式反馈给第一终端,使得第一终端上的第一光标的光标样式可进行相应的改变,给用户在光标移动到投屏界面对应内容上后,光标进行视觉反馈的视觉效果。
在另一种可能的实现方式中,第二光标移动到第一内容对应的内容上时,第一界面中第一内容对应的内容的显示方式由第一方式变更为第二方式;该方法还可以包括:在第一光标移动到投屏界面的第一内容上后,第一终端更新投屏界面,更新前的投屏界面中第一内容的显示方式为第一方式,更新后的投屏界面中第一内容的显示方式为第二方式。
在另一种可能的实现方式中,第二光标移动到第二内容对应的内容上时,第一界面中第二内容对应的内容的显示方式由第三方式变更为第四方式;该方法还可以包括:在第一光标移动到投屏界面的第二内容上后,第一终端更新投屏界面,更新前的投屏界面中第二内容的显示方式为第三方式,更新后的投屏界面中第二内容的显示方式为第四方式。
在第一光标进入投屏界面后,第一终端通过将对应的操作数据发送给第二终端,以便第二终端可根据该操作的数据移动第二终端的光标到对应内容上,以便该内容进行相应的视觉反馈,第一终端通过更新投屏界面,可使得第一终端上的对应内容的显示方式也进行相应的改变,给用户在光标移动到投屏界面对应内容上后,该内容进行视觉反馈的视觉效果。
在另一种可能的实现方式中,第二光标的透明度大于阈值。
在另一种可能的实现方式中,上述向第二终端发送第一操作的数据,可以包括:在 第一光标进入投屏界面后,用户使用第一终端的输入设备输入第一操作的过程中,第一终端获取接收到的第一输入事件中的第一操作参数,第一输入事件为第一操作对应的移动事件;第一终端向第二终端发送第一操作参数,该第一操作参数用于第二终端模拟第一输入事件,进而用于移动第二光标。
在另一种可能的实现方式中,该方法还可以包括:在第一光标移动到投屏界面的第一内容上时,第一终端接收用户使用第一终端的输入设备输入的第二操作;第一终端向第二终端发送第二操作的数据,该第二操作的数据用于第二终端显示第二界面;第一终端更新投屏界面,更新后的投屏界面的内容为第二界面内容的镜像。在光标移动到投屏界面的内容上后,如果用户对该内容进行了操作,第一终端通过将对应的操作数据发送给第二终端,以便第二终端进行相应的响应。第一终端通过更新投屏界面,以便第二终端更新后的界面可对应投射到第一终端上。
在另一种可能的实现方式中,第一终端向第二终端发送第二操作的数据,可以包括:在用户使用第一终端的输入设备输入第二操作后,第一终端拦截第二操作对应的第二输入事件;第一终端获取并向第二终端发送第二输入事件中的第二操作参数,第二操作参数用于第二终端模拟第二输入事件,进而用于显示第二界面。
在另一种可能的实现方式中,第一操作对应移动事件;在第一光标进入投屏界面后,该方法还可以包括:第一终端开启输入事件的拦截,用于拦截除移动事件外的其他输入事件;第一终端向第二终端发送第一指示信息,该第一指示信息用于指示共享开始。
在另一种可能的实现方式中,在第一光标移出投屏界面后,该方法还可以包括:第一终端取消输入事件的拦截;第一终端向第二终端发送第二指示信息,该第二指示信息用于指示共享停止。
在另一种可能的实现方式中,在第一光标进入投屏界面后,第一终端可调整第一光标的透明度,调整后的第一光标的透明度大于阈值;第一终端还可拦截第一输入事件,并将第一输入事件的第一操作参数发送给第二终端,以便第二终端根据该第一操作参数模拟第一输入事件,进而移动第二光标。这样,在第二光标移动到第一内容对应在第一界面的内容上时,第一终端更新投屏界面;其中,更新后的界面中光标移动到投屏界面的第一内容上,更新后的投屏界面中光标的光标样式为第一样式;和/或,更新前的第一内容的显示方式为第一方式,更新后的第一内容的显示方式为第二显示方式。在第二光标移动到第二内容对应在第一界面的内容上后,第一终端更新投屏界面;其中,更新后的界面中光标移动到投屏界面的第二内容上,更新后的投屏界面中光标的光标样式为第二样式;和/或,更新前的第二内容的显示方式为第一方式,更新后的第二内容的显示方式为第二显示方式。可以理解的是,第一终端通过更新投屏界面,可给用户光标在移动到投屏界面的内容上时,投屏界面中的该内容和/或光标进行相应的视觉反馈的视觉效果。
第二方面,本申请实施例提供一种界面显示方法,应用于第二终端,第二终端与第一终端连接,该方法可以包括:
第二终端显示第一界面,将第一界面投射显示到第一终端,以使得第一终端显示投屏界面;第二终端在第一终端的第一光标进入投屏界面时,在第一界面上显示第二光标;第二终端接收用户使用第一终端的输入设备输入的第一操作,该第一操作用于移动第二终端的显示屏上的第二光标;在第二光标移动到第一界面的第一内容上时,第二终端将第二光标显示为第一样式,和/或,将第一内容的显示方式由第一方式变更为第二方式, 以便在第一光标移动到投屏界面与第一内容对应的内容上时,第一光标显示为第一样式,和/或,投屏界面与第一内容对应的内容的显示方式由第一方式变更为第二方式;在第二光标移动到第一界面的第二内容上时,第二终端将第二光标显示为第二样式,和/或,将第二内容的显示方式由第三方式变更为第四方式,以便在第一光标移动到投屏界面与第二内容对应的内容上时,第一光标显示为第二样式,和/或,投屏界面与第二内容对应的内容的显示方式由第一方式变更为第二方式。
在一种可能的实现方式中,在第二终端将第二光标显示为第一样式之后,该方法还可以包括:第二终端向第一终端发送第一样式的光标类型,用于第一终端显示第一光标,以便第一光标显示为第一样式;在第二终端将第二光标显示为第二样式之后,该方法还可以包括:第二终端向第一终端发送第二样式的光标类型,用于第一终端显示第一光标,以便第一光标显示为第二样式。第二终端可通过将第二光标的光标样式反馈给第一终端,使得第一终端上的第一光标的光标样式可进行相应的改变,给用户在光标移动到投屏界面对应内容上后,光标进行视觉反馈的视觉效果。
在另一种可能的实现方式中,第二光标的透明度大于阈值。
在另一种可能的实现方式中,第二终端在第一终端的第一光标进入投屏界面时,在第一界面上显示第二光标,可以包括:第二终端接收来自第一终端的第一光标进入投屏界面的初始坐标位置;第二终端根据初始坐标位置,投屏界面的尺寸和第二终端的分辨率确定起始位置,该起始位置可以为相对于第二终端显示屏的第一角的坐标位置;第二终端在起始位置显示第二光标。
在另一种可能的实现方式中,第二终端接收用户使用第一终端的输入设备输入的第一操作,可以包括:第二终端接收来自第一终端的第一操作参数,该第一操作参数是第一光标进入投屏界面后,用户使用第一终端的输入设备输入第一操作的过程中第一终端接收到的第一输入事件中的操作参数,第一操作参数包括第一光标相较于初始坐标位置的相对位移;第二终端根据第一光标相当于初始坐标位置的相对位移,确定第二光标相较于起始位置的相对位移;第二终端根据确定出的第二光标相较于起始位置的相对位移,及第一操作参数中的其他参数模拟第一输入事件。通过对接收到的操作参数中的相对位移进行换算,可使得在第一光标移动后,第二光标可移动到对应内容上。
该方法还可以包括:第二终端根据第一输入事件在第二终端的显示屏上显示第二光标移动的动画。
在另一种可能的实现方式中,该方法还可以包括:在第二光标移动到第一界面的第一内容上时,第二终端接收用户使用第一终端的输入设备输入的第二操作;响应于第二操作,第二终端显示第二界面,将第二界面投射显示到第一终端,以使得第一终端更新后的投屏界面的内容为第二界面内容的镜像。
在另一种可能的实现方式中,第二终端接收用户使用第一终端的输入设备输入的第二操作,可以包括:第二终端接收来自第一终端的第二操作参数,该第二操作参数是在第一光标移动到投屏界面与第一内容对应的内容上时,用户使用第一终端的输入设备输入第二操作后,第一终端拦截到的第二输入事件中包括的操作参数;第二终端根据第二操作参数模拟第二输入事件,第二输入事件用于显示第二界面。
在另一种可能的实现方式中,在第一终端的第一光标进入投屏界面时,该方法还可以包括:第二终端接收来自第一终端的第一指示信息,第一指示信息用于指示共享开始。
在另一种可能的实现方式中,该方法还可以包括:第二终端接收来自第一终端的第 二指示信息,第二指示信息用于指示共享停止,第二指示信息是第一终端在确定第一光标移出投屏界面后发送的。
第三方面,本申请实施例提供一种界面显示装置,应用于第一终端,该第一终端与第二终端连接,该装置可以包括:
显示单元,用于在第一终端的显示屏上显示投屏界面,该投屏界面的内容为第二终端显示屏上显示的第一界面内容的镜像;输入单元,用于接收用户使用第一终端的输入设备输入的第一操作,该第一操作用于移动第一终端的显示屏上的第一光标;其中,在第一光标移动到投屏界面的第一内容上时,第一光标的光标样式为第一样式,和/或,第一内容的显示方式由第一方式变更为第二方式;在第一光标移动到投屏界面的第二内容上时,第一光标的光标样式为第二样式,和/或,第二内容的显示方式由第三方式变更为第四方式。
在一种可能的实现方式中,上述投屏界面显示在第一终端显示屏的部分区域上;显示单元,还用于响应于第一操作,在第一终端的显示屏上显示第一光标移动的动画;该装置还可以包括:发送单元,用于在第一光标在第一终端的显示屏上移动的过程中,在确定第一光标进入投屏界面时,向第二终端发送第一光标进入投屏界面的初始坐标位置,向第二终端发送第一操作的数据;其中,初始坐标位置为第一光标进入投屏界面时相对于投屏界面的第一角的坐标位置,用于第二终端在第二终端的显示屏上显示第二光标;第一操作的数据用于移动第二终端的显示屏上的第二光标,以使得在第一光标移动到第一内容上时,第二光标移动到第一界面与第一内容对应的内容上,第二光标移动到与第一内容对应的内容上时,第二光标的光标样式为第一样式,还使得在第一光标移动到第二内容上时,第二光标移动到第一界面与第二内容对应的内容上,第二光标移动到与第二内容对应的内容上时,第二光标的光标样式为第二样式。
该装置还可以包括:接收单元。
接收单元,用于在第一光标移动到投屏界面的第一内容上时,接收来自第二终端的第一样式的光标类型;显示单元,还用于根据第一样式的光标类型显示第一光标,以便第一光标显示为第一样式;接收单元,还用于在第一光标移动到投屏界面的第二内容上时,接收来自第二终端的第二样式的光标类型;显示单元,还用于根据第二样式的光标类型显示第一光标,以便第一光标显示为第二样式。
在另一种可能的实现方式中,第二光标移动到第一内容对应的内容上时,第一界面中第一内容对应的内容的显示方式由第一方式变更为第二方式;显示单元,还用于在第一光标移动到投屏界面的第一内容上后,更新投屏界面,更新前的投屏界面中第一内容的显示方式为第一方式,更新后的投屏界面中第一内容的显示方式为第二方式。
在另一种可能的实现方式中,第二光标移动到第二内容对应的内容上时,第一界面中第二内容对应的内容的显示方式由第三方式变更为第四方式;显示单元,还用于在第一光标移动到投屏界面的第二内容上后,更新投屏界面,更新前的投屏界面中第二内容的显示方式为第三方式,更新后的投屏界面中第二内容的显示方式为第四方式。
在另一种可能的实现方式中,第二光标的透明度大于阈值。
在另一种可能的实现方式中,该装置还可以包括:获取单元,用于在第一光标进入投屏界面后,用户使用第一终端的输入设备输入第一操作的过程中,获取接收到的第一输入事件中的第一操作参数,第一输入事件为第一操作对应的移动事件;发送单元,具体用于向第二终端发送第一操作参数,该第一操作参数用于第二终端模拟第一输入事件, 进而用于移动第二光标。
在另一种可能的实现方式中,输入单元,还用于在第一光标移动到投屏界面的第一内容上时,接收用户使用第一终端的输入设备输入的第二操作;发送单元,还用于向第二终端发送第二操作的数据,该第二操作的数据用于第二终端显示第二界面;显示单元,还用于更新投屏界面,更新后的投屏界面的内容为第二界面内容的镜像。
在另一种可能的实现方式中,获取单元,还用于在用户使用第一终端的输入设备输入第二操作后,拦截第二操作对应的第二输入事件,获取第二输入事件中的第二操作参数,发送单元,具体用于向第二终端发送第二操作参数,该第二操作参数用于第二终端模拟第二输入事件,进而用于显示第二界面。
在另一种可能的实现方式中,第一操作对应移动事件;获取单元,还用于开启输入事件的拦截,用于拦截除移动事件外的其他输入事件。发送单元,还用于向第二终端发送第一指示信息,该第一指示信息用于指示共享开始。
在另一种可能的实现方式中,在第一光标移出投屏界面后,获取单元,还用于取消输入事件的拦截。发送单元,还用于向第二终端发送第二指示信息,该第二指示信息用于指示共享停止。
第四方面,本申请实施例提供一种界面显示装置,应用于第二终端,第二终端与第一终端连接,该装置可以包括:
显示单元,用于显示第一界面;投射单元,用于将第一界面投射显示到第一终端,以使得第一终端显示投屏界面;显示单元,还用于在第一终端的第一光标进入投屏界面时,在第一界面上显示第二光标;接收单元,用于接收用户使用第一终端的输入设备输入的第一操作,该第一操作用于移动第二终端的显示屏上的第二光标;显示单元,还用于在第二光标移动到第一界面的第一内容上时,将第二光标显示为第一样式,和/或,将第一内容的显示方式由第一方式变更为第二方式,以便在第一光标移动到投屏界面与第一内容对应的内容上时,第一光标显示为第一样式,和/或,投屏界面与第一内容对应的内容的显示方式由第一方式变更为第二方式;显示单元,还用于在第二光标移动到第一界面的第二内容上时,将第二光标显示为第二样式,和/或,将第二内容的显示方式由第三方式变更为第四方式,以便在第一光标移动到投屏界面与第二内容对应的内容上时,第一光标显示为第二样式,和/或,投屏界面与第二内容对应的内容的显示方式由第三方式变更为第四方式。
在一种可能的实现方式中,该装置还可以包括:发送单元,用于在将第二光标显示为第一样式后,向第一终端发送第一样式的光标类型,用于第一终端显示第一光标,以便第一光标显示为第一样式;发送单元,还用于在将第二光标显示为第二样式之后,向第一终端发送第二样式的光标类型,用于第一终端显示第一光标,以便第一光标显示为第二样式。
在另一种可能的实现方式中,第二光标的透明度大于阈值。
在另一种可能的实现方式中,接收单元,还用于接收来自所述第一终端的所述第一光标进入所述投屏界面的初始坐标位置。该装置还可以包括:确定单元,用于根据初始坐标位置,投屏界面的尺寸和第二终端的分辨率确定起始位置,该起始位置可以为相对于第二终端显示屏的第一角的坐标位置;显示单元,具体用于在起始位置显示第二光标。
在另一种可能的实现方式中,接收单元,具体用于接收来自第一终端的第一操作参数,该第一操作参数是第一光标进入投屏界面后,用户使用第一终端的输入设备输入第 一操作的过程中第一终端接收到的第一输入事件中的操作参数,第一操作参数包括第一光标相较于初始坐标位置的相对位移;确定单元,还用于根据第一光标相当于初始坐标位置的相对位移,确定第二光标相较于起始位置的相对位移;该装置还可以包括:模拟单元,用于根据确定出的第二光标相较于起始位置的相对位移,及第一操作参数中的其他参数模拟第一输入事件。
显示单元,还用于根据第一输入事件在第二终端的显示屏上显示第二光标移动的动画。
在另一种可能的实现方式中,接收单元,还用于在第二光标移动到第一界面的第一内容上时,接收用户使用第一终端的输入设备输入的第二操作;显示单元,还用于响应于第二操作,显示第二界面;投射单元,还用于将第二界面投射显示到第一终端,以使得第一终端更新后的投屏界面的内容为第二界面内容的镜像。
在另一种可能的实现方式中,接收单元,具体用于接收来自第一终端的第二操作参数,该第二操作参数是在第一光标移动到投屏界面与第一内容对应的内容上时,用户使用第一终端的输入设备输入第二操作后,第一终端拦截到的第二输入事件中包括的操作参数;模拟单元,用于根据第二操作参数模拟第二输入事件,第二输入事件用于显示第二界面。
在另一种可能的实现方式中,接收单元,还用于接收来自第一终端的第一指示信息,第一指示信息用于指示共享开始。
在另一种可能的实现方式中,接收单元,还用于接收来自第一终端的第二指示信息,第二指示信息用于指示共享停止,第二指示信息是第一终端在确定第一光标移出投屏界面后发送的。
第五方面,本申请实施例提供一种界面显示装置,该装置可以包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令时使得界面显示装置实现如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者实现如第二方面或第二方面的可能的实现方式中任一项所述的方法。
第六方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序指令,计算机程序指令被电子设备执行时使得电子设备实现如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者实现如第二方面或第二方面的可能的实现方式中任一项所述的方法。
第七方面,本申请实施例提供一种电子设备,该电子设备包括显示屏,一个或多个处理器和存储器;显示屏,处理器和存储器耦合;存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当计算机指令被电子设备执行时,使得该电子设备执行如第一方面或第一方面的可能的实现方式中任一项所述的方法,或者,使得该终端执行如第二方面或第二方面的可能的实现方式中任一项所述的方法。
第八方面,本申请实施例提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,电子设备中的处理器执行第一方面或第一方面的可能的实现方式中任一项所述的方法,或者执行第二方面或第二方面的可能的实现方式中任一项所述的方法。
第九方面,本申请实施例提供一种界面显示系统,该界面显示系统可以包括第一终端和第二终端,第一终端与第二终端连接。
第二终端,用于显示第一界面,将第一界面投射显示到第一终端,以使得第一终端 显示投屏界面。
第一终端,用于在第一终端的显示屏上显示投屏界面,投屏界面的内容为第二终端显示屏上显示的第一界面内容的镜像;接收用户使用第一终端的输入设备输入的第一操作,第一操作用于移动第一终端的显示屏上的第一光标。
其中,在第一光标移动到投屏界面的第一内容上时,第一光标的光标样式为第一样式,和/或,第一内容的显示方式由第一方式变更为第二方式;在第一光标移动到投屏界面的第二内容上时,第一光标的光标样式为第二样式,和/或,第二内容的显示方式由第三方式变更为第四方式。
在一种可能的实现方式中,投屏界面显示在第一终端显示屏的部分区域上;第一终端,还用于响应于第一操作,在第一终端的显示屏上显示第一光标移动的动画;第二终端,还用于在第一光标进入投屏界面时,在第一界面上显示第二光标;接收用户使用第一终端的输入设备输入的第一操作,第一操作用于移动第二终端的显示屏上的第二光标;在第二光标移动到第一界面与第一内容对应的内容上时,将第二光标显示为第一样式,向第一终端发送第一样式的光标类型;第一终端,还用于根据第一样式的光标类型显示第一光标;第二终端,还用于在第二光标移动到第一界面与第二内容对应的内容上时,将第二光标显示为第二样式,向第一终端发送第二样式的光标类型;第一终端,还用于根据第二样式的光标类型显示第一光标。
在另一种可能的实现方式中,第二终端,还用于在第二光标移动到第一界面与第一内容对应的内容上时,将第一界面与第一内容对应的内容的显示方式由第一方式变更为第二方式;第一终端,还用于更新投屏界面;第二终端,还用于在第二光标移动到第一界面与第二内容对应的内容上时,将第一界面与第二内容对应的内容的显示方式由第三方式变更为第四方式;第一终端,还用于更新投屏界面。
需要说明的是,在本实施例中,所述的第一角可以为显示屏的左上角,左下角,右上角和右下角中的任意一个。
可以理解地,上述提供的第三方面及其任一种可能的实现方式所述的界面显示装置,第四方面及其任一种可能的实现方式所述的界面显示装置,第五方面所述的界面显示装置,第六方面所述的计算机可读存储介质,第七方面所述的终端,第八方面所述的计算机程序产品及第九方面所述的界面显示系统所能达到的有益效果,可参考如第一方面或第二方面及其任一种可能的实现方式中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种显示界面的示意图;
图2为本申请实施例提供的一种系统架构的简化示意图;
图3为本申请实施例提供的一种手机的结构示意图;
图4为本申请实施例提供的一种软件架构的组成示意图;
图5为本申请实施例提供的一种界面显示方法的流程示意图;
图6为本申请实施例提供的另一种显示界面示意图;
图7为本申请实施例提供的又一种显示界面示意图;
图8为本申请实施例提供的一种显示坐标系示意图;
图9为本申请实施例提供的另一种显示坐标系示意图;
图10为本申请实施例提供的又一种显示界面示意图;
图11为本申请实施例提供的一种光标样式示意图;
图12为本申请实施例提供的又一种显示界面示意图;
图13为本申请实施例提供的又一种显示界面示意图;
图14为本申请实施例提供的一种界面显示装置的组成示意图;
图15为本申请实施例提供的另一种界面显示装置的组成示意图;
图16为本申请实施例提供的一种芯片系统的组成示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
目前,为了提高办公效率,用户可将多个终端连接起来一起配合使用。例如,在两个终端连接后,利用多屏协同可实现这两个终端间的协同办公。多屏协同可利用镜像投屏方式,将一个终端显示的界面投射到另一个终端的显示屏上显示。在本实施例中,可以将投射其显示界面的终端称为投屏源端,接收投屏源端的投射并显示投屏源端显示界面的终端称为投屏目的端。将投屏目的端上显示的投屏源端投射的界面称为投屏界面,将投屏目标端用于显示投屏界面的窗口称为投屏窗口。
例如,结合图1,以投屏源端为手机,投屏目的端为PC为例。手机可以将其显示屏上显示的界面(如桌面101)投射到PC的显示屏上。PC可在PC的显示屏上显示手机投射的界面,如PC在投屏窗口中显示投屏界面102。之后,用户可使用投屏目的端的输入设备,通过在投屏界面上进行操作,以实现对投屏源端实际界面的操作。如,继续结合图1,以输入设备为鼠标为例。用户可利用PC的鼠标在投屏界面102中执行鼠标点击,鼠标移动等鼠标操作。PC接收到对应鼠标操作后,可根据投屏界面102和手机投射过来的原始界面(如桌面101)尺寸比例关系,将用户在投屏界面102中执行鼠标操作时的坐标转化为在手机原始界面中的坐标。PC通过将转化后的坐标和操作类型(如移动,点击)发送给手机,以便手机可生成对应的触摸事件来模拟操作实际界面(如桌面101),并将操作后的界面投射到PC上。
另外,一般情况下,对于PC上显示的原始界面中包含的内容,如控件而言,在用户通过操作鼠标将光标移动到该控件上时,为了能够让用户从视觉上得知该控件是否可以进行下一步操作,该控件和/或光标会有相应的视觉反馈,如控件呈现高亮的背景,又如光标由正常选择样式变为文本选择样式。但是,在上述多屏协同场景下,用户通过操作投屏目的端(如PC)的输入设备,如鼠标将光标移动到投屏界面的控件上时,通常情况下投屏目的端不会有视觉反馈,如投屏界面中的控件不会呈现高亮的背景,光标样式也无相应变化。如,继续结合图1,当用户通过操作PC的鼠标将光标移动到投屏界面102中应用(application,APP)1的图标103上时,图标103不会有视觉反馈(如不呈现高亮的背景),光标104也一直是正常选择样式,并无变化。这对用户来说并不太友好,用户无法从视觉上得知图标103对应在手机上显示的图标105是否可以进行下一步操作。
本申请实施例提供一种界面显示方法,该方法可以应用于多终端协同使用时,投屏源端将其显示屏上显示的界面投射到投屏目的端显示屏上显示的场景中。本实施例提供的方法,在用户通过操作投屏目的端的输入设备,如鼠标或触摸板将光标移动到投屏界 面的内容上时,投屏界面中的该内容和/或光标会进行相应的视觉反馈,如投屏界面中的内容呈现高亮的背景,光标样式进行相应变化。这样,用户可从视觉上确定投屏界面中的该内容对应在投屏源端上显示的内容是否可以进行下一步操作,提高了用户的使用体验。
需要说明的是,本实施例中所述的光标也可以称为鼠标指针。光标可以是一个图像,其可以是动态的也可以是静态的,在不同情况下光标的样式也可能有所不同。本实施例中的内容可以为控件等显示在界面中的可操作的元素,也可以为显示在界面中不可操作的元素。一个元素可以包括以下内容中的一种或多种:文字,按钮,图标等。
下面将结合附图对本申请实施例的实施方式进行详细描述。
请参考图2,为本申请实施例提供的一种可以应用上述方法的系统架构的简化示意图。如图2所示,该系统架构至少可以包括:第一终端201和第二终端202。
其中,第一终端201与输入设备201-1连接(如图2所示),或包括输入设备201-1(图2中未示出)。作为一种示例,该输入设备201-1可以为鼠标,触摸板等。图2中以输入设备201-1是鼠标为例示出。
在本实施例中,第一终端201和第二终端202可通过有线或无线的方式建立连接。基于建立的连接,第一终端201和第二终端202可配合一起使用。在本实施例中,第一终端201和第二终端202采用无线方式建立连接时采用的无线通信协议可以为无线保真(wireless fidelity,Wi-Fi)协议、蓝牙(Bluetooth)协议、ZigBee协议、近距离无线通信(Near Field Communication,NFC)协议等,还可以是各种蜂窝网协议,在此不做具体限制。
在第一终端201与第二终端202连接后,第一终端201和第二终端202中的投屏源端可将其显示屏上显示的界面投射到投屏目的端显示屏上显示。如,以第一终端201作为投屏目的端,第二终端202作为投屏源端为例。第二终端202可将其显示屏上显示的界面投射到第一终端201的显示屏上显示。之后,用户使用第一终端201的输入设备201-1,通过在第一终端201显示屏上显示的投屏界面上进行操作,便可实现对第二终端202中显示的实际界面的操作。
在本申请实施例中,在用户对第一终端201显示屏上显示的投屏界面进行操作的过程中,用户通过操作第一终端201的输入设备201-1,如鼠标或触摸板,将光标在第一终端201的显示屏上移动到投屏界面的内容,如控件上时,借助键鼠共享技术,第一终端201可使投屏界面中的控件和/或光标进行相应的视觉反馈,如投屏界面中的控件呈现高亮的背景,光标样式进行相应变化,以便用户能够从视觉上得知投屏界面中的该控件对应在第二终端202上显示的控件是否可以进行下一步操作。
需要说明的是,本申请实施例中的终端,如上述第一终端201,又如上述第二终端202,可以为手机,平板电脑,手持计算机,PC,蜂窝电话,个人数字助理(personal digital assistant,PDA),可穿戴式设备(如智能手表),车载电脑,游戏机,以及增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等,本实施例对终端的具体形式不做特殊限制。其中,图2中以第一终端201为PC,第二终端202为手机为例示出。另外,本实施例提供的技术方案除了可以应用于上述终端(或者说移动终端)外,还可以应用于其他电子设备,如智能家居设备(如电视机)等。
在本实施例中,以终端为手机为例。请参考图3,为本申请实施例提供的一种手机的结构示意图。以下实施例中的方法可以在具有上述硬件结构的手机中实现。
如图3所示,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193以及显示屏194等。可选的,手机还可以包括移动通信模块150,用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本实施例示意的结构并不构成对手机的具体限定。在另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是手机的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,SIM接口,和/或USB接口等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为手机供电。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141也可接收电池142的输入为手机供电。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
当手机包括移动通信模块150时,移动通信模块150可以提供应用在手机上的包括 2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),NFC,红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。当有触摸操作作用于显示屏194,手机根据压力传感器180A检测所述触摸操作强度。手机也可以根据压力传感器180A的检测信号计算触摸的位置。
陀螺仪传感器180B可以用于确定手机的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。手机可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测手机在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。手机可以利用接近光传感器180G检测用户手持手机贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。手机可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。温度传感器180J用于检测温度。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
当手机包括SIM卡接口195时,SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机的接触和分离。手机可以 支持1个或N个SIM卡接口,N为大于1的正整数。手机通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机中,不能和手机分离。
结合图2,本申请实施例以第一终端201的软件系统是windows系统,第二终端202的软件系统是Android系统为例,示例性说明第一终端201和第二终端202的软件结构。请参考图4,为本申请实施例提供的一种软件架构的组成示意图。
其中,如图4所示,第一终端201的软件架构可以包括:应用层和windows系统(windows shell)。在一些实施例中,应用层可以包括安装在第一终端201的各个应用。应用层的应用可直接与windows系统交互。示例性的,应用层还可以包括投屏服务模块。
第二终端202的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。以第二终端202的软件系统是分层架构为例。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,如图4所示,第二终端202可以包括应用层和框架层(framework,FWK)。应用层可以包括一系列应用程序包。例如,应用程序包可以包括设置,计算器,相机,短信息,音乐播放器等应用。应用层包括的应用可以是第二终端202的系统应用,也可以是第三方应用,本申请实施例在此不做具体限制。应用层还可以包括投屏服务模块。框架层主要负责为应用层的应用提供应用编程接口(application programming interface,API)和编程框架。当然,第二终端202还可以包括其他层,如内核层(图4中未示出)等。该内核层是硬件和软件之间的层。内核层至少可以包含显示驱动,摄像头驱动,音频驱动,传感器驱动等。
在本申请实施例中,以第一终端201作为投屏目的端,第二终端202作为投屏源端为例。在第二终端202将其显示屏上显示的界面投射到第一终端201的显示屏上显示之后,如果用户通过操作第一终端201的输入设备201-1,如鼠标或触摸板,将光标在第一终端201的显示屏上移动到投屏界面的内容,如控件上时,在该控件对应在第二终端202上显示的控件可以操作的情况下,基于上述软件架构,并借助键鼠共享技术,第一终端201能够使得投屏界面中的该控件和/或光标进行相应的视觉反馈,如该控件呈现高亮的背景,光标样式进行相应变化。这样,可使得用户能够从视觉上得知投屏界面中的控件对应在第二终端202上显示的控件是否可以进行下一步操作。其中,键鼠共享技术可以是指用一个终端的输入设备(如鼠标,触摸板),实现对其他终端控制的技术。
以下结合图2和图4,以第一终端201为PC,第二终端202为手机,输入设备202-1为鼠标为例,结合附图对本申请实施例提供的界面显示方法进行详细介绍。
图5为本申请实施例提供的一种界面显示方法的流程示意图。如图5所示,该方法可以包括以下S501-S511。
S501、在手机与PC建立连接后,手机将手机显示屏上显示的界面投射到PC显示屏上,PC显示投屏界面。
在用户想利用多屏协同实现多个终端之间的协同办公时,在这多个终端之间的连接建立后,作为投屏源端的终端可将其显示屏上显示的界面投射到作为投屏目的端的终端显示屏上显示。例如,以手机作为投屏源端,PC作为投屏目的端为例。手机与PC建立连接。之后,手机可以将其显示屏上显示的界面投射到PC的显示屏上显示。PC可在PC的显示屏上显示投屏界面。
其中,手机与PC建立连接的方式可以有多种。在一些实施例中,手机与PC可以采 用有线的方式建立连接。例如,手机与PC可通过数据线建立有线连接。
在其他一些实施例中,手机与PC可以采用无线的方式建立连接。其中,终端之间采用无线方式建立连接有两点要求,一个是终端之间互相知晓对端的连接信息,另一个是各终端具有传输能力。连接信息可以是终端的设备标识,如互联网协议(internet protocol,IP)地址,端口号或终端登录的账号等。终端登录的账号可以是运营商为用户提供的账号,如华为账号等。终端登录的账号还可以为应用账号,如微信
Figure PCTCN2021114825-appb-000001
账号、优酷
Figure PCTCN2021114825-appb-000002
账号等。终端具有传输能力可以是近场通信能力,也可以是长距离通信能力。也就是说,终端间,如手机与PC建立连接采用的无线通信协议可以是如Wi-Fi协议或蓝牙协议或NFC协议等近场通信协议,也可以是蜂窝网协议。例如,用户可使用手机触碰PC的NFC标签,手机读取该NFC标签中保存的连接信息,如该连接信息中包括PC的IP地址。之后,手机可根据PC的IP地址采用NFC协议与PC建立连接。又例如,手机与PC均打开了蓝牙功能和Wi-Fi功能。PC可广播蓝牙信号,以发现周围的终端,如PC可显示发现的设备列表,该发现设备列表中可包括PC发现的手机的标识。另外,在PC进行设备发现的过程中也可与发现的设备互相交换连接信息,如IP地址。之后,在PC接收到用户在显示的设备列表中选择该手机的标识的操作后,PC根据手机的IP地址,可采用Wi-Fi协议与该手机建立连接。再例如,手机和PC均接入了蜂窝网,手机与PC登录了同一华为账号。手机与PC可根据该华为账号基于蜂窝网建立连接。
示例性的,结合图2,在本实施例中,以手机与PC采用无线方式建立连接为例。在用户利用多屏协同实现手机和PC之间协同办公的场景下,用户可手动开启PC的投屏服务功能。PC的投屏服务功能也可以自动开启,如在PC开机启动时自动开启。在PC的投屏服务功能开启后,PC应用层的投屏服务模块可开始进行网络监听,以监听是否有终端连接PC。在用户想要将手机的显示界面投射到PC,使用手机与PC协同办公时,用户可打开手机的NFC开关,并使用手机碰触PC的NFC标签。手机可读取到该NFC标签中保存的PC的IP地址。之后,手机和PC会分别显示确认界面,以询问用户是否确认将手机显示界面投射到PC上显示。当用户在确认界面选择确认后,PC,如PC的投屏服务模块可向手机(如手机的投屏服务模块)发送通知投屏的消息。手机接收到该通知投屏的消息后,可根据上述获取到的PC的IP地址,与PC建立连接。
在连接建立成功后,作为投屏源端的手机可将手机显示屏上显示的界面投射到作为投屏目的端的PC的显示屏上。PC显示投屏界面。该投屏界面中显示的内容与手机显示屏上显示的界面(如第一界面)内容相同,或者说该投屏界面中的内容为手机显示屏上显示界面内容的镜像。例如,如图6所示,手机显示屏上当前显示设置界面601。在手机与PC的连接建立成功后,手机可将该设置界面601投射到PC的显示屏上。PC显示投屏界面602。可以看到的是,投屏界面602中的内容与设置界面601中的内容相同。
其中,PC用于显示投屏界面的窗口可以称为投屏窗口。例如,结合图2,在手机将手机显示屏上显示的界面投射到PC的显示屏之前,PC应用层的投屏服务模块可显示投屏窗口。如,PC的投屏服务模块可在PC的投屏服务功能开启后,或者在PC的投屏服务功能开启,且与其他终端(如上述手机)连接建立成功后,显示投屏窗口。PC可以在其整个显示屏上显示投屏窗口,即投屏窗口占据了PC显示屏的全部。PC也可以在其显示屏的部分区域上显示投屏窗口,即投屏窗口中的投屏界面只是PC显示屏上的部分界面,本实施例在此不做具体限制。
在本实施例中,手机将手机显示屏上显示的界面投射到PC的显示屏上显示的具体实 现可以是:手机,如手机的投屏服务模块可获取手机当前显示界面对应的数据,并发送给PC。PC接收到该数据后,可根据该数据在PC显示屏上的投屏窗口中显示投屏界面。如,手机的投屏服务模块可通过手机的显示管理器(如该显示管理器是手机框架层的模块)获取手机当前显示界面对应数据,如录屏数据,并发送给PC,即可实现手机显示界面到PC显示屏上的投射显示。
在一些实施例中,可采用分布式多媒体协议(Distributed Multi-media Protocol,DMP)来实现手机显示界面到PC显示屏上的投射显示。例如,在手机接收到PC通知投屏的消息后,手机的投屏服务模块可使用手机的显示管理器(DisplayManager)创建虚拟显示(VirtualDisplay)。如手机的投屏服务模块向手机的显示管理器发送创建VirtualDisplay的请求,手机的显示管理器完成VirtualDisplay的创建后,可将创建的VirtualDisplay返回给手机的投屏服务模块。之后,手机的投屏服务模块可将手机显示屏上显示的界面的绘制移到该VirtualDisplay中。这样,手机的投屏服务模块可获得录屏数据。在手机的投屏服务模块获得录屏数据后,可将录屏数据进行编码后发送给PC。PC的投屏服务模块可接收到对应数据,对该数据进行解码后便可获得录屏数据。之后,PC的投屏服务模块与PC的框架层配合可根据录屏数据,绘制对应界面并显示在投屏窗口中。如PC的框架层可提供一个surfaceview来实现投屏界面在PC端的投射显示。
在其他一些实施例中,也可以采用无线投影(Miracast)实现手机显示界面在PC显示屏上的投射显示,即手机可获取手机显示界面的所有图层,然后将获得的所有图层整合成视频流(或者说称为录屏数据)并编码后通过实时流传输协议(real time streaming protocol,RTSP)协议发送给PC。PC在接收到视频流后可对其进行解码并播放,以实现手机显示界面在PC显示屏上的投射显示。或者,手机可以将手机显示界面进行指令抽取后获得指令流,并获取手机显示界面的层信息等,之后通过将指令流及层信息等发送给PC,用于PC恢复出手机显示屏上显示的界面,以实现手机显示界面在PC上的投射显示。
S502、手机创建虚拟输入设备。
在本实施例中,作为投屏源端的手机还可进行虚拟输入设备的创建,用于在用户使用投屏目的端,如PC的输入设备(如鼠标)对PC上显示的投屏界面进行操作时,可在手机端模拟对应输入事件。手机通过对模拟出的输入事件进行相应响应,可实现PC输入设备对手机的控制。也就是说,用户使用投屏目的端的输入设备不仅可以实现对投屏目的端的控制,还可实现对投屏源端的控制,实现投屏目的端和投屏源端之间的键鼠共享。
作为一种示例性的实现,在PC的键鼠共享模式开启的情况下,可实现PC和手机之间的键鼠共享,即用户可使用PC的输入设备实现对PC和手机两者的控制。
例如,在一些实施例中,在其他终端与PC成功建立连接后,PC可显示弹窗。该弹窗用于询问用户是否开启键鼠共享模式。如果接收到用户选择开启键鼠共享模式的操作,PC可开启键鼠共享模式。
PC在开启键鼠共享模式后,可通知与自身建立了连接的所有终端,或与自身建立了连接且投射界面到PC的终端,键鼠共享模式已开启。如PC与手机建立了连接,且手机投射界面到了PC,则PC会向手机通知键鼠共享模式已开启。如,PC可向手机发送通知消息,该通知消息可用于指示PC的键鼠共享模式已开启。手机在接收到该通知后,可创建一个虚拟输入设备,该虚拟输入设备与常规的如鼠标,触摸板等输入设备的作用相同,可用于手机模拟对应输入事件。例如,以输入设备为鼠标为例,手机创建的该虚拟输入设备与常规的鼠标作用相同,可以看作是PC共享给手机的鼠标,能够用于在手机端模拟 鼠标事件,以实现PC的鼠标对手机的控制。示例性的,以手机的操作系统是Android系统为例。手机可利用linux的uinput能力实现虚拟输入设备的创建。其中,uinput是一个内核层模块,可以模拟输入设备。通过写入/dev/uinput(或/dev/input/uinput)设备,进程可以创建具有特定功能的虚拟输入设备。一旦创建了该虚拟输入设备,其便可模拟对应的事件。类似的,其他与PC建立了连接的终端也会根据接收到通知,进行虚拟输入设备的创建。需要说明的是,如果接收到通知的终端的操作系统是Android系统,则可以利用linux的uinput能力实现虚拟输入设备的创建,或者可以使用人机交互设备(human interface device,HID)协议来实现虚拟输入设备的创建。如果接收到通知的终端的操作系统是IOS系统或windows系统等其他操作系统,则可使用HID协议来实现虚拟输入设备的创建。另外,上述实施例是以终端接收到用于通知PC的键鼠共享模式已开启的通知后,进行虚拟输入设备的创建为例进行说明的。在其他一些实施例中,在终端接收到上述通知后,也可以显示弹窗,以询问用户是否想要使用PC的输入设备实现对本设备的控制。如果接收到用户选择使用PC的输入设备实现对本设备的控制的操作,则再进行虚拟输入设备的创建,否则不创建虚拟输入设备。
又例如,在其他一些实施例中,也可以默认在其他终端,如手机与PC建立连接后,PC自动开启键鼠共享模式,无需用户手动打开。在其他终端,如上述手机与PC建立连接后,也可自动进行虚拟输入设备的创建,无需PC发送通知。或者,在其他终端与PC建立连接后,可以先显示弹窗询问用户是否想要使用PC的输入设备实现对本设备的控制。如果接收到用户选择使用PC的输入设备实现对本设备的控制的操作,则再进行虚拟输入设备的自动创建,否则不创建虚拟输入设备。或者,也可以默认在作为投屏源端的终端投射界面到作为投屏目的端的PC后,PC自动开启键鼠共享模式,无需用户手动打开。作为投屏源端的终端,也可以投射界面到PC后,或在接收到来自PC的通知投屏的消息后,进行虚拟输入设备的创建。
另外,结合图2,由于鼠标是PC的输入设备,在PC开启了键鼠共享模式,其他终端,如手机创建了虚拟输入设备后,一般情况下,暂时是由PC对鼠标的操作进行响应的,或者说用户使用鼠标暂时可实现对PC的控制。在本实施例中,在键鼠共享模式开启后,PC可在确定满足鼠标穿梭条件后,触发由与PC建立了连接的创建了虚拟输入设备的其他终端,如手机对鼠标的操作进行响应,即触发PC和手机之间的键鼠共享。
示例性的,鼠标穿梭条件可以是在PC显示屏上显示的光标,如称为光标1滑入PC显示屏上显示的投屏界面。以输入设备是鼠标为例,用户可通过移动鼠标,使得PC显示屏上显示的光标1滑入PC显示屏上显示的投屏界面,以触发PC和手机之间的键鼠共享。如,该方法还包括以下S503-S504。
S503、PC接收鼠标移动事件,根据鼠标移动事件在PC的显示屏上显示光标1移动的动画。
其中,光标1可以为本申请实施例中的第一光标。
S504、PC在确定光标1进入投屏界面时,向手机发送第一指示信息,用于指示键鼠共享开始。
在用户想通过在投屏目的端的投屏界面上进行操作,以实现对投屏源端实际界面的操作时,用户可通过操作投屏目的端的输入设备,如输入第一操作,使得投屏目的端显示的光标移动到投屏界面上。在本实施例中,当投屏目的端显示的光标进入投屏界面后,投屏目的端与投屏源端可开始进行键鼠共享。
继续以输入设备为鼠标,投屏目的端为PC,投屏源端为手机为例。用户可移动PC的鼠标,使得光标1在PC的显示屏上移动。在光标1在PC的显示屏上移动的过程中,PC可确定光标1是否进入PC显示屏上显示的投屏界面。例如,如上描述,投屏界面显示在PC的投屏窗口中,该投屏窗口可用于监测光标1是否进入投屏界面。如当光标1进入投屏界面时,该投屏窗口可检测到对应事件,该事件用于指示光标1进入投屏窗口,根据该事件PC可确定光标1进入投屏界面。在确定光标1进入PC显示屏上显示的投屏界面时,PC可确定满足鼠标穿梭条件,之后可与手机开始进行键鼠共享。另外,在确定鼠标穿梭开始后,PC还可向手机发送上述第一指示信息,以向手机指示键鼠共享开始。
例如,结合图2和图6,在用户使用PC的输入设备输入移动操作,如用户移动鼠标的过程中,PC可接收到对应的输入事件,如移动事件,该移动事件可以称为鼠标移动事件。根据接收到的鼠标移动事件,PC的windows系统可绘制光标1移动的动画并在PC的显示屏上显示。如图7所示,随着鼠标701的移动,PC在PC的显示屏702上对应显示光标703移动的动画,如图7中所示光标703的移动轨迹如轨迹704所示。在光标703在PC的显示屏702上移动的过程中,PC应用层的投屏服务模块可通过投屏窗口确定光标703是否进入投屏界面705。如,当光标703进入投屏界面705时,该投屏窗口可检测到用于指示光标703进入投屏窗口的事件。在检测到该事件后,投屏窗口可向PC应用层的投屏服务模块发送通知,以通知PC的投屏服务模块光标已进入投屏界面705。在PC的投屏服务模块确定光标703进入投屏界面后,可确定满足鼠标穿梭条件。之后,PC可与手机开始进行键鼠共享。PC的投屏服务模块还可向手机(如手机的投屏服务模块)发送用于指示键鼠共享开始的指示信息。手机接收到该指示信息后,可为接收来自PC的输入事件,如鼠标事件做准备。需要说明的是,以上示例是以PC和手机间的通信通过两者包括的投屏服务模块为例进行说明的,也就是说,该投屏服务模块具有与其他终端通信的功能。在其他一些实施例中,投屏服务模块也可以不具有与其他终端通信的功能,则在该实施例中,PC和手机间的通信可通过其他模块来实现。如PC和手机可还包括传输管理模块,两者之间可通过该模块来实现通信,本实施例在此并不做具体限制。为了便于描述,以下实施例中以PC和手机之间的通信通过投屏服务模块实现为例进行说明。
另外,为了在用户使用投屏目的端的输入设备对投屏界面进行操作时,投屏源端能够准确定位到用户在投屏界面上操作位置对应在投屏源端的内容,如控件,在本实施例中,可以在投屏源端显示光标,如称为光标2,并使该光标2可随用户对投屏目的端的输入设备的操作而移动。其中,光标2可以为本申请实施例中的第二光标。如,该方法还包括以下S505-S508。
S505、PC向手机发送光标1进入投屏界面时的初始坐标位置。
其中,上述初始坐标位置是光标1进入投屏界面时进入点相对于投屏界面(或者说投屏窗口)原点(该原点可以是投屏界面的一个角(如称为第一角),如图7中所示的原点O1)的坐标位置。
S506、手机根据初始坐标位置在手机显示屏上显示光标2。
其中,该光标2为隐形光标,其透明度大于阈值,如光标2的透明度很高,或者完全透明。
在确定鼠标穿梭开始后,作为投屏目的端的PC可获取光标1进入投屏界面时进入点相对于投屏界面原点的坐标位置(即获得上述初始坐标位置),并将该初始坐标位置发送给作为投屏源端的手机。
作为一种示例,在光标1进入投屏界面后,PC可获取光标1进入投屏界面时进入点在PC的显示坐标系中的坐标位置,如称为坐标位置1。PC根据该坐标位置1和投屏界面的左上角在PC的显示坐标系中的坐标位置,如称为坐标位置2,可确定出上述初始坐标位置。
例如,请参考图8,PC的显示坐标系可以为以PC的左上角为坐标原点(图8中所示的位置O2),X轴从坐标原点O2指向PC显示屏右边缘,Y轴从坐标原点O2指向PC显示屏下边缘的坐标系。结合图7,以光标1进入投屏界面的进入点如图7中的进入点706所示。在确定光标1进入投屏界面后,PC可获取到进入点706在PC的显示坐标系中的坐标位置1,如获得的坐标位置1为(a,b)。在显示投屏界面时,用于显示该投屏界面的投屏窗口在PC显示坐标系中的位置PC是已知的,投屏窗口的位置即为投屏界面的位置,则投屏界面左上角在PC的显示坐标系中的坐标位置,如称为坐标位置2已知,如坐标位置2为(a,c)。PC根据该坐标位置1和坐标位置2可确定进入点706相对于坐标原点O1的坐标位置,即确定初始坐标位置。如PC确定的初始坐标位置为(0,b-c)。
手机接收到该初始坐标位置之后,可根据该初始坐标位置,确定出手机上光标2出现的起始位置。手机可在该起始位置显示光标2。
可以理解的是,用于显示投屏界面的窗口为投屏窗口,投屏界面的尺寸由该投屏窗口的尺寸决定,如投屏界面的尺寸与投屏窗口的尺寸相同。该投屏窗口的尺寸可以是预定义的,其与手机的分辨率可以相同,也可以不同。如,投屏窗口的尺寸与手机的分辨率不同,则投屏窗口中的投屏界面与手机投射的界面内容相同,但投屏界面是对手机投射来的界面做拉伸和/压缩处理后的界面。
在投屏窗口的尺寸与手机分辨率不同的情况下,为了能够使得手机上显示光标2的起始位置与光标1进入投屏界面的位置一致,手机可根据手机的分辨率和投屏界面的尺寸,以初始坐标位置为依据,换算出在手机上光标2出现的起始位置。也就是说,手机接收到来自PC的初始坐标位置后,可根据手机的分辨率,投屏界面的尺寸(或者说投屏窗口的尺寸)和该初始坐标位置,确定出在手机上光标2出现的起始位置。该起始位置为光标2相对于手机显示屏原点(该原点可以是手机显示屏的一个角(如称为第一角))的坐标位置。
其中,投屏窗口的尺寸可以是PC与手机建立连接的过程中,或连接建立成功后PC发送给手机的。
示例性的,如图9所示,假设光标1在投屏界面上显示在点1所示的位置时,对应光标2应显示在手机显示屏上的位置为点2所示的位置。以投屏窗口的尺寸为A1*B1,手机分辨率为A2*B2,点1在坐标系1中的坐标为(x1,y1),点2在坐标系2中的坐标为(x2,y2)为例。其中,坐标系1是以投屏窗口左上角为坐标原点(如图9中的O1),X轴从坐标原点O1指向投屏窗口右边缘,Y轴从坐标原点O1指向投屏窗口下边缘的坐标系。坐标系2是以手机显示屏的左上角为坐标原点(如图9中的O2),X轴从坐标原点O2指向手机显示屏右边缘,Y轴从坐标原点O2指向手机显示屏下边缘的坐标系。手机可以根据投屏窗口的尺寸和手机分辨率,确定光标1在坐标系1中的坐标位置与光标2在坐标系2中的坐标位置的换算关系。如,光标1在坐标系1中X轴的坐标位置与光标2在坐标系2中X轴的坐标位置应满足x1/A1=x2/A2,则在光标1在坐标系1中X轴上的坐标位置已知时,光标2在坐标系2中X轴的坐标位置x2=(A2/A1)*x1,其中,该公式可称为换算关系1,A2/A1可称为换算比例值1。类似的,光标1在坐标系1中Y 轴的坐标位置与光标2在坐标系2中Y轴的坐标位置应满足y1/B1=y2/B2,则在光标1在坐标系1中Y轴上的坐标位置已知时,光标2在坐标系2中Y轴的坐标位置y2=(B2/B1)*y1,其中,该公式可称为换算关系2,B2/B1可称为换算比例值2。如:手机分辨率为2340*1080,投屏窗口的尺寸为1170*540,则换算比例值1等于2,换算比例值2等于2。
在一种可能的实现方式中,手机在接收到来自PC的初始坐标位置后,可根据上述换算关系(如换算关系1和/或换算关系2)确定在手机上光标2出现的起始位置。在另一种可能的实现方式中,手机可预先确定好上述换算比例值1和换算比例值2。在接收到初始坐标位置后,可根据预先确定好的换算比例值1和/或换算比例值2,确定在手机上光标2出现的起始位置。例如,结合图8所示示例,初始坐标位置为(0,b-c),则手机确定出的在手机上光标2出现的起始位置为(0,(B2/B1)*(b-c))。手机根据确定的该起始位置,可在手机显示屏上显示光标2,如结合图7所示,手机在手机显示屏上显示的光标2如图7中的707所示。可以看到的是,手机显示光标2的起始位置与PC上光标1进入投屏界面时进入点的位置是一致的。
其中,手机显示的该光标2可以为一个隐形光标,其透明度大于阈值,如光标2的透明度很高,或者完全透明,也可以说该光标2对于用户来说是不可见的。该光标2也可以不是隐形光标,对用户可见,本实施例在此不做限制。为了便于描述,本申请实施例的附图中以光标2对用户可见为例示出。
另外,以上实施例是以投屏目的端获得初始坐标位置后发送给投屏源端,由投屏源端根据初始坐标位置确定投屏源端光标2出现的起始位置为例进行说明的。在其他一些实施例中,投屏目的端获得初始坐标位置后,也可由投屏目的端根据初始坐标位置确定出投屏源端光标2出现的起始位置后,将该起始位置发送给投屏源端,用于投屏源端显示光标2。具体确定过程与投屏源端确定光标2出现的起始位置的确定过程相同,此处不再详细赘述。投屏源端的设备分辨率可以是与投屏目的端建立连接的过程中,或连接建立成功后发送给投屏目的端的。当投屏窗口的尺寸与手机分辨率相同时,则手机可直接根据初始坐标位置在手机上显示光标2,无需进行换算处理。
S507、PC向手机发送鼠标移动事件包含的鼠标操作参数1。
S508、手机接收鼠标操作参数1,根据鼠标操作参数1模拟鼠标移动事件。手机根据鼠标移动事件在手机显示屏上显示光标2移动的动画。
其中,鼠标移动事件可以为本申请实施例中的第一输入事件。鼠标操作参数1可以为本申请实施例中的第一操作参数。
在光标1进入投屏界面后,用户可能会继续操作投屏目的端的输入设备,以使得光标1移动到投屏界面中想要操作的位置。由于在光标1进入投屏界面后,键鼠共享已经开始。在键鼠共享开始后,投屏目的端可以不对用户操作输入设备后接收到的输入事件进行响应,而是通过将输入事件中的操作参数发送给键鼠共享的投屏源端,以便投屏源端对该输入事件进行响应。
其中,当输入设备为鼠标时,输入事件可以包括鼠标移动事件,鼠标按下事件,鼠标抬起事件等。需要特别说明的是,当显示在手机上的光标2是隐形光标时,手机投射到PC的投屏界面中不包含光标,且PC上显示有光标1,因此,为了在用户移动鼠标时,光标,如光标1可随鼠标的移动而移动。在键鼠共享开始后,投屏目的端不对用户操作输入设备后接收到的输入事件进行响应,具体可以为:投屏目标端对除鼠标移动事件外的其他鼠标事件,如鼠标按下事件和鼠标抬起事件不做响应。而对鼠标移动事件进行响 应,以便在用户移动鼠标后,光标1可随之在PC显示屏上移动。
作为一种示例性的实现,投屏目的端,如PC可在开启键鼠共享模式后,挂载钩子(HOOK)。挂载的HOOK在键鼠共享开始后可用于拦截(或者说屏蔽)除鼠标移动事件外的其他输入事件。挂载的HOOK在键鼠共享开始后还可用于获取(或者说捕获)对应输入事件(包括鼠标移动事件和其他输入事件)包含的操作参数。如以输入设备是鼠标为例,该输入事件可以是鼠标事件。也就是说,在光标进入投屏界面后,键鼠共享开始,之后,PC可利用挂载的HOOK拦截除鼠标移动事件外的其他输入事件。PC还可利用挂载的HOOK捕获接收到的鼠标事件中的操作参数,如称为鼠标操作参数,并通过将捕获到的操作参数发送给创建了虚拟输入设备的投屏源端,以便该投屏源端可利用创建的虚拟输入设备模拟对应输入事件,如鼠标事件,进而对其进行响应。这样,对于鼠标移动事件,可使得不仅投屏目的端可以对输入设备输入的操作进行响应,投屏源端也可以对输入设备输入的操作进行响应。对于除鼠标移动事件外的其他输入事件,由于挂载的HOOK会将其拦截,因此投屏目标端不对其进行响应,而是由投屏源端根据投屏目的端发送的操作参数对输入设备输入的操作进行响应。
其中,上述鼠标操作参数可以包括:鼠标按键标志位(用于指示用户对鼠标进行了按下、抬起、移动或滚轮滚动中的何种操作)、坐标信息(在用户移动了鼠标时,用于指示光标移动的X坐标和Y坐标)、滚轮信息(在用户操作了鼠标的滚轮时,用于指示滚轮滚动的X轴距离和Y轴距离)、键位信息(用于指示用户对鼠标的左键、中键或右键中的哪个键进行了操作)。作为一种示例,挂载的HOOK在键鼠共享开始后可根据上述鼠标操作参数中的鼠标按键标志位确定输入事件是否是鼠标移动事件,如果是鼠标移动事件,则不拦截,如果不是鼠标移动事件,则拦截。
当然,也可以通过其他方式(如在PC中注册RAWINPUT)来实现输入事件的拦截和其中操作参数的捕获。或者,还可以通过不同的方式来分别实现输入事件的拦截和其中操作参数的捕获。例如,以输入设备是鼠标为例,PC在开启键鼠共享模式后,可挂载HOOK,并注册RAWINPUT,其中,在键鼠共享开始后,挂载的HOOK可用于拦截除鼠标移动事件外的其他鼠标事件,注册的RAWINPUT可用于捕获鼠标事件中的参数。本实施例在此对鼠标事件的拦截和其中参数的捕获的具体实现不做限制。为了便于描述,以下实施例中以通过挂载HOOK来实现输入事件的拦截和其中操作参数的捕获为例进行介绍。
例如,结合图2和图7,以输入设备为鼠标为例。以用户想要打开手机的蓝牙设置界面为例。如图10所示,在用户通过移动PC的鼠标,使得光标1进入投屏界面1001后,用户继续移动鼠标,以使得光标1移动到投屏界面1001上蓝牙选项1002的位置。由于光标1进入投屏界面1001后,键鼠共享已经开始。键鼠共享开始后,挂载的HOOK可拦截除鼠标移动事件外的其他鼠标事件。因此,在用户继续移动鼠标后,PC可接收到对应输入事件,如鼠标移动事件,且该鼠标移动事件不会被挂载的HOOK拦截,该事件会被传输给PC的windows系统。根据接收到的鼠标移动事件,PC的windows系统可继续绘制光标1移动的动画并在PC的显示屏上显示。如图10所示,随着鼠标1003的移动,PC在PC的显示屏上对应显示光标1004移动的动画,如图10中所示光标1004的移动轨迹如轨迹1005所示,可以看到的是,光标1004移动到了投屏界面1001上蓝牙选项1002的位置。其中,光标1004进入投屏界面后,移动到蓝牙选项1002之前,可以认为光标1004显示在投屏界面的一个元素上,该元素可以是本实施例中的第一内容,可以看到的 是,在该第一内容上,光标1004的光标样式为样式1(该样式1可以为本实施例中的第一样式),即正常选择样式。
另外,在键鼠共享开始后,挂载的HOOK可捕获输入事件中的操作参数,因此,在用户继续移动鼠标的过程中,PC,如PC应用层的投屏服务模块可利用挂载的HOOK捕获接收到的鼠标移动事件中的操作参数,如称为鼠标操作参数1,并将该鼠标操作参数1发送给投屏源端的手机。作为一种示例,该鼠标操作参数1可以是:用于指示用户对鼠标进行了移动的鼠标按键标志位,用于指示光标(如光标1)移动的X坐标和Y坐标的坐标信息,滚轮信息(取值为空)和键位信息(取值为空)。其中,该坐标信息是鼠标移动过程中,光标1相较于光标1进入投屏界面时光标1所处位置的相对位移。
手机接收到鼠标操作参数1后,可根据该鼠标操作参数1利用创建的虚拟输入设备模拟出对应的输入事件,如鼠标移动事件,以便手机上的光标2也可以移动到手机显示的实际界面上蓝牙选项的位置。
其中,如S506中的描述,投屏窗口的尺寸与手机分辨率可能不同,因此,在用户移动了PC的鼠标后,为了使得光标2也能够在实际界面中移动到蓝牙选项的位置,手机可根据手机的分辨率和投屏界面的尺寸,以鼠标操作参数1中的坐标信息为依据,换算出在手机上光标2相较于起始位置的相对位移。
例如,类似于图9中的描述,以投屏窗口的尺寸为A1*B1,手机分辨率为A2*B2,用户继续移动鼠标后,光标1在坐标系1中相较于进入点的相对位移为(X3,Y3),光标2在坐标系2中相较于起始位置的相对位移为(X4,Y4)为例。手机可以根据投屏窗口的尺寸和手机分辨率,确定鼠标移动后,光标1在坐标系1中相较于进入点的相对位移与光标2在坐标系2中相较于起始位置的相对位移间的换算关系。如,光标1在坐标系1中X轴的相对位移与光标2在坐标系2中X轴的相对位移应满足X3/A1=X4/A2,则在光标1在坐标系1中X轴上的相对位移已知时,光标2在坐标系2中X轴的相对位移X4=(A2/A1)*X3,其中,该公式可称为换算关系3,A2/A1可称为换算比例值1。类似的,光标1在坐标系1中Y轴的相对位置与光标2在坐标系2中Y轴的相对位移应满足Y3/B1=Y4/B2,则在光标1在坐标系1中Y轴上的相对位移已知时,光标2在坐标系2中Y轴的相对位移Y4=(B2/B1)*Y3,其中,该公式可称为换算关系4,B2/B1可称为换算比例值2。如:手机分辨率为2340*1080,投屏窗口的尺寸为1170*540,则换算比例值1等于2,换算比例值2等于2,也就是说,在投屏窗口上光标1在X轴和Y轴移动的距离,换算到手机上均为原来的2倍。其中,光标1在坐标系1中相较于进入点的相对位移(X3,Y3),可根据鼠标操作参数1中的坐标信息来确定。
在一种可能的实现方式中,手机(如手机应用层的投屏服务模块)在接收到来自PC的鼠标操作参数1后,可根据鼠标操作参数1中的坐标信息和上述换算关系(如换算关系3和/或换算关系4)确定在手机上光标2相较于起始位置的相对位移。或者,手机可预先确定好上述换算比例值1和换算比例值2,在接收到来自PC的鼠标操作参数1后,可根据鼠标操作参数1中的坐标信息,及预先确定好的换算比例值1和/或换算比例值2,确定在手机上光标2相较于起始位置的相对位移。基于确定的相对位移和鼠标操作参数1中的其他参数(如,鼠标按键标志位,滚轮信息和键位信息),手机(如手机的框架层)可利用创建的虚拟输入设备模拟出对应输入事件,如鼠标移动事件。根据该鼠标移动事件,手机的框架层可绘制光标2移动的动画并在手机的显示屏上显示。继续参考图10所示,随着鼠标1003的移动,手机可在手机的显示屏上对应显示光标1006移动的动 画,如图10中所示光标1006的移动轨迹如轨迹1007所示,可以看到的是,由于坐标信息是经过转换后的坐标,因此光标1006移动到了实际界面1008上蓝牙选项1009的位置。这样,用户通过操作PC的鼠标,不仅可使PC显示屏上的光标1移动到投屏界面中想要操作的位置,手机显示屏上的光标2也移动到了对应的位置。
可以理解的是,如S506中的描述,当手机显示的该光标2是一个隐形光标时,在用户通过移动PC的鼠标,使得手机和PC上的光标均移动时,手机上光标2移动的动画用户可是不可见的。图10中仅是为了方便理解,示出了光标2移动的动画。
另外,需要说明的是,由于PC与手机的操作系统不同,两者鼠标事件中鼠标操作参数的键值存在差异。因此,手机接收到鼠标操作参数1后,可根据预设映射关系,将接收到的鼠标操作参数1的键位码转换成手机能够识别的键位码。之后,手机利用创建的虚拟输入设备根据转换键位码后的鼠标操作参数1可模拟出手机能够识别的输入事件,如鼠标移动事件,以进行相应的响应。
S509、在光标2移动到手机当前显示界面的控件上时,手机将光标2由样式1变更为样式2。
S510、手机向PC发送样式2的光标类型。
S511、PC根据该光标类型在PC的显示屏上显示光标1,并更新投屏界面。
一般的,对于显示界面中的内容,如控件,为了能够让用户从视觉上得知其是否可以进行下一步操作,在用户通过操作输入设备,如鼠标将光标移动到其上时,对于可以进行下一步操作的控件,该控件和/或光标会有相应的视觉反馈。在本实施例中,当光标1移动到投屏界面中某控件的位置时,如果该控件对应在手机上的控件是可以进行下一步操作的控件,则在投屏界面的该控件和/或光标1会进行相应的视觉反馈。如该控件呈现高亮的背景,又如光标1的样式发生改变。
如S507-S508的描述,在用户通过操作PC的鼠标将光标1移动到投屏界面的某控件上时,手机显示屏上的光标2也移动到了手机显示界面中对应控件的位置。对于手机而言,在光标2移动到显示界面中该控件的位置时,如果该控件可以进行下一步操作,则光标2的光标样式会发生变化,如光标2的光标样式由样式1变更为样式2。如,光标2从界面中的一个内容(如该内容可以为本实施例中的第一内容),移到了另一个内容(如该内容可以为本实施例中的第二内容)上,则该光标2的光标样式为由样式1(该样式1可以为本实施例中的第一样式),变更为了样式2(该样式2可以为本实施例中的第二样式)。
可以理解的是,光标有多种光标样式,不同光标样式的光标类型不同。如图11所示,为本实施例提供的一种光标样式及其对应光标类型的示意图。其中,光标样式可以包括:正常选择样式,链接选择样式,文本选择样式,移动样式和垂直调整样式等。对应的光标类型包括:正常选择,链接选择,文本选择,移动和垂直调整等。当光标处于不同控件上时,光标的光标样式可以不同,也可以相同,具体控件与光标样式的对应关系可以是由第三方应用开发者或设备厂商预先定义并存储在手机中,只要光标移动到控件上时光标样式发生变化即可。在光标2移动到显示界面中某控件的位置时,如果该控件可以进行下一步操作,则手机可根据预存的对应关系,改变光标2的光标样式。例如,继续结合图10所示示例,光标2,即光标1006移动到了实际界面1008上蓝牙选项1009的位置,即光标1006从显示在实际界面1008的第一内容的位置,移到了第二内容,即蓝牙选项1009的位置。第一内容为显示在界面中不能操作的元素。如以预定义光标2在手 机上不能操作的位置时光标样式是正常选择样式,预先定义的光标2在蓝牙选项1009上时光标样式是链接选择样式为例。当光标2从第一内容的位置移动到实际界面1008上蓝牙选项1009的位置时,光标2的光标样式由正常选择样式变更为了链接选择样式。
在手机上光标2的光标样式改变后,手机可将改变后的光标样式的光标类型发送给PC。例如,在手机显示光标2后,可在手机的框架层注册光标样式监听器。这样,在光标2的光标样式发生改变后,该光标样式监听器可监听到光标样式改变的事件。手机的框架层可获得改变后的光标样式,如样式2的光标类型,并通过手机应用层的投屏服务模块发送给PC。
在PC,如PC应用层的投屏服务模块接收到样式2的光标类型后,可根据该光标类型在PC的显示屏上显示光标1。例如,继续结合图10所示示例,光标1,即光标1004由在投屏界面的第一内容处的正常选择样式变更为了链接选择样式。这样,给用户呈现出光标1移动到投屏界面中控件的位置时,光标样式发生了改变的视觉效果。
另外,在光标2移动到手机显示界面中对应内容的位置时,该内容可能会有相应的视觉反馈。示例性的,第一内容的显示方式可以由第一方式变更为第二方式,第二内容的显示方式可以由第三方式变更为第四方式。其中,不同内容变更前的显示方式可以相同,也可以不同。不同内容变更后的显示方式可以相同,也可以不同。如以内容是控件,第一方式和第三方式相同,为未呈现高亮背景,第二方式和第四方式相同,均为呈现高亮背景为例。这样,在光标2移动到手机显示界面中对应控件的位置时,该控件由未呈现高亮背景变更为呈现高亮的背景。可以理解的是,在多屏协同时,手机会实时将手机显示屏上显示的界面投射到PC显示屏上,因此在控件呈现高亮的背景后,该改变也会投射到PC的显示屏上。这样,给用户呈现出光标1移动到投屏界面中控件的位置时,控件给出了响应视觉反馈的效果。
例如,结合图10,以光标2移动到手机显示界面中某蓝牙选项的位置时,该蓝牙选项和光标2均有视觉反馈为例。如图12所示,当用户通过移动PC的鼠标,使得光标1移动到投屏界面1201中蓝牙选项1203的位置时,在PC的投屏界面中蓝牙选项1203呈现高亮背景,且光标1由正常选择样式变更为了链接选择样式。另外,手机上光标2也移动到实际界面1202中蓝牙选项1204的位置,手机实际界面1202中蓝牙选项1204呈现高亮背景,且光标2由正常选择样式变更为了链接选择样式。需要说明的是,光标2对用户可以是不可见的。
之后,继续结合图12,如果用户想要对手机的蓝牙进行设置,则可以使用PC的输入设备,如PC的鼠标输入按下操作(该按下操作可以为本申请实施例中的第二操作,该第二操作也可以是其他操作)。如用户可按下鼠标左键。PC可接收到对应的输入事件,如鼠标按下事件。由于该鼠标按下事件是在键鼠共享开启之后接收到的,因此挂载的HOOK会将该鼠标按下事件拦截,以便PC的windows系统不对其进行响应。另外,PC(如PC应用层的投屏服务模块)可获取该鼠标按下事件中的操作参数,如称为鼠标操作参数2(如利用挂载的HOOK捕获该鼠标操作参数2),并将该鼠标操作参数2发送给手机。其中,鼠标按下事件可以为本申请实施例中的第二输入事件,鼠标操作参数2可以为本申请实施例中的第二操作参数。该鼠标操作参数2可以包括:用于指示用户对鼠标进行了按下的鼠标按键标志位,坐标信息(取值为空),滚轮信息(取值为空)和用于指示用户对鼠标的左键进行了操作的键位信息。需要说明的是,只有在鼠标操作参数中坐标信息取值不为空时,手机需要在模拟对应输入事件之前将该坐标信息进行转换处 理(如上述S508中对应内容的具体描述),如果鼠标操作参数中坐标信息取值为空,则无需进行转换处理,可根据接收到的鼠标操作参数模拟对应输入事件。
手机(如手机应用层的投屏服务模块)接收到该鼠标操作参数2后,可由手机的框架层根据预设映射关系,将接收到的鼠标操作参数2的键位码转换成手机能够识别的键位码后,利用创建的虚拟输入设备根据转换键位码后的鼠标操作参数2模拟出手机能够识别的输入事件,如鼠标按下事件。根据该鼠标按下事件,手机可进行相应的响应,如显示第二界面。如图13所示,手机显示蓝牙设置界面1302。手机将手机显示屏上显示的蓝牙设置界面1302投射到PC显示屏上,则PC显示投屏界面1301。另外,可以理解的是,在手机的显示界面发生变化后,光标2所在位置处的控件也会发生变化,手机可利用光标样式监测器,实时监测光标2的样式,如果确定光标2的样式发生变化,则将变化后的光标样式的光标类型发送给PC,以便PC可对光标1的光标样式进行相应的变更。例如,结合图13,在手机显示蓝牙设置界面1302后,光标1304所在位置处没有控件,则光标1304由链接选择样式变更为了正常选择样式。手机可将正常选择样式的光标类型发送给PC,以便PC在投屏窗口上显示正常选择样式的光标1303。
在用户在投屏界面中的操作完成后,用户可能会通过操作PC的输入设备,如移动PC的鼠标,使得光标1移出投屏界面。类似的,用于显示投屏界面的投屏窗口还可用于监测光标1是否移出投屏界面。如当光标1移出投屏界面时,该投屏窗口可检测到对应事件,该事件用于指示光标1移出投屏窗口,根据该事件PC可确定光标1移出投屏界面。在光标1移出投屏界面后,PC可确定与手机之间的键鼠共享停止。PC可向手机发送第二指示信息,以向手机指示键鼠共享停止。PC还可卸载HOOK(或者说关闭HOOK),也即取消对输入事件,如鼠标事件的拦截及其中操作参数的捕获。之后,如果用户对PC的输入设备进行了操作,PC便不会拦截接收到输入事件,而是会将接收到的输入事件发送给PC的windows系统,以便PC的windows系统对该输入事件进行响应,也即使得用户可使用PC的鼠标实现对PC的控制。另外,可以理解的是,当用户移动PC的鼠标,使得光标1移出投屏界面的同时,手机上不可见的光标2也会移动到手机显示屏的边缘。在本实施例中,手机在接收到上述第二指示信息后,可恢复光标2的显示,即将光标2设置为可见,这样能确保在手机直接连接鼠标后,可在手机显示屏上正常显示光标。以上是以由PC确定是否停止键鼠共享为例进行说明的,在其他一些实施例中,也可以由手机确定是否停止键鼠共享。如上述实施例中的描述,在用户移动PC的鼠标后,不仅PC显示屏上的光标1可随之移动,手机显示屏上的光标2也可随之移动。因此,手机可以在光标2的移动过程中,监测光标2是否移出了手机显示屏的边缘,在确定光标2移出手机显示屏的边缘后,手机可确定与PC之间的键鼠共享停止。手机可向PC发送上述第二指示信息,以向PC指示键鼠共享停止。PC接收到该第二指示信息后,可卸载HOOK。手机也可以在确定出光标2移出了手机显示屏的边缘后,恢复光标2的显示。
需要说明的是,以上实施例是以光标1进入投屏界面时(如光标1在进入点时),对应在手机上起始位置显示的光标2的光标样式没有发生改变,即仍是正常选择样式为例进行说明的。在其他一些实施例中,如果在光标1进入投屏界面时,对应在手机上的光标2的样式便发生了改变,如由正常选择样式变为了链接选择样式,则手机可以将变更后的光标类型发送给PC,以便PC对应更改在进入点时光标1的光标样式。
以上实施例是以在键鼠共享开始后,手机显示不可见的光标2,并在该光标2的样式发生改变后,通过将改变后的光标类型发送给PC,用于PC对应更改光标1的样式, 来实现PC的投屏界面中的光标进行视觉反馈为例进行说明的。在其他一些实施例中,在键鼠共享开始后,PC可将其显示屏上的光标,如光标1隐藏,手机显示可见的光标2。这样,在手机上的光标2移动到可进行下一步操作的控件上时,该光标2的样式可发生相应变化,和/或该控件可进行视觉反馈。由于手机上的界面会实时投射到PC的显示屏上显示,因此在光标2的样式发生变化,和/或控件进行视觉反馈时,投射到PC显示屏上的投屏界面中对应内容也会相应发生改变,这样,也可以给用户以投屏界面中的控件和/或光标进行相应的视觉反馈的效果。具体实现与上述实施例中的描述类似,区别在于,在光标1滑入投屏窗口后,隐藏PC上的光标1,手机上显示可见的光标2;挂载的HOOK在键鼠共享开始后,拦截所有输入事件。其他描述与上述实施例中的描述相同,此处不再一一赘述。
另外,以上实施例是以输入设备是鼠标为例进行说明的,在本实施例中,输入设备也可以是触摸板。当输入设备是触摸板时,用户可使用触摸板的按键(左按键或右按键)输入按下操作,通过手指在触摸板上滑动输入移动操作。用户使用触摸板输入操作以实现本实施例方法的具体实现与使用鼠标输入操作实现本实施例方法的具体实现类似,此处不再一一赘述。
本实施例提供的技术方案,在用户通过操作投屏目的端的输入设备,如鼠标或触摸板将光标移动到投屏界面的控件上时,投屏界面中的该控件和/或光标会进行相应的视觉反馈,如投屏界面中的控件呈现高亮的背景,光标样式进行相应变化。这样,用户可从视觉上确定投屏界面中的该控件对应在投屏源端上显示的控件是否可以进行下一步操作,提高了用户的使用体验。
图14为本申请实施例提供的一种界面显示装置的组成示意图。如图14所示,该装置可以应用于第一终端(如上述PC),该第一终端与第二终端连接,该装置可以包括:显示单元1401和输入单元1402。
显示单元1401,用于在第一终端的显示屏上显示投屏界面,该投屏界面的内容为第二终端显示屏上显示的第一界面内容的镜像。
输入单元1402,用于接收用户使用第一终端的输入设备输入的第一操作,该第一操作用于移动第一终端的显示屏上的第一光标。
其中,在第一光标移动到投屏界面的第一内容上时,第一光标的光标样式为第一样式,和/或,第一内容的显示方式由第一方式变更为第二方式;在第一光标移动到投屏界面的第二内容上时,第一光标的光标样式为第二样式,和/或,第二内容的显示方式由第三方式变更为第四方式。
进一步的,上述投屏界面显示在第一终端显示屏的部分区域上。
显示单元1401,还用于响应于第一操作,在第一终端的显示屏上显示第一光标移动的动画。
该装置还可以包括:发送单元1403,用于在第一光标在第一终端的显示屏上移动的过程中,在确定第一光标进入投屏界面时,向第二终端发送第一光标进入投屏界面的初始坐标位置,向第二终端发送第一操作的数据。
其中,初始坐标位置为第一光标进入投屏界面时相对于投屏界面的第一角的坐标位置,用于第二终端在第二终端的显示屏上显示第二光标;第一操作的数据用于移动第二终端的显示屏上的第二光标,以使得在第一光标移动到第一内容上时,第二光标移动到第一界面与第一内容对应的内容上,第二光标移动到与第一内容对应的内容上时,第二 光标的光标样式为第一样式,还使得在第一光标移动到第二内容上时,第二光标移动到第一界面与第二内容对应的内容上,第二光标移动到与第二内容对应的内容上时,第二光标的光标样式为第二样式。
该装置还可以包括:接收单元1404。
接收单元1404,用于在第一光标移动到投屏界面的第一内容上时,接收来自第二终端的第一样式的光标类型。
显示单元1401,还用于根据第一样式的光标类型显示第一光标,以便第一光标显示为第一样式。
接收单元1404,还用于在第一光标移动到投屏界面的第二内容上时,接收来自第二终端的第二样式的光标类型。
显示单元1401,还用于根据第二样式的光标类型显示第一光标,以便第一光标显示为第二样式。
进一步的,第二光标移动到第一内容对应的内容上时,第一界面中第一内容对应的内容的显示方式由第一方式变更为第二方式。显示单元1401,还用于在第一光标移动到投屏界面的第一内容上后,更新投屏界面,更新前的投屏界面中第一内容的显示方式为第一方式,更新后的投屏界面中第一内容的显示方式为第二方式。
在另一种可能的实现方式中,第二光标移动到第二内容对应的内容上时,第一界面中第二内容对应的内容的显示方式由第三方式变更为第四方式。显示单元1401,还用于在第一光标移动到投屏界面的第二内容上后,更新投屏界面,更新前的投屏界面中第二内容的显示方式为第三方式,更新后的投屏界面中第二内容的显示方式为第四方式。
在另一种可能的实现方式中,第二光标的透明度大于阈值。
在另一种可能的实现方式中,该装置还可以包括:获取单元1405,用于在第一光标进入投屏界面后,用户使用第一终端的输入设备输入第一操作的过程中,获取接收到的第一输入事件中的第一操作参数,第一输入事件为第一操作对应的移动事件;发送单元1403,具体用于向第二终端发送第一操作参数,该第一操作参数用于第二终端模拟第一输入事件,进而用于移动第二光标。
在另一种可能的实现方式中,输入单元1402,还用于在第一光标移动到投屏界面的第一内容上时,接收用户使用第一终端的输入设备输入的第二操作;发送单元1403,还用于向第二终端发送第二操作的数据,该第二操作的数据用于第二终端显示第二界面;显示单元1401,还用于更新投屏界面,更新后的投屏界面的内容为第二界面内容的镜像。
在另一种可能的实现方式中,获取单元1405,还用于在用户使用第一终端的输入设备输入第二操作后,拦截第二操作对应的第二输入事件,获取第二输入事件中的第二操作参数,发送单元1403,具体用于向第二终端发送第二操作参数,该第二操作参数用于第二终端模拟第二输入事件,进而用于显示第二界面。
在另一种可能的实现方式中,第一操作对应移动事件。获取单元1405,还用于开启输入事件的拦截,用于拦截除移动事件外的其他输入事件。发送单元1403,还用于向第二终端发送第一指示信息,该第一指示信息用于指示共享开始。
在另一种可能的实现方式中,在第一光标移出投屏界面后,发送单元1403,还用于向第二终端发送第二指示信息,该第二指示信息用于指示共享停止;获取单元1405,还用于取消输入事件的拦截。
图15为本申请实施例提供的另一种界面显示装置的组成示意图。该装置可以应用于 第二终端(如上述手机),第二终端与第一终端连接。如图15所示,该装置可以包括:显示单元1501,投射单元1502和接收单元1503。
显示单元1501,用于显示第一界面。
投射单元1502,用于将第一界面投射显示到第一终端,以使得第一终端显示投屏界面。
显示单元1501,还用于在第一终端的第一光标进入投屏界面时,在第一界面上显示第二光标。
接收单元1503,用于接收用户使用第一终端的输入设备输入的第一操作,该第一操作用于移动第二终端的显示屏上的第二光标。
显示单元1501,还用于在第二光标移动到第一界面的第一内容上时,将第二光标显示为第一样式,和/或,将第一内容的显示方式由第一方式变更为第二方式,以便在第一光标移动到投屏界面与第一内容对应的内容上时,第一光标显示为第一样式,和/或,投屏界面与第一内容对应的内容的显示方式由第一方式变更为第二方式;显示单元1501,还用于在第二光标移动到第一界面的第二内容上时,将第二光标显示为第二样式,和/或,将第二内容的显示方式由第三方式变更为第四方式,以便在第一光标移动到投屏界面与第二内容对应的内容上时,第一光标显示为第二样式,和/或,投屏界面与第二内容对应的内容的显示方式由第三方式变更为第四方式。
在一种可能的实现方式中,该装置还可以包括:发送单元1504,用于在将第二光标显示为第一样式后,向第一终端发送第一样式的光标类型,用于第一终端显示第一光标,以便第一光标显示为第一样式;发送单元1504,还用于在将第二光标显示为第二样式之后,向第一终端发送第二样式的光标类型,用于第一终端显示第一光标,以便第一光标显示为第二样式。
在另一种可能的实现方式中,第二光标的透明度大于阈值。
在另一种可能的实现方式中,接收单元1503,还用于接收来自第一终端的第一光标进入投屏界面的初始坐标位置。该装置还可以包括:确定单元1505,用于根据初始坐标位置,投屏界面的尺寸和第二终端的分辨率确定起始位置,该起始位置可以为相对于第二终端显示屏的第一角的坐标位置;显示单元1501,具体用于在起始位置显示第二光标。
在另一种可能的实现方式中,接收单元1503,具体用于接收来自第一终端的第一操作参数,该第一操作参数是第一光标进入投屏界面后,用户使用第一终端的输入设备输入第一操作的过程中第一终端接收到的第一输入事件中的操作参数,第一操作参数包括第一光标相较于初始坐标位置的相对位移;确定单元1505,还用于根据第一光标相当于初始坐标位置的相对位移,确定第二光标相较于起始位置的相对位移;该装置还可以包括:模拟单元1506,用于根据确定出的第二光标相较于起始位置的相对位移,及第一操作参数中的其他参数模拟第一输入事件。
显示单元1501,还用于根据第一输入事件在第二终端的显示屏上显示第二光标移动的动画。
在另一种可能的实现方式中,接收单元1503,还用于在第二光标移动到第一界面的第一内容上时,接收用户使用第一终端的输入设备输入的第二操作;显示单元1501,还用于响应于第二操作,显示第二界面;投射单元1502,还用于将第二界面投射显示到第一终端,以使得第一终端更新后的投屏界面的内容为第二界面内容的镜像。
在另一种可能的实现方式中,接收单元1503,具体用于接收来自第一终端的第二操 作参数,该第二操作参数是在第一光标移动到投屏界面与第一内容对应的内容上时,用户使用第一终端的输入设备输入第二操作后,第一终端拦截到的第二输入事件中包括的操作参数;模拟单元1506,用于根据第二操作参数模拟第二输入事件,第二输入事件用于显示第二界面。
在另一种可能的实现方式中,接收单元1503,还用于接收来自第一终端的第一指示信息,第一指示信息用于指示共享开始。
在另一种可能的实现方式中,接收单元1503,还用于接收来自第一终端的第二指示信息,第二指示信息用于指示共享停止,第二指示信息是第一终端在确定第一光标移出投屏界面后发送的。
本申请实施例还提供一种界面显示装置,该装置可以应用于电子设备,如上述实施例中的第一终端或第二终端。该装置可以包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令时使得该界面显示装置实现上述方法实施例中手机或PC执行的各个功能或者步骤。
本申请实施例还提供一种电子设备(该电子设备可以是终端,如可以为上述实施例中的第一终端或第二终端),该电子设备可以包括:显示屏、存储器和一个或多个处理器。该显示屏、存储器和处理器耦合。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,电子设备可执行上述方法实施例中手机或PC执行的各个功能或者步骤。当然,该电子设备包括但不限于上述显示屏、存储器和一个或多个处理器。例如,该电子设备的结构可以参考图3所示的手机的结构。
本申请实施例还提供一种芯片系统,该芯片系统可以应用于电子设备,如前述实施例中的终端(如第一终端或第二终端)。如图16所示,该芯片系统包括至少一个处理器1601和至少一个接口电路1602。该处理器1601可以是上述电子设备中的处理器。处理器1601和接口电路1602可通过线路互联。该处理器1601可以通过接口电路1602从上述电子设备的存储器接收并执行计算机指令。当计算机指令被处理器1601执行时,可使得电子设备执行上述实施例中手机或PC执行的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机可读存储介质,用于存储电子设备,如上述终端(如手机或PC)运行的计算机指令。
本申请实施例还提供一种计算机程序产品,包括电子设备,如上述终端(如手机或PC)运行的计算机指令。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示 的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (23)

  1. 一种界面显示方法,其特征在于,应用于第一终端,所述第一终端与第二终端连接,所述方法包括:
    所述第一终端在所述第一终端的显示屏上显示投屏界面,所述投屏界面的内容为所述第二终端显示屏上显示的第一界面内容的镜像;
    所述第一终端接收用户使用所述第一终端的输入设备输入的第一操作,所述第一操作用于移动所述第一终端的显示屏上的第一光标;
    其中,在所述第一光标移动到所述投屏界面的第一内容上时,所述第一光标的光标样式为第一样式,和/或,所述第一内容的显示方式由第一方式变更为第二方式;在所述第一光标移动到所述投屏界面的第二内容上时,所述第一光标的光标样式为第二样式,和/或,所述第二内容的显示方式由第三方式变更为第四方式。
  2. 根据权利要求1所述的方法,其特征在于,所述投屏界面显示在所述第一终端显示屏的部分区域上;
    所述方法还包括:
    响应于所述第一操作,所述第一终端在所述第一终端的显示屏上显示所述第一光标移动的动画;
    在所述第一光标在所述第一终端的显示屏上移动的过程中,所述第一终端在确定所述第一光标进入所述投屏界面时,向所述第二终端发送所述第一光标进入所述投屏界面的初始坐标位置,向所述第二终端发送所述第一操作的数据;
    其中,所述初始坐标位置为所述第一光标进入所述投屏界面时相对于所述投屏界面的第一角的坐标位置,用于所述第二终端在所述第二终端的显示屏上显示第二光标;所述第一操作的数据用于移动所述第二终端的显示屏上的所述第二光标,以使得在所述第一光标移动到所述第一内容上时,所述第二光标移动到所述第一界面与所述第一内容对应的内容上,所述第二光标移动到与所述第一内容对应的内容上时,所述第二光标的光标样式为所述第一样式,还使得在所述第一光标移动到所述第二内容上时,所述第二光标移动到所述第一界面与所述第二内容对应的内容上,所述第二光标移动到与所述第二内容对应的内容上时,所述第二光标的光标样式为所述第二样式;
    在所述第一光标移动到所述投屏界面的所述第一内容上时,所述第一终端接收来自所述第二终端的所述第一样式的光标类型,根据所述第一样式的光标类型显示所述第一光标,以便所述第一光标显示为所述第一样式;
    在所述第一光标移动到所述投屏界面的所述第二内容上时,所述第一终端接收来自所述第二终端的所述第二样式的光标类型,根据所述第二样式的光标类型显示所述第一光标,以便所述第一光标显示为所述第二样式。
  3. 根据权利要求2所述的方法,其特征在于,所述第二光标移动到所述第一内容对应的内容上时,所述第一界面中所述第一内容对应的内容的显示方式由所述第一方式变更为所述第二方式;
    所述方法还包括:
    在所述第一光标移动到所述投屏界面的所述第一内容上后,所述第一终端更新所述投屏界面,更新前的所述投屏界面中所述第一内容的显示方式为所述第一方式,更新后的所述投屏界面中所述第一内容的显示方式为所述第二方式;
    所述第二光标移动到所述第二内容对应的内容上时,所述第一界面中所述第二内容对应的内容的显示方式由所述第三方式变更为所述第四方式;
    所述方法还包括:
    在所述第一光标移动到所述投屏界面的所述第二内容上后,所述第一终端更新所述投屏界面,更新前的所述投屏界面中所述第二内容的显示方式为所述第三方式,更新后的所述投屏界面中所述第二内容的显示方式为所述第四方式。
  4. 根据权利要求2或3所述的方法,其特征在于,所述第二光标的透明度大于阈值。
  5. 根据权利要求2-4中任一项所述的方法,其特征在于,所述向所述第二终端发送所述第一操作的数据,包括:
    在所述第一光标进入所述投屏界面后,用户使用所述第一终端的输入设备输入所述第一操作的过程中,所述第一终端获取接收到的第一输入事件中的第一操作参数,所述第一输入事件为所述第一操作对应的移动事件;
    所述第一终端向所述第二终端发送所述第一操作参数,所述第一操作参数用于所述第二终端模拟所述第一输入事件,进而用于移动所述第二光标。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述方法还包括:
    在所述第一光标移动到所述投屏界面的所述第一内容上时,所述第一终端接收用户使用所述第一终端的输入设备输入的第二操作;
    所述第一终端向所述第二终端发送所述第二操作的数据,所述第二操作的数据用于所述第二终端显示第二界面;
    所述第一终端更新所述投屏界面,更新后的所述投屏界面的内容为所述第二界面内容的镜像。
  7. 根据权利要求6所述的方法,其特征在于,所述第一终端向所述第二终端发送所述第二操作的数据,包括:
    在用户使用所述第一终端的输入设备输入所述第二操作后,所述第一终端拦截所述第二操作对应的第二输入事件;
    所述第一终端获取并向所述第二终端发送所述第二输入事件中的第二操作参数,所述第二操作参数用于所述第二终端模拟所述第二输入事件,进而用于显示所述第二界面。
  8. 根据权利要求2-7中任一项所述的方法,其特征在于,所述第一操作对应移动事件;
    在所述第一光标进入所述投屏界面后,所述方法还包括:
    所述第一终端开启输入事件的拦截,用于拦截除所述移动事件外的其他输入事件;
    所述第一终端向所述第二终端发送第一指示信息,所述第一指示信息用于指示共享开始。
  9. 根据权利要求8所述的方法,其特征在于,在所述第一光标移出所述投屏界面后,所述方法还包括:
    所述第一终端取消输入事件的拦截;
    所述第一终端向所述第二终端发送第二指示信息,所述第二指示信息用于指示共享停止。
  10. 一种界面显示方法,其特征在于,应用于第二终端,所述第二终端与第一终端连接,所述方法包括:
    所述第二终端显示第一界面,将所述第一界面投射显示到所述第一终端,以使得所 述第一终端显示投屏界面;
    所述第二终端在所述第一终端的第一光标进入所述投屏界面时,在所述第一界面上显示第二光标;
    所述第二终端接收用户使用所述第一终端的输入设备输入的第一操作,所述第一操作用于移动所述第二终端的显示屏上的所述第二光标;
    在所述第二光标移动到所述第一界面的第一内容上时,所述第二终端将所述第二光标显示为第一样式,和/或,将所述第一内容的显示方式由第一方式变更为第二方式,以便在所述第一光标移动到所述投屏界面与所述第一内容对应的内容上时,所述第一光标显示为所述第一样式,和/或,所述投屏界面与所述第一内容对应的内容的显示方式由所述第一方式变更为所述第二方式;
    在所述第二光标移动到所述第一界面的第二内容上时,所述第二终端将所述第二光标显示为第二样式,和/或,将所述第二内容的显示方式由第三方式变更为第四方式,以便在所述第一光标移动到所述投屏界面与所述第二内容对应的内容上时,所述第一光标显示为所述第二样式,和/或,所述投屏界面与所述第二内容对应的内容的显示方式由所述第三方式变更为所述第四方式。
  11. 根据权利要求10所述的方法,其特征在于,在所述第二终端将所述第二光标显示为第一样式之后,所述方法还包括:所述第二终端向所述第一终端发送所述第一样式的光标类型,用于所述第一终端显示所述第一光标,以便所述第一光标显示为所述第一样式;
    在所述第二终端将所述第二光标显示为第二样式之后,所述方法还包括:所述第二终端向所述第一终端发送所述第二样式的光标类型,用于所述第一终端显示所述第一光标,以便所述第一光标显示为所述第二样式。
  12. 根据权利要求10或11所述的方法,其特征在于,所述第二光标的透明度大于阈值。
  13. 根据权利要求10-12中任一项所述的方法,其特征在于,所述第二终端在所述第一终端的第一光标进入所述投屏界面时,在所述第一界面上显示第二光标,包括:
    所述第二终端接收来自所述第一终端的所述第一光标进入所述投屏界面的初始坐标位置;
    所述第二终端根据所述初始坐标位置,所述投屏界面的尺寸和所述第二终端的分辨率确定起始位置,所述起始位置为相对于所述第二终端显示屏的第一角的坐标位置;
    所述第二终端在所述起始位置显示所述第二光标。
  14. 根据权利要求13所述的方法,其特征在于,所述第二终端接收用户使用所述第一终端的输入设备输入的第一操作,包括:
    所述第二终端接收来自所述第一终端的第一操作参数,所述第一操作参数是所述第一光标进入所述投屏界面后,用户使用所述第一终端的输入设备输入所述第一操作的过程中所述第一终端接收到的第一输入事件中的操作参数,所述第一操作参数包括所述第一光标相较于所述初始坐标位置的相对位移;
    所述第二终端根据所述第一光标相当于所述初始坐标位置的相对位移,确定所述第二光标相较于所述起始位置的相对位移;
    所述第二终端根据确定出的所述第二光标相较于所述起始位置的相对位移,及所述第一操作参数中的其他参数模拟所述第一输入事件;
    所述方法还包括:所述第二终端根据所述第一输入事件在所述第二终端的显示屏上显示所述第二光标移动的动画。
  15. 根据权利要求10-14中任一项所述的方法,其特征在于,所述方法还包括:
    在所述第二光标移动到所述第一界面的所述第一内容上时,所述第二终端接收用户使用所述第一终端的输入设备输入的第二操作;
    响应于所述第二操作,所述第二终端显示第二界面,将所述第二界面投射显示到所述第一终端,以使得所述第一终端更新后的投屏界面的内容为所述第二界面内容的镜像。
  16. 根据权利要求15所述的方法,其特征在于,所述第二终端接收用户使用所述第一终端的输入设备输入的第二操作,包括:
    所述第二终端接收来自所述第一终端的第二操作参数,所述第二操作参数是在所述第一光标移动到所述投屏界面与所述第一内容对应的内容上时,用户使用所述第一终端的输入设备输入所述第二操作后,所述第一终端拦截到的第二输入事件中包括的操作参数;
    所述第二终端根据所述第二操作参数模拟所述第二输入事件,所述第二输入事件用于显示所述第二界面。
  17. 根据权利要求10-16中任一项所述的方法,其特征在于,在所述第一终端的所述第一光标进入所述投屏界面时,所述方法还包括:
    所述第二终端接收来自所述第一终端的第一指示信息,所述第一指示信息用于指示共享开始。
  18. 根据权利要求17所述的方法,其特征在于,所述方法还包括:
    所述第二终端接收来自所述第一终端的第二指示信息,所述第二指示信息用于指示共享停止,所述第二指示信息是所述第一终端在确定所述第一光标移出所述投屏界面后发送的。
  19. 一种界面显示装置,其特征在于,包括:处理器;用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令时使得所述界面显示装置实现如权利要求1-9中任一项所述的方法,或者实现如权利要求10-18中任一项所述的方法。
  20. 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被电子设备执行时使得所述电子设备实现如权利要求1-9中任一项所述的方法,或者实现如权利要求10-18中任一项所述的方法。
  21. 一种界面显示系统,其特征在于,包括第一终端和第二终端,所述第一终端与所述第二终端连接;
    所述第二终端,用于显示第一界面,将所述第一界面投射显示到所述第一终端,以使得所述第一终端显示投屏界面;
    所述第一终端,用于在所述第一终端的显示屏上显示所述投屏界面,所述投屏界面的内容为所述第二终端显示屏上显示的所述第一界面内容的镜像;接收用户使用所述第一终端的输入设备输入的第一操作,所述第一操作用于移动所述第一终端的显示屏上的第一光标;
    其中,在所述第一光标移动到所述投屏界面的第一内容上时,所述第一光标的光标样式为第一样式,和/或,所述第一内容的显示方式由第一方式变更为第二方式;在所述第一光标移动到所述投屏界面的第二内容上时,所述第一光标的光标样式为第二样式, 和/或,所述第二内容的显示方式由第三方式变更为第四方式。
  22. 根据权利要求21所述的系统,其特征在于,所述投屏界面显示在所述第一终端显示屏的部分区域上;
    所述第一终端,还用于响应于所述第一操作,在所述第一终端的显示屏上显示所述第一光标移动的动画;
    所述第二终端,还用于在所述第一光标进入所述投屏界面时,在所述第一界面上显示第二光标;接收用户使用所述第一终端的输入设备输入的所述第一操作,所述第一操作用于移动所述第二终端的显示屏上的所述第二光标;在所述第二光标移动到所述第一界面与所述第一内容对应的内容上时,将所述第二光标显示为所述第一样式,向所述第一终端发送所述第一样式的光标类型;
    所述第一终端,还用于根据所述第一样式的光标类型显示所述第一光标;
    所述第二终端,还用于在所述第二光标移动到所述第一界面与所述第二内容对应的内容上时,将所述第二光标显示为所述第二样式,向所述第一终端发送所述第二样式的光标类型;
    所述第一终端,还用于根据所述第二样式的光标类型显示所述第一光标。
  23. 根据权利要求22所述的系统,其特征在于,
    所述第二终端,还用于在所述第二光标移动到所述第一界面与所述第一内容对应的内容上时,将所述第一界面与所述第一内容对应的内容的显示方式由所述第一方式变更为所述第二方式;所述第一终端,还用于更新所述投屏界面;
    所述第二终端,还用于在所述第二光标移动到所述第一界面与所述第二内容对应的内容上时,将所述第一界面与所述第二内容对应的内容的显示方式由所述第三方式变更为所述第四方式;所述第一终端,还用于更新所述投屏界面。
PCT/CN2021/114825 2020-08-26 2021-08-26 一种界面显示方法及设备 WO2022042656A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/042,688 US20230333703A1 (en) 2020-08-26 2021-08-26 Interface Display Method and Device
EP21860490.8A EP4195008A4 (en) 2020-08-26 2021-08-26 INTERFACE DISPLAY METHOD AND DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010873983.5 2020-08-26
CN202010873983.5A CN114115629A (zh) 2020-08-26 2020-08-26 一种界面显示方法及设备

Publications (1)

Publication Number Publication Date
WO2022042656A1 true WO2022042656A1 (zh) 2022-03-03

Family

ID=80352704

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/114825 WO2022042656A1 (zh) 2020-08-26 2021-08-26 一种界面显示方法及设备

Country Status (4)

Country Link
US (1) US20230333703A1 (zh)
EP (1) EP4195008A4 (zh)
CN (1) CN114115629A (zh)
WO (1) WO2022042656A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115243082A (zh) * 2022-07-18 2022-10-25 海信视像科技股份有限公司 一种显示设备及终端控制方法
WO2023179129A1 (zh) * 2022-03-24 2023-09-28 海信视像科技股份有限公司 显示设备、投屏设备及基于轨迹提取的设备控制方法

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115373553A (zh) * 2021-05-19 2022-11-22 华为技术有限公司 一种显示方法及终端设备
WO2023068521A1 (ko) * 2021-10-18 2023-04-27 삼성전자주식회사 확장 화면을 공유하는 소스 장치, 싱크 장치 및 그 동작 방법
CN114760291B (zh) * 2022-06-14 2022-09-13 深圳乐播科技有限公司 一种文件处理方法及装置
CN117492671A (zh) * 2022-07-26 2024-02-02 华为技术有限公司 一种投屏方法及设备
CN115185414B (zh) * 2022-07-26 2024-04-19 Vidaa国际控股(荷兰)公司 显示设备及光标控制方法
CN117785085A (zh) * 2022-09-21 2024-03-29 北京字跳网络技术有限公司 虚拟终端设备的信息提示方法、装置、设备、介质及产品
CN118227063A (zh) * 2022-12-21 2024-06-21 华为终端有限公司 一种显示方法、计算机可读存储介质及电子设备
CN117707450A (zh) * 2023-07-11 2024-03-15 荣耀终端有限公司 屏幕协同和键鼠共享的并发方法、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160092154A1 (en) * 2014-09-30 2016-03-31 International Business Machines Corporation Content mirroring
CN111327769A (zh) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 多屏互动方法及装置、存储介质
CN111399789A (zh) * 2020-02-20 2020-07-10 华为技术有限公司 界面布局方法、装置及系统

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120046071A1 (en) * 2010-08-20 2012-02-23 Robert Craig Brandis Smartphone-based user interfaces, such as for browsing print media
KR101621524B1 (ko) * 2012-11-02 2016-05-31 삼성전자 주식회사 디스플레이장치 및 그 제어방법
CN105512086B (zh) * 2016-02-16 2018-08-10 联想(北京)有限公司 信息处理设备以及信息处理方法
CN105843470A (zh) * 2016-03-18 2016-08-10 联想(北京)有限公司 一种信息处理方法及电子设备
CN106873846A (zh) * 2016-12-29 2017-06-20 北京奇虎科技有限公司 一种pc端控制移动设备的方法及系统
US11422765B2 (en) * 2018-07-10 2022-08-23 Apple Inc. Cross device interactions
CN110417992B (zh) * 2019-06-20 2021-02-12 华为技术有限公司 一种输入方法、电子设备和投屏系统
CN110515576B (zh) * 2019-07-08 2021-06-01 华为技术有限公司 显示控制方法及装置
CN110602805B (zh) * 2019-09-30 2021-06-15 联想(北京)有限公司 信息处理方法、第一电子设备和计算机系统
CN110806831A (zh) * 2019-09-30 2020-02-18 华为技术有限公司 一种触摸屏的响应方法及电子设备
CN111221491A (zh) * 2020-01-09 2020-06-02 Oppo(重庆)智能科技有限公司 交互控制方法及装置、电子设备、存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160092154A1 (en) * 2014-09-30 2016-03-31 International Business Machines Corporation Content mirroring
CN111399789A (zh) * 2020-02-20 2020-07-10 华为技术有限公司 界面布局方法、装置及系统
CN111327769A (zh) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 多屏互动方法及装置、存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4195008A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023179129A1 (zh) * 2022-03-24 2023-09-28 海信视像科技股份有限公司 显示设备、投屏设备及基于轨迹提取的设备控制方法
CN115243082A (zh) * 2022-07-18 2022-10-25 海信视像科技股份有限公司 一种显示设备及终端控制方法
CN115243082B (zh) * 2022-07-18 2024-08-23 海信视像科技股份有限公司 一种显示设备及终端控制方法

Also Published As

Publication number Publication date
US20230333703A1 (en) 2023-10-19
CN114115629A (zh) 2022-03-01
EP4195008A4 (en) 2024-02-21
EP4195008A1 (en) 2023-06-14

Similar Documents

Publication Publication Date Title
WO2022042656A1 (zh) 一种界面显示方法及设备
WO2022048500A1 (zh) 一种显示方法及设备
WO2021052279A1 (zh) 一种折叠屏显示方法及电子设备
EP3905786B1 (en) Method and device for controlling networking of smart home device
WO2022022490A1 (zh) 一种跨设备的对象拖拽方法及设备
CN111190748B (zh) 数据共享方法、装置、设备及存储介质
CN115793916A (zh) 显示多窗口的方法、电子设备和系统
CN112130788A (zh) 一种内容分享方法及其装置
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
CN115657918A (zh) 一种跨设备的对象拖拽方法及设备
WO2022135163A1 (zh) 一种投屏显示方法及电子设备
WO2021197354A1 (zh) 一种设备的定位方法及相关装置
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
CN114143906B (zh) 一种电子设备连接方法及电子设备
WO2022143310A1 (zh) 一种双路投屏的方法及电子设备
WO2022048453A1 (zh) 解锁方法及电子设备
CN111580892B (zh) 一种业务组件调用的方法、装置、终端和存储介质
WO2022206848A1 (zh) 一种应用小部件的显示方法及设备
WO2022111690A1 (zh) 一种共享输入设备的方法、电子设备及系统
WO2022152174A1 (zh) 一种投屏的方法和电子设备
CN115407885A (zh) 一种手写笔的连接控制方法和电子设备
WO2023071590A1 (zh) 输入控制方法及电子设备
CN112035823B (zh) 数据获取方法、装置、终端及存储介质
WO2023025059A1 (zh) 一种通信系统及通信方法
WO2022179273A1 (zh) 一种分布式音频播放方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860490

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021860490

Country of ref document: EP

Effective date: 20230308

NENP Non-entry into the national phase

Ref country code: DE