WO2022135210A1 - 一种增强的屏幕共享方法和系统、电子设备 - Google Patents

一种增强的屏幕共享方法和系统、电子设备 Download PDF

Info

Publication number
WO2022135210A1
WO2022135210A1 PCT/CN2021/137463 CN2021137463W WO2022135210A1 WO 2022135210 A1 WO2022135210 A1 WO 2022135210A1 CN 2021137463 W CN2021137463 W CN 2021137463W WO 2022135210 A1 WO2022135210 A1 WO 2022135210A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
mobile phone
event
information
interface
Prior art date
Application number
PCT/CN2021/137463
Other languages
English (en)
French (fr)
Inventor
苏航
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21909207.9A priority Critical patent/EP4242826A4/en
Publication of WO2022135210A1 publication Critical patent/WO2022135210A1/zh
Priority to US18/336,885 priority patent/US20230333803A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72406User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by software upgrading or downloading
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present application relates to the field of communications, and in particular, to an enhanced screen sharing method and system, and an electronic device.
  • Screen sharing and interaction refers to an electronic device with screen display function, which acts as a screen sharing initiator device (hereinafter referred to as the initiator device) and screen sharing receiver device (hereinafter referred to as the receiver device) to realize the screen content between electronic devices. , media content and other information sharing and interaction technology.
  • the recipient device can receive relatively simple operations on the shared screen from the user. For example, the recipient device can receive operations such as graffiti on the shared screen by the user.
  • the receiver device can receive operations such as graffiti on the shared screen by the user.
  • Embodiments of the present application provide an enhanced screen sharing method, system, and electronic device.
  • the recipient device of the screen sharing can receive more information from the user on the screen sharing sharing interface.
  • Type of operation to achieve better interaction between the receiver device and the initiator device of screen sharing and improve user experience.
  • the implementation of the embodiments of the present application provides an enhanced screen sharing method, including: a first electronic device displays a first sharing interface, and sends the first sharing interface to a second electronic device interface data; the second electronic device receives the interface data, and displays the second sharing interface corresponding to the first sharing interface according to the interface data; the second electronic device enables the linkage mode to the first electronic device; in the linkage mode, the second electronic device The device detects the user's triggering operation on the second sharing interface, and determines event information corresponding to the triggering operation, where the event information includes the event type and operating area information corresponding to the triggering operation; the second electronic device sends the event information to the first electronic device; An electronic device receives event information, and performs corresponding operations on the first electronic device according to the event information.
  • the second electronic device can cause the first electronic device to perform a corresponding operation on the first electronic device through the detected triggering operation of the user on the second sharing interface, so as to realize the linkage between the second electronic device and the first electronic device , the interaction between the second electronic device and the first electronic device can be better realized, the usability and user-friendliness of the sharing interface can be improved, and the user's experience can be improved.
  • the event type may be, for example, a single-click input event, a double-click input event, a long-press input event, a sliding input event, or other input events
  • the operation area information may be the position coordinates of the user's trigger position Information (such as the coordinates of where the user clicked).
  • the first shared interface and the second shared interface may be completely identical, and may be collectively referred to as a shared interface.
  • the first shared interface and the second shared interface may also be partially the same.
  • the first electronic device performs a corresponding operation on the first electronic device according to the event information, including: the first electronic device determines the first operation type according to the event type; the first electronic device determines the first operation type according to the event type; The first operation type and operation area information perform corresponding operations on the first electronic device.
  • the first operation type is an application determination operation
  • the first electronic device determines that the operation area information is on the first shared interface according to the operation area information
  • Application identification information of the corresponding application the first electronic device generates event response information, and the event response information includes application identification information
  • the first electronic device sends the event response information to the second electronic device
  • the second electronic device receives the event response information, if It is determined according to the event response information that the application corresponding to the application identification information is not currently installed on the second electronic device, and the application corresponding to the application identification information is downloaded or downloaded and installed.
  • the first input event is a double-click input event or a long-press input event.
  • the second electronic device is linked with the first electronic device, and the second electronic device can realize the operation of downloading and installing the application corresponding to the application in the first electronic device by the second electronic device.
  • the interaction between the two can be better realized, and the usability and user-friendliness of the shared interface can be improved, so as to improve the user's experience.
  • the method further includes: the first electronic device determines a second operation type according to the first operation type, where the second operation type is an application download or a download and install operation; the first electronic device Generate event response information, where the event response information includes the application identification information and the second operation type; if the second electronic device determines according to the event response information that the application corresponding to the application identification information is not currently installed in the second electronic device, download or download and install the application identification Information corresponding to the application.
  • the linkage between the second electronic device and the first electronic device can realize the operation of downloading and installing the application corresponding to the application in the first electronic device by the second electronic device.
  • the interaction between the two can be better realized, and the usability and user-friendliness of the shared interface can be improved, so as to improve the user's experience.
  • the first operation type is the operation object triggering operation; the first electronic device determines, according to the operation area information, that the operation area information is in the first sharing interface The operation object corresponding to the above; the first electronic device executes the operation that triggers the operation object.
  • the second input event is a click input event or a sliding input event.
  • the linkage between the second electronic device and the first electronic device can enable the second electronic device to link the first electronic device to trigger operations such as operating objects, which can better realize the interaction between the two and improve the usability and efficiency of the shared interface.
  • the method further includes: the second electronic device operates according to the trigger Obtain the initial operation area information; the second electronic device adjusts the initial operation area information according to the first screen resolution and the second screen resolution to obtain the operation area information; or the first electronic device obtains the operation area information according to the first screen resolution and the second screen resolution Resolution, adjust the received operation area information, and perform corresponding operations according to the adjusted operation area information and the first operation type.
  • the operation area information of the area where the user performs the trigger operation can be accurately determined.
  • the operation area information is position coordinate information.
  • the second electronic device enables the linkage mode to the first electronic device, including: the first electronic device displays the first linkage control; The activation trigger operation of the linkage control, the first electronic device opens the linkage mode with the second electronic device; or the second electronic device displays the second linkage control; if the second electronic device detects that the user triggers the activation of the second linkage control operation, the second electronic device starts the linkage mode to the first electronic device.
  • the linkage mode can be turned on by the first electronic device or by the second electronic device, which can be set as required.
  • the second electronic device if the second electronic device detects a user's activation trigger operation on the second linkage control, the second electronic device activates the linkage mode for the first electronic device, including: if the second electronic device The electronic device detects the user's activation trigger operation on the second linkage control, and the second electronic device sends a linkage request to the first electronic device; the first electronic device receives the linkage request and displays the linkage determination control; if the first electronic device detects that the user The linkage determines the trigger operation of the control, and the second electronic device generates and sends a linkage response agreeing to the linkage to the second electronic device; the second electronic device receives the linkage response and starts the linkage mode for the first electronic device.
  • the method further includes: the second electronic device enables a linkage mode with the first electronic device, and generates linkage operation prompt information; and the second electronic device displays the linkage operation prompt information.
  • the user experience can be improved.
  • the implementation of the embodiments of the present application provides an enhanced screen sharing method, which is applied to a first electronic device, including: the first electronic device displays a first sharing interface, and sends the first sharing interface to the second electronic device interface data of the interface, so that the second electronic device displays a second shared interface corresponding to the first shared interface according to the interface data; the first electronic device receives the event information sent by the second electronic device, The corresponding operation is performed on the user interface; the event information is that the second electronic device opens the linkage mode to the first electronic device, and in the linkage mode, according to the information determined by the user's trigger operation on the shared interface, the event information includes the event type corresponding to the trigger operation and Operating area information.
  • the second electronic device can cause the first electronic device to perform a corresponding operation on the first electronic device through the detected triggering operation of the user on the second sharing interface, so as to realize the linkage between the second electronic device and the first electronic device , the interaction between the second electronic device and the first electronic device can be better realized, the usability and user-friendliness of the sharing interface can be improved, and the user's experience can be improved.
  • the first electronic device performs a corresponding operation on the first electronic device according to the event information, including: the first electronic device determines the first operation type according to the event type; the first electronic device determines the first operation type according to the event type; The first operation type and operation area information perform corresponding operations on the first electronic device.
  • the implementation of the embodiments of the present application provides an enhanced screen sharing method, which is applied to a second electronic device, including: the second electronic device receives interface data of the first shared interface sent by the first electronic device, A second sharing interface corresponding to the first sharing interface is displayed according to the interface data; the sharing interface is an interface displayed by the first electronic device; the second electronic device enables the linkage mode to the first electronic device; in the linkage mode, the second electronic device detects To the user's trigger operation on the second sharing interface, determine the event information corresponding to the trigger operation, and the event information includes the event type and operation area information corresponding to the trigger operation; the second electronic device sends the event information to the first electronic device, so that the first electronic device An electronic device performs a corresponding operation on the first electronic device according to the event information.
  • the second electronic device can cause the first electronic device to perform corresponding operations on the first electronic device through the detected triggering operation of the user on the second sharing interface, so as to realize the linkage with the first electronic device, which can better It realizes the interaction with the first electronic device in an efficient manner, improves the usability and user-friendliness of the sharing interface, and improves the user's experience.
  • the implementations of the embodiments of the present application provide an enhanced screen sharing system, including: a first electronic device and a second electronic device; wherein the first electronic device is used to display the first sharing interface and report to the third electronic device.
  • the second electronic device sends the interface data of the first shared interface; the second electronic device is used to receive the interface data, and display the second shared interface corresponding to the first shared interface according to the interface data; the second electronic device is also used to enable the first electronic device
  • the linkage mode of the device in the linkage mode, the second electronic device is used to determine the event information corresponding to the trigger operation when the user's trigger operation on the second sharing interface is detected, and the event information includes the event type and operation area corresponding to the trigger operation. information; the second electronic device is further configured to send the event information to the first electronic device; the first electronic device is configured to receive the event information and perform corresponding operations on the first electronic device according to the event information.
  • the enhanced screen sharing system provided by this implementation includes a first electronic device and a second electronic device that execute the aforementioned enhanced screen sharing method, so the aforementioned first aspect or a possible implementation of the first aspect can also be provided.
  • the effect of the enhanced screen sharing method includes a first electronic device and a second electronic device that execute the aforementioned enhanced screen sharing method, so the aforementioned first aspect or a possible implementation of the first aspect can also be provided. The effect of the enhanced screen sharing method.
  • implementations of the embodiments of the present application provide an electronic device, including: a memory for storing a computer program, where the computer program includes program instructions; and a processor for executing the program instructions, so that the electronic device executes the foregoing 's enhanced screen sharing method.
  • the implementation of the embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the program instructions are executed by an electronic device to cause the electronic device to execute The aforementioned enhanced screen sharing method.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on an electronic product, enables the electronic product to execute the aforementioned method for collaboration between electronic devices.
  • FIG. 1 is a schematic diagram of a communication system provided by an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • FIG. 3 is a block diagram of a software structure of a mobile phone according to an embodiment of the present application.
  • 4A-4N are schematic diagrams of some interfaces for interaction between a mobile phone 100 and a mobile phone 200 according to an embodiment of the present application;
  • 5A-5C are schematic diagrams of some interfaces in a process of video playback linkage between a mobile phone 100 and a mobile phone 200 according to an embodiment of the present application;
  • 6A-6B are schematic diagrams of some interfaces in the process of performing music playback linkage between a mobile phone 100 and a mobile phone 200 according to an embodiment of the present application;
  • FIGS. 7A-7B are schematic diagrams of some interfaces in the process of picture display linkage between the mobile phone 100 and the mobile phone 200 according to an embodiment of the present application;
  • FIGS. 8A and 8B are schematic diagrams of other shared control controls provided by the embodiments of the present application.
  • FIG. 8C is a schematic interface diagram of a graffiti operation between the mobile phone 100 and the mobile phone 200 according to an embodiment of the present application;
  • FIG. 9 is a schematic diagram of an interface in a process of linkage between a mobile phone 100 and a tablet computer 300 according to an embodiment of the present application;
  • FIG. 10 is a schematic diagram of a process of application download and installation linkage between a mobile phone 100 and a mobile phone 200 according to an embodiment of the present application;
  • FIG. 11 is a schematic diagram of a process of application opening linkage between a mobile phone 100 and a mobile phone 200 according to an embodiment of the present application;
  • FIG. 12 is a schematic structural diagram showing an electronic device according to some embodiments of the present application.
  • FIG. 13 is a schematic structural diagram of a system on a chip (SoC) according to some embodiments of the present application.
  • SoC system on a chip
  • FIG. 1 is a schematic diagram of a communication system provided by an embodiment of the present application.
  • the communication system includes a mobile phone 100 and a mobile phone 200.
  • both the mobile phone 100 and the mobile phone 200 are installed with a video calling application with a video calling function and a screen sharing function.
  • the video calling application can use the user's account information and password to log in, thereby realizing the interconnection between the mobile phone 100 and the mobile phone 200, and realizing the interaction through the video calling function and the screen sharing function.
  • logging in reference may be made to the prior art, which is not limited in this embodiment of the present application.
  • the communication system further includes a cloud server 300, and the cloud server 300 is a server corresponding to the video calling application. Communication between the mobile phone 100 and the mobile phone 200 can be performed through the cloud server 300, such as video calls and screen sharing.
  • the user of the mobile phone 100 is the user U1
  • the user of the mobile phone 200 is the user U2.
  • User U1 opens the video call application in mobile phone 100 and initiates a video call request to user U2. If user U2 operates mobile phone 200 to accept the video call request, mobile phone 100 and mobile phone 200 can conduct a video call.
  • the video call interfaces displayed on the mobile phones 100 and 200 can respectively provide more functional controls, such as screen sharing controls. If the user U1 enables the screen sharing mode through the screen sharing control (or it may be called the screen sharing function, or the interface sharing mode), and the user U2 operates the mobile phone 200 to accept the request to enable the screen sharing mode, the mobile phone 100 may The real-time interface is used as the first sharing interface, and the mobile phone 100 sends the display content of the first sharing interface as interface data to the mobile phone 200 in the form of a data stream (or may also be referred to as a video data stream).
  • a data stream or may also be referred to as a video data stream
  • the mobile phone 200 receives the interface data of the first sharing interface, and displays a second sharing interface corresponding to the first sharing interface on its own screen according to the interface data, so that the mobile phone 100 can share the screen content of the mobile phone 100 with the mobile phone 200 .
  • the mobile phone 100 is used as a screen sharing initiator device for screen sharing
  • the mobile phone 200 is used as a screen sharing receiver device for screen sharing.
  • the interfaces of the mobile phone 100 and the mobile phone 200 can respectively provide more functional controls, such as linkage controls (or can also be referred to as for linkage switch). If the user U2 enables the linkage mode between the mobile phone 100 and the mobile phone 200 through the linkage control displayed on the mobile phone 200 (or it can also be called the linkage function), in the linkage mode, the mobile phone 200 receives the first message displayed by the user U2 to the mobile phone 2000 in the linkage mode. After the triggering operation of the second sharing interface, the mobile phone 200 sends the event information corresponding to the triggering operation of the user U2 to the mobile phone 100 .
  • linkage controls or can also be referred to as for linkage switch.
  • the event information includes the event type and operation area information corresponding to the trigger operation of the user U2, wherein the event type can be, for example, a single-click input event, a double-click input event, a long-press input event, a sliding input event and other input events, and the operation area information can be the user
  • the location coordinate information of the trigger location (for example, the coordinates of the location where the user clicks).
  • the mobile phone 100 After receiving the event information sent by the mobile phone 200, the mobile phone 100 can determine the operation type of the operation that the mobile phone 100 needs to perform according to the event information, and execute the corresponding operation.
  • the mobile phone 100 is made to perform an operation corresponding to the trigger operation of the user U2, so as to achieve the purpose of linking the mobile phone 200 to the mobile phone 100. That is, the linkage between the mobile phone 200 and the mobile phone 100 means that the mobile phone 100 can perform an operation corresponding to the event information in the mobile phone 100 according to the event information sent by the mobile phone 200 .
  • the mobile phone 200 detects that the user U2 clicks the application icon of an application in the second sharing interface displayed by the mobile phone 200, the mobile phone 100 performs corresponding actions and feedback such as opening the application according to the received event information.
  • the mobile phone 200 detects that the user U2 double-clicks the application icon of an application in the second sharing interface displayed by the mobile phone 200, and the mobile phone 100 performs corresponding actions and feedback such as application download and installation according to the received event information.
  • the mobile phone 200 detects that the user U2 clicks a control in the second sharing interface displayed by the mobile phone 200, and the mobile phone 100 executes corresponding actions and feedback such as triggering the control according to the received event information.
  • the mobile phone 100 displays a first sharing interface
  • the mobile phone 200 displays a second sharing interface corresponding to the first sharing interface sent by the mobile phone 100 .
  • the mobile phone 200 detects the click operation of the user U2 on the second sharing interface displayed by the mobile phone 200
  • the mobile phone 200 determines that the event type corresponding to the click operation of the user U2 is a click input event.
  • the mobile phone 200 sends the event type and the position coordinate information at the position where the user U2 performs the click operation to the mobile phone 100 as event information.
  • the mobile phone 100 determines, according to the event type, that the operation type corresponding to the click input event is the operation object triggering operation.
  • the mobile phone 100 determines, according to the position coordinate information, the operation object corresponding to the click operation position of the user U2 on the first sharing interface displayed by the mobile phone 100, and performs the operation of triggering the operation object. For example, if the operation object is an application, the operation of opening the application is performed; if the operation object is a control in the application, the operation of triggering the control is performed.
  • the mobile phone 200 determines that the event type corresponding to the double-click operation of the user U2 is a double-click input event.
  • the mobile phone 200 sends the event type and the position coordinate information at the position where the user U2 performs the double-click operation to the mobile phone 100 as event information.
  • the mobile phone 100 receives the event information, it is determined according to the event type that the operation type corresponding to the double-click input event is an application determination operation, and then the mobile phone 100 determines, according to the position coordinate information, that the position where the user U2 performs the double-click operation is on the first sharing interface displayed by the mobile phone 100.
  • the application name of the corresponding application is a double-click input event.
  • the mobile phone 100 sends the application name to the mobile phone 200 as the event response information. After the mobile phone 200 receives the event response information, it determines whether the corresponding application has been installed in the mobile phone 200 according to the application name. If the corresponding application is not installed, the mobile phone 200 directly downloads the corresponding application from the application market application in the mobile phone 200 according to the application name, and completes the process. App install. If the corresponding application is already installed in the mobile phone 200, the mobile phone 200 does not perform the application download and installation operation; or the mobile phone 200 may display prompt information for informing the user U2 that the corresponding application is currently installed on the mobile phone 200.
  • the mobile phone 100 determines the application name, it can also determine that the corresponding operation that the mobile phone 200 needs to perform is application download and installation according to the operation type "application determination operation", then the mobile phone 100 determines that the mobile phone 200 needs to perform the operation after receiving the event response information.
  • the mobile phone 100 generates event response information according to the application name and the operation type that the mobile phone 200 needs to perform. That is, the event response information includes the application name and the operation type that the mobile phone 200 needs to perform.
  • the mobile phone 200 After the mobile phone 200 receives the event response information, it can conveniently determine that the application download and installation operation needs to be performed according to the operation type, then the mobile phone 200 determines whether the corresponding application has been installed in the mobile phone 200 according to the application name, and when the corresponding application is not The corresponding application is downloaded from the application market application in the mobile phone 200, and the application installation is completed.
  • the mobile phone 200 may also link with the mobile phone 100 to make the mobile phone 100 perform other operations according to other trigger operations received by the user U2 on the shared screen displayed by the mobile phone 200 .
  • the user's input event type such as single-click input event, double-click input event, long-press input event, etc.
  • operation objects such as interface controls, applications, etc.
  • the aforementioned single-click operation may also be a sliding operation, that is, the input event is a sliding input event; if the mobile phone 200 detects that the user U2 slides the application icon of an application in the second sharing interface displayed by the mobile phone 200, the mobile phone 100 executes the opening operation. App and other corresponding actions and feedback.
  • the double-click operation can also be a long-press operation, that is, the input event is a long-press input event; if the mobile phone 200 detects that the user U2 long-presses the application icon of an application in the second sharing interface displayed by the mobile phone 200, the mobile phone 100 executes the application download and the operation. Installation and other corresponding actions and feedback.
  • the input event may also be a triple-click input event. If the mobile phone 200 detects the triple-click operation of the application icon of an application in the second sharing interface displayed by the mobile phone 200 by the user U2, the mobile phone 100 performs corresponding actions such as application uninstallation and the like. feedback.
  • the mobile phone 200 can realize the linkage with the mobile phone 100 through the trigger operation of the user U2 on the second sharing interface, so that the mobile phone 200 can download and install the application in the mobile phone 100
  • the operation of the corresponding application, or the triggering of the operation object in conjunction with the mobile phone 100 can better realize the interaction with the mobile phone 100, improve the usability and user-friendliness of the sharing interface, and improve the user's experience.
  • FIG. 2 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • the mobile phone may be the aforementioned mobile phone 100 or the mobile phone 200 .
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a limitation on the mobile phone.
  • the mobile phone may include more or less components than shown, or some components may be combined, or some components may be separated, or different component arrangements.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • GPU graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • the processor can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the processor 110 is configured to make the mobile phone execute the enhanced screen sharing method provided by the embodiments of the present application.
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • the mobile communication module 150 may provide wireless communication functions such as 2G/3G/4G/5G applied on the mobile phone.
  • the wireless communication module 160 can provide wireless communication solutions including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT) and other wireless communication solutions applied to the mobile phone. .
  • WLAN wireless local area networks
  • BT bluetooth
  • the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • the handset may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize the shooting function and realize the video calling function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the mobile phone may include 1 or N cameras 193 , where N is a positive integer greater than 1. Further, the mobile phone includes at least one camera 193 located on the same side as the display screen 194 .
  • Video codecs are used to compress or decompress digital video.
  • a phone can support one or more video codecs.
  • the mobile phone can play or record videos in various encoding formats, such as: Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the mobile phone can realize audio and video data and screen recording data. Packaging and playback, etc.
  • the mobile phone can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as video calls, music playback, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a “speaker” is used to convert audio electrical signals into sound signals.
  • the handset can make video calls, listen to music, or listen to hands-free calls through speaker 170A.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be received by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the earphone jack 170D is used to connect wired earphones.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • a touch operation or can be called a trigger operation
  • the mobile phone detects the intensity of the touch operation according to the pressure sensor 180A.
  • the mobile phone can also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch panel (Touch Panel, TP), also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor 180K may pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile phone, which is different from the location where the display screen 194 is located.
  • the mobile phone can detect the user's trigger operation on the display screen 194 according to the pressure sensor 180A and the touch sensor 180K, can also detect the user's voice input according to the receiver 170B, or can detect the user's gesture input according to the camera 193. , or the user's input is detected according to an input module such as another input keyboard, which is not limited in this embodiment.
  • FIG. 3 is a software structural block diagram of a mobile phone provided by an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the system of the mobile phone is divided into three layers, which are an application layer, an application framework (Framework) layer and a kernel layer from top to bottom.
  • the system also includes a software development kit (Software Development Kit, SDK).
  • the kernel layer includes a sensor (Sensor) module and a touch screen.
  • the sensor module may be, for example, the aforementioned pressure sensor 180A, which is used to generate a trigger operation electronic signal according to a user's physical trigger operation (such as a user's click operation) on the touch screen.
  • the trigger operation electronic signal may include click input event information corresponding to the user's click operation, and the trigger operation electronic signal may also include operation area information of the position where the user performs the click operation, such as position coordinate information.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer includes an input event reading module, an input event dispatching module and a window management module.
  • the input event reading module may be, for example, InputReader
  • the input event distribution module may be, for example, InputDispatcher
  • the window management module may be, for example, a window manager (Window Manager Service, WMS).
  • the InputReader is used to receive the triggering operation electronic signal sent by the pressure sensor 180A, and continuously fetch events from the EventHub (not shown in the figure) through the thread loop for event translation to determine the input event, and encapsulate the input event and send it to the InputDispatcher.
  • InputDispatcher holds information from all windows of WMS. After the InputDispatcher receives the input event from the InputReader, it finds the appropriate window in the saved window for event distribution. In the embodiment of the present application, the InputDispatcher mainly distributes input events to the InCallUI APP in the application layer when the mobile phone is used as the receiver device, so as to trigger further operations of the InCallUI APP.
  • WMS is used to manage the window program.
  • WMS can obtain the size of the display screen of the mobile phone and provide the position of the window. Can make inputdispachter correctly dispatch input events to the specified window.
  • the application layer may include a series of applications, such as including a video call interface application and a video call application.
  • the video call interface application may be, for example, the InCallUI APP
  • the video call application may be, for example, the VoipService APP or the HwVoipService APP.
  • the InCallUI APP is responsible for receiving the input events sent by the inputdispachter, and identifying and judging the input events. In addition, when the mobile phone is used as the initiator device, the InCallUI APP is also responsible for the validity of the input event and parsing the application information of the application corresponding to the input event.
  • the application interface provided by the InCallUI APP includes some functional controls, which are displayed in the form of touch controls on the screen, which can be displayed according to different scenarios where the application is located.
  • the function control may include one or more of controls such as "video call”, “screen sharing”, “linkage”, “graffiti”, etc., and the display content of the function control on the screen includes icons and / or text.
  • the mobile phone opens the video call application, and after entering the video call application interface, the mobile phone displays the "video call” control. If the mobile phone detects the user's trigger operation on the "video call” control, a video call is established between the mobile phone and the peer mobile phone.
  • the phone After the phone establishes a video call, the phone displays the Screen Sharing controls. If the mobile phone detects the user's trigger operation on the "screen sharing" control, the mobile phone sends its own current screen as the shared screen to the peer mobile phone for the video call for display.
  • the mobile phone After the mobile phone establishes screen sharing, the mobile phone displays the "Linkage” control. If the mobile phone detects the user's trigger operation on the "Linkage” control, the mobile phone will start the linkage mode between the mobile phone and the peer mobile phone that is making a video call and performing screen sharing.
  • the InCallUI APP is also responsible for the display and adaptation of the communication interface.
  • the InCallUI APP includes the interface switching entry of the aforementioned video calling, screen sharing, linkage and other business functions for the display of the corresponding interface.
  • the HwVoipService APP is an application that supports video calls and screen sharing provided by this embodiment of the application, and has a visual user interface.
  • HwVoipService is responsible for business logic control, including providing audio calls, video calls, device discovery, message services, business function switching and other functions. Its capabilities are encapsulated into service APIs for use by InCallUI APP.
  • the HwVoipService APP is also responsible for interacting with the SDK.
  • the SDK includes an SDK module, and the SDK module may be, for example, the CaasKit SDK.
  • the CaasKit SDK is responsible for sending signaling, parsing signaling, and interacting with the application layer.
  • application layer can also include applications such as camera, gallery, calling, music, video, etc.
  • the application framework layer can also include an activity manager (Activity Manager Service, AMS), a view system (VIEW system), a content provider, a phone manager, a resource manager, etc.; in addition, it can also include MSDP (Multicast Source Discovery Protocol, Multicast source discovery protocol) perception service and multi-path access service, etc., which are not limited in this embodiment of the present application.
  • Activity Manager Service Activity Manager Service
  • VIEW system view system
  • MSDP Multicast Source Discovery Protocol, Multicast source discovery protocol
  • AMS, WMS, and VIEW provide capability support for the linkage in this embodiment of the present application.
  • AMS provides basic UX interaction capabilities; WMS is responsible for window area calculation and other capabilities, and provides operation area information.
  • VIEW provides capabilities such as operation object monitoring. According to the response or callback of the last application, it can capture user operation behavior, record operation objects, and provide operation objects and other information.
  • the MSDP perception service and the multi-path access service are the basic components of the existing audio call and video call, and the kernel layer is based on the existing audio call and video call, which are the same as the existing technology, and will not be repeated here.
  • the kernel layer is the underlying system, and the underlying system also includes the underlying display system for providing display services. For example, it may also include a surface manager (surface manager), a media library (Media Libraries), and the like.
  • the Surface Manager is used to manage the display system and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer at least includes a display driver, a camera driver, an audio driver, and the like, which are not limited in this embodiment of the present application. That is, it mainly includes hard codes, and MSDP virtualization devices corresponding to the aforementioned MSDP awareness services, and the like.
  • the following first describes the application scenario of the enhanced screen sharing method and the interaction process between devices.
  • the mobile phone 100 and the mobile phone 200 each have only one physical screen, and their respective main desktops are displayed on the screens (as shown in FIG. 4A ).
  • application names and application icons of applications such as the application market application A1, the video calling application A2, the video playing application A3, and the music application A4 are displayed.
  • Application names and application icons of applications such as the application market application A1 and the video calling application A2 are displayed on the main desktop of the mobile phone 200 .
  • the mobile phone 100 detects the click operation of the video calling application A2 on the mobile phone 100 by the user U1, the mobile phone 100 starts the video calling application A2, and the mobile phone 200 displays the contact selection interface as shown in FIG. 4B .
  • the mobile phone 100 detects that the user U1 clicks the video call control 10 corresponding to "user U2" in the contact list, the mobile phone 100 is indeed conducting a video call.
  • the mobile phone 100 sends a video call request to the device mobile phone 200 corresponding to the user U2 to request to establish a video call.
  • the mobile phone 200 displays a prompt message 20 "received a video call request". If the mobile phone 200 detects that the user U2 clicks the "accept" control, the mobile phone 200 sends the mobile phone 100 a response for establishing a video call.
  • a video call is established between the mobile phone 100 and the mobile phone 200, and the mobile phone 100 and the mobile phone 200 respectively display the video call interface.
  • the mobile phone 100 usually displays the video call image of the user U1 in the upper right corner of the screen by default (for example, the video call image shown by the girl's avatar), and the video call image of the user U2 (for example, the video call image shown by the boy's avatar) is displayed in the main area of the screen. video call image). And a "more” control 11, a "switch camera” control and a "hang up” control are displayed at the lower part of the screen of the mobile phone 100 . Of course, the mobile phone 100 may also display other controls.
  • the mobile phone 200 displays the video call image of the user U2 in the upper right corner of the screen, and displays the video call image of the user U1 in the main area of the screen. And the mobile phone 200 displays a "more” control 21, a "switch camera” control and a "hang up” control at the lower part of the screen.
  • the mobile phone 100 detects the click operation of the “more” control 11 by the user U1 , the mobile phone 100 displays the “share screen” control 12 . If the mobile phone 100 detects the user U1's click operation on the "share screen” control 12 , the mobile phone 100 sends a screen sharing request to the mobile phone 200 .
  • the mobile phone 200 after receiving the screen sharing request sent by the mobile phone 100 , the mobile phone 200 displays a prompt message 22 “The other party is requesting to share the screen with you”. If the mobile phone 200 detects that the user U2 clicks the "accept" control, the mobile phone 200 sends a response to the mobile phone 100 for agreeing to screen sharing.
  • the first shared interface and the second shared interface may be the same, and are collectively referred to as shared interfaces.
  • the screen sharing mode is enabled between the mobile phone 100 and the mobile phone 200 to perform screen sharing.
  • the video calling application A2 in the mobile phone 100 is automatically switched to run in the background, and the mobile phone 100 displays the main desktop of the mobile phone 100 .
  • the mobile phone 100 sends its own main desktop interface to the mobile phone 200 as a shared interface.
  • the video call application A2 in the mobile phone 200 is also automatically switched to run in the background, and the mobile phone 200 displays the sharing interface sent by the mobile phone 100 .
  • the mobile phone 100 is the initiator of the screen sharing
  • the mobile phone 200 is the receiver of the screen sharing.
  • the mobile phone 100 also displays the initiator-side sharing control control 13 .
  • the initiator-side sharing control control 13 includes sharing prompt information 130 , a linkage control 131 , and an exit sharing control 132 .
  • the sharing prompt information 130 includes the screen sharing status prompt information "sharing" and the time information "00:00" when the screen sharing has been performed. numerical value.
  • the mobile phone 100 can enable or disable the linkage mode between the mobile phone 100 and the mobile phone 200 according to the detected click operation of the user U1 on the linkage control 131 .
  • the mobile phone 100 can exit the screen sharing mode and the linkage mode between the mobile phone 100 and the mobile phone 200 according to the detected click operation of the user U1 on the exit sharing control 132 . After the screen sharing mode is exited between the mobile phone 100 and the mobile phone 200, the mobile phone 100 and the mobile phone 200 can display a video call interface as shown in FIG. 4C.
  • the receiver side sharing control control 23 is displayed on the interface of the mobile phone 200 , and the receiver side sharing control control 23 includes sharing prompt information 230 , a linkage control 231 and an exit sharing control 232 .
  • the sharing prompt information 230 includes the sharing state prompt information "the counterparty is sharing” and the time information "00:00" when the screen has been shared.
  • the sharing time information displays a corresponding value in real time according to the time of screen sharing between the mobile phone 100 and the mobile phone 200 .
  • the mobile phone 200 can enable or disable the linkage mode between the mobile phone 100 and the mobile phone 200 according to the detected click operation of the user U2 on the linkage control 231 .
  • the mobile phone 200 can exit the screen sharing mode and the linkage mode between the mobile phone 100 and the mobile phone 200 according to the detected click operation of the user U2 on the exit sharing control 232 .
  • the trigger states of the shared control control 13 on the initiator side of the mobile phone 100 and the shared control control 23 on the receiver side of the mobile phone 200 are synchronized with each other. That is, either of the mobile phone 100 and the mobile phone 200 can start and end the linkage mode according to the corresponding user's click operation on the control. For example, if the mobile phone 100 detects a click operation on the linkage control 131 by the user U1 , the linkage mode is enabled between the mobile phone 100 and the mobile phone 200 . After that, if the mobile phone 200 detects that the user U2 clicks the linkage control 231 , the linkage mode is turned off between the mobile phone 100 and the mobile phone 200 , and the mobile phone 200 ends the linkage with the mobile phone 100 .
  • the mobile phone 200 if the mobile phone 200 detects a click operation of the linkage control 231 by the user U2 , the mobile phone 200 sends a linkage request to the mobile phone 100 . After receiving the linkage request, the mobile phone 100 displays a prompt message 14 "the counterparty requests linkage". If the mobile phone 100 detects that the user U1 clicks the "OK" control as the linkage determination control, the mobile phone 100 sends a response of agreeing to the linkage to the mobile phone 200 .
  • the linkage mode is enabled between the mobile phone 100 and the mobile phone 200 .
  • the linkage control 131 of the mobile phone 100 can display the text “Linkage” to remind the user U1 that the current mobile phone 100 and the mobile phone 200 have enabled the linkage mode.
  • the mobile phone 200 If the mobile phone 200 detects the click operation on the linkage control 231 by the user U2 for the first time, the mobile phone 200 displays the linkage operation prompt information.
  • the linked operation prompt information displayed by the mobile phone 200 includes operation introduction graphic information 24 , description information 25 and confirmation control 26 .
  • the operation introduction icon information 24 includes a plurality of schematic areas, which are used to indicate that the user can click the corresponding area to complete the corresponding linkage to the mobile phone 100 .
  • the description information 25 is "Double-tap the shared screen with a finger to trigger the installation of the initiator's specific application; single-tap the shared screen with a finger to trigger the initiator's operation”.
  • Confirmation control 26 may display "Got it".
  • the linked operation prompt information may also only include the operation introduction graphic information 24 and the confirmation control 26 , or only the description information 25 and the confirmation control 26 .
  • the linked operation prompt information may also be other information for explaining the function of the linked mode to the user.
  • the mobile phone 200 detects that the user clicks the confirmation control 26 .
  • the mobile phone 200 displays the interface shown in FIG. 4H.
  • the linkage control 231 also displays the text "linkage" to remind the user U2 that the mobile phone 200 and the mobile phone 100 are currently in the linkage mode.
  • the interface shown in FIG. 4H can be directly displayed, that is, the linkage control 231 directly displays the text “Linkage” , instead of displaying the operation instruction interface shown in FIG. 4G .
  • the mobile phone 200 when the mobile phone 200 detects the click operation of the linkage control 231 by the user U2 for the first time, the mobile phone 200 can display the linkage operation prompt information to remind the user how to perform the linkage operation between the mobile phone 200 and the mobile phone 100, which can effectively Improve user experience.
  • the mobile phone 200 may also periodically display the linkage operation prompt information, or display the linkage operation prompt information according to other needs or a user's trigger operation.
  • the mobile phone 200 determines that the event type input by the user is a double-click input event.
  • the mobile phone 200 generates event information according to the position coordinate information and the event type at the position where the user U2 performs the double-click operation, that is, the event information includes the event type (double-click input) and the position coordinate information of the user's double-click operation.
  • the cell phone 200 sends the event information to the cell phone 100 .
  • the mobile phone 100 After the mobile phone 100 receives the event information, it is determined according to the event type that the operation type of the operation that the mobile phone 100 needs to perform is an application determination operation, and the mobile phone 100 determines, according to the position coordinate information, that the user U2 performs a double-click operation, and the application name "video" of the corresponding application on the sharing interface. Play Application A3". And the mobile phone 100 determines according to the operation type that after the mobile phone 200 receives the event response information, the operation type that needs to be performed is an application download and installation operation. Then the mobile phone 100 sends the application name "video playback application A3" and the operation type "application download and installation" to the mobile phone 200 as event response information.
  • the mobile phone 200 After receiving the event response information sent by the mobile phone 100, the mobile phone 200 determines that the video playback application A3 needs to be downloaded and installed according to the operation type and application name, and the mobile phone 200 further determines whether the video playback application A3 is currently installed in the mobile phone 200.
  • the mobile phone 200 does not perform the download and installation operation of the video playback application A3.
  • the mobile phone 200 may display a prompt message "The video playback application A3 is currently installed, please confirm” (not shown in the figure) to inform the user U2 that the mobile phone 200 has currently installed the video playback application A3.
  • the mobile phone 200 searches and downloads the corresponding video playing application A3 from the application market application A1 in the mobile phone 200, and completes the installation of the video playing application A3.
  • the mobile phone 200 may display a prompt message 27 "The video playback application A3 has been successfully installed" to remind the user U2.
  • the prompt information 27 may display "application downloading", etc., to remind the user U2, which can be set as required.
  • the mobile phone 200 can directly display the main desktop of the mobile phone 200 .
  • the main desktop includes the application icons and applications of the video playback application A3 installed on the mobile phone 200 name.
  • the upper right corner of the screen of the mobile phone 200 may also display the video call image of the user U1.
  • the mobile phone 100 displays the main desktop of the mobile phone 100, and the mobile phone 100 can also display the video call image of the user U2 at the upper right of the screen.
  • the linkage mode is enabled between the mobile phone 200 and the mobile phone 100, and the mobile phone 200 can perform linkage with the mobile phone 100 through the received double-click operation of the application on the sharing interface by the user U2, and obtain the application information through the mobile phone 100.
  • application name and conveniently install the application corresponding to the double-click operation position of the user, which enriches the interaction types between the mobile phone 200 and the mobile phone 100 and improves the user experience.
  • the mobile phone 100 and the mobile phone 200 start the linkage mode, and after the interface shown in FIG. 4E is displayed, if the mobile phone 200 detects the click operation of the linkage control 231 by the user U2.
  • the mobile phone 200 may also not need to send a linkage request to the mobile phone 100, and the linkage mode can be automatically and directly enabled between the mobile phone 200 and the mobile phone 100, and the interface shown in FIG. 4G is displayed.
  • the mobile phone 200 when the mobile phone 200 searches, downloads and installs the video call application A3, the mobile phone 200 may not display the prompt information 27 shown in FIG. 4I , which can be set as required.
  • application market applications installed in the mobile phone 100 and the mobile phone 200 respectively may be the same application market application A1, or may be different applications providing an application download function.
  • the mobile phone 100 can display the main desktop, and the mobile phone 200 can display the sharing interface.
  • neither the linkage control 131 nor the linkage control 231 displays the text "linkage" to remind the user that the mobile phone 100 and the mobile phone 200 are not currently in the linkage mode.
  • the mobile phone 100 may continue to display the video call interface , the video call interface further includes a call interface zoom control 15 . And the mobile phone 100 sends the video call interface to the mobile phone 200 as a shared interface. The mobile phone 200 displays the sharing interface shared by the mobile phone 100 .
  • the mobile phone 200 determines that the click operation corresponds to the user U2.
  • the event type is a click input event.
  • the mobile phone 200 sends the position coordinate information and the event type at the position where the user U2 performs the click operation to the mobile phone 100 as event information.
  • the mobile phone 100 After receiving the event information, the mobile phone 100 determines that the operation type is the operation object triggering operation according to the event type. Then, according to the position coordinate information, the mobile phone 100 determines that the operation object corresponding to the click operation position of the user U2 on the sharing interface is the call interface zoom-out control 15 . Then, the mobile phone 100 executes the operation of triggering the call interface shrinking control 15, that is, executes the call interface shrinking operation, and displays the interface as shown in FIG. 4N .
  • the mobile phone 200 and the mobile phone 100 can also respectively display the video call interface as shown in FIG. 4C .
  • the mobile phone 100 and the mobile phone 200 enable the screen sharing mode and the linkage mode. If the mobile phone 200 detects the click operation of the video playback application A3 by the user U2, the mobile phone 200 determines that the event type corresponding to the click operation of the user U2 is a click input event. The mobile phone 200 sends the position coordinate information and the event type at the position where the user U2 performs the click operation to the mobile phone 100 as event information.
  • the mobile phone 100 After receiving the event information, the mobile phone 100 determines that the operation type is the operation object triggering operation according to the event type, and the mobile phone 100 determines that the operation object corresponding to the click operation position of the user U2 on the sharing interface is the video playback application A3 according to the position coordinate information. The mobile phone 100 performs the operation of opening the video playback application A3, and displays the video application interface as shown in FIG. 5B .
  • the mobile phone 200 determines that the event type corresponding to the click operation of the user U2 is a click input event.
  • the mobile phone 200 sends the position coordinate information and the event type at the position where the user U2 performs the click operation to the mobile phone 100 as event information.
  • the mobile phone 100 After receiving the event information, the mobile phone 100 determines that the operation type is the operation object triggering operation according to the event type, and the mobile phone 100 determines that the operation object corresponding to the click operation position of the user U2 on the sharing interface is "video one" according to the position coordinate information. The mobile phone 100 performs the operation of opening "video one", and displays the video playing interface as shown in FIG. 5C .
  • the mobile phone 100 and the mobile phone 200 enable the screen sharing mode and the linkage mode.
  • the mobile phone 100 displays the application interface of the music application A3. If the mobile phone 200 detects the click operation of the user U2 on "Song 3", the mobile phone 200 determines that the event type corresponding to the click operation of the user U2 is a click input event.
  • the mobile phone 200 sends the position coordinate information and the event type at the position where the user U2 performs the click operation to the mobile phone 100 as event information.
  • the mobile phone 100 After receiving the event information, the mobile phone 100 determines that the operation type is the operation object triggering operation according to the event type, and the mobile phone 100 determines that the operation object corresponding to the click operation position of the user U2 on the sharing interface is "song three" according to the position coordinate information. The mobile phone 100 performs the operation of playing "Song 3", and displays the video playing interface as shown in FIG. 6B .
  • the mobile phone 100 and the mobile phone 200 enable the screen sharing mode and the linkage mode.
  • the mobile phone 100 displays the pictures in the photo album application. If the mobile phone 200 detects the user U2's sliding zoom operation on the picture (wherein the user opens two fingers as a sliding zoom operation, or may also be called a drag zoom operation; the user's two fingers pinch is a sliding zoom operation), then the mobile phone 200 determines that the user The event type corresponding to the sliding zoom operation of U2 is the sliding zoom input event.
  • the mobile phone 200 sends the initial position coordinate information, the termination position coordinate information and the event type at the position where the user U2 performs the sliding zoom operation to the mobile phone 100 as event information.
  • the mobile phone 200 draws the sliding track of the user's two fingers, and determines the initial position coordinates and the end position coordinates of each finger as the position coordinate information through the event. The information is mapped to the cell phone 100 .
  • the mobile phone 100 After receiving the event information, the mobile phone 100 determines that the operation type is the operation object triggering operation according to the event type. Then the mobile phone 100 determines the enlargement ratio of the picture according to the coordinate information of the initial position and the coordinate information of the end position, and executes a picture enlargement operation to display the picture display interface as shown in FIG. 7B .
  • the mobile phone 100 serving as the initiator device may only include the sharing control control 13 on the initiator side of the mobile phone 100 .
  • the prompt information 130 and the exit sharing control 132 do not include the aforementioned linkage control 131 .
  • the receiver-side sharing control control 23 of the mobile phone 200 as the receiver's device may include a linkage control 231 . That is, in this implementation manner, the on and off of the linkage mode between the mobile phone 200 and the mobile phone 100 can only be triggered by the linkage control 231 .
  • the icon of the linkage control 131 in the sharing control control 13 on the initiator side may also be as shown in FIG. 8B .
  • the linkage control 131 can remind the user to enable and disable the linkage mode by changing the shape and format of the icon, or changing the color.
  • the sharing control control 13 on the initiator side may further include a graffiti control 133 for the user U1 to perform graffiti operations on the shared screen through the graffiti control 133 .
  • the receiver-side sharing control control 23 in the mobile phone 200 may also be as shown in FIG. 8B , which will not be repeated here.
  • the mobile phone 100 displays the oval graffiti traces shown in the figure, and the corresponding mobile phone 200 also displays the corresponding graffiti traces.
  • the mobile phone 200 displays the square graffiti traces shown in the figure, and the corresponding mobile phone 100 also displays the corresponding graffiti traces.
  • the initiator-side shared control control 13 and the recipient-side shared control control 23 may also be other types and formats of controls such as side toolbars, which can be set as required.
  • the mobile phone 100 may also enable the linkage mode in the case of screen sharing corresponding to scenarios such as voice calls, remote assistance, and device collaboration with the mobile phone 200 .
  • the resolutions of the screens of the mobile phone 100 and the mobile phone 200 may be the same or different. If the screen resolutions of the mobile phone 100 and the mobile phone 200 are the same, the mobile phone 200 directly sends the obtained position coordinate information to the mobile phone 100 . If the resolutions of the screens of the mobile phone 100 and the mobile phone 200 are different, the mobile phone 200 may send the obtained position coordinate information to the mobile phone 100 after coordinate transformation according to the corresponding relationship between the resolutions of the mobile phone 200 and the mobile phone 100 .
  • the mobile phone 200 directly sends the obtained position coordinate information to the mobile phone 100, and the mobile phone 100 performs coordinate transformation on the obtained position coordinate information according to the corresponding relationship between the resolutions of the mobile phone 200 and the mobile phone 100, and then performs the aforementioned application determination, operation object determination, etc. operate.
  • two-way screen sharing may be performed between the mobile phone 100 and the mobile phone 200, that is, the mobile phone 100 sends its own screen as the first shared screen to the mobile phone 200 for display, and the mobile phone 200 also sends its own screen to the mobile phone 200 for display.
  • the screen is sent to the mobile phone 200 as the second shared screen for display.
  • the mobile phone 100 and the mobile phone 200 are devices with only one physical screen, when two-way screen sharing is performed between the mobile phone 100 and the mobile phone 200, the mobile phone 100 and the mobile phone 200 respectively perform screen splitting operations. And you can choose one screen area after the split screen to display your own interface, and another screen area to display the shared interface shared by the other party.
  • the mobile phone 100 and the mobile phone 200 are devices with two or more physical screens respectively, when the two-way screen sharing is performed between the mobile phone 100 and the mobile phone 200, the mobile phone 100 and the mobile phone 200 select a physical screen to display their own interface. , another physical screen displays the sharing interface shared by the other party.
  • the mobile phone 100 can operate the screen of the mobile phone 200 in linkage, and the mobile phone 200 can operate the screen of the mobile phone 100 in linkage to realize the two-way linkage between the mobile phone 100 and the mobile phone 200.
  • the mobile phone 100 can also establish a video call with the tablet computer 300 , and enable the screen sharing mode and the linkage mode.
  • the screen resolution of the tablet computer 300 is different from the screen resolution of the mobile phone 100 .
  • the coordinate adjustment parameter can be determined according to the ratio between the screen resolution F1 of the mobile phone 100 and the screen resolution F3 of the tablet computer 300 . and adjust the initial position coordinate information obtained by the tablet computer 300 according to the coordinate adjustment parameters to obtain the position coordinate information.
  • the tablet computer 300 sends the event information including the adjusted position coordinate information to the mobile phone 100 , so that the mobile phone 100 can accurately determine the corresponding application or control on the shared screen displayed by the mobile phone 100 for the position coordinate information.
  • Z1 Sa ⁇ Z0, where Z0 is the initial position coordinate, such as (x 0 , y 0 ), Sa is the coordinate adjustment parameter, and Z1 is the position coordinate obtained after adjustment.
  • F1 is (F11 ⁇ F12), F11 is the number of pixels in the horizontal direction (or the screen width direction, or the x-axis direction) of the screen of the mobile phone 100 , and F12 is the vertical direction of the screen of the mobile phone 100 . (Or it can also be called the number of pixels in the length direction of the screen, or the y-axis direction).
  • F3 is (F31 ⁇ F32), F31 is the number of pixels of the screen of the tablet computer 300 in the horizontal direction (or the screen width direction, or the x-axis direction), and F32 is the screen of the tablet computer 300 in the vertical direction (or It can be called the number of pixels in the length direction of the screen, or the y-axis direction).
  • the position coordinate Z1 obtained after the above adjustment may be (S x1 x 0 , S y1 y 0 ).
  • the mobile phone 100 may also enable the sharing mode and the linkage mode with other mobile phones with different screen resolutions and screen resolutions of the mobile phone 100 (eg, a large-screen mobile phone and a small-screen mobile phone).
  • the mobile phone 100 can also establish a video call with other devices with a screen display function, such as a TV, and enable the screen sharing mode and the linkage mode.
  • the embodiments of the present application relate to an enhanced screen sharing method applied to an electronic device, where the electronic device may be a mobile phone, a tablet computer, a TV, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, or a netbook , personal digital assistant (personal digital assistant, PDA), wearable devices, virtual reality devices and other electronic devices.
  • the electronic device may be a mobile phone, a tablet computer, a TV, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, or a netbook , personal digital assistant (personal digital assistant, PDA), wearable devices, virtual reality devices and other electronic devices.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the electronic device is an electronic device that can wirelessly communicate with other electronic devices, and the electronic device has a screen, and the electronic device has a video call function, a screen sharing function, and a linkage function.
  • a video call is performed between the mobile phone 100 and the mobile phone 200, and screen sharing is performed, and after the linkage mode is enabled between the mobile phone 100 and the mobile phone 200, the mobile phone 200 can download and install the application. .
  • the mobile phone 100 includes a video call application A2 and an SDK module 105 .
  • the mobile phone 200 includes a sensor module 201 , an input event reading module 202 , an input event distribution module 203 , a video call interface application 204 , a video call application A2 and an SDK module 205 .
  • the process of application download linkage between the mobile phone 100 and the mobile phone 200 includes the following steps:
  • the sensor module 201 in the mobile phone 200 sends the double-click operation information corresponding to the detected double-click operation of the user U2 to the input event reading module 202 in the mobile phone 200 .
  • the sensor module 201 will send the first click operation information corresponding to the first click operation after detecting the first click operation of the user U2 Read module 202 for input events. After detecting the second click operation of the user U2, the sensor module 201 sends the second click operation information corresponding to the second click operation to the input event reading module 202.
  • the click operation information may be a trigger operation electronic signal, including the input event type corresponding to the trigger operation of the user U2, for example, the event type corresponding to the click operation of the user U2 is "click input event", and the position of the click position of the user U2 Coordinate information.
  • the input event reading module 202 After receiving the first click operation information and the second click operation information, the input event reading module 202 determines that the time interval between receiving the first click operation information and the second click operation information is less than a preset double-click operation After determining the time threshold, the input event reading module 202 retrieves the event from the EventHub through a thread to perform event translation, and can determine that the input event corresponding to the operation of the user U2 is a double-click input event. The input event reading module 202 sends the double-click input event and the position coordinate information of the double-click position of the user U2 to the input event distribution module 203 as event information.
  • the value range of the time threshold for the determination of the double-click operation may be, for example, 0.1s ⁇ 0.6s, such as 0.1s, 0.25s, 0.5s, 0.6s, and the like.
  • the time threshold for determining the double-click operation may also be other values.
  • the input event distribution module 203 performs event distribution according to the received event information, and sends the event information to the video call interface application 204 .
  • the video call interface application 204 transmits the event information of the double-click input event to the video call application A2.
  • the video call application A2 judges the time interval between two clicks in the event information of the double-click input event, so as to achieve the purpose of event identification and event validity analysis. If the video calling application A2 determines that the interval between two clicks in the event information of the double-click input event is less than the preset double-click operation determination time threshold, the video calling application A2 determines that the input event is a double-click input event.
  • the video call application A2 calls the SDK module 205 to send event information.
  • the SDK module 205 sends the event information to the SDK module 105 in the mobile phone 100 .
  • the event information includes the event type "double-click input event” and the operation area information "position coordinate information of the user's double-click position".
  • the event information sent by the SDK module 205 may be sent in a signaling manner through a signaling transmission channel established between the mobile phone 200 and the mobile phone 100 .
  • the SDK module 205 in the mobile phone 200 may first send the event response information to the cloud server 300 corresponding to the video call application A2. Then, the cloud server 300 sends it to the SDK module 105 in the mobile phone 100 .
  • the SDK module 105 parses the event information, and obtains the event type "double-click input event” and the operation area information "position coordinate information of the user's double-click position".
  • the SDK module 105 sends the parsed event information to the video calling application A2. That is, the event type "double-click input event” and the operation area information "position coordinate information of the user's double-click position" are sent to the video calling application A2.
  • the video call application A2 may first determine that the mobile phone 100 needs to perform an application determination operation according to the event type "double-click input event". Then, the video call application A2 determines whether the position coordinate information is valid or not based on the operation area information "position coordinate information of the user's double-click position". For example, whether the current interface of the mobile phone 100 contains the application icon and/or the application name of the application, if so, the video call application A2 determines whether the user's double-click position is at the location of the desktop application icon and/or the application name, and if so, it is considered that the position coordinates Information is valid.
  • the video call application A2 determines the corresponding area on the interface of the mobile phone 100 according to the operation area information "position coordinate information of the user's double-click position", and determines the application name of the application corresponding to the area, such as video playback application A3. And the video call application A2 determines that the corresponding operation type to be performed by the mobile phone 200 after receiving the name is "application download and installation".
  • the video call application A2 calls the SDK module 105 to return event response information, where the event response information includes the operation type "application download and install” and the application name "video playback application A3".
  • the SDK module 105 sends the event response information to the SDK module 205 in the mobile phone 200 .
  • the event information sent by the SDK module 105 may be sent in a signaling manner through a signaling transmission channel established between the mobile phone 100 and the mobile phone 200 .
  • the SDK module 105 in the mobile phone 100 may first send the event response information to the cloud server 300 corresponding to the video calling application A2. Then, the cloud server 300 sends it to the SDK module 105 in the mobile phone 200 .
  • the SDK module 105 parses the event response information to obtain the operation type and application name.
  • the SDK module 105 sends the parsed event response information to the video call application A2, that is, the SDK module 105 sends the operation type and the application name to the video call application A2.
  • the video call application A2 determines that the application download and installation operation needs to be performed according to the operation type "application download and installation", and the video call application A2 determines that the application to be downloaded is the video playback application A3 according to the application name "video playback application A3". Then the video calling application A2 first determines whether the video playing application A3 has been installed in the mobile phone 200 according to the application name "video playing application A3". Download the video playback application A3 from the market application A1, and complete the application installation. If the video playback application A3 is installed in the mobile phone 200, the mobile phone 200 does not download and install the application;
  • the mobile phone 200 in the process of screen sharing between the mobile phone 200 and the mobile phone 100, after the linkage mode between the mobile phone 200 and the mobile phone 100 is enabled, the mobile phone 200 can operate according to the trigger operation of the mobile phone 200 by the user U2, Download the corresponding application in the mobile phone 100 .
  • the interaction types between the mobile phone 100 and the mobile phone 200 for screen sharing can be enriched, and the user experience can be improved.
  • the mobile phone 100 when the mobile phone 100 generates the event response information, it is also possible to send only the application name as the event response information to the mobile phone 200 without determining the operation type that the mobile phone 200 needs to perform. Then, after receiving the event response information, the mobile phone 200 can directly determine whether to download and install the corresponding application according to the application name.
  • the process of performing application opening linkage between the mobile phone 100 and the mobile phone 200 includes the following steps:
  • the sensor module 201 in the mobile phone 200 sends the detected click operation information corresponding to the click operation of the user U2 to the input event reading module 202 in the mobile phone 200 .
  • the click operation information includes the input event type corresponding to the trigger operation of the user U2, for example, the event type corresponding to the click operation of the user U2 is "click input event", and the position coordinate information of the click position of the user U2.
  • the input event reading module 202 receives the single-click operation information, extracts the event from the EventHub through a thread to perform event translation, and can determine that the input event of the user U2 is a single-click input event.
  • the input event reading module 202 sends the click input event to the input event distribution module 203 .
  • the value range of the time threshold for the determination of the double-click operation may be, for example, 0.1s to 0.6s, such as 0.1s, 0.35s, 0.5s, 0.6s, and the like.
  • the time threshold for determining the double-click operation may also be other values.
  • the input event distribution module 203 performs event distribution according to the received click input event, and sends the event information of the click input event to the video call interface application 204 .
  • the video call interface application 204 transmits the event information of the click input event to the video call application A2.
  • the video call application A2 judges the click time in the event information of the click input event, so as to achieve the purpose of event identification and event validity analysis. If the video call application A2 determines that the next input event is not received within the preset interval time threshold after the click input event, the video call application A2 determines that the input event is a click input event.
  • the value range of the interval time threshold may be, for example, 0.1s ⁇ 0.6s, such as 0.1s, 0.35s, 0.5s, 0.6s, and the like. Of course, the interval time threshold can also be other values.
  • the video call application A2 calls the SDK module 205 to send event information.
  • the SDK module 205 sends the event information to the SDK module 105 in the mobile phone 100 .
  • the event information includes the event type "click input event” and the operation area information "position coordinate information of the user's click position".
  • the SDK module 105 parses the event information, and obtains the event type "click input event” and the operation area information "position coordinate information of the user's click position".
  • the SDK module 105 sends the parsed event information to the video calling application A2. That is, the event type "double-click input event” and the operation area information "position coordinate information of the user's click position" are sent to the video calling application A2.
  • the video call application A2 may first determine that the mobile phone 100 needs to perform an operation object triggering operation according to the event type "click input event". Then, the video calling application A2 determines whether the position coordinate information is valid according to the operation area information "position coordinate information of the user's click position". For example, whether the current interface of the mobile phone 100 contains the application icon and/or the application name of the application, if so, the video call application A2 determines whether the user's click position is at the location of the desktop application icon and/or the application name, and if so, it is considered that the position is Coordinate information is valid.
  • the video call application A2 determines the corresponding area on the interface of the mobile phone 100 according to the position coordinate information "position coordinate information of the user's click position", and determines the operation object corresponding to the area.
  • the operation object is the video playback application A3.
  • the video calling application A2 determines the opening operation of the video playing application A3 that needs to be performed by the mobile phone 200 .
  • the mobile phone 100 performs the opening operation of the video playing application A3 to open the video playing application A3.
  • the mobile phone 100 performs the opening operation of the video playback application A3 to open the video playback application A3.
  • the video call application A2 may send the video playback application A3 opening event to the video call application A2 through the window management module (not shown in the figure) in the mobile phone 100.
  • the corresponding video call interface application (not shown in the figure) triggers the video call interface application to perform the operation of opening the video playback application A3.
  • the mobile phone 200 in the process of screen sharing between the mobile phone 200 and the mobile phone 100, after the linkage mode is enabled, the mobile phone 200 can trigger the corresponding trigger operation in the mobile phone 100 according to the trigger operation of the mobile phone 200 by the user U2.
  • Action object The interaction types between the mobile phone 100 and the mobile phone 200 for screen sharing can be enriched, and the user experience can be improved.
  • the video call interface application 204 directly transfers the received event information of the click input event sent by the input event distribution module 203 to the video call application A2, that is, the mobile phone 100 directly transfers the obtained position coordinate information directly. Send to mobile phone 200.
  • the video call application A2 in the mobile phone 100 can directly determine the corresponding operation object according to the position coordinate information.
  • the mobile phone 100 and the mobile phone 200 need to convert the position coordinate information.
  • the sensor module 201 in the mobile phone 200 obtains the initial position coordinate information according to the click operation of the user U1.
  • the video call interface application 204 can receive the initial position information in the click input event sent by the input event distribution module 203, according to the first screen resolution F1 of the mobile phone 100 and the second screen resolution F1 of the mobile phone 200.
  • the coordinate adjustment parameters determined by the screen resolution F2 are adjusted to obtain the position coordinate information by adjusting the initial position coordinate information.
  • the video call interface application 204 sends the event information including the adjusted position coordinate information to the video call application A2.
  • F1 is (F11 ⁇ F12), F11 is the number of pixels in the horizontal direction (or the screen width direction, or the x-axis direction) of the screen of the mobile phone 100 , and F12 is the vertical direction of the screen of the mobile phone 100 . (Or it can also be called the number of pixels in the length direction of the screen, or the y-axis direction).
  • F2 is (F21 ⁇ F22), F21 is the number of pixels of the screen of the mobile phone 200 in the horizontal direction (or the screen width direction, or the x-axis direction), and F22 is the screen of the mobile phone 200 in the vertical direction (or can also be called as is the number of pixels in the length direction of the screen, or the y-axis direction).
  • Z2 Sb ⁇ Z1
  • Z0 is the initial position coordinate information (x 0 , y 0 )
  • Z2 is the position coordinate information.
  • Z2 can be (S x2 x 0 , S y2 y 0 ).
  • the mobile phone 200 may also send the event information to the mobile phone 100 as in the foregoing steps S201 to S207.
  • step S210 after the video call application A2 in the mobile phone 100 obtains the event information, it can send the event information to the video call interface application 104 in the mobile phone 100 .
  • the video call interface application 104 adjusts the position coordinate information according to the coordinate adjustment parameters determined by the first screen resolution F1 of the mobile phone 100 and the second screen resolution F2 of the mobile phone 200 to obtain the adjusted position coordinate information. The process of adjusting the coordinates of the mobile phone 100 will not be repeated here. Then, the video call interface application 104 sends the adjusted position coordinate information to the video call application A2, and the video call application A2 determines the corresponding operation object according to the adjusted position coordinate information.
  • the application name of the application may also be an application such as the name of the application package corresponding to the application, or may also be other types of application identification information that can be used to identify the application.
  • the user may operate the screen through the aforementioned touch operations such as single click, double click, and slide, and may also operate the screen through voice, gesture, or the like.
  • the operation area information may also be image information such as a trajectory corresponding to the user operation.
  • the mobile phone 100 and the mobile phone 200 may also perform screen sharing in scenarios such as voice calls and device collaboration, and enable the linkage mode during the screen sharing process.
  • the first sharing interface and the second sharing interface may be partially identical.
  • the second electronic device may display only a part of the interface of the first display interface according to the interface data of the first shared interface, or the second electronic device may display the same content as the first display interface but with the same content as the first display interface according to the interface data of the first shared interface
  • the typesetting of characters and the like, which are different from the first display interface, can be set as required.
  • the aforementioned application download and installation operation may also be just an application download operation.
  • FIG. 12 is a schematic structural diagram of an electronic device 900 provided according to an implementation manner of an embodiment of the present application.
  • Electronic device 900 may include one or more processors 901 coupled to controller hub 904 .
  • the controller hub 904 communicates with the processor 901 via a multidrop bus such as a front side bus (FSB), a point-to-point interface such as a Quick Path Interconnect (QPI), or a similar connection.
  • FSB front side bus
  • QPI Quick Path Interconnect
  • Processor 901 executes instructions that control general types of data processing operations.
  • the controller hub 904 includes, but is not limited to, a graphics memory controller hub (GMCH) (not shown) and an input/output hub (IOH) (which may be on separate chips) (not shown), wherein the GMCH includes a memory and a graphics controller and is coupled to the IOH.
  • GMCH graphics memory controller hub
  • IOH input/output hub
  • Electronic device 900 may also include a coprocessor 906 and memory 902 coupled to controller hub 904 .
  • the memory 902 and the GMCH may be integrated within the processor 901 (as described in the embodiments of the present application), with the memory 902 and co-processor 906 directly coupled to the processor 901 and the controller hub 904 , the controller hub 904 and the IOH are in a single chip.
  • the coprocessor 906 is a special-purpose processor, and optional properties of the coprocessor 906 are indicated in FIG. 12 by dashed lines.
  • the electronic device 900 may further include a network interface (NIC) 903 .
  • Network interface 903 may include a transceiver for providing a radio interface for electronic device 900 to communicate with any other suitable device (eg, front-end modules, antennas, etc.).
  • network interface 903 may be integrated with other components of electronic device 900 .
  • the network interface 903 can realize the function of the communication unit in the above-mentioned embodiment.
  • Electronic device 900 may further include input/output (I/O) device 905 .
  • I/O input/output
  • Figure 12 is exemplary only. That is, although FIG. 12 shows that the electronic device 900 includes a processor 901, a controller hub 904, a memory 902 and other devices, in practical applications, the device using the methods in the embodiments of the present application may only include electronic devices Some of the components of the device 900 may, for example, only include the processor 901 and the NIC 903 . The properties of the optional device in Figure 12 are shown in dashed lines.
  • One or more tangible, non-transitory computer-readable media for storing data and/or instructions may be included in the memory of the electronic device 900 .
  • the computer-readable storage medium stores instructions, in particular temporary and permanent copies of the instructions.
  • the electronic device 900 may be a mobile phone, and the instructions stored in the memory of the electronic device may include: when executed by at least one unit in the processor, causing the mobile phone to implement the enhanced screen sharing method mentioned above. instruction.
  • FIG. 13 is a schematic structural diagram of a SoC (System on Chip, system on chip) 1000 provided according to an embodiment of the present application.
  • SoC System on Chip, system on chip
  • similar components have the same reference numerals.
  • the dotted box is an optional feature of the more advanced SoC 1000.
  • the SoC 1000 can be used in any electronic device according to the present application, and can implement corresponding functions according to different devices and different instructions stored therein.
  • SoC 1000 includes: interconnect unit 1002 coupled to processor 1001 ; system proxy unit 1006 ; bus controller unit 1005 ; integrated memory controller unit 1003 ; , which may include integrated graphics logic, image processor, audio processor, and video processor; SRAM (Static Random Access Memory) unit 1008 ; DMA (Direct Memory Access) unit 1004 .
  • the coprocessor 1007 includes a special purpose processor, such as, for example, a network or communications processor, a compression engine, a GPGPU, a high throughput MIC processor, an embedded processor, or the like.
  • One or more computer-readable media for storing data and/or instructions may be included in the SRAM cell 1008 .
  • the computer-readable storage medium may have instructions stored thereon, in particular, temporary and permanent copies of the instructions.
  • the instructions may include instructions that, when executed by at least one unit in the processor, cause the electronic device to implement the enhanced screen sharing method as mentioned above.
  • Embodiments of the mechanism disclosed in this application may be implemented in software, hardware, firmware, or a combination of these implementation methods.
  • Embodiments of the present application may be implemented as a computer program or program code executing on a programmable system including at least one processor, memory (or storage system, including volatile and nonvolatile memory and/or storage unit).

Abstract

一种增强的屏幕共享方法、系统及电子设备,该方法包括:第一电子设备显示第一共享界面,并向第二电子设备发送第一共享界面的界面数据;第二电子设备接收界面数据并显示与第一共享界面对应的第二共享界面;第二电子设备开启对第一电子设备的联动模式,若检测到用户对第二共享界面的触发操作,确定触发操作对应的事件信息,事件信息包括触发操作对应的事件类型和操作区域信息;第二电子设备将事件信息发送给第一电子设备;第一电子设备接收事件信息,并根据事件信息执行相应操作。第二电子设备和第一电子设备之间可以更好地进行交互,提高共享界面的可用性和用户使用友好性,提高用户体验。

Description

一种增强的屏幕共享方法和系统、电子设备
本申请要求于2020年12月21日提交中国专利局、申请号为202011518228.1、申请名称为“一种增强的屏幕共享方法和系统、电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及通信领域,特别涉及一种增强的屏幕共享方法和系统、电子设备。
背景技术
随着信息技术的不断发展,以及诸如手机、平板电脑等电子设备的普及与多样化,电子设备的功能日益丰富。例如,电子设备之间进行屏幕共享和交互的技术已经越来越成熟。屏幕共享和交互是指具有屏幕显示功能的电子设备,分别作为屏幕共享发起方设备(以下简称发起方设备)和屏幕共享接收方设备(以下简称接收方设备),实现电子设备之间的屏幕内容、媒体内容等信息的共享和交互的技术。
当前,在发起方设备和接收方设备进行屏幕共享和交互时,接收方设备可以接收的用户对共享屏幕的操作比较单一,例如接收方设备可以接收用户对共享屏幕进行涂鸦等操作。存在接收方设备和发起方设备之间不能很好地实现良好的交互,影响用户体验的问题。
发明内容
本申请实施例提供了一种增强的屏幕共享方法和系统、电子设备,在电子设备之间进行屏幕共享的过程中,屏幕共享的接收方设备可以接收用户对屏幕共享的共享界面进行的更多类型的操作,以实现屏幕共享的接收方设备和发起方设备之间更好地交互,提高用户体验。
为解决上述技术问题,第一方面,本申请实施例的实施方式提供了一种增强的屏幕共享方法,包括:第一电子设备显示第一共享界面,并向第二电子设备发送第一共享界面的界面数据;第二电子设备接收界面数据,根据界面数据显示与第一共享界面对应的第二共享界面;第二电子设备开启对第一电子设备的联动模式;在联动模式下,第二电子设备检测到用户对第二共享界面的触发操作,确定触发操作对应的事件信息,事件信息包括触发操作对应的事件类型和操作区域信息;第二电子设备将事件信息发送给第一电子设备;第一电子设备接收事件信息,并根据事件信息在第一电子设备上执行相应操作。
第二电子设备可以通过检测到的用户对第二共享界面的触发操作,可以使得第一电子设备在第一电子设备上执行相应操作,以实现第二电子设备与第一电子设备之间的联动,可以更好地实现第二电子设备与第一电子设备之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
在上述第一方面的一种可能的实现中,事件类型例如可以是单击输入事件、双击输入事件、长按输入事件、滑动输入事件等输入事件,操作区域信息可以是用户触发位置的位置坐标信息(例如用户进行单击操作的位置的坐标)。
在上述第一方面的一种可能的实现中,第一共享界面和第二共享界面可以完全相同,并且可以统一 称为共享界面。
上述第一方面的另一种可能的实现中,第一共享界面和第二共享界面也可以部分相同。
在上述第一方面的一种可能的实现中,第一电子设备根据事件信息在第一电子设备上执行相应操作,包括:第一电子设备根据事件类型确定第一操作类型;第一电子设备根据第一操作类型和操作区域信息在第一电子设备上执行相应操作。
在上述第一方面的一种可能的实现中,若事件类型为第一输入事件,第一操作类型为应用确定操作;第一电子设备根据操作区域信息,确定操作区域信息在第一共享界面上所对应应用的应用标识信息;第一电子设备生成事件应答信息,事件应答信息包括应用标识信息;第一电子设备将事件应答信息发送给第二电子设备;第二电子设备接收事件应答信息,若根据事件应答信息确定第二电子设备当前未安装应用标识信息对应的应用,则下载或者下载并安装应用标识信息对应的应用。
在上述第一方面的一种可能的实现中,第一输入事件为双击输入事件或长按输入事件。
第二电子设备与第一电子设备之间进行联动,第二电子设备可以实现第二电子设备下载并安装与第一电子设备中的应用对应应用的操作。可以更好地实现二者之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
在上述第一方面的一种可能的实现中,该方法还包括:第一电子设备根据第一操作类型确定第二操作类型,第二操作类型为应用下载或者下载并安装操作;第一电子设备生成事件应答信息,事件应答信息包括应用标识信息和第二操作类型;若第二电子设备根据事件应答信息确定第二电子设备当前未安装应用标识信息对应的应用,则下载或者下载并安装应用标识信息对应的应用。
第二电子设备与第一电子设备之间进行联动,可以实现第二电子设备下载并安装与第一电子设备中的应用对应应用的操作。可以更好地实现二者之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
在上述第一方面的一种可能的实现中,若事件类型为第二输入事件,第一操作类型为操作对象触发操作;第一电子设备根据操作区域信息,确定操作区域信息在第一共享界面上所对应的操作对象;第一电子设备执行触发操作对象的操作。
在上述第一方面的一种可能的实现中,第二输入事件为单击输入事件或滑动输入事件。
第二电子设备与第一电子设备之间进行联动,可以实现第二电子设备联动第一电子设备进行操作对象触发等操作,可以更好地实现二者之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
在上述第一方面的一种可能的实现中,若第一电子设备的第一屏幕分辨率和第二电子设备的第二屏幕分辨率不相同,该方法还包括:第二电子设备根据触发操作得到初始操作区域信息;第二电子设备根据第一屏幕分辨率和第二屏幕分辨率,对初始操作区域信息进行调整得到操作区域信息;或者第一电子设备根据第一屏幕分辨率和第二屏幕分辨率,对接收到的操作区域信息进行调整,并根据调整后的操作区域信息和第一操作类型执行相应操作。
通过分辨率调整,可以准确地确定用户进行触发操作的区域的操作区域信息。
在上述第一方面的一种可能的实现中,操作区域信息为位置坐标信息。
在上述第一方面的一种可能的实现中,第二电子设备开启对第一电子设备的联动模式,包括:第一电子设备显示第一联动控件;若第一电子设备检测到用户对第一联动控件的开启触发操作,第一电子设备开启与第二电子设备之间的联动模式;或者第二电子设备显示第二联动控件;若第二电子设备检测 到用户对第二联动控件的开启触发操作,第二电子设备开启对第一电子设备的联动模式。
联动模式可以由第一电子设备开启,也可以由第二电子设备开启,其可以根据需要设置。
在上述第一方面的一种可能的实现中,若第二电子设备检测到用户对第二联动控件的开启触发操作,第二电子设备开启对第一电子设备的联动模式,包括:若第二电子设备检测到用户对第二联动控件的开启触发操作,第二电子设备向第一电子设备发送联动请求;第一电子设备接收联动请求,显示联动确定控件;若第一电子设备检测到用户对联动确定控件的触发操作,第二电子设备生成并发送同意联动的联动应答给第二电子设备;第二电子设备接收联动应答,开启对第一电子设备的联动模式。
在上述第一方面的一种可能的实现中,该方法还包括:第二电子设备开启对第一电子设备的联动模式,且生成联动操作提示信息;第二电子设备显示联动操作提示信息。以便于用户方便地进行联动操作,提高用户的体验。
第二方面,本申请实施例的实施方式提供了一种增强的屏幕共享方法,应用于第一电子设备,包括:第一电子设备显示第一共享界面,并向第二电子设备发送第一共享界面的界面数据,以使第二电子设备根据界面数据显示与第一共享界面对应的第二共享界面;第一电子设备接收第二电子设备发送来的事件信息,根据事件信息在第一电子设备上执行相应操作;事件信息为第二电子设备开启对第一电子设备的联动模式,并在联动模式下,根据用户对共享界面的触发操作确定的信息,事件信息包括触发操作对应的事件类型和操作区域信息。
第二电子设备可以通过检测到的用户对第二共享界面的触发操作,可以使得第一电子设备在第一电子设备上执行相应操作,以实现第二电子设备与第一电子设备之间的联动,可以更好地实现第二电子设备与第一电子设备之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
在上述第二方面的一种可能的实现中,第一电子设备根据事件信息在第一电子设备上执行相应操作,包括:第一电子设备根据事件类型确定第一操作类型;第一电子设备根据第一操作类型和操作区域信息在第一电子设备上执行相应操作。
第三方面,本申请实施例的实施方式提供了一种增强的屏幕共享方法,应用于第二电子设备,包括:第二电子设备接收第一电子设备发送来的第一共享界面的界面数据,根据界面数据显示与第一共享界面对应的第二共享界面;共享界面为第一电子设备显示的界面;第二电子设备开启对第一电子设备的联动模式;在联动模式下,第二电子检测到用户对第二共享界面的触发操作,确定触发操作对应的事件信息,事件信息包括触发操作对应的事件类型和操作区域信息;第二电子设备将事件信息发送给第一电子设备,以使第一电子设备根据事件信息在第一电子设备上执行相应操作。
第二电子设备可以通过检测到的用户对第二共享界面的触发操作,可以使得第一电子设备在第一电子设备上执行相应操作,以实现与第一电子设备之间的联动,可以更好地实现与第一电子设备之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
第四方面,本申请实施例的实施方式提供了一种增强的屏幕共享系统,包括:第一电子设备和第二电子设备;其中,第一电子设备用于显示第一共享界面,并向第二电子设备发送第一共享界面的界面数据;第二电子设备用于接收界面数据,根据界面数据显示与第一共享界面对应的第二共享界面;第二电子设备还用于开启对第一电子设备的联动模式;在联动模式下,第二电子设备用于在检测到用户对第二共享界面的触发操作时,确定触发操作对应的事件信息,事件信息包括触发操作对应的事件类型和操作区域信息;第二电子设备还用于将事件信息发送给第一电子设备;第一电子设备用于接收事件信息,并根据事件信息在第一电子设备上执行相应操作。
本实现方式提供的增强的屏幕共享系统,包括执行上述的增强的屏幕共享方法的第一电子设备和第二电子设备,因此也可以实现上述第一方面或者第一方面的一种可能的实现提供的增强的屏幕共享方法的效果。
第五方面,本申请实施例的实施方式提供了一种电子设备,包括:存储器,用于存储计算机程序,计算机程序包括程序指令;处理器,用于执行程序指令,以使该电子设备执行前述的增强的屏幕共享方法。
第六方面,本申请实施例的实施方式提供了一种计算机可读取存储介质,计算机可读取存储介质存储有计算机程序,计算机程序包括程序指令,程序指令被电子设备运行以使电子设备执行前述的增强的屏幕共享方法。
第七方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在电子产品上运行时,使得电子产品执行前述的电子设备间的协同方法。
可以理解的是,上述第二方面至第七方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所使用的附图作简单介绍。
图1为本申请实施例提供的一种通信系统的示意图;
图2为本申请实施例提供的一种手机的结构示意图;
图3为本申请实施例提供的一种手机的软件结构框图;
图4A-4N为本申请实施例提供的一种手机100和手机200之间进行交互的一些界面示意图;
图5A-5C为本申请实施例提供的一种手机100和手机200之间进行视频播放联动过程中的一些界面示意图;
图6A-6B为本申请实施例提供的一种手机100和手机200之间进行音乐播放联动过程中的一些界面示意图;
图7A-7B为本申请实施例提供的一种手机100和手机200之间进行图片显示联动过程中的一些界面示意图;
图8A和8B为本申请实施例提供的另一些共享控制控件的示意图;
图8C为本申请实施例提供的手机100和手机200之间进行涂鸦操作的界面示意图;
图9为本申请实施例提供的一种手机100和平板电脑300之间进行联动的过程中的一种界面示意图;
图10为本申请实施例提供的一种手机100和手机200之间进行应用下载并安装联动的过程示意图;
图11为本申请实施例提供的一种手机100和手机200之间进行应用打开联动的过程示意图;
图12是根据本申请的一些实施例,示出了一种电子设备的结构示意图;
图13是根据本申请的一些实施例,示出了一种片上系统(SoC)的结构示意图。
具体实施方式
下面将结合附图对本申请实施例的实施方式作进一步地详细描述。
请参见图1,图1是本申请实施例提供的一种通信系统的示意图。
该通信系统包括手机100和手机200,在本申请实施例的一种实现方式中,手机100和手机200 中均安装有具有视频通话功能和屏幕共享功能的视频通话应用。该视频通话应用可使用用户的账户信息和密码进行登录,借此实现手机100和手机200之间的互联,并且实现借助视频通话功能和屏幕共享功能的交互。登录的具体方式可参考现有技术,本申请实施例对此不做限定。
该通信系统还包括云服务器300,云服务器300为视频通话应用对应的服务器。手机100和手机200之间通过云服务器300可以进行视频通话、屏幕共享等通信。
手机100的用户为用户U1,手机200的用户为用户U2。
用户U1开启手机100中的视频通话应用,并向用户U2发起视频通话的请求,若用户U2操作手机200接受了该视频通话的请求,手机100和手机200二者之间可以进行视频通话。
在手机100和200进行视频通话的过程中,手机100和200显示的视频通话的界面分别可提供更多的功能控件,例如屏幕共享控件。若用户U1通过屏幕共享控件开启了屏幕共享模式(或者可以称为屏幕共享功能,或者界面共享模式),并且用户U2操作手机200接受了开启屏幕共享模式的请求,则手机100可以将手机100的实时界面作为第一共享界面,并且手机100将第一共享界面的显示内容作为界面数据以数据流(或者也可以称为视频数据流)的形式发送给手机200。手机200接收第一共享界面的界面数据,根据界面数据在自己的屏幕上显示与第一共享界面对应的第二共享界面,实现手机100向手机200共享手机100的屏幕内容。其中手机100作为进行屏幕共享的屏幕共享发起方设备,手机200作为进行屏幕共享的屏幕共享接收方设备。
本申请实施例提供的增强的屏幕共享方法中,手机100和手机200进行屏幕共享的过程中,手机100和手机200的界面也分别可提供更多的功能控件,例如联动控件(或者也可以称为联动开关)。若用户U2通过手机200显示的联动控件开启了手机100和手机200之间的联动模式(或者也可以称为联动功能),在联动模式下,手机200在接收到用户U2对手机2000显示的第二共享界面的触发操作后,手机200将与用户U2的触发操作对应的事件信息发送给手机100。事件信息包括用户U2的触发操作对应的事件类型和操作区域信息,其中事件类型例如可以是单击输入事件、双击输入事件、长按输入事件、滑动输入事件等输入事件,操作区域信息可以是用户触发位置的位置坐标信息(例如用户进行单击操作的位置的坐标)。
手机100接收到手机200发送来的事件信息后,可以根据事件信息确定手机100需要执行操作的操作类型,并执行相应的操作。使得手机100执行与用户U2的触发操作对应的操作,达到手机200对手机100进行联动的目的。即手机200对手机100进行联动是指,手机100可以根据手机200发送来的事件信息在手机100中执行与该事件信息对应的操作。例如,手机200检测到用户U2单击手机200显示的第二共享界面中某应用的应用图标的操作,则手机100根据接收到的事件信息执行打开应用等相应动作和反馈。手机200检测到用户U2双击手机200显示的第二共享界面中某应用的应用图标的操作,手机100根据接收到的事件信息执行应用下载并安装等相应动作和反馈。手机200检测到用户U2单击手机200显示的第二共享界面中的某控件的操作,手机100根据接收到的事件信息执行触发控件等相应动作和反馈。
例如,手机100显示第一共享界面,手机200显示与手机100发送来第一共享界面对应的第二共享界面。若手机200检测到用户U2对手机200显示的第二共享界面的单击操作,则手机200确定用户U2的单击操作对应的事件类型为单击输入事件。手机200将事件类型和用户U2进行单击操作位置处的位置坐标信息作为事件信息发送给手机100。手机100接收到事件信息后,根据事件类型确定单击输入事件对应的操作类型为操作对象触发操作。则手机100根据位置坐标信息确定用户U2进行单击操作位置 在手机100显示的第一共享界面上对应的操作对象,并且执行触发操作对象的操作。例如,操作对象为应用,则执行打开应用的操作;若操作对象为应用中的控件,则进行触发控件的操作等。
若手机200检测到用户U2对手机200显示的第二共享界面的双击操作,则手机200确定用户U2的双击操作对应的事件类型为双击输入事件。手机200将事件类型和用户U2进行双击操作位置处的位置坐标信息作为事件信息发送给手机100。手机100接收到事件信息后,根据事件类型确定与双击输入事件对应的操作类型为应用确定操作,则手机100根据位置坐标信息确定用户U2进行双击操作的位置在手机100显示的第一共享界面上对应应用的应用名称。手机100将应用名称作为事件应答信息发送给手机200。手机200接收到事件应答信息后,根据应用名称确定手机200中是否已安装对应应用,若未安装对应应用,手机200直接根据应用名称从手机200中的应用市场应用中下载相应的应用,并完成应用安装。若手机200中已安装对应应用,则手机200不执行应用下载并安装操作;或者手机200可以显示用于告知用户U2当前手机200已安装对应应用的提示信息。
另外,手机100确定应用名称后,还可以根据操作类型“应用确定操作”确定手机200需要执行的对应操作为应用下载并安装,则手机100确定手机200接收到事件应答信息后需要执行操作的操作类型为“应用下载并安装”。并且手机100根据应用名称和手机200需要执行操作的操作类型生成事件应答信息,即事件应答信息包括应用名称和手机200需要执行操作的操作类型。手机200接收到事件应答信息后,可以根据该操作类型方便地确定需要执行应用下载并安装操作,则手机200根据应用名称确定手机200中是否已安装对应应用,以及在未安装对应应用时,从手机200中的应用市场应用中下载相应的应用,并完成应用安装。
当然,手机200也可以根据接收到的用户U2对手机200显示的共享屏幕的其他触发操作,联动手机100以使手机100执行其他的操作。本申请实施例提供的增强的屏幕共享方法中,可以根据需要设置用户的输入事件类型(例如单击输入事件,双击输入事件,长按输入事件等)和操作对象(例如界面控件,应用等)的组合方式,以使手机100执行相应的操作。
例如,前述的单击操作也可以是滑动操作,即输入事件为滑动输入事件;若手机200检测到用户U2滑动手机200显示的第二共享界面中某应用的应用图标的操作,手机100执行打开应用等相应动作和反馈。双击操作也可以是长按操作,即输入事件为长按输入事件;若手机200检测到用户U2长按手机200显示的第二共享界面中某应用的应用图标的操作,手机100执行应用下载并安装等相应动作和反馈。或者,输入事件也可以是三连击输入事件,若手机200检测到用户U2对手机200显示的第二共享界面中某应用的应用图标的三连击操作,手机100执行应用卸载等相应动作和反馈。
本申请实施例提供的增强的屏幕共享方法中,手机200可以通过用户U2对第二共享界面的触发操作,实现与手机100之间的联动,以实现手机200下载并安装与手机100中的应用对应应用的操作,或者联动手机100进行操作对象触发等操作,可以更好地实现与手机100之间的交互,提高共享界面的可用性和用户使用友好性,以提高用户的体验。
请参见图2,图2是本申请实施例提供的一种手机的结构示意图。该手机可以是前述的手机100,也可以是手机200。
其中,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模 块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对手机的限定。在本申请实施例另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。例如处理器110用于使得手机执行本申请实施例提供的增强的屏幕共享方法。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信功能。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT)等无线通信的解决方案。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。
应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能,以及实现视频通话功能。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。进一步地,手机至少包括一个与显示屏194位于同一侧的摄像头193。
视频编解码器用于对数字视频压缩或解压缩。手机可以支持一种或多种视频编解码器。这样,手机可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等,另外,手机可以实现音视频数据和录屏数据的封装和播放等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如视频通话、音乐播放等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机可以通过扬声器170A进行视频通话,收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
耳机接口170D用于连接有线耳机。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。当有触摸操作(或者可以称为触发操作)作用于显示屏194,手机根据压力传感器180A检测所述触摸操作强度。手机也可以根据压力传感器180A的检测信号计算触摸的位置。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏(Touch Panel,TP),也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器180K可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机的表面,与显示屏194所处的位置不同。
本申请实施例中,手机可以根据压力传感器180A、触摸传感器180K检测用户的在显示屏194上的触发操作,还可以根据受话器170B检测用户的语音输入,也可以是根据摄像头193检测用户的手势输入,或者根据其他输入键盘等输入模块检测用户的输入,本实施例对此不做限定。
请参见图3,图3是本申请实施例提供的一种手机的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,手机的系统分为三层,从上至下分别为应用程序层,应用程序框架(Framework)层和内核层。另外,该系统还包括软件开发工具包(Software Development Kit,SDK)。
其中,内核层包括传感器(Sensor)模块和触摸屏。其中,传感器模块例如可以是前述的压力传感器180A,用于根据用户在触摸屏上的物理触发操作(比如用户的单击操作),产生触发操作电子信号。触发操作电子信号中可以包括用户的单击操作对应的单击输入事件信息,触发操作电子信号中还可以包括用户进行单击操作位置的操作区域信息,例如位置坐标信息。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
应用程序框架层包括输入事件读取模块、输入事件分发模块和窗口管理模块。其中,输入事件读取模块例如可以是InputReader,输入事件分发模块例如可以是InputDispatcher,窗口管理模块例如可以是窗口管理器(Window Manager Service,WMS)。
InputReader用于接收压力传感器180A发送来的触发操作电子信号,并通过线程循环不断从EventHub(图中未示出)中取出事件进行事件翻译以确定输入事件,并将输入事件进行封装发送给InputDispatcher。
InputDispatcher保管有来自WMS的所有窗口的信息。InputDispatcher接收到来自InputReader的输入事件后,在保管的窗口寻找合适窗口进行事件分发。本申请实施例中,InputDispatcher主要是在手机作为接收方设备时,将输入事件分发给应用程序层中的InCallUI APP,以触发InCallUI APP的进一步操作。
WMS用于管理窗口程序WMS可以获取手机的显示屏大小,以及提供窗口位置。可以使 inputdispachter正确分发输入事件到指定窗口。
应用程序层可以包括一系列应用程序,如包括视频通话界面应用和视频通话应用。其中,视频通话界面应用例如可以是InCallUI APP,视频通话应用例如可以是VoipService APP或者HwVoipService APP。
InCallUI APP负责接收inputdispachter发送来的输入事件,进行输入事件的识别判断。另外,手机作为发起方设备时,InCallUI APP还负责输入事件的有效性和解析输入事件对应应用的应用信息。
InCallUI APP提供的应用界面上包括一些功能控件,该功能控件以屏幕上的触控件的形式进行显示,其可以根据应用所处的不同场景进行显示。本申请实施例中,该功能控件可以包括“视频通话”、“屏幕共享”、“联动”、“涂鸦”等控件中的一个或多个,且该功能控件在屏幕上的显示内容包括图标和/或文字。
例如,手机开启视频通话应用,进入视频通话应用界面后,手机显示“视频通话”控件。若手机检测到用户对“视频通话”控件的触发操作,则手机与对端手机之间建立视频通话。
手机建立视频通话后,手机显示“屏幕共享”控件。若手机检测到用户对“屏幕共享”控件的触发操作,则手机将自己的当前屏幕作为共享屏幕发送至进行视频通话的对端手机进行显示。
手机建立屏幕共享后,手机显示“联动”控件。若手机检测到用户对“联动”控件的触发操作,手机与进行视频通话、且进行屏幕共享的对端手机之间开启联动模式。
InCallUI APP还负责通信界面的显示及适配,例如InCallUI APP包括前述视频通话、屏幕共享、联动等业务功能的界面的切换入口,以用于相应界面的显示。
HwVoipService APP为本申请实施例所提供的支持视频通话和屏幕共享的应用,具有可视化的用户界面。
HwVoipService用于负责业务逻辑控制,包括提供音频通话、视频通话、设备发现、消息服务、业务功能切换等功能的实现,其能力通过封装成服务API给InCallUI APP使用。
HwVoipService APP还负责与SDK进行交互。
SDK包括SDK模块,SDK模块例如可以是CaasKit SDK。CaasKit SDK负责发送信令,解析信令,与应用程序层进行交互。
需要说明的是,应用程序层还可以包括相机,图库,通话,音乐,视频等应用程序。
应用程序框架层还可以包括活动管理器(Activity Manager Service,AMS),视图系统(VIEW system)、内容提供器,电话管理器,资源管理器等;另外,还可以包括MSDP(Multicast Source Discovery Protocol,组播源发现协议)感知服务和多路径接入服务等,本申请实施例对此不作任何限制。
其中,AMS、WMS和VIEW,为本申请实施例中的联动提供能力支持。AMS提供基础UX交互能力;WMS负责窗口区域计算等能力,提供操作区域信息。VIEW提供操作对象监听等能力,根据上次应用的响应或回调,可以捕获用户操作行为,记录操作对象,提供操作对象等信息。
MSDP感知服务和多路径接入服务为已有音频通话、视频通话的基础组件,内核层基于已有的音频通话、视频通话,其与现有技术相同,此处不再赘述。
内核层为底层系统,底层系统中还包括用于提供显示服务的底层显示系统。例如还可以包括表面管理器(surface manager),媒体库(Media Libraries)等。表面管理器用于对显示系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动等,本申请实施例对此不做任何限制。即其主要包括硬编码,以及与前述MSDP感知服务对应的MSDP虚拟化设备等。
对于手机100和手机200基于图3所示的软件结构的系统内部工作流程将在后文进行详细说明。
以下先将对该增强的屏幕共享方法的应用场景和设备间的交互过程进行说明。
手机100和手机200之间建立视频通话的过程:
请参见图4A,在本申请实施例的一种实现方式中,手机100和手机200分别只具备一个物理屏幕,且屏幕上分别显示各自的主桌面(如图4A所示)。手机100的主桌面上显示有应用市场应用A1、视频通话应用A2、视频播放应用A3和音乐应用A4等应用的应用名称和应用图标。手机200的主桌面上显示有应用市场应用A1和视频通话应用A2等应用的应用名称和应用图标。
继续参见图4A,若手机100检测到用户U1对手机100上的视频通话应用A2的单击操作,手机100开启视频通话应用A2,手机200显示如图4B所示的联系人选择界面。
继续参见图4B,若手机100检测到用户U1对联系人列表中的“用户U2”对应的视频通话控件10的单击操作,手机100确实进行视频通话。手机100向用户U2对应的设备手机200发送视频通话请求,以请求建立视频通话。
继续参见图4B,手机200接收到手机100发送来的视频通话请求后,手机200显示提示信息20“接收到视频通话请求”。若手机200检测到用户U2对“接受”控件的单击操作,手机200向手机100发送建立视频通话的应答。
请参见图4C,手机100接收到手机200发送的建立视频通话的应答之后,手机100和手机200之间建立视频通话,并且手机100和手机200分别显示视频通话界面。
其中,手机100通常默认在屏幕的右上角显示用户U1的视频通话图像(例如女生头像所示的视频通话图像),以及在屏幕的主区域显示用户U2的视频通话图像(例如男生头像所示的视频通话图像)。并且手机100屏幕的下方显示“更多”控件11,以及“切换摄像头”控件和“挂断”控件。当然,手机100也可以显示其他控件。
相应的,手机200在屏幕的右上角显示用户U2的视频通话图像,及在屏幕的主区域显示用户U1的视频通话图像。并且手机200在屏幕的下方显示“更多”控件21,以及“切换摄像头”控件和“挂断”控件。
手机100和手机200之间进行屏幕共享的过程:
请参见图4D,若手机100检测到用户U1对“更多”控件11的单击操作,手机100显示“共享屏幕”控件12。若手机100检测到用户U1对“共享屏幕”控件12的单击操作,手机100向手机200发送屏幕共享请求。
继续参见图4D,手机200接收到手机100发送来的屏幕共享请求后,显示提示信息22“对方正在请求向您共享屏幕”。若手机200检测到用户U2对“接受”控件的单击操作,手机200向手机100发送同意屏幕共享的应答。
在本申请的一种实现方式中,第一共享界面和第二共享界面可以相同,并且统一称为共享界面。
请参见图4E,手机100接收到手机200发送来的同意屏幕共享的应答之后,手机100和手机200之间开启屏幕共享模式,进行屏幕共享。手机100中的视频通话应用A2自动切换至在后台运行,手机100显示手机100的主桌面。手机100将自己的主桌面界面作为共享界面发送给手机200。相应的,手机200中的视频通话应用A2也自动切换至在后台运行,手机200显示手机100发送来的共享界面。
其中,手机100作为屏幕共享发起方,手机200作为屏幕共享接收方。
另外,手机100还显示发起方侧共享控制控件13,在本申请实施例的一种实现方式中,发起方侧共享控制控件13包括共享提示信息130、联动控件131和退出共享控件132。
其中共享提示信息130包括屏幕共享状态提示信息“共享中”和已进行屏幕共享的时间信息“00:00”,共享时间信息实时地根据手机100和手机200之间进行屏幕共享的时间显示相应的数值。手机100可以根据检测到的用户U1对联动控件131的单击操作,开启或者关闭手机100和手机200之间的联动模式。手机100可以根据检测到的用户U1对退出共享控件132的单击操作,退出手机100和手机200之间的屏幕共享模式以及联动模式。手机100和手机200之间退出屏幕共享模式后,手机100和手机200可以显示如图4C所示的视频通话界面。
另外,手机200的界面上显示接收方侧共享控制控件23,接收方侧共享控制控件23包括共享提示信息230、联动控件231和退出共享控件232。
其中,共享提示信息230包括共享状态提示信息“对方共享中”和已进行屏幕共享的时间信息“00:00”。共享时间信息实时地根据手机100和手机200之间进行屏幕共享的时间显示相应的数值。手机200可以根据检测到的用户U2对联动控件231的单击操作,开启或者关闭手机100和手机200之间的联动模式。手机200可以根据检测到的用户U2对退出共享控件232的单击操作,退出手机100和手机200之间的屏幕共享模式以及联动模式。
需要说明的是,手机100的发起方侧共享控制控件13和手机200的接收方侧共享控制控件23的触发状态彼此同步。即手机100和手机200双方任何一方均可以根据对应用户对该控件的点击操作来开始和结束联动模式。例如,若手机100检测到用户U1对联动控件131的单击操作,手机100和手机200之间开启联动模式。之后,若手机200检测到用户U2对联动控件231的单击操作,手机100和手机200之间关闭联动模式,手机200结束对手机100的联动。
请参见图4F,在本申请实施例的一种实现方式中,若手机200检测到用户U2对联动控件231的单击操作,手机200向手机100发送联动请求。手机100接收到联动请求后,显示提示信息14“对方请求联动”。若手机100检测到用户U1对作为联动确定控件的“确定”控件的单击操作,手机100向手机200发送同意联动的应答。
手机200接收到手机100发送来的同意联动的应答后,手机100和手机200之间开启联动模式。
请参见图4G,手机100和手机200之间开启联动模式之后,手机100的联动控件131可以显示文字“联动”,以提示用户U1当前手机100和手机200已开启联动模式。
若手机200为首次检测到用户U2对联动控件231的单击操作,手机200显示联动操作提示信息。
继续参见图4G,手机200显示的联动操作提示信息包括操作介绍图示信息24、说明信息25和确认控件26。其中,操作介绍图示信息24包括多个示意区域,用于指示说明用户单击相应区域即可完成对手机100的相应联动。说明信息25为“手指双击共享屏幕可触发安装发起方特定应用;手指单击共享屏幕可触发发起方操作”。确认控件26可以显示“知道了”。
联动操作提示信息也可以只包括操作介绍图示信息24和确认控件26,或者只包括说明信息25和确认控件26。当然,联动操作提示信息也可以是其他用于向用户说明联动模式的功能的信息。
若手机200检测到用户对确认控件26的单击操作。手机200显示如图4H所示的界面。其中,联动控件231也显示文字“联动”,以提醒用户U2当前手机200和手机100处于联动模式。
若手机200为非首次(如第二次、第三次等)检测到用户U2对联动控件231的单击操作,可以直 接显示图4H所示的界面,即联动控件231直接显示文字“联动”,而不用显示图4G所示的操作说明界面。
本申请实施例中,手机200可以在首次检测到用户U2对联动控件231的单击操作时,显示联动操作提示信息,以提醒用户如何进行手机200和手机100之间的联动操作,可以有效地提高用户的体验。当然,手机200也可以周期性地显示联动操作提示信息,或者根据其他需要或者用户的触发操作显示联动操作提示信息。
继续参见图4H,若手机200检测到用户U2对共享界面上的视频播放应用A3的应用图标或者应用名称的双击操作,手机200确定用户输入的事件类型为双击输入事件。手机200根据用户U2进行双击操作位置处的位置坐标信息和事件类型生成事件信息,即事件信息包括事件类型(双击输入)和用户进行双击操作的位置坐标信息。手机200将事件信息发送给手机100。
手机100接收到事件信息后,根据事件类型确定手机100需要执行操作的操作类型为应用确定操作,手机100则根据位置坐标信息确定用户U2进行双击操作位置在共享界面上对应应用的应用名称“视频播放应用A3”。并且手机100根据操作类型确定手机200接收到事件应答信息后,需要执行的操作类型为应用下载并安装操作。则手机100将应用名称“视频播放应用A3”和操作类型“应用下载并安装”作为事件应答信息发送给手机200。
手机200接收到手机100发送来的事件应答信息后,根据该操作类型和应用名称确定需要下载并安装视频播放应用A3,则手机200进一步确定当前手机200中是否已安装视频播放应用A3。
若手机200已安装视频播放应用A3,则手机200不执行视频播放应用A3下载并安装操作。或者,手机200可以显示提示信息“当前已安装视频播放应用A3,请确认”(图中未示出),以告知用户U2手机200当前已安装了视频播放应用A3。
若手机200未安装视频播放应用A3,则手机200从手机200中的应用市场应用A1中搜索下载相应的视频播放应用A3,并完成视频播放应用A3的安装。
请参见图4I,当手机200完成视频播放应用A3的安装后,手机200可以显示提示信息27“已成功安装视频播放应用A3”,以用于提醒用户U2。当然,手机200下载应用的过程中,提示信息27可以显示“应用下载中”等,以用于提醒用户U2,其可以根据需要设置。
继续参见图4I,手机200安装视频播放应用A3之后,若手机200检测到用户U2对共享退出控件232的单击操作,手机200和手机100结束联动模式和屏幕共享模式,退出共享界面。
请参见图4J,手机200和手机100结束联动模式和屏幕共享模式后,手机200可以直接显示手机200的主桌面,此时主桌面包括了手机200已安装的视频播放应用A3的应用图标和应用名称。另外,手机200的屏幕的右上方还可以显示用户U1的视频通话图像。
另外,手机100结束联动模式和屏幕共享模式后,手机100显示手机100的主桌面,且手机100在屏幕的右上方还可以显示用户U2的视频通话图像。
本实现方式中,手机200与手机100之间开启联动模式,手机200可以通过接收到的用户U2对共享界面上的应用的双击操作,与手机100之间进行联动,通过手机100获取该应用的应用名称,并方便地安装用户进行双击操作位置对应的应用,丰富了手机200和手机100之间的交互类型,提高了用户体验。
在本申请实施例的另一种实现方式中,手机100和手机200开启联动模式,显示如图4E所示的界面之后,若手机200检测到用户U2对联动控件231的单击操作。手机200也可以不用向手机100发送 联动请求,手机200和手机100之间可以自动直接开启联动模式,显示图4G所示的界面。
在本申请实施例的另一种实现方式中,手机200在搜索下载并安装视频通话应用A3的过程中,手机200也可以不显示图4I所示的提示信息27,其可以根据需要设置。
需要说明的是,手机100和手机200中分别安装的应用市场应用可以是相同的应用市场应用A1,也可以是不相同的提供应用下载功能的应用。
请参见图4K,在本申请实施例的另一种实现方式中,若手机200在与手机100开启了联动模式之后,若手机200再次检测到用户对联动控件231的单击操作,手机200和手机100退出联动模式。
请参见图4L,手机200和手机100退出联动模式之后,手机100可以显示主桌面,手机200显示共享界面。并且联动控件131和联动控件231上都不显示文字“联动”,以用于提醒用户当前手机100和手机200未处于联动模式。
请参见图4M,在本申请实施例的另一种实现方式中,手机100和手机200之间建立视频通话的过程中,手机100和手机200进行屏幕共享时,手机100可以继续显示视频通话界面,视频通话界面上还包括通话界面缩小控件15。并且手机100将视频通话界面作为共享界面发送给手机200。手机200显示手机100共享的共享界面。
若手机100和手机200之间开启联动模式,且手机200检测到用户U2对手机200显示的共享界面上的通话界面缩小控件15的单击操作,则手机200确定用户U2的单击操作对应的事件类型为单击输入事件。手机200将用户U2进行单击操作位置处的位置坐标信息和事件类型作为事件信息发送给手机100。
手机100接收到事件信息后,根据事件类型确定操作类型为操作对象触发操作。则手机100根据位置坐标信息确定用户U2进行单击操作位置在共享界面上对应的操作对象为通话界面缩小控件15。则手机100执行触发通话界面缩小控件15的操作,即执行通话界面缩小操作,显示如图4N所示的界面。
在本申请的另一种实现方式中,若手机200和手机100结束联动操作模式和屏幕共享模式,退出共享界面之后,手机200和手机100也可以分别显示如图4C所示的视频通话界面。
手机100和手机200之间进行视频播放联动的场景和过程:
请参见图5A,手机100和手机200开启屏幕共享模式和联动模式。若手机200检测到用户U2对视频播放应用A3的单击操作,则手机200确定用户U2的单击操作对应的事件类型为单击输入事件。手机200将用户U2进行单击操作位置处的位置坐标信息和事件类型作为事件信息发送给手机100。
手机100接收到事件信息后,根据事件类型确定操作类型为操作对象触发操作,并且手机100根据位置坐标信息确定用户U2进行单击操作位置在共享界面上对应的操作对象为视频播放应用A3。手机100执行打开视频播放应用A3的操作,显示如图5B所示的视频应用界面。
若手机200检测到用户对“视频一”的单击操作,则手机200确定用户U2的单击操作对应的事件类型为单击输入事件。手机200将用户U2进行单击操作位置处的位置坐标信息和事件类型作为事件信息发送给手机100。
手机100接收到事件信息后,根据事件类型确定操作类型为操作对象触发操作,并且手机100根据位置坐标信息确定用户U2进行单击操作位置在共享界面上对应的操作对象为“视频一”。手机100执行打开“视频一”的操作,显示如图5C所示的视频播放界面。
手机100和手机200之间进行音乐播放联动的场景和过程:
请参见图6A,手机100和手机200开启屏幕共享模式和联动模式。手机100显示音乐应用A3的应用界面。若手机200检测到用户U2对“歌曲三”的单击操作,则手机200确定用户U2的单击操作对应 的事件类型为单击输入事件。手机200将用户U2进行单击操作位置处的位置坐标信息和事件类型作为事件信息发送给手机100。
手机100接收到事件信息后,根据事件类型确定操作类型为操作对象触发操作,并且手机100根据位置坐标信息确定用户U2进行单击操作位置在共享界面上对应的操作对象为“歌曲三”。手机100执行播放“歌曲三”的操作,显示如图6B所示的视频播放界面。
手机100和手机200之间进行图片浏览联动的场景和过程:
请参见图7A,手机100和手机200开启屏幕共享模式和联动模式。手机100显示相册应用中的图片。若手机200检测到用户U2对图片的滑动放大操作(其中用户双指张开为滑动放大操作,或者也可以称为拖动放大操作;用户双指捏合为滑动缩小操作),则手机200确定用户U2的滑动放大操作对应的事件类型为滑动放大输入事件。手机200将用户U2进行滑动放大操作位置处的初始位置坐标信息、终止位置坐标信息和事件类型作为事件信息发送给手机100。以用户双指张开为例,各个手指的初始位置坐标和终止位置坐标都不同,则手机200绘制用户双指的滑动轨迹,确定各个手指的初始位置坐标和终止位置坐标作为位置坐标信息通过事件信息映射到手机100。
手机100接收到事件信息后,根据事件类型确定操作类型为操作对象触发操作。则手机100根据初始位置坐标信息、终止位置坐标信息确定对图片的放大比例,并且执行图片放大操作,显示如图7B所示的图片显示界面。
请参见图8A,在本申请实施例的另一种实现方式中,在手机100和手机200开启联动模式之后,作为发起方设备的手机100,其发起方侧共享控制控件13也可以只包括共享提示信息130和退出共享控件132,而不包括前述的联动控件131。作为接收方设备的手机200的接收方侧共享控制控件23可以包括联动控件231。即本实现方式中,手机200与手机100之间的联动模式的开启和关闭只能通过联动控件231进行触发。
请参见图8B,在本申请实施例的另一种实现方式中,发起方侧共享控制控件13中的联动控件131的图标也可以如图8B所示。另外,联动控件131可以通过改变图标的形状和格式,或者进行颜色变化等方式以提醒用户联动模式的开启和关闭。
另外,在本申请实施例的另一种实现方式中,发起方侧共享控制控件13还可以包括涂鸦控件133,以用于用户U1通过涂鸦控件133对共享屏幕进行涂鸦操作。
相应的,手机200中的接收方侧共享控制控件23也可以如图8B所示,此处不再赘述。
请参见图8C,手机100接收到用户U1对“应用市场应用A1”的涂鸦操作后,显示图示的椭圆形的涂鸦痕迹,相应的手机200也显示对应的涂鸦痕迹。另外,手机200接收到用户U2对“音乐应用A4”的涂鸦操作后,显示图示的方形的涂鸦痕迹,相应的手机100也显示对应的涂鸦痕迹。
在本申请实施例的另一些实施方式中,发起方侧共享控制控件13和接收方侧共享控制控件23也可以是其他例如侧边工具条等类型、格式的控件,其可以根据需要设置。
在本申请实施例的另一些实现方式中,手机100也可以在与手机200进行语音通话、远程辅助、设备协同等场景对应的屏幕共享的情况下,开启联动模式。
需要说明的是,本申请实施例中,手机100和手机200的屏幕的分辨率可以是相同的,也可以是不相同的。若手机100和手机200的屏幕的分辨率是相同的,则手机200直接将得到的位置坐标信息发送给手机100。若手机100和手机200的屏幕的分辨率是不相同的,则手机200可以将得到的位置坐标信息根据手机200与手机100的分辨率的对应关系进行坐标转换后,发送给手机100。或者手机200直接 将得到的位置坐标信息发送给手机100,手机100将得到的位置坐标信息根据手机200与手机100的分辨率的对应关系进行坐标转换后再进行前述的应用确定、操作对象确定等操作。
在本申请实施例的另一些实现方式中,手机100和手机200之间可以进行屏幕双向共享,即手机100将自己的屏幕作为第一共享屏幕发送给手机200进行显示,手机200也将自己的屏幕作为第二共享屏幕发送给手机200进行显示。
若手机100和手机200分别为只具有一块物理屏幕的设备,则手机100和手机200之间进行屏幕双向共享的时候,手机100和手机200分别进行屏幕分屏操作。并且可以选择分屏后的一处屏幕区域显示自己的界面,在另一处屏幕区域显示对方分享来的共享界面。
若手机100和手机200分别为具有两块或者两块以上的物理屏幕的设备,则手机100和手机200之间进行屏幕双向共享的时候,手机100和手机200分别选择一块物理屏幕显示自己的界面,另一块物理屏幕显示对方分享来的共享界面。
手机100和手机200进行屏幕双向共享之后,手机100可以联动操作手机200的屏幕,并且手机200可以联动操作手机100的屏幕,实现手机100和手机200之间的双向联动。
请参见图9,在本申请实施例的另一种实现方式中,手机100也可以与平板电脑300建立视频通话,并且开启屏幕共享模式和联动模式。平板电脑300的屏幕分辨率和手机100的屏幕分辨率不同。则平板电脑300在确定事件信息时,可以根据手机100的屏幕分辨率F1和平板电脑300的屏幕分辨率F3的比值确定坐标调整参数。并根据坐标调整参数调整平板电脑300得到的初始位置坐标信息得到位置坐标信息。然后平板电脑300将包括调整得到的位置坐标信息的事件信息发送给手机100,以使手机100可以准确地确定该位置坐标信息在手机100显示的共享屏幕上相应的应用或者控件。
例如,Z1=Sa×Z0,其中Z0为初始位置坐标,例如(x 0,y 0),Sa为坐标调整参数,Z1为调整后得到的位置坐标。
本实现方式中,F1为(F11×F12),F11为手机100的屏幕在水平方向(或者也可以称为屏幕宽度方向,或者x轴方向)的像素点数,F12为手机100的屏幕在垂直方向(或者也可以称为屏幕长度方向,或者y轴方向)的像素点数。F3为(F31×F32),F31为平板电脑300的屏幕在水平方向(或者也可以称为屏幕宽度方向,或者x轴方向)的像素点数,F32为平板电脑300的屏幕在垂直方向(或者也可以称为屏幕长度方向,或者y轴方向)的像素点数。
则坐标调整参数Sa为(S x1,S y1),其中,S x1=F11/F31,S y1=F12/F32。
上述的调整后得到的位置坐标Z1则可以是(S x1x 0,S y1y 0)。
当然,手机100也可以是与其他屏幕分辨率和手机100的屏幕分辨率不同的手机(例如大屏手机和小屏手机)之间开启共享模式和联动模式。或者手机100也可以与电视等其他具备屏幕显示功能的设备建立视频通话,并且开启屏幕分享模式和联动模式。
本申请实施例涉及的应用于电子设备的增强的屏幕共享方法,该电子设备可以是手机、平板电脑、电视、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、可穿戴设备、虚拟现实设备等电子设备。
需要说明的是,本申请实施例中,该电子设备为可以与其他电子设备进行无线通信的电子设备,并且该电子设备具有屏幕,以及该电子设备具备视频通话功能、屏幕共享功能和联动功能。
以下对于手机100和手机200基于前述的图3所示的软件结构的系统内部工作流程进行详细说明。
本申请实施例提供的增强的屏幕共享方法中,手机100和手机200之间进行视频通话,且进行屏幕 共享,在手机100和手机200之间开启联动模式之后,手机200可以进行应用下载并安装。
请参见图10,手机100包括视频通话应用A2和SDK模块105。手机200包括传感器模块201、输入事件读取模块202、输入事件分发模块203、视频通话界面应用204、视频通话应用A2和SDK模块205。
手机100和手机200之间进行应用下载联动的过程包括以下步骤:
S101,手机200中的传感器模块201将检测到的用户U2的双击操作对应的双击操作信息发送给手机200中的输入事件读取模块202。
需要说明的是,若用户U2的触发操作为双击操作,则传感器模块201在检测到用户U2的第一次单击操作后,将与第一次单击操作对应的第一单击操作信息发送给输入事件读取模块202。传感器模块201在检测到用户U2的第二次单击操作后,将与第二次单击操作对应的第二单击操作信息发送给输入事件读取模块202。
单击操作信息可以是触发操作电子信号,包括用户U2的触发操作对应的输入事件类型,例如用户U2的单击操作对应的事件类型为“单击输入事件”,以及用户U2单击位置的位置坐标信息。
S102,输入事件读取模块202接收到第一单击操作信息和第二单击操作信息之后,确定接收到第一单击操作信息和第二单击操作信息的时间间隔小于预设的双击操作确定时间阈值,输入事件读取模块202通过线程从EventHub取出事件以进行事件翻译,可以确定用户U2的操作对应的输入事件为双击输入事件。输入事件读取模块202将双击输入事件和用户U2进行双击位置的位置坐标信息作为事件信息发送给输入事件分发模块203。
双击操作确定时间阈值的取值范围例如可以是0.1s~0.6s,例如0.1s、0.25s、0.5s、0.6s等。当然,双击操作确定时间阈值也可以是其他值。
S103,输入事件分发模块203根据接收到的事件信息后进行事件分发,将事件信息发送给视频通话界面应用204。
S104,视频通话界面应用204将双击输入事件的事件信息传递给视频通话应用A2。
S105,视频通话应用A2对该双击输入事件的事件信息中的两次单击的时间间隔进行判断,以达到事件识别和事件有效性分析的目的。若视频通话应用A2确定双击输入事件的事件信息中的两次单击的间隔时间小于预设的双击操作确定时间阈值,则视频通话应用A2确定输入事件为双击输入事件。
S106,视频通话应用A2调用SDK模块205下发事件信息。
S107,SDK模块205将事件信息发送给手机100中的SDK模块105。事件信息包括事件类型“双击输入事件”和操作区域信息“用户双击位置的位置坐标信息”。
SDK模块205发送事件信息可以是以信令的方式通过手机200和手机100之间建立的信令传输通道进行发送。
需要说明的是,手机200中的SDK模块205可以先将事件应答信息发送给视频通话应用A2对应的云服务器300。然后由云服务器300再发送给手机100中的SDK模块105。
S108,SDK模块105解析事件信息,得到事件类型“双击输入事件”和操作区域信息“用户双击位置的位置坐标信息”。
S109,SDK模块105将解析后的事件信息发送给视频通话应用A2。即将事件类型“双击输入事件”和操作区域信息“用户双击位置的位置坐标信息”发送给视频通话应用A2。
S110,视频通话应用A2根据事件类型“双击输入事件”可以先确定手机100需要进行应用确定操作。然后,视频通话应用A2根据操作区域信息“用户双击位置的位置坐标信息”确定位置坐标信息是 否有效。例如手机100的当前界面是否包含应用的应用图标和/或应用名称,若包含,则视频通话应用A2判断用户双击位置是否在桌面应用图标和/或应用名称所在位置,若在,则认为位置坐标信息有效。视频通话应用A2根据操作区域信息“用户双击位置的位置坐标信息”确定在手机100的界面上所对应的区域,并确定该区域对应应用的应用名称,例如为视频播放应用A3。且视频通话应用A2确定手机200在收到名称后需要执行的相应的操作类型为“应用下载并安装”。
S111,视频通话应用A2调用SDK模块105返回事件应答信息,事件应答信息包括操作类型“应用下载并安装”和应用名称“视频播放应用A3”。
S112,SDK模块105将事件应答信息发送给手机200中的SDK模块205。
SDK模块105发送事件信息可以是以信令的方式通过手机100和手机200之间建立的信令传输通道进行发送。
需要说明的是,手机100中的SDK模块105可以先将事件应答信息发送给视频通话应用A2对应的云服务器300。然后由云服务器300再发送给手机200中的SDK模块105。
S113,SDK模块105解析事件应答信息,得到操作类型和应用名称。
S114,SDK模块105将解析后的事件应答信息发送给视频通话应用A2,即SDK模块105将操作类型和应用名称发送给视频通话应用A2。
S115,视频通话应用A2根据操作类型“应用下载并安装”确定需要执行应用下载并安装操作,且视频通话应用A2根据应用名称“视频播放应用A3”确定需要下载的应用为视频播放应用A3。则视频通话应用A2先根据应用名称“视频播放应用A3”确定手机200中是否已安装视频播放应用A3,若未安装视频播放应用A3,则视频通话应用A2直接根据应用名称从手机200中的应用市场应用A1中下载视频播放应用A3,并完成应用安装。若手机200中已安装视频播放应用A3,则手机200不执行应用下载并安装操作;或者手机200可以显示用于告知用户U2当前手机200已安装视频播放应用A3的提示信息。
本实施例提供的增强的屏幕共享方法中,手机200和手机100之间在屏幕共享的过程中,开启手机200对手机100的联动模式后,手机200可以根据用户U2对手机200的触发操作,下载手机100中对应的应用。可以丰富进行屏幕共享的手机100和手机200之间的交互类型,提高用户的体验。
在实施例的另一实现方式中,对于前述的S110,手机100生成事件应答信息时,也可以不用确定手机200需要执行操作的操作类型,只将应用名称作为事件应答信息发送给手机200。则后续手机200在接收到事件应答信息后,可以直接根据应用名称确定是否下载并安装对应的应用。
请参见图11,在本申请实施例的另一种实现方式中,手机100和手机200之间进行应用打开联动的过程包括以下步骤:
S201,手机200中的传感器模块201将检测到的用户U2的单击操作对应的单击操作信息发送给手机200中的输入事件读取模块202。
单击操作信息中包括用户U2的触发操作对应的输入事件类型,例如用户U2的单击操作对应的事件类型为“单击输入事件”,以及用户U2单击位置的位置坐标信息。
S202,输入事件读取模块202接收单击操作信息,通过线程从EventHub取出事件进行事件翻译,可以确定用户U2的输入事件为单击输入事件。输入事件读取模块202将单击输入事件发送给输入事件分发模块203。
双击操作确定时间阈值的取值范围例如可以是0.1s~0.6s,例如0.1s、0.35s、0.5s、0.6s等。当然,双击操作确定时间阈值也可以是其他值。
S203,输入事件分发模块203根据接收到的单击输入事件进行事件分发,将单击输入事件的事件信息发送给视频通话界面应用204。
S204,视频通话界面应用204将单击输入事件的事件信息传递给视频通话应用A2。
S205,视频通话应用A2对该单击输入事件的事件信息中的单击时间进行判断,以达到事件识别和事件有效性分析的目的。若视频通话应用A2确定单击输入事件后在预设的间隔时间阈值内未收到下一个输入事件,则视频通话应用A2确定输入事件为单击输入事件。
间隔时间阈值的取值范围例如可以是0.1s~0.6s,例如0.1s、0.35s、0.5s、0.6s等。当然,间隔时间阈值也可以是其他值。
S206,视频通话应用A2调用SDK模块205下发事件信息。
S207,SDK模块205将事件信息发送给手机100中的SDK模块105。事件信息包括事件类型“单击输入事件”和操作区域信息“用户单击位置的位置坐标信息”。
S208,SDK模块105解析事件信息,得到事件类型“单击输入事件”和操作区域信息“用户单击位置的位置坐标信息”。
S209,SDK模块105将解析后的事件信息发送给视频通话应用A2。即将事件类型“双击输入事件”和操作区域信息“用户单击位置的位置坐标信息”发送给视频通话应用A2。
S210,视频通话应用A2根据事件类型“单击输入事件”可以先确定手机100需要进行操作对象触发操作。然后,视频通话应用A2根据操作区域信息“用户单击位置的位置坐标信息”确定位置坐标信息是否有效。例如手机100的当前界面是否包含应用的应用图标和/或应用名称,若包括,则视频通话应用A2判断用户单击位置是否在桌面应用图标和/或应用名称所在位置,若在,则认为位置坐标信息有效。视频通话应用A2根据位置坐标信息“用户单击位置的位置坐标信息”确定在手机100的界面上所对应的区域,并确定该区域对应的操作对象。例如操作对象为视频播放应用A3。则视频通话应用A2确定手机200需要执行的视频播放应用A3打开操作。手机100执行视频播放应用A3打开操作以打开视频播放应用A3。
手机100执行视频播放应用A3打开操作以打开视频播放应用A3可以是,视频通话应用A2通过手机100中的窗口管理模块(图中未示出)将视频播放应用A3打开事件发送给视频通话应用A2对应的视频通话界面应用(图中未示出),触发视频通话界面应用执行打开视频播放应用A3的操作。
本实施例提供的增强的屏幕共享方法中,手机200和手机100之间在屏幕共享的过程中,开启联动模式后,手机200可以根据用户U2对手机200的触发操作,触发手机100中对应的操作对象。可以丰富进行屏幕共享的手机100和手机200之间的交互类型,提高用户的体验。
在本申请实施例的一种实现方式中,若手机100和手机200的屏幕分辨率相同。则对于前述的步骤S204,视频通话界面应用204将收到的输入事件分发模块203发送来的单击输入事件的事件信息直接传递给视频通话应用A2,即手机100直接将得到的位置坐标信息直接发送给手机200。对于前述的S210,手机100中的视频通话应用A2可以直接根据位置坐标信息确定对应的操作对象。
在本申请实施例的一种实现方式中,若手机100和手机200的屏幕分辨率不相同。则手机100和手机200需要进行位置坐标信息的转换。
例如,对于步骤S201,手机200中的传感器模块201根据用户U1的单击操作得到初始位置坐标信息。则对于步骤S204,视频通话界面应用204可以对收到的输入事件分发模块203发送来的单击输入事件中的初始位置信息,根据通过手机100的第一屏幕分辨率F1和手机200的第二屏幕分辨率F2确定 的坐标调整参数,进行初始位置坐标信息调整得到位置坐标信息。然后视频通话界面应用204将包括调整得到的位置坐标信息的事件信息发送给视频通话应用A2。
本实现方式中,F1为(F11×F12),F11为手机100的屏幕在水平方向(或者也可以称为屏幕宽度方向,或者x轴方向)的像素点数,F12为手机100的屏幕在垂直方向(或者也可以称为屏幕长度方向,或者y轴方向)的像素点数。F2为(F21×F22),F21为手机200的屏幕在水平方向(或者也可以称为屏幕宽度方向,或者x轴方向)的像素点数,F22为手机200的屏幕在垂直方向(或者也可以称为屏幕长度方向,或者y轴方向)的像素点数。
则Z2=Sb×Z1,Z0为初始位置坐标信息(x 0,y 0),Z2为位置坐标信息。
坐标调整参数Sb为(S x2,S y2),其中,S x2=F11/F21,S y2=F12/F22。
则Z2可以是(S x2x 0,S y2y 0)。
在本申请实施例的一种实现方式中,手机200也可以如前述的步骤S201~S207将事件信息发送给手机100。对于步骤S210,手机100中的视频通话应用A2得到事件信息后,可以将事件信息发送给手机100中的视频通话界面应用104。视频通话界面应用104根据手机100的第一屏幕分辨率F1和手机200的第二屏幕分辨率F2确定的坐标调整参数调整位置坐标信息得到调整后的位置坐标信息。对于手机100进行坐标调整的过程,此处不再赘述。然后视频通话界面应用104将调整后的位置坐标信息发送给视频通话应用A2,视频通话应用A2根据调整后的位置坐标信息确定对应的操作对象。
在本申请实施例的另一种实现方式中,应用的应用名称也可以是应用对应的应用包的名称等应用,或者还可以是其他类型的可以用于标识应用的应用标识信息。
在本申请实施例的另一种实现方式中,用户可以通过前述的单击、双击、滑动等触摸操作以操作屏幕,也可以通过语音、手势等方式操作屏幕。
在本申请实施例的另一种实现方式中,操作区域信息也可以是用户操作对应的轨迹等图像信息。
在本申请实施例的另一种实现方式中,手机100和手机200也可以是在进行语音通话、设备协同等场景时,进行屏幕共享,并在屏幕共享的过程中开启联动模式。
在本申请的一种实现方式中,第一共享界面和第二共享界面可以部分相同。例如,第二电子设备可以根据第一共享界面的界面数据只显示第一显示界面的部分界面,或者第二电子设备根据第一共享界面的界面数据显示文字等内容与第一显示界面相同、但是文字的排版等与第一显示界面不相同的界面等,其可以根据需要设置。
在本申请的一种实现方式中,前述的应用下载并安装操作,也可以只是应用下载操作。
请参见图12,图12所示为根据本申请实施例的一实施方式提供的电子设备900的结构示意图。电子设备900可以包括耦合到控制器中枢904的一个或多个处理器901。对于至少一个实施例,控制器中枢904经由诸如前端总线(FSB)之类的多分支总线、诸如快速通道互连(QPI)之类的点对点接口、或者类似的连接与处理器901进行通信。处理器901执行控制一般类型的数据处理操作的指令。在一实施例中,控制器中枢904包括,但不局限于,图形存储器控制器中枢(GMCH)(图中未示出)和输入/输出中枢(IOH)(其可以在分开的芯片上)(图中未示出),其中GMCH包括存储器和图形控制器并与IOH耦合。
电子设备900还可包括耦合到控制器中枢904的协处理器906和存储器902。或者,存储器902和GMCH中的一个或两者可以被集成在处理器901内(如本申请实施例中所描述的),存储器902和协处理器906直接耦合到处理器901以及控制器中枢904,控制器中枢904与IOH处于单个芯片中。
在一个实施例中,协处理器906是专用处理器,协处理器906的任选性质用虚线表示在图12中。
在一个实施例中,电子设备900可以进一步包括网络接口(NIC)903。网络接口903可以包括收发器,用于为电子设备900提供无线电接口,进而与任何其他合适的设备(如前端模块,天线等)进行通信。在各种实施例中,网络接口903可以与电子设备900的其他组件集成。网络接口903可以实现上述实施例中的通信单元的功能。
电子设备900可以进一步包括输入/输出(I/O)设备905。
值得注意的是,图12仅是示例性的。即虽然图12中示出了电子设备900包括处理器901、控制器中枢904、存储器902等多个器件,但是,在实际的应用中,使用本申请实施例各方法的设备,可以仅包括电子设备900各器件中的一部分器件,例如,可以仅包含处理器901和NIC903。图12中可选器件的性质用虚线示出。
在该电子设备900的存储器中可以包括用于存储数据和/或指令的一个或多个有形的、非暂时性计算机可读介质。计算机可读存储介质中存储有指令,具体而言,存储有该指令的暂时和永久副本。
本申请实施例中,该电子设备900可以是手机,该电子设备的存储器中存储的指令可以包括:由处理器中的至少一个单元执行时导致手机实施如前述提到的增强的屏幕共享方法的指令。
请参见图13,图13所示为根据本申请的一实施方式提供的SoC(System on Chip,片上系统)1000的结构示意图。在图13中,相似的部件具有同样的附图标记。另外,虚线框是更先进的SoC 1000的可选特征。该SoC 1000可以被用于根据本申请的任一电子设备,根据其所在的设备不同以及其内所存储的指令的不同,可以实现相应的功能。
在图13中,SoC1000包括:互连单元1002,其被耦合至处理器1001;系统代理单元1006;总线控制器单元1005;集成存储器控制器单元1003;一组或一个或多个协处理器1007,其可包括集成图形逻辑、图像处理器、音频处理器和视频处理器;SRAM(静态随机存取存储器)单元1008;DMA(直接存储器存取)单元1004。在一个实施例中,协处理器1007包括专用处理器,诸如例如网络或通信处理器、压缩引擎、GPGPU、高吞吐量MIC处理器、或嵌入式处理器等等。
SRAM单元1008中可以包括用于存储数据和/或指令的一个或多个计算机可读介质。计算机可读存储介质中可以存储有指令,具体而言,存储有该指令的暂时和永久副本。该指令可以包括:由处理器中的至少一个单元执行时导致电子设备实施如前述所提到的增强的屏幕共享方法的指令。
本申请公开的机制的各实施例均可以以软件、硬件、固件或这些实现方法的组合等方式实现。本申请的实施例可实现为在可编程系统上执行的计算机程序或程序代码,该可编程程序包括至少一个处理器、存储器(或存储系统,包括易失性和非易失性存储器和/或存储单元)。
需要说明的是,术语“第一”、“第二”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
需要说明的是,在附图中,可以以特定布置和/或顺序示出一些结构或方法特征。然而,应该理解,可能不需要这样的特定布置和/或排序。而是,在一些实施方式中,这些特征可以以不同于说明性附图中所示的方式和/或顺序来布置。另外,在特定图中包括结构或方法特征并不意味着暗示在所有实施方式中都需要这样的特征,并且在一些实施方式中,可以不包括这些特征或者可以与其他特征组合。
虽然通过参照本申请的某些优选实施方式,已经对本申请进行了图示和描述,但本领域的普通技术人员应该明白,以上内容是结合具体的实施方式对本申请所作的进一步详细说明,不能认定本申请的具体实施只局限于这些说明。本领域技术人员可以在形式上和细节上对其作各种改变,包括做出若干简单推演或替换,而不偏离本申请的精神和范围。

Claims (18)

  1. 一种增强的屏幕共享方法,其特征在于,包括:
    第一电子设备显示第一共享界面,并向第二电子设备发送所述第一共享界面的界面数据;
    所述第二电子设备接收所述界面数据,根据所述界面数据显示与所述第一共享界面对应的第二共享界面;
    所述第二电子设备开启对所述第一电子设备的联动模式;
    在所述联动模式下,所述第二电子设备检测到用户对所述第二共享界面的触发操作,确定所述触发操作对应的事件信息,所述事件信息包括所述触发操作对应的事件类型和操作区域信息;
    所述第二电子设备将所述事件信息发送给所述第一电子设备;
    所述第一电子设备接收所述事件信息,并根据所述事件信息在所述第一电子设备上执行相应操作。
  2. 如权利要求1所述的增强的屏幕共享方法,其特征在于,所述第一电子设备根据所述事件信息在所述第一电子设备上执行相应操作,包括:
    所述第一电子设备根据所述事件类型确定第一操作类型;
    所述第一电子设备根据所述第一操作类型和所述操作区域信息在所述第一电子设备上执行相应操作。
  3. 如权利要求2所述的增强的屏幕共享方法,其特征在于,
    若所述事件类型为第一输入事件,所述第一操作类型为应用确定操作;
    所述第一电子设备根据所述操作区域信息,确定所述操作区域信息在所述第一共享界面上所对应应用的应用标识信息;
    所述第一电子设备生成事件应答信息,所述事件应答信息包括所述应用标识信息;
    所述第一电子设备将所述事件应答信息发送给所述第二电子设备;
    所述第二电子设备接收所述事件应答信息,若根据所述事件应答信息确定所述第二电子设备当前未安装所述应用标识信息对应的应用,则下载或者下载并安装所述应用标识信息对应的应用。
  4. 如权利要求3所述的增强的屏幕共享方法,其特征在于,所述第一输入事件为双击输入事件或长按输入事件。
  5. 如权利要求3所述的增强的屏幕共享方法,其特征在于,所述方法还包括:
    所述第一电子设备根据所述第一操作类型确定第二操作类型,所述第二操作类型为应用下载或者下载并安装操作;
    所述第一电子设备生成事件应答信息,所述事件应答信息包括所述应用标识信息和所述第二操作类型;
    若所述第二电子设备根据所述事件应答信息确定所述第二电子设备当前未安装所述应用标识信息对应的应用,则下载或者下载并安装所述应用标识信息对应的应用。
  6. 如权利要求2所述的增强的屏幕共享方法,其特征在于,
    若所述事件类型为第二输入事件,所述第一操作类型为操作对象触发操作;
    所述第一电子设备根据所述操作区域信息,确定所述操作区域信息在所述第一共享界面上所对应的操作对象;
    所述第一电子设备执行触发所述操作对象的操作。
  7. 如权利要求6所述的增强的屏幕共享方法,其特征在于,所述第二输入事件为单击输入事件或滑动输入事件。
  8. 如权利要求1-7任一项所述的增强的屏幕共享方法,其特征在于,若所述第一电子设备的第一屏幕分辨率和所述第二电子设备的第二屏幕分辨率不相同,所述方法还包括:
    所述第二电子设备根据所述触发操作得到初始操作区域信息;所述第二电子设备根据所述第一屏幕分辨率和所述第二屏幕分辨率,对所述初始操作区域信息进行调整得到所述操作区域信息;或者
    所述第一电子设备根据所述第一屏幕分辨率和所述第二屏幕分辨率,对接收到的所述操作区域信息进行调整,并根据调整后的所述操作区域信息和所述第一操作类型执行相应操作。
  9. 如权利要求1-8任一项所述的增强的屏幕共享方法,其特征在于,所述操作区域信息为位置坐标信息。
  10. 如权利要求1-9任一项所述的增强的屏幕共享方法,其特征在于,所述第二电子设备开启对所述第一电子设备的联动模式,包括:
    所述第一电子设备显示第一联动控件;若所述第一电子设备检测到用户对所述第一联动控件的开启触发操作,所述第一电子设备开启与所述第二电子设备之间的联动模式;或者
    所述第二电子设备显示第二联动控件;若所述第二电子设备检测到用户对所述第二联动控件的开启触发操作,所述第二电子设备开启对所述第一电子设备的联动模式。
  11. 如权利要求10所述的增强的屏幕共享方法,其特征在于,若所述第二电子设备检测到用户对所述第二联动控件的开启触发操作,所述第二电子设备开启对所述第一电子设备的联动模式,包括:
    若所述第二电子设备检测到用户对所述第二联动控件的开启触发操作,所述第二电子设备向所述第一电子设备发送联动请求;
    所述第一电子设备接收所述联动请求,显示联动确定控件;
    若所述第一电子设备检测到用户对所述联动确定控件的触发操作,所述第二电子设备生成并发送同意联动的联动应答给所述第二电子设备;
    所述第二电子设备接收所述联动应答,开启对所述第一电子设备的联动模式。
  12. 如权利要求1-11任一项所述的增强的屏幕共享方法,其特征在于,所述方法还包括:
    所述第二电子设备开启对所述第一电子设备的联动模式,且生成联动操作提示信息;
    所述第二电子设备显示所述联动操作提示信息。
  13. 一种增强的屏幕共享方法,应用于第一电子设备,其特征在于,包括:
    所述第一电子设备显示第一共享界面,并向第二电子设备发送所述第一共享界面的界面数据,以使所述第二电子设备根据所述界面数据显示与所述第一共享界面对应的第二共享界面;
    所述第一电子设备接收所述第二电子设备发送来的事件信息,根据所述事件信息在所述第一电子设备上执行相应操作,所述事件信息为所述第二电子设备开启对所述第一电子设备的联动模式,并在所述联动模式下,根据用户对所述共享界面的触发操作确定的信息,所述事件信息包括所述触发操作对应的事件类型和操作区域信息。
  14. 如权利要求13所述的增强的屏幕共享方法,其特征在于,所述第一电子设备根据所述事件信息在所述第一电子设备上执行相应操作,包括:
    所述第一电子设备根据所述事件类型确定第一操作类型;
    所述第一电子设备根据所述第一操作类型和所述操作区域信息在所述第一电子设备上执行相应操作。
  15. 一种增强的屏幕共享方法,应用于第二电子设备,其特征在于,包括:
    所述第二电子设备接收第一电子设备发送来的第一共享界面的界面数据,根据所述界面数据显示与所述第一共享界面对应的第二共享界面,所述共享界面为所述第一电子设备显示的界面;
    所述第二电子设备开启对所述第一电子设备的联动模式;
    在所述联动模式下,所述第二电子检测到用户对所述第二共享界面的触发操作,确定所述触发操作对应的事件信息,所述事件信息包括所述触发操作对应的事件类型和操作区域信息;
    所述第二电子设备将所述事件信息发送给所述第一电子设备,以使所述第一电子设备根据所述事件信息在所述第一电子设备上执行相应操作。
  16. 一种增强的屏幕共享系统,其特征在于,包括:第一电子设备和第二电子设备;其中,
    所述第一电子设备用于显示第一共享界面,并向所述第二电子设备发送所述第一共享界面的界面数据;
    所述第二电子设备用于接收所述界面数据,根据所述界面数据显示与所述第一共享界面对应的第二共享界面;
    所述第二电子设备还用于开启对所述第一电子设备的联动模式;
    在所述联动模式下,所述第二电子设备用于在检测到用户对所述第二共享界面的触发操作时,确定所述触发操作对应的事件信息,所述事件信息包括所述触发操作对应的事件类型和操作区域信息;
    所述第二电子设备还用于将所述事件信息发送给所述第一电子设备;
    所述第一电子设备用于接收所述事件信息,并根据所述事件信息在所述第一电子设备上执行相应操作。
  17. 一种电子设备,其特征在于,包括:
    存储器,用于存储计算机程序,所述计算机程序包括程序指令;
    处理器,用于执行所述程序指令,以使所述电子设备执行如权利要求1-12任一项所述的增强的屏幕共享方法,或以使所述电子设备执行如权利要求13或14所述的增强的屏幕共享方法,或以使所述电子设备执行如权利要求15所述的增强的屏幕共享方法。
  18. 一种计算机可读取存储介质,其特征在于,所述计算机可读取存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令被电子设备运行以使电子设备执行如权利要求1-12任一项所述的增强的屏幕共享方法,或以使所述电子设备执行如权利要求13或14所述的增强的屏幕共享方法,或以使所述电子设备执行如权利要求15所述的增强的屏幕共享方法。
PCT/CN2021/137463 2020-12-21 2021-12-13 一种增强的屏幕共享方法和系统、电子设备 WO2022135210A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21909207.9A EP4242826A4 (en) 2020-12-21 2021-12-13 IMPROVED SCREEN SHARING METHOD AND SYSTEM AND ELECTRONIC DEVICE
US18/336,885 US20230333803A1 (en) 2020-12-21 2023-06-16 Enhanced Screen Sharing Method and System, and Electronic Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011518228.1A CN114647390B (zh) 2020-12-21 2020-12-21 一种增强的屏幕共享方法和系统、电子设备
CN202011518228.1 2020-12-21

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/336,885 Continuation US20230333803A1 (en) 2020-12-21 2023-06-16 Enhanced Screen Sharing Method and System, and Electronic Device

Publications (1)

Publication Number Publication Date
WO2022135210A1 true WO2022135210A1 (zh) 2022-06-30

Family

ID=81991200

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137463 WO2022135210A1 (zh) 2020-12-21 2021-12-13 一种增强的屏幕共享方法和系统、电子设备

Country Status (4)

Country Link
US (1) US20230333803A1 (zh)
EP (1) EP4242826A4 (zh)
CN (1) CN114647390B (zh)
WO (1) WO2022135210A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115297344B (zh) * 2022-06-27 2024-03-22 青岛海尔科技有限公司 屏端设备协同交互方法、装置、存储介质及电子装置
WO2024065449A1 (zh) * 2022-09-29 2024-04-04 京东方科技集团股份有限公司 一种数据共享显示的方法及智能显示系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080125104A1 (en) * 2006-07-04 2008-05-29 Samsung Electronics Co., Ltd. Apparatus and method for sharing video telephony screen in mobile communication terminal
CN101582020A (zh) * 2008-05-14 2009-11-18 北京帮助在线信息技术有限公司 一种用于在线帮助的双工和单工协调的设备和方法
CN101582021A (zh) * 2008-05-14 2009-11-18 北京帮助在线信息技术有限公司 一种用于在线帮助的双工操作的设备和方法
CN101789955A (zh) * 2009-12-25 2010-07-28 宇龙计算机通信科技(深圳)有限公司 一种桌面共享控制方法、装置及移动终端
CN103685389A (zh) * 2012-09-13 2014-03-26 卓望数码技术(深圳)有限公司 一种远程控制数据终端实现用户界面交互的系统及方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050056041A (ko) * 2003-12-09 2005-06-14 주식회사 아데나코리아 이기종 장치간 자원 공유를 위한 객체 제어 방법 및 시스템
KR101685364B1 (ko) * 2010-01-05 2016-12-12 엘지전자 주식회사 휴대 단말기, 휴대 단말기 시스템 및 그 동작 제어방법
KR20120081368A (ko) * 2011-01-11 2012-07-19 주식회사 엔씨소프트 모바일 플랫폼에서의 채팅을 통한 게임 초대 방법
US8913026B2 (en) * 2012-03-06 2014-12-16 Industry-University Cooperation Foundation Hanyang University System for linking and controlling terminals and user terminal used in the same
CN103685440B (zh) * 2012-09-14 2017-10-10 南京邻动网络科技有限公司 用于在多个移动电子设备之间共享文件的方法
JP6097679B2 (ja) * 2013-02-28 2017-03-15 エルジー アプラス コーポレーション 端末間機能共有方法及びその端末
KR20150069155A (ko) * 2013-12-13 2015-06-23 삼성전자주식회사 전자 장치의 터치 인디케이터 디스플레이 방법 및 그 전자 장치
WO2016036603A1 (en) * 2014-09-02 2016-03-10 Apple Inc. Reduced size configuration interface
CN106339192B (zh) * 2016-08-24 2019-12-06 腾讯科技(深圳)有限公司 区域共享方法、装置及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080125104A1 (en) * 2006-07-04 2008-05-29 Samsung Electronics Co., Ltd. Apparatus and method for sharing video telephony screen in mobile communication terminal
CN101582020A (zh) * 2008-05-14 2009-11-18 北京帮助在线信息技术有限公司 一种用于在线帮助的双工和单工协调的设备和方法
CN101582021A (zh) * 2008-05-14 2009-11-18 北京帮助在线信息技术有限公司 一种用于在线帮助的双工操作的设备和方法
CN101789955A (zh) * 2009-12-25 2010-07-28 宇龙计算机通信科技(深圳)有限公司 一种桌面共享控制方法、装置及移动终端
CN103685389A (zh) * 2012-09-13 2014-03-26 卓望数码技术(深圳)有限公司 一种远程控制数据终端实现用户界面交互的系统及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4242826A4

Also Published As

Publication number Publication date
US20230333803A1 (en) 2023-10-19
EP4242826A4 (en) 2024-05-08
EP4242826A1 (en) 2023-09-13
CN114647390A (zh) 2022-06-21
CN114647390B (zh) 2024-03-26

Similar Documents

Publication Publication Date Title
US20220342850A1 (en) Data transmission method and related device
CN112394895B (zh) 画面跨设备显示方法与装置、电子设备
WO2020134872A1 (zh) 一种消息处理的方法、相关装置及系统
CN112558825A (zh) 一种信息处理方法及电子设备
US20150019694A1 (en) Method for Screen Sharing, Related Device, and Communications System
EP4040277A1 (en) Method for displaying multiple windows, and electronic device and system
WO2021143182A1 (zh) 游戏的处理方法、装置、电子设备及计算机可读存储介质
WO2022135210A1 (zh) 一种增强的屏幕共享方法和系统、电子设备
WO2016127795A1 (zh) 业务处理方法、服务器及终端
WO2021185244A1 (zh) 一种设备交互的方法和电子设备
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
WO2021147779A1 (zh) 配置信息分享方法、终端设备及计算机可读存储介质
CN112398855B (zh) 应用内容跨设备流转方法与装置、电子设备
WO2022105445A1 (zh) 基于浏览器的应用投屏方法及相关装置
CN112527174B (zh) 一种信息处理方法及电子设备
WO2018107941A1 (zh) 一种ar场景下的多屏联动方法和系统
US20230138804A1 (en) Enhanced video call method and system, and electronic device
JP7168671B2 (ja) サービス処理方法及び移動通信端末
CN112527222A (zh) 一种信息处理方法及电子设备
WO2022048500A1 (zh) 一种显示方法及设备
CN113613064B (zh) 视频处理方法、装置、存储介质及终端
WO2023030099A1 (zh) 跨设备交互的方法、装置、投屏系统及终端
WO2022179405A1 (zh) 一种投屏显示方法及电子设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021190353A1 (zh) 交互方法和显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21909207

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021909207

Country of ref document: EP

Effective date: 20230607

NENP Non-entry into the national phase

Ref country code: DE