WO2024017296A1 - 共享方法、电子设备及系统 - Google Patents

共享方法、电子设备及系统 Download PDF

Info

Publication number
WO2024017296A1
WO2024017296A1 PCT/CN2023/108156 CN2023108156W WO2024017296A1 WO 2024017296 A1 WO2024017296 A1 WO 2024017296A1 CN 2023108156 W CN2023108156 W CN 2023108156W WO 2024017296 A1 WO2024017296 A1 WO 2024017296A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
interface
electronic device
sharing
user
Prior art date
Application number
PCT/CN2023/108156
Other languages
English (en)
French (fr)
Inventor
贾银元
张利
王良
许浩维
王志峰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024017296A1 publication Critical patent/WO2024017296A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/104Peer-to-peer [P2P] networks
    • H04L67/1074Peer-to-peer [P2P] networks for supporting data block transmission mechanisms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/104Peer-to-peer [P2P] networks
    • H04L67/1074Peer-to-peer [P2P] networks for supporting data block transmission mechanisms
    • H04L67/1078Resource delivery mechanisms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the present application relates to the field of computer technology, and in particular, to a sharing method, electronic device and system.
  • NFC near field communication
  • This application discloses a sharing method, electronic device and system, which enables users to realize real-time sharing functions such as watching together, listening together, playing together and editing together with at least one call counterparty, nearby users, etc. in a simpler and faster operation method.
  • this application provides a sharing method, applied to a first device.
  • the method includes: displaying a first interface, the first interface being used to instruct the current operator call with the second device; and the third When the two devices make a call with the operator, the second interface of the first application is displayed; the first user operation on the second interface is received; and the first data is sent to the second device, and the first data is Output multimedia data related to the second interface on the second device.
  • the first device when the first device and the second device are on an operator call, the first device can send the first data related to the first application as a foreground application to the second device according to the received first user operation.
  • This enables the second device to output multimedia data related to the interface of the first application, which solves the problem of being unable to share multimedia data streams in real time during phone calls by operators, allowing users to watch and watch together with the other party in a simpler and faster operation. Listen, meet user needs and improve user experience.
  • the first interface and the second interface include a first floating window
  • the first user operation is a user operation that acts on a sharing control in the first floating window; or , the first user operation is a user operation of sliding along a first trajectory.
  • first user operations for triggering sending the first data to the second device.
  • the user can choose which type of first user operation to perform according to the needs to meet different user needs. Improve user experience.
  • the method before sending the first data to the second device, the method further includes: when displaying the second interface, grabbing multimedia data related to the second interface, so The first data includes multimedia data related to the second interface.
  • the first data includes audio streams, video streams and other multimedia data related to the second interface output by the device captured by the first device. Therefore, the second device can directly output the second interface after receiving the first data. Relevant multimedia data can be shared in real time without the need to install the first application or adapt to the first application, broadening the application scenarios and improving the user experience.
  • sending the first data to the second device includes: sending the call data of the operator call to the second device through the main link of the operator call and the first data.
  • sending the first data to the second device includes: sending the call data of the operator call to the second device through the main link of the operator call, The first data is sent to the second device through the data path of the operator's call.
  • sending the first data to the second device includes: sending the call data of the operator call to the second device through the main link of the operator call, The first data is sent to the second device over a secondary link.
  • the auxiliary link is a traversal link or relay link of Network Address Translation NAT.
  • the physical channel of the auxiliary link is a cellular communication link, a wireless fidelity Wi-Fi link, a Bluetooth BT link, a point-to-point D2D link or a satellite link.
  • the first device when the first device and the second device have an operator call, the first device can send the first data shared in real time through the main link, data path or related auxiliary link of the call with the operator. Therefore, , the first device and the second device do not need to install chat applications, conferencing applications and other applications for real-time sharing of multimedia data. Users can quickly share multimedia data in real time based on current operator calls, broadening application scenarios and improving user experience.
  • the method before sending the first data to the second device, further includes: sending the first data to the network device. Send a first request message, the first request message including the identification information of the second device; receive the session identification of the second device sent by the network device based on the first request message; according to the second The session identifier of the device and the second device establish the auxiliary link.
  • the identification information includes phone number, over-the-top OTT identification ID, and network account number.
  • the first device even if the first device does not originally store the session identifier of the second device, it can obtain the session identifier of the second device through the existing identification information of the second device, thereby establishing an auxiliary link with the second device, identifying
  • sending the first data to the second device includes: displaying a third interface, where the third interface includes information on multiple devices; and receiving information from the multiple devices. a second user operation of the second device; sending the first data to the second device.
  • the shared device ie, the second device that is shared with the first device in real time can be determined in response to the user's operation, which makes the user's use more flexible and improves the user experience.
  • the second device is a device connected to the first device through near field communication. This solves the problem of being unable to share multimedia data streams in real time in near field communication scenarios, allowing users to watch and listen together with nearby devices in a simpler and faster way, meeting user needs and improving user experience.
  • the multiple devices include at least one of the following: a discovered device, a connected device, a device that last made a call with an operator, a device that stores identification information, a device that is identified based on the captured image device of.
  • the connected device includes the second device, a device connected through near field communication, and a device connected through far field communication.
  • the discovered devices include devices discovered through near field communication and devices discovered through far field communication.
  • the captured images include images captured by the first device, and/or images captured by a device connected to the first device.
  • sending the first data to the second device includes: displaying a fourth interface, the fourth interface including information of multiple windows; receiving effects on the multiple windows A third user operation of the first window, the first window including the content of the second interface; sending the first data to the second device.
  • the content (ie, the first data) shared by the first device in real time can be determined in response to the user's operation, making the user's use more flexible and improving the user experience.
  • the multiple windows include at least one of the following: windows of foreground applications, windows of background applications, and windows of applications that are installed but not running on the first device.
  • the content to be shared that can be selected by the user can be the multimedia data of the foreground application, the multimedia data of the background application, or the multimedia data of the application that has been installed but not running on the first device, so as to satisfy the user's real-time needs. Share the needs of different multimedia data and improve user experience.
  • sending the first data to the second device includes: displaying a fifth interface, the fifth interface including multiple sharing modes; and receiving effects in the multiple sharing modes.
  • a fourth user operation in the first manner displaying a sixth interface, the sixth interface including information about multiple windows and multiple devices, the multiple windows and the multiple devices being determined according to the first approach receiving a fifth user operation acting on a second window among the plurality of windows, and receiving a sixth user operation acting on the second device among the plurality of devices, the second window including the Content of the second interface; sending the first data to the second device according to the fifth user operation and the sixth user operation.
  • the first way is to watch together, the plurality of windows include windows of a video application, and the plurality of devices include devices equipped with display screens (for example, mobile phones, tablet computers).
  • the first way is to listen together, the plurality of windows include windows of a music application, and the plurality of devices include devices equipped with speakers (for example, headphones, speakers).
  • the shared devices and content to be shared displayed by the first device for the user to select may be determined according to the sharing method selected by the user, thereby filtering out the shared devices and content to be shared that the user does not want to select. Prevent this information from affecting user choices and improve user experience.
  • the method before sending the first data to the second device, the method further includes: receiving a seventh user operation; in response to the seventh user operation, determining that the type of shared data is The first type; wherein, when the first type is audio, the first data includes audio data related to the second interface; when the first type is image, the first data includes the Video data related to the second interface; when the first type is audio and image, the first data includes audio data and video data related to the second interface.
  • the user can select the type of content to be shared, that is, select the type of the first data to be audio, image, or audio and image, Meet users' personalized needs and improve user experience.
  • the first data includes video data related to the second interface; the method further includes: receiving an eighth user operation that acts on the second interface and slides on a second trajectory. ; Send second data to the second device, where the second data includes audio data related to the second interface.
  • the first trajectory is the trajectory of W
  • the second trajectory is the trajectory of L.
  • users can trigger sharing of different types of content by performing different user operations, making the operation simpler and more convenient and improving user experience.
  • the method before sending the first data to the second device, the method further includes: receiving a ninth user operation for selecting the first area in the second interface, so The first data includes multimedia data related to the first area.
  • the user can choose to share the multimedia data related to some areas in the second interface, which allows the user to quickly share the multimedia data of any area to meet the user's personalized needs and improve the user experience.
  • the method before sending the first data to the second device, the method further includes: receiving a tenth user operation for selecting the first layer in the second interface,
  • the first data includes multimedia data related to the first layer.
  • the user can choose to share the multimedia data related to some layers in the second interface, which allows the user to quickly share the multimedia data of any layer to meet the user's personalized needs and improve the user experience.
  • sending the first data to the second device includes: when the first application is not a default application, sending the first data to the second device, The security level of the preset application is higher than the first level.
  • the preset application includes an application determined by the first device in response to a user operation.
  • the preset application includes an application determined by the first device according to preset rules.
  • the preset applications include banking applications and/or payment applications.
  • the first device may not share multimedia data of preset applications whose security level is higher than the first level, effectively ensuring the privacy and security of the user.
  • sending the first data to the second device includes: identifying that the security level of data related to the second area in the second interface is higher than the second level; The second device sends the first data, and the first data does not include data related to the second area.
  • the data related to the second area includes data determined by the first device in response to a user operation.
  • the data related to the second area includes data determined by the first device according to preset rules.
  • the data related to the second area includes user name, password, account name, login name, ID number, bank card number, and account balance.
  • the first device may not share data with a security level higher than the second level, effectively ensuring the privacy and security of the user.
  • displaying the second interface of the first application includes: receiving broadcast data of the first channel sent by a network device; and displaying the second interface according to the broadcast data of the first channel.
  • the method further includes: receiving broadcast data of the second channel sent by the network device, and the user interface displayed by the first device has nothing to do with the broadcast data of the second channel; receiving Eleventh user operation: Send broadcast data of the second channel to a third device, where the broadcast data of the second channel is used by the third device to output audio and/or video of the second channel.
  • the first device may not output the received broadcast data of the second channel, but directly send the broadcast data of the second channel to the third device in response to the user operation, and the application processor of the first device does not need to Wake up to process the broadcast data of the second channel, thereby reducing device power consumption.
  • the first device does not need to have the ability to decode and play broadcast data, broadening application scenarios and providing a better user experience.
  • sending the first data to the second device includes: sending the first data and third data to the second device, and the third data includes the third data.
  • the multimedia data sent by the first device to the second device can be superimposed with the audio data collected by the microphone and/or the image data collected by the camera.
  • the user using the second device can watch/listen to the application data while checking the situation of the other party. And/or listen to the other party’s explanation to meet the user’s personalized needs and improve the user experience.
  • the method further includes: receiving a twelfth user operation; in response to the twelfth user operation, determining not to grant the second device permission to save and forward the first data. Permissions for the first data; receiving a second request message sent by the second device, the second request message being used to request saving and/or forwarding of the first data; displaying the second request message according to the second request message. A prompt message.
  • the first device can be set not to allow the second device to save and forward the first data.
  • the second device needs to save the first data or forward the first data, it can request the first device for permission, thereby avoiding the need to use the first device.
  • the second device retransmits the first data shared by the first device, thereby improving the protection of the user's privacy and security.
  • the method further includes: receiving a third request message sent by the second device, the third request message being used to request real-time sharing of multimedia data to the first device; according to the The third request message displays the second prompt information; receives a thirteenth user operation, the thirteenth user operation is used to accept the request indicated by the second request message; receives the fourth data sent by the second device; The fourth data is output.
  • the second device when the first device shares the first data with the second device, the second device can also share the multimedia data with the first device, that is, two-way sharing is realized, meeting the user's personalized real-time sharing needs and improving the user experience.
  • the outputting the fourth data includes: displaying a seventh interface according to the fourth data, and when the first device displays the seventh interface, the second device displays The content of the second interface; or, the outputting the fourth data includes: displaying the second interface and the eighth interface in split screen, and the eighth interface is determined based on the fourth data.
  • the second device when the first device displays the content shared by the second device, the second device can also display the content shared by the first device, that is, "you see mine, I see yours", or the first device can also display Split screen displays the content shared by this device and the content shared by the second device.
  • the display methods are flexible and diverse to meet the different needs of users in different scenarios.
  • the method further includes: receiving a fourteenth user operation; and sending the fourth data to the fourth device, so that The fourth device outputs the fourth data.
  • the first device can share the fourth data shared by the second device with other devices to meet the user's personalized real-time sharing needs and improve the user experience.
  • sending the first data to the second device includes: sending the first data to the second device through a first link and a second link, and the third One link is a cellular communication link or an auxiliary link, and the second link includes at least one of the following: Bluetooth link, wireless fidelity Wi-Fi link, V2X link, satellite link, point-to-point D2D link , a cellular communication link and an auxiliary link, where the first link and the second link are different.
  • the first device can transmit the first data together through different transmission paths of different communication methods. For example, the first data is transmitted once through the first link, and the first data is transmitted once again through the second link. It can be understood that This is to achieve redundant packet compensation, avoid a certain link being unstable and causing the second device to be unable to receive valid first data, and improve transmission quality.
  • the method further includes: displaying a ninth interface, where the ninth interface includes information on multiple user interfaces run by the first device; receiving a message that acts on the ninth interface.
  • a fifteenth user operation of a first control the first control being related to a tenth interface among the plurality of user interfaces; sending fifth data to a fifth device, the fifth data being used for the fifth device Output multimedia data related to the tenth interface.
  • the ninth interface is a user interface of a multi-task list.
  • the user can trigger the sharing of multimedia data related to one of the tasks (the tenth interface) based on the user interface of the multi-task list.
  • the user interface of the multi-task list There are various ways to trigger real-time sharing, which can meet the different needs of the user in different scenarios and improve the user experience. experience.
  • the method further includes: displaying an eleventh interface, where the eleventh interface includes information on multiple functions of the control center; receiving a second message acting on the eleventh interface.
  • a sixteenth user operation of a control where the second control is related to a sharing function among the multiple functions; sending sixth data to a sixth device, where the sixth data is used by the sixth device to output the first Multimedia data for a device's foreground application.
  • the eleventh interface is a user interface of the control center displayed by the first device in response to a user operation of sliding down from the upper edge of the screen.
  • users can trigger real-time sharing based on the user interface of the control center.
  • this application provides yet another sharing method, applied to a first device.
  • the method includes: displaying a first interface, the first interface including information on multiple windows running on the first device; receiving a function Based on the first user operation of the first control in the first interface, the first control is related to the first window of the first application in the plurality of windows; sending the first data to the second device, the The first data is used by the second device to output multimedia data related to the first window.
  • the first interface is a user interface of a multi-task list.
  • the second device is a device that conducts operator calls with the first device.
  • the second device is a device connected to the first device through near field communication.
  • the second device is a device connected to the first device through far field communication.
  • the user can trigger the sharing of multimedia data related to one of the tasks (the first window) based on the user interface of the multi-task list.
  • the second device to be shared can be the other party or a nearby device, solving the problem of operator calls and near-field
  • the problem of being unable to share multimedia data streams in real time in communication scenarios allows users to watch and watch together with the other party, nearby devices, and far-field devices in a simpler and faster operation. Listen together to meet user needs and improve user experience.
  • sending the first data to the second device includes: displaying a second interface, the second interface including information of multiple devices; receiving all the information that acts on the multiple devices. A second user operation of the second device; sending the first data to the second device.
  • the multiple devices include at least one of the following: a discovered device, a connected device, a device that last made a call with an operator, a device that stores identification information, a device that is identified based on the captured image device of.
  • the connected devices include devices currently making calls with the operator, devices connected through near field communication, and devices connected through far field communication.
  • the method before displaying the second interface, the method further includes: displaying a third interface, where the third interface includes multiple sharing modes; and receiving a message that acts on the multiple sharing modes.
  • a third user operation in the first manner, the plurality of devices are determined according to the first manner.
  • the first way is to watch them together, and the multiple devices include devices equipped with display screens (such as mobile phones and tablet computers).
  • the first way is to listen together, and the plurality of devices include devices equipped with speakers (for example, headphones, speakers).
  • the method before sending the first data to the second device, the method further includes: receiving a fourth user operation; in response to the fourth user operation, determining that the type of shared data is the first type; wherein, when the first type is audio, the first data includes audio data related to the first window; when the first type is image, the first data includes the first Window-related video data; when the first type is audio and image, the first data includes audio data and video data related to the first window.
  • the method before sending the first data to the second device, the method further includes: receiving a fifth user operation acting on the first area in the first window, the first The data includes multimedia data related to the first area.
  • the method before sending the first data to the second device, the method further includes: receiving a sixth user operation acting on the first layer in the first window, the One data includes multimedia data related to the first layer.
  • sending the first data to the second device includes: when the first application is not a default application, sending the first data to the second device, and The security level of the default application is higher than the first level.
  • sending the first data to the second device includes: identifying that the security level of data related to the second area in the first window is higher than the second level; sending the first data to the second device.
  • the two devices send the first data, and the first data does not include data related to the second area.
  • the displaying the first interface includes: receiving broadcast data of the first channel sent by a network device; displaying the first element in the first interface according to the broadcast data of the first channel. A window.
  • the method further includes: receiving broadcast data of the second channel sent by the network device, and the user interface displayed by the first device has nothing to do with the broadcast data of the second channel; receiving A seventh user operation: Send broadcast data of the second channel to the second device, where the broadcast data of the second channel is used by the second device to output audio and/or video of the second channel.
  • sending the first data to the second device includes: sending the first data and second data to the second device, and the second data includes the first data Audio data collected through the microphone and/or image data collected through the camera of the first device.
  • the method further includes: receiving an eighth user operation; in response to the eighth user operation, determining not to grant the second device permission to save the first data and forward the Permission for the first data; receiving the first request message sent by the second device, the first request message being used to request saving and/or forwarding the first data; displaying the first prompt according to the first request message information.
  • the method further includes: receiving a second request message sent by the second device, the second request message being used to request real-time sharing; and displaying a second request message according to the second request message. Prompt information; receive a ninth user operation, the ninth user operation is used to accept the request indicated by the second request message; receive the third data sent by the second device; output the third data.
  • the outputting the third data includes: displaying a fourth interface according to the third data, and when the first device displays the fourth interface, the second device displays The content of the first window; or, the outputting the third data includes: split-screen display of a fifth interface and a sixth interface, the fifth interface includes the content of the first window, and the sixth interface The interface is determined based on the third data.
  • the method further includes: receiving a tenth user operation; and sending the third data to the third device so that the The third device outputs the third data.
  • sending the first data to the second device includes: sending the first data to the third device through a first link and a second link.
  • Two devices send the first data, and the first link and the second link include at least one of the following: cellular communication link, auxiliary link, Bluetooth link, wireless fidelity Wi-Fi link, V2X link, satellite link, point-to-point D2D link, the first link and the second link are different.
  • the present application provides yet another sharing method, applied to a first device.
  • the method includes: displaying a first interface, the first interface including information on multiple functions of the control center; receiving an action on the first device.
  • the first interface is a user interface of the control center displayed by the first device in response to a user operation of sliding down from the upper edge of the screen.
  • the second device is a device that conducts operator calls with the first device.
  • the second device is a device connected to the first device through near field communication.
  • the second device is a device connected to the first device through far field communication.
  • the user can trigger real-time sharing based on the user interface of the control center, and the second device shared can be the other party or a nearby device, which solves the problem of being unable to share multimedia data streams in real time in operator calls and near-field communication scenarios. , allowing users to watch and listen together with the other party, nearby devices, and far-field devices in a simpler and faster operation method, meeting user needs and improving user experience.
  • sending the first data to the second device includes: displaying a second interface, the second interface including information of multiple devices; receiving all the information that acts on the multiple devices. A second user operation of the second device; sending the first data to the second device.
  • the multiple devices include at least one of the following: a discovered device, a connected device, a device that last made a call with an operator, a device that stores identification information, a device that is identified based on the captured image device of.
  • the connected devices include devices currently making calls with the operator, devices connected through near field communication, and devices connected through far field communication.
  • sending the first data to the second device includes: displaying a third interface, the third interface including information of multiple windows; and receiving a third message that acts on the multiple windows.
  • a third user operation of a window, the first data includes multimedia data related to the first window; sending the first data to the second device.
  • the multiple windows include at least one of the following: windows of foreground applications, windows of background applications, and windows of applications that are installed but not running on the first device.
  • the method before sending the first data to the second device, the method further includes: receiving a fourth user operation acting on the first area in the first window, the first The data includes multimedia data related to the first area.
  • the method before sending the first data to the second device, the method further includes: receiving a fifth user operation acting on the first layer in the first window, the One data includes multimedia data related to the first layer.
  • sending the first data to the second device includes: when the application corresponding to the first data is not a preset application, sending the first data to the second device , the security level of the preset application is higher than the first level.
  • sending the first data to the second device includes: identifying that the security level of data related to the second area in the first window is higher than the second level; sending the first data to the second device.
  • the two devices send the first data, and the first data does not include data related to the second area.
  • sending the first data to the second device includes: displaying a fourth interface, the fourth interface including a plurality of sharing modes; and receiving a first message that acts on the plurality of sharing modes.
  • a sixth user operation in one mode displaying a fifth interface, the fifth interface including information on multiple windows and multiple devices, the multiple windows and the multiple devices being determined according to the first mode;
  • a seventh user operation acting on a second window of the plurality of windows is received, and an eighth user operation acting on the second device of the plurality of devices is received, and the first data includes the second Window-related multimedia data; sending the first data to the second device according to the seventh user operation and the eighth user operation.
  • the method before sending the first data to the second device, the method further includes: receiving a ninth user operation; in response to the ninth user operation, determining that the type of shared data is the first type; wherein, when the first type is audio, the first data includes audio data; when the first type is image, the first data includes video data; when the first type is audio and images, the first data includes audio data and video data.
  • the method before sending the first data to the second device, the method further includes: receiving broadcast data of the first channel sent by the network device; displaying the broadcast data of the first channel according to the broadcast data of the first channel.
  • the first data includes multimedia data related to the sixth interface.
  • the method further includes: receiving broadcast data of the second channel sent by the network device, the first The user interface displayed by a device has nothing to do with the broadcast data of the second channel.
  • the first data includes the broadcast data of the second channel.
  • the first data is used by the second device to output the second channel. audio and/or video.
  • sending the first data to the second device includes: sending the first data and second data to the second device, and the second data includes the first data Audio data collected through the microphone and/or image data collected through the camera of the first device.
  • the method further includes: receiving a tenth user operation; in response to the tenth user operation, determining not to grant the second device permission to save the first data and forward the Permission for the first data; receiving the first request message sent by the second device, the first request message being used to request saving and/or forwarding the first data; displaying the first prompt according to the first request message information.
  • the method further includes: receiving a second request message sent by the second device, the second request message being used to request real-time sharing; and displaying a second request message according to the second request message.
  • Prompt information receive an eleventh user operation, the eleventh user operation is used to accept the request indicated by the second request message; receive the third data sent by the second device; output the third data.
  • the outputting the third data includes: displaying a seventh interface according to the third data, and when the first device displays the seventh interface, the second device displays The video data included in the first data; or, the outputting the third data includes: split-screen display of an eighth interface and a ninth interface, the eighth interface is determined based on the first data, so The ninth interface is determined based on the third data.
  • the method further includes: receiving a twelfth user operation; sending the third data to a third device, so that the third device outputs the third data.
  • sending the first data to the second device includes: sending the first data to the second device through a first link and a second link, and the first link
  • the path and the second link include at least one of the following: cellular communication link, auxiliary link, Bluetooth link, wireless fidelity Wi-Fi link, V2X link, satellite link, point-to-point D2D link, so The first link and the second link are different.
  • the application provides an electronic device, including a transceiver, a processor, and a memory.
  • the memory is used to store a computer program.
  • the processor calls the computer program to execute any possible method in any of the above aspects. Shared methods in implementations.
  • this application provides a computer storage medium that stores a computer program.
  • the sharing method in any of the possible implementations of any of the above aspects is implemented.
  • the present application provides a computer program product that, when run on an electronic device, causes the electronic device to execute the sharing method in any of the possible implementations of any of the above aspects.
  • the present application provides an electronic device, which includes executing the method or device described in any implementation manner of the present application.
  • the above-mentioned electronic device is, for example, a chip.
  • Figure 1A is a schematic architectural diagram of a sharing system provided by this application.
  • Figure 1B is an architectural schematic diagram of another sharing system provided by this application.
  • FIG. 1C is an architectural schematic diagram of another sharing system provided by this application.
  • FIG. 2A is a schematic diagram of the hardware structure of an electronic device provided by this application.
  • Figure 2B is a schematic diagram of the software architecture of an electronic device provided by this application.
  • FIG. 2C is a schematic diagram of the software architecture of another electronic device provided by this application.
  • Figure 2D is a schematic diagram of the software architecture of another electronic device provided by this application.
  • FIG. 2E is an architectural schematic diagram of another sharing system provided by this application.
  • Figure 3 Figure 4A- Figure 4C, Figure 5A- Figure 5D, Figure 6A- Figure 6D, Figure 7A- Figure 7C, Figure 8A- Figure 8C, Figure 9A- Figure 9C, Figure 10A- Figure 10B, Figure 11A- Figure 11D, Figure 12A- Figure 12D, Figure 13, Figure 14A- Figure 14D, Figure 15A- Figure 15D, Figure 16A- Figure 16E, Figure 17A- Figure 17I, Figure 18A- Figure 18D, Figure 19A - Figure 19G, Figure 20A- Figure 20D, Figure 21A- Figure 21E, Figure 22A- Figure 22E, Figure 23A- Figure 23C, Figure 24A- Figure 24C are schematic diagrams of some user interfaces provided by this application;
  • Figure 25 is a schematic flow chart of a sharing method provided by this application.
  • Figure 26A is a schematic diagram of an audio transmission method provided by this application.
  • Figure 26B is a schematic diagram of another audio transmission method provided by this application.
  • Figure 26C is a schematic diagram of another audio transmission method provided by this application.
  • Figure 27 is an architectural schematic diagram of another sharing system provided by this application.
  • Figure 28 is a schematic flow chart of an auxiliary link establishment process provided by this application.
  • Figure 29 is a schematic diagram of a communication map provided by this application.
  • Figure 30 is a schematic flow chart of a predictive link building process provided by this application.
  • Figure 31 is a schematic diagram of data transmission provided by this application.
  • Figure 32A is an architectural schematic diagram of an audio stream and/or video stream transmission provided by this application.
  • Figure 32B is a schematic diagram of a data packet provided by this application.
  • Figure 33 is a schematic diagram of another data transmission provided by this application.
  • Figure 34 is an architectural schematic diagram of another sharing system provided by this application.
  • Figure 35 is a schematic flow chart of device discovery and connection provided by this application.
  • FIG. 36 is a schematic diagram of another data transmission provided by this application.
  • Figure 37 is a schematic flow chart of a multicast group member leaving provided by this application.
  • Figure 38 is a schematic flow chart of another multicast group member departure provided by this application.
  • Figure 39 is a schematic diagram of another data packet provided by this application.
  • Figure 40 is an architectural schematic diagram of another sharing system provided by this application.
  • Figure 41 is an architectural schematic diagram of another sharing system provided by this application.
  • Figure 42 is a schematic diagram of another data transmission provided by this application.
  • Figure 43 is a schematic flow chart of a password transmission process provided by this application.
  • Figure 44 is a schematic flow chart of a multi-device synchronization process provided by this application.
  • Figures 45A-45D are schematic diagrams of some multi-level sharing scenarios provided by this application.
  • Figures 46A-46C are schematic architectural diagrams of some new wireless access NR communication systems provided by this application.
  • first and second are used for descriptive purposes only and shall not be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the embodiments of this application, unless otherwise specified, “plurality” The meaning is two or more.
  • the current sharing function can be implemented in the following three ways:
  • Method 1 Users who use mobile phones often share the content they see on the mobile phone through dictation during the phone call (also called an operator call), and the other party cannot view the content.
  • Method 2 Users achieve file-based sharing through near-field communication technologies such as Bluetooth and near field communication (NFC). For example, picture files can be shared with nearby users, but audio streams/video streams cannot be shared in real time. Multimedia data streams such as these have the possibility of secondary propagation and cannot effectively protect user privacy and security.
  • NFC near field communication
  • Method 3 Users share multimedia data streams of other applications in real time through chat applications or conferencing applications installed on electronic devices.
  • both the sharing device and the shared device need to install a chat application or a conferencing application, as well as the application to be shared, and may even require the shared device to register and/or log in to the application to be shared.
  • the application to be shared also needs to be adapted to the chat application or conferencing application.
  • the multimedia data stream of the unadapted application cannot be shared in real time.
  • the application scenarios are limited and cannot meet the needs of users.
  • This application provides a sharing method that can provide a more concise and convenient user experience operation sequence, allowing the sharing device and one or more call parties, nearby devices, far-field devices and other shared devices to watch, listen together, Real-time sharing and collaboration functions such as playing together and editing together solve the problem of being unable to share in real time in operator call and near field communication scenarios.
  • chat applications or conference applications applications to be shared, or adaptation to be shared.
  • the application greatly broadens the application scenarios, allowing users to quickly share multimedia data streams in any application and any area, effectively meeting user needs and improving user experience.
  • real-time sharing can reduce the possibility of secondary transmission and improve the protection of user privacy and security.
  • real-time sharing may be a sharing data such as a multimedia data stream shared by a sharing device/sharing user to at least one shared device/shared user, and the sharing device/sharing user and at least one shared device/shared user can watch together/ Listen to multimedia data streams.
  • the multimedia data stream may include image data (multiple frames of images may be called a video stream) and audio data (multiple frames of audio may be called an audio stream).
  • a sharing device is a device that initiates real-time sharing/real-time sharing, and may also be called a sharing initiator.
  • the sharing device may provide sharing content (also called sharing data, such as any of the above applications or multimedia in any area). data flow).
  • the shared device is a device that receives the above-initiated real-time sharing/real-time sharing. It can also be called a sharing receiver.
  • the shared device can receive shared content and output shared content.
  • Sharing users can use sharing devices to share data in real time with shared users who use shared devices.
  • the sharing device/sharing user the shared device/shared user may be referred to as the sharing object.
  • the real-time sharing in this application belongs to real-time sharing, and can be a description of real-time sharing from the perspective of the sharing device.
  • sharing device/sharing user and shared device/shared user are relative role concepts rather than physical concepts.
  • a device/user can have different roles in different sharing scenarios.
  • device 1/user 1 can serve as a sharing device/user to share multimedia data streams in real time with other devices/users at time 1, and can serve as a shared device at time 2 to receive multimedia data streams shared by other sharing devices in real time.
  • device 1/user 1 can share a multimedia data stream to device 2/user 2 in real time.
  • device 2/user 2 can also share a multimedia data stream to device 3/user 3.
  • device 2 is a shared device, but for device 3, device 2 is a sharing device.
  • watching together, listening together, playing together and editing together can be four different real-time sharing methods.
  • watching together can be a real-time sharing of content that can be viewed (such as images of a video application)
  • listening together can be a real-time sharing of content that can be listened to (such as audio of a music application)
  • playing together can be a real-time sharing of game-related content.
  • images and/or audio of game applications Editing together may be content related to real-time shared documents (documents that can be edited, such as word format, table (excel) format, and presentation (powerpoint, PPT) format, etc.).
  • the user can choose a real-time sharing method, but it is understandable that the real-time sharing method selected by the user will not limit the actual real-time sharing content. For example, the user first chooses a real-time sharing method of watching together, However, in actual real-time sharing, users can use the sharing device to send listenable content, game-related content, and/or document-related content, such as audio streams and video streams of video applications, to other shared devices.
  • the electronic device can also determine the real-time sharing method by itself, for example, setting a real-time sharing method by default, or determining the real-time sharing method according to preset rules. Not limited to the above example, there can also be other real-time sharing methods. This application does not limit the specific content and determination method of the real-time sharing method.
  • the electronic device can run at least one application.
  • the user-visible and interactive application in the at least one application can be called a foreground application.
  • the electronic device can display the user interface of the foreground application, which can also be called the electronic device in the foreground.
  • the application in at least one application that is invisible and non-interactive to the user can be called a background application.
  • the electronic device will not display the user interface of the background application, but will still run the background application. It can also be called the electronic device running the application in the background.
  • foreground applications and background applications are role concepts, not physical concepts. An application can have different roles in different scenarios.
  • the electronic device displays the user interface of application 1 (at this time, application 1 is a foreground application and application 2 is a background application), it can display the user interface of application 2 in response to the user operation (at this time, application 2 is a foreground application and application 1 is a background application). background application).
  • nearby devices are electronic devices that can communicate via Bluetooth, wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi)), point-to-point communication (device to device communication, D2D), near Equipment that communicates with near field communication technologies such as near field communication (NFC), ultra wide band (UWB), infrared, etc.
  • WLAN wireless local area networks
  • D2D point-to-point communication
  • NFC near field communication
  • UWB ultra wide band
  • infrared etc.
  • Nearby devices may include devices that are discovered by the electronic device but not connected to, and/or devices that are connected to the electronic device. This application does not limit the specific content of near field communication technology.
  • a far-field device is an electronic device that can communicate through far-field communication technologies such as WLAN, satellite, and cellular communications.
  • Far-field devices may include devices that are discovered but not connected to by the electronic device, and/or devices that are connected to the electronic device. This application does not limit the specific content of far field communication technology.
  • Touch operations in this application may include, but are not limited to: single-click, double-click, long press, single-finger long press, multi-finger long press, single-finger slide, multi-finger slide, knuckle slide, and other forms.
  • the sliding touch operation may be referred to as a sliding operation.
  • the sliding operation may be, for example, but not limited to, sliding left and right, sliding up and down, sliding to a first specific position, sliding along a specific trajectory, etc. This application does not limit the trajectory of the sliding operation.
  • the touch operation may be performed on a second specific location on the electronic device.
  • the above-mentioned specific location can be located on the display screen of the electronic device, such as where controls such as icons are located or on the edge of the display screen, or the specific location can also be located on the side, back, or other locations of the electronic device, such as the volume keys, power button, etc. key and other key positions.
  • the above-mentioned specific position is preset by the electronic device, or the specific position is determined by the electronic device in response to a user operation.
  • the above-mentioned specific trajectory is preset by the electronic device, or the specific trajectory is determined by the electronic device in response to a user operation.
  • the sharing system 10 involved in the embodiment of this application is introduced below.
  • FIG. 1A illustrates an architectural schematic diagram of a sharing system 10 .
  • the sharing system 10 may include an electronic device 11, and the electronic device 11 may communicate with different electronic devices through different communication methods. Specific examples are as follows:
  • the electronic device 11 may communicate with at least one electronic device through a cellular communication network (which may also be referred to as via cellular communication), optionally implementing operator calls (ie, phone calls).
  • FIG. 1A takes the at least one electronic device including electronic device 12 as an example for illustration.
  • the electronic device 11, the cellular communication network and the at least one electronic device may constitute a cellular communication system.
  • the cellular communication system may be, for example, but not limited to, a global system for mobile communications (GSM), a code division multiple access (code division multiple access) division multiple access (CDMA), wideband code division multiple access (WCDMA), time division synchronous code division multiple access (TD-SCDMA), long term evolution (LTE) ), new radio access (new radio, NR) or other future network systems.
  • GSM global system for mobile communications
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time division synchronous code division multiple access
  • LTE long term evolution
  • new radio access new radio, NR
  • the cellular communication network includes, for example but is not limited to, base stations, core networks and communication lines.
  • a base station is a device deployed in a radio access network (RAN) to provide wireless communication functions.
  • RAN radio access network
  • the name of the base station may be different, such as but not limited to, base transceiver station (BTS) in GSM or CDMA, node B (node B, NB) in WCDMA, LTE
  • BTS base transceiver station
  • node B node B
  • NB node B
  • the evolved base station evolved node B, eNodeB
  • g node B, gNB next generation base station
  • the core network is a key control node in the cellular communication system and is mainly responsible for signaling processing functions, such as but not limited to implementing access control, mobility management, session management and other functions.
  • Core network equipment includes, but is not limited to, access and mobility management function (access and mobility management function, AMF) entities, session management function (SMF) entities, user plane function (UPF) entities, etc. .
  • Communication lines include, but are not limited to, twisted pairs, coaxial cables, and optical fibers.
  • the electronic device 11 can be connected to the base station 1 in the cellular communication network through an air interface (referred to as the air interface), the electronic device 12 can be connected to the base station 2 in the cellular communication network through the air interface, and the base station 1 and the base station 2 can be connected. to the core network.
  • base station 1 and base station 2 may also be the same base station.
  • the electronic device 11 can communicate with at least one electronic device through near field communication technology.
  • Near field communication technology includes, but is not limited to, Bluetooth, WLAN (such as Wi-Fi), D2D, NFC, UWB, infrared, etc. .
  • FIG. 1A takes the at least one electronic device including electronic device 13, electronic device 14 and electronic device 15 as an example to illustrate.
  • the electronic device 11 communicates with the electronic device 13 through WLAN, communicates with the electronic device 14 through Bluetooth, and communicates with the electronic device through D2D.
  • the device 15 communicates.
  • An example of the communication between the electronic device 11 and the electronic device 15 can be seen in Figure 1B below.
  • Near field communication WLAN includes, for example, peer-to-peer (P2P) direct connection, or two devices connected to the same WLAN signal source (in the same LAN at this time) can communicate through near-field WLAN . It is not limited to this. In other examples, WLAN can also be a far-field communication method. For example, two devices belonging to different local area networks can communicate through far-field WLAN.
  • P2P peer-to-peer
  • WLAN can also be a far-field communication method. For example, two devices belonging to different local area networks can communicate through far-field WLAN.
  • the electronic device 11 can also communicate with at least one vehicle through vehicle wireless communication (vehicle to X, V2X) technology.
  • FIG. 1A takes the at least one vehicle including the vehicle 16 as an example for illustration.
  • the electronic device 11 can communicate with the vehicle 16 through a cellular communication network, which can be understood as realizing V2X through the cellular communication network.
  • electronic device 11 may communicate directly with vehicle 16 .
  • the electronic device 11 can also communicate with other devices such as vehicle-mounted devices through V2X technology.
  • the electronic device 11 can also communicate with at least one electronic device through satellites. Satellite systems include, but are not limited to, Beidou, Tiantong, Starlink, etc.
  • the at least one electronic device in FIG. 1A includes electronic device 12 for illustration.
  • the electronic device 11 can connect to a satellite, then connect to a cellular communication network through the satellite, and finally connect to the electronic device 12 through the cellular communication network. See the example shown in Figure 1C below.
  • the electronic device 11 can also implement an over-the-top (OTT) call with at least one electronic device.
  • OTT over-the-top
  • the OTT call can be developed based on the open Internet across operators.
  • video and other data services are implemented, for example, through Wi-Fi.
  • OTT calls can be implemented based on the operator's cellular data services.
  • FIG. 1B exemplarily shows an architectural schematic diagram of yet another sharing system 10 .
  • the sharing system 10 includes an electronic device 11 and an electronic device 15.
  • the electronic device 11 and the electronic device 15 implement D2D communication based on an air interface (such as PC5) and a communication link (such as a sidelink). , where, unlike cellular communication links, which distinguish between uplink and downlink, sidelink can reflect the parity between the two ends of the communication.
  • D2D communication provides a direct discovery function and a direct communication function.
  • Direct discovery can provide electronic device A with the function of discovering that there is an electronic device B around it that can be directly connected.
  • Direct communication can provide The function of data exchange between electronic device A and surrounding electronic device B.
  • electronic device A is electronic device 11 and electronic device B is electronic device 15, or electronic device A is electronic device 15 and electronic device B is electronic device 11. .
  • direct connection discovery and direct communication can be performed between the electronic device 11 and the electronic device 15 through D2D technology, thereby realizing real-time sharing functions such as watching together, listening together, playing together, and editing together.
  • FIG. 1C exemplarily shows an architectural schematic diagram of yet another sharing system 10 .
  • the shared system 10 includes electronic equipment 11, satellites, ground receiving stations, base stations 1, core network equipment 1, data network (Data Network), core network equipment 2, base stations 2 and electronic equipment 12.
  • the electronic device 11 and the electronic device 12 can realize real-time sharing functions such as watching together, listening together, playing together, editing together, etc. through the sharing system 10.
  • the electronic device 11 is the sharing device and the electronic device 12 is the shared device. Take the equipment as an example to illustrate:
  • the electronic device 11 can be connected to the satellite and send the shared content to the satellite. Satellites can send shared content to ground receiving stations.
  • the ground receiving station can access the core network device 1 via the base station 1 and send the shared content to the core network device 1 through the base station 1.
  • the ground receiving station can also be directly connected to the core network device 1.
  • Core network device 1 directly sends the shared content to core network device 1.
  • core network device 1 can send the shared content to core network device 2 through Data Network.
  • the electronic device 12 can access the core network device 2 via the base station 2 , and the core network device 2 can send the shared content to the electronic device 12 for output through the base station 2 .
  • the ground receiving station may be connected to the core through at least one gateway device for access conversion.
  • Network equipment 1 may be more or fewer devices between the satellite and the electronic device 12.
  • the ground receiving station may be connected to the core through at least one gateway device for access conversion.
  • Network equipment 1 may be more or fewer devices between the satellite and the electronic device 12.
  • the electronic device 12 may also access the Data Network not through cellular communication network equipment (such as the base station 2 and the core network equipment 2), but through WLAN (such as Wi-Fi). Fi) method to access the Data Network.
  • WLAN such as Wi-Fi
  • Fi Wireless Fidelity
  • multiple connections can be implemented between the sharing device and the shared device through multiple communication methods, such as redundant packet supplementation through different transmission paths of different communication methods, thereby ensuring the transmission quality of real-time sharing (such as real-time performance). and/or stability).
  • the above-mentioned multiple communication methods include, but are not limited to, the communication methods described in Figure 1A, Figure 1B and Figure 1C.
  • the supplementary packet in this application can be that when transmitting a certain data packet, part or all of the content of the data packet is transmitted at least once.
  • the content of each transmission can be the same or different (for example, including three situations: exactly the same, partially the same and completely different), the time of each transfer can be the same or different.
  • packet 1 For example, the entire contents of packet 1 are transmitted via satellite at time 1, fragment 1 of packet 1 is transmitted via cellular communication, fragment 2 of packet 1 is transmitted via satellite at time 2, and packet 1 is transmitted via cellular communication at time 3
  • the above base station can also be other access network equipment, such as user equipment (UE), access point (AP), transceiver point (transmission point) and receiver point (TRP), relay equipment, or other network equipment with the function of a base station, etc.
  • UE user equipment
  • AP access point
  • TRP receiver point
  • relay equipment or other network equipment with the function of a base station, etc.
  • the electronic device 100 may be any electronic device in the sharing system 10 .
  • the electronic device 100 may be a mobile phone, a tablet computer, a handheld computer, a desktop computer, a laptop computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, or a personal digital assistant (personal digital assistant).
  • digital assistant as well as smart home devices such as smart large screens and smart smart speakers, wearable devices such as smart bracelets, smart watches, and smart glasses, augmented reality (AR), and virtual reality (VR) , extended reality (XR) equipment such as mixed reality (MR), vehicle-mounted equipment or smart city equipment.
  • AR augmented reality
  • VR virtual reality
  • XR extended reality
  • MR mixed reality
  • vehicle-mounted equipment smart city equipment.
  • FIG. 2A exemplarily shows a schematic diagram of the hardware structure of an electronic device 100.
  • the electronic device 100 shown in FIG. 2A is only an example, and the electronic device 100 may have more or fewer components than shown in FIG. 2A , two or more components may be combined, or Can have different component configurations.
  • the various components shown in Figure 2A may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , Antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 , display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem, a graphics processing unit (GPU), an image processor Image signal processor (ISP), controller, video codec, digital signal processor (DSP), baseband processor, and/or neural-network processing unit (NPU) )wait.
  • application processor application processor
  • AP application processor
  • GPU graphics processing unit
  • ISP image processor Image signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is a cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor, for example, transmitting real-time shared audio streams/video streams.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G/6G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency Signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • WiFi wireless fidelity
  • BT Bluetooth
  • global navigation satellites System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR), D2D, V2X and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, etc., such as displaying a real-time shared video stream.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 (also called a screen) is used to display images, videos, etc.
  • Display 194 includes a display panel. The display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active matrix organic light emitting diode active matrix organic light emitting diode
  • active-matrix organic light emitting diode active-matrix organic light emitting diode
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement the shooting function through the ISP, camera 193, video codec, GPU, display screen 194, application processor, etc., for example, it can capture portraits for real-time sharing with the application's video stream to other devices.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, etc. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In one implementation, the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions. Internal memory 121 It can include storage program area and storage data area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.). The storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.). In addition, the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), etc. The processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 100
  • the electronic device 100 can implement audio functions, such as playing a real-time shared audio stream, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor.
  • audio functions such as playing a real-time shared audio stream, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In one implementation, the audio module 170 may be disposed in the processor 110 , or some functional modules of the audio module 170 may be disposed in the processor 110 .
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music or other real-time shared audio streams through the speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 170C with the human mouth and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the electronic device 100 may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function.
  • the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • the audio collected by the microphone 170C in real time can be shared with other devices in real time together with the audio stream of the application.
  • the headphone interface 170D is used to connect wired headphones.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be disposed on the display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material.
  • the electronic device 100 determines the intensity of the pressure based on the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • Touch sensor 180K also known as "touch device”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 in a position different from that of the display screen 194 .
  • the pressure sensor 180A and/or the touch sensor 180K are used to detect touch operations on or near the pressure sensor 180A. Pressure sensor 180A and/or touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through display screen 194 .
  • the gyro sensor 180B may be used to determine the motion posture of the electronic device 100 .
  • Air pressure sensor 180C is used to measure air pressure.
  • Magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may utilize the magnetic sensor 180D to detect opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes).
  • Distance sensor 180F for measuring distance.
  • Proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • Temperature sensor 180J is used to detect temperature.
  • Bone conduction sensor 180M can acquire vibration signals.
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate vibration prompts.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the layered architecture software system may be an Android system, a Harmony operating system (operating system, OS), or other software systems.
  • FIG. 2B exemplarily shows a schematic diagram of the software architecture of the electronic device 100 .
  • FIG. 2B takes the Android system with a layered architecture as an example to illustrate the software architecture of the electronic device 100 .
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system library, and kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as address book, gallery, Bluetooth, WLAN, call, short message, browser, music, sharing, short video and video.
  • the sharing application can provide real-time sharing functions such as watching, listening, editing, and playing with one or more call parties, nearby devices, far-field devices, and other shared devices. Sharing can be an independent application, or it can be a functional component encapsulated by other applications such as calling, Bluetooth, and WLAN. This application does not limit this.
  • the application package can also be replaced by other forms of software such as applets.
  • the following embodiments take the integrated sharing functional components of call, Bluetooth and WLAN as an example for description.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, a sharing module, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 .
  • call status management including connected, hung up, etc.
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • the sharing module can be used to implement real-time sharing functions such as watching together, listening together, editing together, and playing together, such as but not limited to including user experience (UX) display and providing user interaction functions (such as receiving and responding to user input operations). ), business functions and service logic, etc.
  • the UX display includes, for example, but is not limited to: a display interface for initiating real-time sharing operations such as watching together, listening together, editing together, and playing together (including controls that trigger real-time sharing operations), playing real-time sharing
  • the display interface of the multimedia data stream, the display interface of selecting the shared content, and the display interface of selecting the shared device/shared user also called the sharing object).
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, H.265 and other video encoding formats, MP3, AAC, AMR, SBC, LC3, aptX, LDAC, L2HC, WAV, FLAC and other audio encoding formats , JPG, PNG, BMP, GIF and other image encoding formats.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the following exemplifies the workflow of the software and hardware of the electronic device 100 in conjunction with the scene of answering a call.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, and other information). Raw input events are stored at the kernel level.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Assume that the touch operation is a touch click operation, and the control corresponding to the click operation is the answering control of the call application.
  • the call application calls the interface of the application framework layer, then starts the audio driver by calling the kernel layer, and plays the call through the receiver 170B.
  • the other party's voice information and/or the current user's voice information is obtained through the microphone 170C.
  • the software system of the electronic device 100 may include an application processor system (application processor, AP) and a wireless communication system. in:
  • the wireless communication system may include, but is not limited to, at least one of the following: cellular communication system (such as 2G/3G/4G/5G/6G, etc.), satellite system (such as Beidou, Tiantong, Starlink, etc.), Wi-Fi, BT, NFC, D2D, etc.
  • cellular communication system such as 2G/3G/4G/5G/6G, etc.
  • satellite system such as Beidou, Tiantong, Starlink, etc.
  • Wi-Fi such as Wi-Fi, BT, NFC, D2D, etc.
  • the wireless communication system may include a coprocessor (CoProcessor, CP) and/or a digital signal processor (digital signal processor, DSP), where the CP in the terminal may be a baseband chip plus a coprocessor or Multimedia accelerator, CP can include the digital components required to communicate with the network, CP can include a reduced instruction set computer (RISC) microprocessor (advanced RISC machines, ARM)-based processor and a DSP.
  • RISC reduced instruction set computer
  • the CP can have an operating system and can communicate with an application processor running Android, IOS, Windows and other operating systems through a high speed (HS) serial connection.
  • CP can implement processing logic such as VR, AR, image processing, high fidelity (HiFi), high-speed data transmission (high data rate, HDR), and sensor management. It is not limited to this, the CP can also be a cellular processor (CP).
  • the application system is used to implement control logic such as rendering and presentation of the user interface, input and response to user operations, business functions, and playback of multimedia data such as audio/video.
  • control logic such as rendering and presentation of the user interface, input and response to user operations, business functions, and playback of multimedia data such as audio/video.
  • FIG. 2C exemplarily shows a schematic diagram of the software architecture of yet another electronic device 100 .
  • the application system of the electronic device 100 includes a sharing module, a discovery module, a capture module, a NewTalk function module, a Wi-Fi function module, a BT function module, a D2D function module, and a satellite function.
  • a sharing module a discovery module, a capture module, a NewTalk function module, a Wi-Fi function module, a BT function module, a D2D function module, and a satellite function.
  • module NewTalk link (Link) module, Wi-Fi link module, BT link module, D2D link module, satellite link module. in:
  • the sharing module can be understood as a core functional module for real-time sharing such as viewing together (View), listening together (Listen), playing together (Play), and editing together (Edit).
  • the sharing module is called Together (View/Listen/Play/ Edit).
  • the sharing module can be used for UX display, for example but not limited to: a display interface for initiating real-time sharing operations such as watching together, listening together, editing together, and playing together (including controls that trigger real-time sharing operations), and playing real-time shared multimedia data streams.
  • the display interface select the display interface of the shared content, select the display interface of the shared device/shared user (also called the sharing object).
  • the sharing module can also be used to provide user interaction functions for real-time sharing, provide related business functions for real-time sharing, and implement service logic for real-time sharing, etc. This application does not limit this.
  • the discovery module is used to discover nearby devices through near field communication technologies such as Wi-Fi, BT, and D2D.
  • the discovery module is called Nearby, for example. It is not limited to this, and the device can also be discovered through far-field communication technology such as cellular communication technology and satellite. This application does not limit the communication technology of the discovered device.
  • the crawling module is used to crawl shared data.
  • the capture module can obtain the decoded multimedia data stream (which can be played directly) or the pre-decoded multimedia data stream (such as the generated original data) based on the application and/or system interface, for example, the decoded multimedia data stream
  • the data stream is data processed for a specific electronic device 100 and can be directly played.
  • the capture module can capture the multimedia data stream before decoding for real-time sharing.
  • the capture module can directly capture the multimedia data stream before decoding at the system layer. For example, after the electronic device 100 receives the broadcast data sent by the base station through the 3G/4G/5G/6G broadcast module, it can capture it through the kernel.
  • the cellular communication network card (not shown) of the system layer reports to the system layer.
  • the electronic device 100 may not play the broadcast data, but the capture module obtains the broadcast data for real-time sharing.
  • the NewTalk function module is used to implement real-time sharing based on NewTalk, where NewTalk can be, but is not limited to, operator calls and/or OTT calls. NewTalk can be implemented, for example, but not limited to, through cellular communication. In one implementation, the NewTalk function module can realize real-time sharing based on NewTalk that is on a call (can be referred to as the call state). In another implementation, the NewTalk function module can be based on the NewTalk that is not talking (can be called the non-call state for short) NewTalk enables real-time sharing.
  • the Wi-Fi function module is used to achieve real-time sharing based on Wi-Fi.
  • Wi-Fi can be implemented using transmission methods such as unicast, broadcast or multicast (also called multicast). communication.
  • the BT function module is used to realize real-time sharing based on BT, in which BT communication can be realized using unicast, broadcast or multicast transmission methods.
  • the D2D function module is used to realize real-time sharing based on D2D.
  • the satellite function module is used to achieve real-time sharing based on communication satellites.
  • the NewTalk link module is used to manage NewTalk links, including but not limited to link establishment, release, data transmission, etc.
  • the NewTalk link may include a primary link and an auxiliary link.
  • the Wi-Fi link module is used to manage Wi-Fi links, such as but not limited to link establishment, release, data transmission, etc.
  • the BT link module is used to manage BT links, including but not limited to link establishment, release, data transmission, etc.
  • the D2D link module is used to manage D2D links, including but not limited to link establishment, release, data transmission, etc.
  • the satellite link module is used to manage communication satellite links, including but not limited to link establishment, release, data transmission, etc.
  • the wireless communication system of the electronic device 100 includes a cellular communication module, a Wi-Fi communication module, a BT communication module and a satellite communication module, wherein:
  • Cellular communication module includes Internet protocol (IP) multimedia system (IP multimedia subsystem, IMS) communication module, circuit switching (circuited switched, CS) communication module and 3G/4G/5G/6G broadcast module, among which , the IMS communication module can, but is not limited to, realize LTE voice call (voice over LTE, VoLTE), LTE video call (video over LTE, ViLTE), NR voice call (voice over NR, VoNR), NR video call (video over NR, ViNR), Wi-Fi voice call (voice over Wi-Fi, VoWiFi), Wi-Fi video call (video over Wi-Fi, ViWiFi), evolved packet system fallback (evolved packet system-Fallback, EPS-Fallback), etc. Call based on IMS protocol.
  • the CS communication module can provide the CS Fallback function.
  • the 3G/4G/5G/6G broadcast module can be used to monitor the 3G/4G/5G/6G broadcast channel.
  • the electronic device 100 may be within the coverage area of at least one base station, and any one of the at least one base station may send broadcast data (such as audio stream/video stream, etc.) to the electronic device (including the electronic device 100) in the coverage area through a broadcast channel.
  • broadcast data such as audio stream/video stream, etc.
  • any base station can maintain at least one channel
  • the broadcast data corresponding to different channels can be different.
  • the user can select a channel corresponding to the received broadcast data through the electronic device 100 .
  • the electronic device 100 can receive broadcast data sent by the base station through a 3G/4G/5G/6G broadcast module, and the 3G/4G/5G/6G broadcast module can report it through a cellular communication network card (not shown) at the kernel layer. Processed at the system level.
  • the electronic device 100 can play the received broadcast data through a system application (such as a call) or a third-party application (such as a chat application, conferencing application), and the electronic device 100 can share the played content with other devices.
  • the electronic device 100 may not play the received broadcast data, but directly share the received broadcast data with other devices, or share the processed broadcast data with other devices.
  • the Wi-Fi communication module may include hardware modules for WiFi communication, such as firmware and chips.
  • the BT communication module may include hardware modules for BT communication, such as firmware and chips.
  • the satellite communication module may include hardware modules for satellite communication, such as firmware and chips.
  • the real-time sharing function based on far-field communication methods such as NewTalk and satellite and near-field communication methods such as Wi-Fi, BT, and D2D is implemented by the sharing module.
  • various radio access technologies (RATs) of near field communication methods and far field communication methods may only be responsible for the management of communication links.
  • the link modules of these communication methods are responsible for communication.
  • Link management and some service functions (such as but not limited to security, encoding/decoding, etc.) can be implemented by the sharing module. It is not limited to this.
  • some service functions (such as but not limited to security, encoding/decoding, etc.) can also be implemented by functional modules of corresponding communication methods.
  • Figure 2C It is not limited to the software architecture schematic diagram shown in Figure 2C.
  • the software architecture schematic diagram of the electronic device 100 can be seen in Figure 2D.
  • Figure 2D is similar to Figure 2C. The difference is that in Figure 2D, based on NewTalk, satellite and other remote Field communication methods and near field communication methods such as Wi-Fi, BT, and D2D have independent real-time sharing functions such as watching together, listening together, playing together, and editing together.
  • the functional modules of these communication methods can be integrated with sharing modules respectively.
  • FIG. 2E exemplarily shows an architectural schematic diagram of yet another sharing system 10 .
  • the sharing system 10 may include an electronic device 100 , an electronic device 200 and a network device 300 .
  • the electronic device 100 and the electronic device 200 may perform real-time sharing such as watching together, listening together, playing together and editing together.
  • the network device 300 may include at least one server.
  • the network device 300 is a server cluster composed of multiple servers. Among them, any server can be a hardware server or a cloud server, such as a web server, a backend server, an application server, a download server, etc.
  • the description of the electronic device 200 is similar.
  • the application system (AP) of the electronic device 100 can be divided into three layers. From top to bottom, they are an application framework layer (framework, FW) and a hardware abstract layer (hardware abstract layer). layer (HAL) and kernel layer (kernel).
  • the application framework layer includes sharing module, discovery module, crawling module, NewTalk function module, Wi-Fi function module, BT function module, D2D function module, satellite function module.
  • the sharing module can include the function module of viewing together (View), the function module of listening together (Listen), the function module of playing together (Play), the function module of editing together (Edit), link management module, security module, and member management. module, quality module, encoding and decoding module, stream capture module, transmission module, data processing module and playback module, among which:
  • the Link Manager module is used to uniformly manage the links of far-field communication methods such as NewTalk and satellite, and near-field communication methods such as Wi-Fi, BT, and D2D.
  • far-field communication methods such as NewTalk and satellite
  • near-field communication methods such as Wi-Fi, BT, and D2D.
  • the one or more physical links may include at least one of the following links: NewTalk's main link, NewTalk's auxiliary link, satellite link, D2D link, BT broadcast link, BT unicast link, Wi-Fi broadcast link, Wi-Fi unicast link.
  • the security module can be used to implement, but is not limited to, certificate authentication, encryption/decryption and other security functions.
  • the Member Manager module is used to manage members (devices/users) for real-time sharing.
  • members for real-time sharing can be added and deleted.
  • the member management module can, but is not limited to, manage members who share in real time through identification information such as device address information and user name information.
  • the quality module is used to control the quality of experience (QoE) of users who perform real-time sharing.
  • QoE quality of experience
  • the codec module (Codec) is used to encode and decode data such as audio, video, and speech.
  • the capture stream (CaptureStream) module is an adaptation module for the stream capture function, which can be used, but is not limited to, to capture data streams such as audio, video, and voice.
  • the transmission module is used to manage the transmission functions of far-field communication methods such as NewTalk and satellite, and near-field communication methods such as Wi-Fi, BT, and D2D.
  • the data processing module can implement at least one data processing strategy, including but not limited to slicing, aggregation, and redundancy.
  • the play (PlayStream) module is an adaptation module for the play function and can be used for, but is not limited to, playing audio, video, voice and other data streams.
  • HAL can include NewTalk service module, Wi-Fi protocol stack, D2D protocol stack, BT protocol stack, satellite service module and auxiliary link module.
  • the Wi-Fi protocol stack can realize Wi-Fi unicast, multicast and broadcast communications.
  • the BT protocol stack can implement BT unicast, multicast and broadcast communications.
  • the auxiliary link module may include a terminal-side service module for network address translation (NAT) traversal and/or relay, for example, called NATService, where session traversal utilities for NAT, STUN) can be understood as a P2P technology used for direct communication between two points.
  • NAT network address translation
  • auxiliary link module may also include a service module of real time communication (RTC), which realizes data transmission of the auxiliary link through real time networks (RTN), for example, to further improve transmission efficiency. and quality.
  • RTC real time communication
  • the kernel layer can include transmission protocol stack, Wi-Fi network interface controller (NIC), Wi-Fi driver (driver), cellular communication network card, A-core data service (ADS), D2D driver, and Bluetooth drive and satellite drive.
  • the transmission protocol stack may include, but is not limited to, transmission control protocol (transmission control protocol, TCP)/IP protocol stack.
  • TCP transmission control protocol
  • IP IP protocol stack.
  • the full English name of the cellular communication network card can be remote (wireless wide area) network, or it can be abbreviated as RMNET.
  • RMNET can be a modem or other external device as a remote network card provided by the operating system. It can form a virtual network card device in the operating system kernel.
  • the modem chip can adopt this end-side networking method and network card device.
  • the Bluetooth driver is, for example, a Bluetooth low energy (bluetooth low energy, BLE) control (Control) module, which is used to control BLE signaling.
  • the network device 300 may include an addressing (wiseFunction) module, a (NAT) traversal (STUN) module, and a (NAT) relay (TURN) module. in:
  • the addressing module is used to perform identity authentication and addressing for establishing a link.
  • the NewTalk function module of the electronic device 100 can implement access token (AT) authentication and NAT traversal sessions through the addressing module of the network device 300 Through the exchange of identity document (ID), the electronic device 100 can obtain the SessionID of the electronic device 200 .
  • the NewTalk function module of the electronic device 200 can also implement the exchange of Session IDs for AT authentication and NAT traversal through the addressing module of the network device 300, and the electronic device 200 can obtain the Session ID of the electronic device 100.
  • SessionID can be used to establish links, such as NAT traversal links or NAT relay links.
  • NAT traversal module is used to realize the establishment and signaling transmission of NAT traversal links.
  • the auxiliary link module of the electronic device 100 and the auxiliary link module of the electronic device 200 can establish P2P traversal through the NAT traversal module of the network device 300 link (auxiliary link) and signaling transmission based on this link.
  • NAT NAT relay module
  • the auxiliary link module of the electronic device 100 and the auxiliary link module of the electronic device 200 can pass the NAT relay module of the network device 300 Establish a relay link (auxiliary link) and perform signaling transmission based on this link.
  • the communication link between the electronic device 100 and the electronic device 200 may include at least one of the following:
  • Link 1 NewTalk link, where the NewTalk link can include an IMS communication link and a CS communication link.
  • the IMS communication link can be, but is not limited to, a multimedia QoS class identifier (QoS class identifier, QCI) 1/QCI2 Channel, or Data channel.
  • the NewTalk link can be established through the cellular communication module of the electronic device 100 and the cellular communication module of the electronic device 200.
  • the cellular communication module of the electronic device 100 is connected to base station 1, base station 1 is connected to base station 2, and base station 2 is connected.
  • the NewTalk link is a communication link between the cellular communication module of the electronic device 100 and the cellular communication module of the electronic device 200 .
  • NewTalk links are used to enable carrier calls (eg, via cellular communications as described above) and/or OTT calls.
  • Wi-Fi link 2 Wi-Fi link, where the Wi-Fi link may include a unicast link, a multicast link, and/or a broadcast link.
  • a Wi-Fi link may be established through the Wi-Fi communication module of electronic device 100 and the Wi-Fi communication module of electronic device 200 .
  • a Wi-Fi link is used to enable Wi-Fi communications.
  • Link 3 BT link, where the BT link may include a unicast link, a multicast link and/or a broadcast link.
  • the BT link may be established through the BT communication module of electronic device 100 and the BT communication module of electronic device 200 .
  • BT links are used to implement BT communications.
  • Link 4 D2D link.
  • the D2D link may be established through the cellular communication module of electronic device 100 and the cellular communication module of electronic device 200 .
  • the D2D link may be established through the Wi-Fi communication module of the electronic device 100 and the Wi-Fi communication module of the electronic device 200 .
  • the D2D link may pass through a D2D communication module (not shown in FIG. 2E ) in the wireless communication system of the electronic device 100 , and a D2D communication module (not shown in FIG. 2E ) in the wireless communication system of the electronic device 200 shown) is established.
  • D2D links are used to implement D2D communications.
  • Link 5 Satellite link.
  • the satellite link may be established through the satellite communication module of electronic device 100 and the satellite communication module of electronic device 200 .
  • satellite links are used to enable satellite communications.
  • the auxiliary link can be NAT traversal (P2P direct transmission) and/or NAT relay.
  • the auxiliary link is established during the conversation state, and in other examples, the auxiliary link is established during the non-conversation state.
  • the physical channel of the auxiliary link can be, but is not limited to, NewTalk link, Wi-Fi link, BT link, D2D link, satellite link and other communication links.
  • NewTalk link Wi-Fi link
  • BT link Wi-Fi link
  • D2D link D2D link
  • satellite link and other communication links.
  • the auxiliary link is used to enable carrier calls and/or OTT calls.
  • the electronic device 100 and the electronic device 200 can choose to establish at least one link (any link or a combination of multiple links) among the above-mentioned links 1 to 5 according to the requirements of the transmission scenario. ), for example, when the electronic device 100 and the electronic device 200 are close to each other, link 3 and link 4 can be established. Establishing multiple links can prevent communication from being impossible or poor communication quality when one link is abnormal, and improve communication efficiency. stability.
  • the upstream/downstream data flows of different communication links are exemplified.
  • the following example takes the electronic device 100 as the sharing device and the electronic device 200 as the shared device.
  • Example 1 upstream data flow of the NewTalk link (data flow direction in the software system of the electronic device 100): capture module -> sharing module (stream capture module -> codec module (for example, used for encoding) -> data processing Module (for example, used for packets) -> Transmission module (for example, used for distribution) -> NewTalk function module -> NewTalk service module -> Transmission protocol stack -> Cellular communication network card -> ADS -> Cellular communication module -> Air interface.
  • capture module -> sharing module stream capture module -> codec module (for example, used for encoding)
  • data processing Module for example, used for packets
  • Transmission module for example, used for distribution
  • NewTalk function module -> NewTalk service module -> Transmission protocol stack -> Cellular communication network card -> ADS -> Cellular communication module -> Air interface.
  • Downstream data flow of the NewTalk link (data flow direction in the software system of the electronic device 200): air interface->cellular communication module->ADS->cellular communication network card->transmission protocol stack->NewTalk service module->NewTalk function module ->Sharing module (Transmission module (for example, used for aggregation) -> Data processing module (for example, for unpacking) -> Codec module (for example, for decoding) -> Playback module).
  • Example 2 upstream data flow of the Wi-Fi link (data flow direction in the software system of the electronic device 100): capture module -> sharing module (stream capture module -> codec module (for example, used for encoding) -> Data processing module (for example, for packets)->Transmission module (for example, for distribution))->Wi-Fi function module->Wi-Fi protocol stack->Transmission protocol stack->Wi-Fi network card->Wi-Fi Driver->Wi-Fi communication module->Air interface.
  • capture module -> sharing module stream capture module -> codec module (for example, used for encoding)
  • Data processing module for example, for packets
  • Transmission module for example, for distribution
  • Wi-Fi function module ->Wi-Fi protocol stack->Transmission protocol stack->Wi-Fi network card->Wi-Fi Driver->Wi-Fi communication module->Air interface.
  • Downstream data flow of the Wi-Fi link (data flow direction in the software system of the electronic device 200): air interface->Wi-Fi communication module->Wi-Fi driver->Wi-Fi network card->transmission protocol stack-> Wi-Fi protocol stack->Wi-Fi function module->sharing module (transmission module (for example, used for aggregation)->data processing module (for example, for unpacking)->codec module (for example, for decoding)-> playback module).
  • Example 3 upstream data flow of the BT link (data flow direction in the software system of the electronic device 100): capture module -> sharing module (stream capture module -> codec module (for example, used for encoding) -> data processing Module (for example, used for packets) -> transmission module (for example, used for distribution) -> BT function module -> BT protocol stack -> BT driver -> BT communication module -> air interface.
  • capture module -> sharing module stream capture module -> codec module (for example, used for encoding)
  • data processing Module for example, used for packets
  • transmission module for example, used for distribution
  • BT function module -> BT protocol stack -> BT driver -> BT communication module -> air interface.
  • Downstream data flow of the BT link (data flow direction in the software system of the electronic device 200): air interface->BT communication module->BT driver->BT protocol stack->BT function module->sharing module (transmission module (such as (for aggregation) -> data processing module (for example, for unpacking) -> encoding and decoding module (for example, for decoding) -> playback module).
  • transmission module such as (for aggregation) -> data processing module (for example, for unpacking)
  • encoding and decoding module (for example, for decoding) -> playback module).
  • Example 4 upstream data flow of the D2D link (data flow direction in the software system of the electronic device 100): capture module -> sharing module (stream capture module -> codec module (for example, used for encoding) -> data processing Module (for example, for packets) -> Transmission module (for example, for distribution)) ->D2D function module->D2D protocol stack->D2D driver->cellular communication module/Wi-Fi communication module->air interface.
  • Downstream data flow of the D2D link (data flow direction in the software system of the electronic device 200): air interface -> cellular communication module/Wi-Fi communication module -> D2D driver -> D2D protocol stack -> D2D function module -> sharing Module (transmission module (for example, for aggregation) -> data processing module (for example, for unpacking) -> codec module (for example, for decoding) -> playback module).
  • the D2D driver in the upstream data stream of the D2D link can also be replaced with: transmission protocol stack -> cellular communication network card -> ADS, in which case the cellular communication module/Wi-Fi communication module is specifically For cellular communication module.
  • the D2D driver in the downlink data stream of the D2D link can also be replaced by: ADS->cellular communication network card->transmission protocol stack.
  • the cellular communication module/Wi-Fi communication module is specifically a cellular communication module.
  • the D2D driver in the upstream data stream of the D2D link can also be replaced with: transmission protocol stack -> Wi-Fi network card -> Wi-Fi driver.
  • the cellular communication module/Wi-Fi communication module is specific For Wi-Fi communication module.
  • the D2D driver in the downlink data stream of the D2D link can also be replaced with: Wi-Fi driver->Wi-Fi network card->transmission protocol stack.
  • the cellular communication module/Wi-Fi communication module is specifically the Wi-Fi communication module.
  • the cellular communication module/Wi-Fi communication module in the uplink/downlink data flow of the D2D link can be replaced with a D2D communication module (not shown in Figure 2E), and the D2D communication module can include hardware for D2D communication. Modules such as firmware and chips.
  • uplink data flow of satellite link (data flow direction in the software system of electronic device 100): capture module -> sharing module (stream capture module -> codec module (for example, used for encoding) -> data processing Module (for example, used for packets)->Transmission module (for example, used for distribution))->Satellite function module->Satellite service module->Satellite driver->Satellite communication module->Air interface.
  • Downstream data flow of the satellite link (data flow direction in the software system of the electronic device 200): air interface->satellite communication module->satellite driver->satellite service module->satellite function module->sharing module (transmission module (such as (for aggregation) -> data processing module (for example, for unpacking) -> encoding and decoding module (for example, for decoding) -> playback module).
  • Example 6 upstream data flow of the auxiliary link (data flow direction in the software system of the electronic device 100): capture module -> sharing module (stream capture module -> codec module (for example, used for encoding) -> data processing Module (for example, used for packets) -> Transmission module (for example, used for distribution) -> NewTalk function module -> NewTalk service module -> Auxiliary link module -> NewTalk/Wi-Fi/BT/D2D/satellite transmission module - >Empty mouth.
  • capture module -> sharing module stream capture module -> codec module (for example, used for encoding)
  • data processing Module for example, used for packets
  • Transmission module for example, used for distribution
  • NewTalk function module -> NewTalk service module
  • Auxiliary link module -> NewTalk/Wi-Fi/BT/D2D/satellite transmission module - >Empty mouth.
  • auxiliary link Downlink data flow of the auxiliary link (data flow direction in the software system of the electronic device 200): air interface->NewTalk/Wi-Fi/BT/D2D/satellite transmission module->auxiliary link module->NewTalk service module-> NewTalk function module -> sharing module (transmission module (for example, used for aggregation) -> data processing module (for example, for unpacking) -> codec module (for example, for decoding) -> playback module).
  • transmission module for example, used for aggregation
  • data processing module for example, for unpacking
  • codec module for example, for decoding
  • the physical channel of the auxiliary link is a NewTalk link.
  • the NewTalk transmission module in the upstream data flow of the auxiliary link is: transmission protocol stack->cellular communication network card->ADS->cellular communication module
  • the NewTalk transmission module in the downlink data flow of the auxiliary link is: cellular communication module-> ADS->Cellular communication network card->Transmission protocol stack.
  • the physical channel of the auxiliary link is a Wi-Fi link.
  • the Wi-Fi transmission module in the upstream data stream of the auxiliary link is: transmission protocol stack->Wi-Fi network card->Wi-Fi driver->Wi-Fi communication module, and the Wi-Fi in the downstream data stream of the auxiliary link-
  • the Fi transmission module is: Wi-Fi communication module->Wi-Fi driver->Wi-Fi network card->Transmission protocol stack.
  • the physical channel of the auxiliary link is a BT link.
  • the BT transmission module in the upstream data stream of the auxiliary link is: BT driver->BT communication module
  • the BT transmission module in the downstream data stream of the auxiliary link is: BT communication module->BT driver.
  • the physical channel of the auxiliary link is a D2D link.
  • the D2D transmission module in the uplink data stream of the auxiliary link is: D2D driver->cellular communication module/Wi-Fi communication module/D2D communication module.
  • the D2D transmission module in the downlink data stream of the auxiliary link is: cellular communication module/ Wi-Fi communication module/D2D communication module -> D2D driver.
  • the D2D driver can also be replaced with other modules described in the above example 4. For details, please refer to the description in the above example 4.
  • the physical channel of the secondary link is a satellite link.
  • the satellite transmission module in the uplink data stream of the auxiliary link is: satellite driver->satellite communication module
  • the satellite transmission module in the downlink data stream of the auxiliary link is: satellite communication module->satellite driver.
  • the following example takes the electronic device 100 as a sharing device as an example.
  • Figure 3 exemplarily shows a schematic diagram of a call interface.
  • NewTalk such as operator calls/OTT calls can be conducted between electronic device 100 (user A, phone number 1) and electronic device 200 (user B, phone number 2).
  • the electronic device 100 can display a user interface 310 of the call application (which may be referred to as the call interface 310 for short).
  • the call interface 310 includes call information 311 and a floating window 312 .
  • the call information 311 includes information about the call party. (Contact name "User B" and communication number "Phone Number 2”), and call duration "1 second".
  • the floating window 312 includes multiple options, such as an option to switch the call mode 312A, an option to send location information 312B, an option to send a file 312C, and a sharing option 312D.
  • the electronic device 200 can display the user interface 320 of the call application.
  • the user interface 320 is similar to the call interface 310 and also includes call information 321 and a floating window 322.
  • the call information 321 includes the information of the call party ( Contact name "User A" and the communication number "Phone number 1") and the call duration "1 second", the floating window 322 is consistent with the floating window 312, and the floating window 322 also includes the sharing option 322A.
  • the electronic device 100 can cancel the display of the detailed information of the floating window 312 and instead display the floating window 312.
  • the icon can be called a floating window 312 that is retracted after staying on the call interface 310 for a preset period of time. For example, it can be retracted against the left, right, upper or lower edge of the screen.
  • FIG. 3 User interface 330.
  • the floating window 312 is displayed in the form of an icon on the left edge of the screen.
  • the electronic device 100 may display detailed information of the floating window 312 in response to a touch operation (such as a click operation) on the floating window 312 in the user interface 330 shown in (C) of FIG. 3 , such as displaying The call interface 310 shown in (A) of FIG. 3 .
  • a touch operation such as a click operation
  • the electronic device 100 in the call state can display the user interface of other applications, and the multimedia data stream of the application can be shared in real time with the call party, For nearby devices, see Figure 4A for specific examples ( Figure 4A takes the real-time sharing of the multimedia data stream of a short video application to the other party of the call: the electronic device 200 as an example).
  • the electronic device 100 can display a user interface 410 of a short video application.
  • the user interface 410 can include a call control 411 at the top, a short video playback window 412 and a floating window 312 .
  • the call control 411 can indicate that the electronic device 100 is currently in a call state, and the call duration is 33 seconds.
  • the play window 412 is used to display the played short video, for example, the short video 1 named "Topic 1" posted by "User 1" is currently being played.
  • the sharing option 312D in the floating window 312 is used to trigger real-time sharing of the multimedia data stream of the foreground application (a short video application is used as an example in Figure 4A) to the call counterparty (electronic device 200/user B).
  • the electronic device 100 may send a sharing request to the electronic device 200 in response to a touch operation (eg, a click operation) for the sharing option 312D in the user interface 410 shown in FIG. 4A .
  • the electronic device 100 can transmit the audio stream (such as the audio of the short video 1 and/or the audio collected by the microphone) and/or the video stream (such as the short video) related to the play window 412 through cellular communication. 1 and/or images collected by the camera) are sent to the electronic device 200 .
  • the electronic device 100 may display the user interface 420 shown in FIG. 4B.
  • the user interface 420 does not include the floating window 312, and the play window 412 in the user interface 420 is in a selected state, which can indicate that the audio stream and/or video related to the play window 412 is currently being shared.
  • Stream that is, share content.
  • the user interface 420 also includes a sharing control option 421.
  • the sharing control option 421 is used to trigger the display of a sharing menu.
  • the sharing menu includes, for example, but is not limited to, pause/exit sharing, change sharing content, change shared devices and other functional options.
  • the communication method by which the electronic device 100 sends the sharing request to the electronic device 200 may be a cellular communication method. In other examples, it may also be a near field communication method or other communication methods.
  • the electronic device 100 may display a sharing menu, such as the user interface shown in FIG. 4C , in response to a touch operation (eg, a click operation) on the sharing control option 421 in the user interface 420 shown in FIG. 4B 430.
  • a touch operation eg, a click operation
  • the user interface 430 also includes a sharing menu 431, which may include a plurality of options, such as option 431A, option 431B, option 431C, option 431D, option 431E, and option 431F.
  • the option 431A includes the characters "Only the current application (picture + audio)", which is used to set the shared content to the images and audio of the foreground application (Figure 4B takes the short video application as an example) (such as the short video played in the play window 412). 1 images and audio).
  • Option 431B includes the characters “Only current application (audio)", which is used to set the shared content to the audio of the foreground application (for example, the audio of short video 1 played by the play window 412).
  • Option 431C includes the characters “Only current application (screen)", which is used to set the shared content to the image of the foreground application (for example, the image of short video 1 played in the play window 412).
  • the option 431D includes the characters "entire screen” and is used to set the shared content to be the display content of the screen of the electronic device 100 (eg, images/audio related to the user interface 430).
  • Option 431E includes the characters “Pause Sharing” for canceling/pausing/stopping real-time sharing.
  • Option 431F includes the character “more”, which is used to trigger the display of more functional options, such as whether to share audio collected by the microphone, whether to share images collected by the camera, whether to allow saving, whether to allow forwarding, etc.
  • the prompt information may be displayed, for example, the user interface 510 shown in FIG. 5A is displayed.
  • the user interface 510 is similar to the user interface 320 shown in (B) of FIG. 3 , except that the user interface 510 does not include the floating window 322 but includes prompt information 511 .
  • the prompt information 511 includes the icon 511A of the short video application to which the shared content (taking the audio stream/video stream of the short video 1 played in the play window 412 in the user interface 410 shown in FIG. 4A as an example) belongs, and the characters "User A invites you Watch together", and accept control 511B.
  • the electronic device 200 may receive the shared content sent by the electronic device 100 through cellular communication in response to a touch operation (such as a click operation) on the acceptance control 511B in the user interface 510 shown in FIG. 5A , and
  • the shared content is displayed, for example, the user interface 520 shown in FIG. 5B is displayed.
  • the user interface 520 may include a call control 521 located at the top, a play window 522 for sharing content, a sharing control option 523 and a prompt box 524.
  • the call control 521 can indicate that the electronic device 200 is currently in a call state, and the call duration is 35 seconds.
  • the prompt box 524 includes the characters "Watching content shared by user A.”
  • the play window 522 is used to display shared content, such as the image displayed by the play window 412 in the user interface 410 shown in FIG. 4A.
  • Sharing control option 523 is used to trigger display of the sharing menu, share
  • the menu may include, for example, but not limitation, options to pause/exit playback of the shared content.
  • the electronic device 200 may display the user interface 530 shown in FIG. 5C in response to a touch operation (eg, a click operation) for the sharing control option 523 in the user interface 520 shown in FIG. 5B .
  • the user interface 530 also includes a sharing menu 531, which may include multiple options, such as option 531A and option 531B.
  • option 531A includes the characters "Exit viewing", which is used to pause/exit the playback interface of the shared content.
  • Option 531B includes the characters "more” for triggering the display of more functional options, such as for triggering the option of sharing the audio stream/video stream to other users in real time.
  • the electronic device 200 may exit viewing the currently played shared content in response to a touch operation (eg, a click operation) for option 531A, for example, displaying the user interface 320 shown in (B) of FIG. 3 .
  • a touch operation eg, a click operation
  • the electronic device 200 may respond to the user operation, accept the sharing request, and play the shared content again, such as displaying the user interface 520 shown in FIG. 5B .
  • the electronic device 200 may return to display the call interface in response to a touch operation (such as a click operation) on the call control 521 in the user interface 520 shown in FIG. 5B , such as displaying the call interface shown in (B) of FIG. 3 User interface 320 shown.
  • a touch operation such as a click operation
  • the electronic device 200 can, but is not limited to, work in the following three situations:
  • Case 1 After receiving the touch operation on the call control 521 in the user interface 520 shown in FIG. 5B , the electronic device 200 does not send a notification message to the electronic device 100 . Therefore, the electronic device 100 will continue to send the shared content to the electronic device 200 . In some examples, the electronic device 200 may run the playback interface of the shared content in the background based on the received shared content.
  • Case 2 After receiving a touch operation on the call control 521 in the user interface 520 shown in FIG. 5B , the electronic device 200 sends a notification message to the electronic device 100 . After receiving the notification message, the electronic device 100 does not send a notification message to the electronic device 200 . Send shared content.
  • Case 3 After receiving a touch operation on the call control 521 in the user interface 520 shown in FIG. 5B , the electronic device 200 sends a notification message to the electronic device 100 , and the electronic device 100 receives the notification message (for example, related to the resolution and/or or frame rate related), reduce the transmission bandwidth of the shared content (for example, by reducing the resolution, frame rate, code rate, etc. of the shared content), thereby saving device power consumption and transmission resources.
  • the electronic device 200 may run the playback interface of the shared content in the background based on the received shared content after reducing the transmission bandwidth.
  • the electronic device 200 may respond to a touch operation (such as a click operation) on the call control 521 in the user interface 520 shown in FIG. 5B, display the call interface in a split screen (such as The user interface 320 shown in (B) of FIG. 3) and the playback interface of the shared content (such as the user interface 520 shown in FIG. 5B) are not limited thereto.
  • the electronic device 200 can respond to the request for The touch operation (such as a click operation) of the call control 521 in the user interface 520 shown in Figure 5B displays the call interface, and the playback interface of the shared content is displayed in the form of a small floating window on the call interface.
  • This application does not make any specific display methods. limited.
  • the electronic device 200 can, but is not limited to, work according to the above three situations.
  • the electronic device 200 after the electronic device 200 receives the shared content, it needs to process the shared content first (for example, lower the resolution, lower the frame rate, etc.), and then display the processed shared content in a split-screen format or a floating small window.
  • the electronic device 200 may respond to the user interface 320 in the floating window 322 shown in the user interface 320 .
  • a touch operation (such as a click operation) on the sharing option 322A replays the shared content, such as displaying the user interface 520 shown in FIG. 5B .
  • the electronic device 200 may display a user interface of a multi-tasking window/multi-tasking list in response to a touch operation (for example, sliding from bottom to top) on the above-mentioned call interface, for example, as shown in FIG. 5D
  • User interface 540 is shown.
  • the user interface 540 is used to display a window list.
  • the window list includes at least one window running on the electronic device 200, such as a short message application window 541, a real-time sharing window 542 and a call application window 543.
  • the icon of the real-time sharing function and the characters "Watch Together" 542A are displayed on the real-time sharing window 542, and the real-time sharing window 542 is used to indicate the playback window of the shared content.
  • the electronic device 200 may replay the shared content in response to a touch operation (eg, a click operation) on the window 542 . It can be understood that when the electronic device 200 returns to display the call interface, the electronic device 200 runs the playback interface of the shared content in the background.
  • the electronic device 200 replaying the shared content may switch the shared content to the playback interface running the shared content in the foreground. .
  • the electronic device 200 can directly replay the shared content based on the received shared content.
  • the electronic device 200 may respond to the above touch operation on the sharing option 322A in the floating window 322 shown in the user interface 320, or the above touch operation on the window 542, to the electronic device 200.
  • 100 sends a notification message. After receiving the notification message, the electronic device 100 sends the shared content to the electronic device 200 for the electronic device 200 to replay the shared content.
  • the electronic device 200 may respond to the above touch operation on the sharing option 322A in the floating window 322 shown in the user interface 320 or the above touch operation on the window 542, to the electronic device. 100 sends a notification message.
  • the electronic device 100 After receiving the notification message, the electronic device 100 increases the transmission bandwidth of the shared content (for example, by increasing the resolution, frame rate, code rate, etc. of the shared content).
  • the electronic device 200 can based on the received After increasing the transmission bandwidth, share the content and replay the shared content.
  • the electronic device 200 when the electronic device 200 displays the call interface and the playback interface of the shared content in split screen, it can respond to the request for sharing.
  • User operations on the content playback interface (such as dragging the drag bar between the call interface and the shared content playback interface in the split-screen interface) display the shared content playback interface in full screen, such as displaying the user interface 520 shown in Figure 5B,
  • the specific example is similar to the above-mentioned example after the electronic device 200 returns to display the call interface, and will not be described again.
  • the electronic device 200 when the electronic device 200 displays the playback interface of the shared content in the form of a small floating window on the call interface, the electronic device 200 may display the playback interface of the shared content in full screen in response to a user operation on the floating small window, for example, as shown in Figure 5B
  • the specific example of the user interface 520 shown is similar to the above-mentioned example after the electronic device 200 returns to display the call interface, and will not be described again.
  • the real-time sharing function can also be triggered through a sliding operation.
  • the sliding operation is sliding up and down, left and right. Sliding or sliding along a specific trajectory, etc.
  • the user interface 610 is similar to the user interface 410 shown in Figure 4A. The difference is that the floating window 312 in the user interface 610 is in a collapsed state, for example, displayed in the form of an icon on the left edge of the screen.
  • the electronic device 100 may respond to a sliding operation on the user interface 610 (FIG. 6A takes the sliding operation of the knuckles sliding along a specific trajectory of “W” as an example), display a selection interface for sharing content and sharing objects, for example, displaying a picture User interface 620 shown in 6B.
  • the user interface 620 includes a list 621 of selectable sharing content and a list 622 of selectable sharing objects.
  • list 621 may include option 621A, option 621B, and option 621C.
  • the characters "Share short video application” are displayed under option 621A, which is used to indicate the window of the foreground application ( Figure 6B takes a short video application as an example).
  • the characters “Share Screen” are displayed under option 621B, which is used to indicate the display content of the screen of the electronic device 100 .
  • the characters “Share Video Application” are displayed under option 621C, which is used to indicate a window of a background application (see Figure 6B, which takes a video application as an example).
  • the electronic device 100 may have fewer or more background applications. For example, if the electronic device 100 does not run a video application, then the list 621 does not include the option 621C, or the electronic device 100 also If other background applications (such as short message applications) are running, the list 621 may also include an option indicating the window of the short message application. In some examples, the electronic device 100 may display other options included in the list 621 in response to a touch operation (eg, sliding left or right) on the list 621 .
  • a touch operation eg, sliding left or right
  • the electronic device 100 may respond to a touch operation (such as a click operation) on any option in the list 621, select the audio stream/video stream related to the option as the sharing content, or cancel the selection.
  • the electronic device 100 may select the audio stream/video stream of the short video application as the shared content in response to a touch operation (such as a click operation) on option 621A.
  • option 621A may be in the selected state shown in FIG. 6B
  • option 621B and option 621C may be in the unselected state shown in Figure 6B.
  • prompt information 623 may be displayed on the list 621 .
  • the prompt information 623 may indicate the number of selected shared contents.
  • the electronic device 100 may, in response to a touch operation (such as a click operation) for option 621B, select the display content of the screen of the electronic device 100, and optionally the playback content of the speaker of the electronic device 100 as the shared content, in this case
  • a touch operation such as a click operation
  • the electronic device 100 displays the user interface 610 shown in FIG. 6A
  • the audio stream/video stream related to the user interface 610 is shared in real time
  • the electronic device 100 switches the display interface to the call shown in (A) of FIG. 3 in response to the user operation.
  • the audio stream/video stream related to the call interface 310 is shared in real time.
  • the user can select multiple shared contents based on the list 621, and the electronic device 100 can send the multiple shared contents selected by the user to the shared device.
  • the shared device can display the multiple shared contents mentioned above in split screen.
  • the interface example is similar to Figure 21E.
  • the shared device can also determine the shared content to be displayed in response to the user's operation. For example, the shared device can display one of the multiple shared contents mentioned above by default, and receives a message for switching. When the user who shares the content operates, other shared content among the multiple shared contents mentioned above is displayed.
  • the shared device can display the above-mentioned multiple shared contents together through the connected device.
  • the shared device can display one shared content, and the device connected to the shared device can display another shared content.
  • This application does not limit the way in which the shared device displays multiple shared contents.
  • the electronic device 100 may respond to the user operation and send the audio data of N pieces of the above-mentioned shared content to the shared device, and does not send the audio data of other shared contents to the shared device, where N is positive. integer.
  • the shared device after the shared device receives the audio data of multiple shared contents sent by the electronic device 100, it can respond to the user operation and play the audio data of M shared contents, where M is a positive integer, thereby avoiding multiple Audio data played together affects the user experience.
  • the list 622 includes an option 622A indicating the other party of the call (i.e., user B/electronic device 200 ) and a plurality of options indicating nearby devices.
  • the option 622A includes the characters "Phone Number 2 (on call)", where the telephone number 2 is the communication number of the other party.
  • the plurality of options indicating nearby devices include, for example, option 622B, option 622C, option 622D, option 622E, and option 622F.
  • Option 622B includes the characters "user C's mobile phone", which is used to indicate a nearby device with a device type of "mobile phone" and an associated user name of "user C”.
  • Option 622C includes the characters "My Notebook", which is used to indicate that the device type is "notebook” and the related user name is a nearby device of user A who uses the electronic device 100 .
  • option 622D includes the characters “User D's tablet” and option 622E includes the characters “using "User C's headset”, option 622F includes the character “User E” speaker.
  • List 622 also includes option 622G, option 622G is used to trigger the display of more functional options, such as viewing more nearby devices, as shown in selection list 622 All options (that is, setting nearby devices indicated by these options as sharing objects), etc.
  • the list 622 may include more options. Or less.
  • the electronic device 100 may respond to a touch operation (such as a click operation) on any option in the list 622, select the device indicated by the option as the sharing object, or cancel the selection.
  • a touch operation such as a click operation
  • the electronic device 100 may, in response to a touch operation (such as a click operation) on option 622A in the list 622, select the call partner (ie, the electronic device 200) indicated by the option 622A as the sharing object.
  • the electronic device 100 may After the device 100 receives a touch operation (such as a click operation) for option 622A, option 622A may be in a selected state.
  • a touch operation such as a click operation
  • the electronic device 100 may respond to a touch operation (such as a click operation) for option 622A, sending a sharing request to the call counterparty indicated by option 622A (ie, the electronic device 200).
  • a touch operation such as a click operation
  • FIGS. 4A-4C, FIG. 5A-The implementation shown in Figure 5D is similar.
  • the electronic device 100 after the electronic device 100 sends a sharing request to the electronic device 200, it can continue to display the sharing content and the selection interface of the sharing object, for example, display the selection interface shown in Figure 6C
  • the electronic device 100 may return to display the upper level interface, such as the user interface 610 shown in FIG.
  • the sharing content selected by the user based on the list 621 is an audio stream/video stream of a background application (assumed to be a video application), and the electronic device 100 can switch the video application to run in the foreground and display the video
  • the audio stream/video stream of the application is shared to the electronic device 200 .
  • the electronic device 100 may display the user interface of the video application in response to the above-mentioned touch operation for the retract option 631 in the user interface 630 .
  • the user can select multiple sharing objects based on the list 622, and the electronic device 100 can send the sharing content to the multiple sharing objects selected by the user.
  • the electronic device 100 may receive touch operations (eg, click operations) for options 622A, 622B, 622C, 622D, 622E, and 622F in the list 622 in sequence.
  • the electronic device 100 may display the display shown in FIG. 6D .
  • User interface 640 is shown.
  • option 622A, option 622B, option 622C, option 622D, option 622E and option 622F are all selected, which can represent that the user has selected option 622A, option 622B, option 622C and option 622D.
  • the device indicated by option 622E and option 622F is the sharing object.
  • the characters included in option 622A are "User B (phone number 2) is watching", the characters included in option 622B are “User C (mobile phone) is watching”, the characters included in option 622C are “My notebook is playing”, and option 622D
  • the characters included are “User D (tablet computer) is watching”
  • the characters included in option 622E are “User C (headphones) are listening”
  • the characters included in option 622F are "User E (speaker) is listening”.
  • the real-time sharing function can also be triggered through the user interface of the multi-task list/multi-task window.
  • the electronic device 100 can respond to a touch operation on the user interface 410 shown in FIG. 4A (for example, from below). Slide up) to display the user interface of the multi-task list/multi-task window.
  • Slide up to display the user interface of the multi-task list/multi-task window.
  • the user interface 710 is used to display a window list.
  • the window list includes at least one window running on the electronic device 100 , such as a call application window 711 , a short video application window 712 and a video application window 713 .
  • the icon and name of the application may be displayed on any window, as well as the sharing control used to trigger real-time sharing of the audio stream/video stream of the application.
  • the icon and name of the short video application may be displayed on the window 712 of the short video application. "Short video" 712A, and sharing control 712B.
  • the electronic device 100 may display a list of selectable sharing objects in response to a touch operation (eg, a click operation) on the sharing control 712B, such as displaying the user interface 720 shown in FIG. 7B .
  • the user interface 720 is similar to the user interface 710 shown in Figure 7A. The difference is that the user interface 720 also includes a list 721 of selectable sharing objects, and the window 712 of the short video application is selected and shared. Control 712B is in a selected state.
  • the list 721 is similar to the list 622 in the user interface 620 shown in FIG. 6B, and includes an option 721A indicating the call party (ie, user B/electronic device 200) and multiple options indicating nearby devices, such as Includes option 721B (including the characters "user C's mobile phone"), option 721C (including the characters "my notebook"), and option 721D (including the characters "user D's tablet”).
  • the electronic device 100 can respond to a touch operation (such as a click operation) on any option in the list 721, select the device indicated by the option as the sharing object, or cancel the selection.
  • a touch operation such as a click operation
  • Examples of device 100 responding to a touch operation for any one of the options in list 622 included in user interface 620 shown in FIG. 6B are similar.
  • the user can select multiple sharing objects based on the list 721, and the electronic device 100 can send the sharing content to the multiple sharing objects selected by the user.
  • the electronic device 100 may display the user interface 730 shown in FIG. 7C .
  • option 721A, option 721B, option 721C and option 721D in the list 721 are all selected, which may indicate that the user has selected option 721A.
  • option 721B, option 721C and option 721D indicate the device to be shared.
  • Option 721A includes the characters “User B (phone number 2) is watching”.
  • the characters included in option 721B are “User C (mobile phone) is watching”
  • the characters included in option 721C are "My laptop is playing”
  • the characters included in option 721D are "User D (tablet) is watching”.
  • the electronic device 100 may receive a touch operation (such as a click operation) on the window 712 of the short video application in the user interface 720 shown in FIG. 7B or the user interface 730 shown in FIG. 7C, Display the playback interface of the shared content, for example, display the user interface 420 shown in Figure 4B.
  • a touch operation such as a click operation
  • the multi-task list/multi-task window displayed by the shared device also includes a display window for real-time shared content, and a sharing control may also be displayed on the display window.
  • the sharing control is used to Trigger the sharing of the above real-time shared content to other devices.
  • the display window of the real-time shared content may be the window 542 in the user interface 540 shown in FIG. 5D
  • the sharing control 712B in the user interface 710 shown in FIG. 7A may be displayed on the window 542.
  • the real-time sharing function can also be triggered through the notification interface.
  • the electronic device 100 can display the notification interface in response to a touch operation (such as sliding from top to bottom) on the user interface 410 shown in FIG. 4A , for a specific example, see user interface 810 shown in Figure 8A.
  • the user interface 810 includes a notification bar 811 of a background application (FIG. 8A takes a video application as an example), a Wi-Fi function control 812, a Bluetooth function control 813 and a menu 814.
  • the control 812 can be used to turn on or off the Wi-Fi function of the electronic device 100, and can also be used to select a connected Wi-Fi signal source ( Figure 8A takes the connected Wi-Fi signal source named "Signal Source 1" as an example to indicate).
  • the control 813 can be used to turn on or off the Bluetooth function of the electronic device 100, and can also be used to select a device to which the electronic device 100 is connected via Bluetooth (FIG. 8A takes the connected device named "Headphone 1" as an example).
  • the menu 814 may include controls for multiple functions, such as controls for a flashlight, a control for flight mode, a control for mobile data 814A, a control for automatic rotation, a control for instant sharing 814B, a control for a positioning function, a control for a screenshot function, and a control for a mute function. View space, screen recording controls, NFC controls, etc.
  • the control 814A is used to turn on or off the mobile data of the electronic device 100 (which can also be called turning on or off the cellular communication function).
  • the characters "instant sharing" 814C and a control 814D are displayed under the control 814B.
  • the control 814B can be used to turn on or off the instant sharing function of the electronic device 100.
  • the control 814D can trigger the display of more functional information of the instant sharing function, such as selecting the instant sharing function. Way.
  • Controls 812, 813 and 814A in the user interface 810 are all in the on state, and the status information 815 at the top of the user interface 810 includes the logos of “5G”, Wi-Fi and Bluetooth, which can indicate that the electronic device 100 is currently Turn on mobile data, Wi-Fi function and Bluetooth function.
  • the electronic device 100 may display a selection interface for sharing content and sharing objects in response to a touch operation (eg, a click operation) on the control 814B, such as displaying the user interface 620 shown in FIG. 6B .
  • buttons in the floating window can be triggered by buttons in the floating window, sliding operations, multi-tasking lists/multi-tasking windows, and buttons in the notification interface to realize real-time sharing functions such as watching together and listening together during calls. It is easy and flexible to use and has a good user experience.
  • multiple devices communicating can also collect the user's face image through the camera, and share the collected image with other users, where the image can be the image currently collected by the device. , it can also be collected before the device (for example, collected before communication), which is not limited in this application.
  • the electronic device used can display at least one window, and each window can display an image of the user, such as the control 1541 in the user interface 1540 shown in Figure 15C.
  • the sharing user can select at least one window displayed by the electronic device, thereby selecting the device/user corresponding to the at least one window as the sharing object.
  • the above-mentioned list of selectable sharing objects may include a window displaying the user's image.
  • At least one window when device 1 discovers any other device (assumed to be device 2), device 2 can respond to device 1, and when responding to device 1, it will use the avatar of the user using device 2 (for example, in the contacts).
  • the avatar, the avatar in instant sharing or the image in the chat application, etc.) is sent to device 1.
  • the list of sharing objects displayed by device 1 can include the avatar, and the avatar can be used to trigger real-time sharing to device 2.
  • This application does not limit the display method of sharing objects.
  • the above list of selectable shared content may also include icons, and this application does not limit the display method of the shared content.
  • the user can also customize the sharing object through the scanning function of the electronic device 100.
  • the electronic device 100 may display an option to select a device in response to a touch operation for option 622G in the user interface 620 shown in FIG. 6B , and the electronic device 100 may, in response to a touch operation for the option, capture a photo through a camera. nearby electronic devices and/or users, and select electronic devices and/or users from the captured images as sharing objects for real-time sharing. For specific examples, see Figure 8B and Figure 8C.
  • the electronic device 100 may display a user interface 820 .
  • the user interface 820 may include an image 821 captured through the scan function.
  • the image 821 may include a user 821A and a user 821B selected by the user of the electronic device 100 .
  • the user The interface 820 may also include specific devices recognized by the electronic device 100 according to the selected user: the device 822 corresponding to the user 821A (including the characters "user M's mobile phone") and the device 823 corresponding to the user 821B (including the characters "user N's mobile phone") , in some examples, the electronic device 100 can respond to a touch operation for any device/user included in the user interface 820 and share in real time to the device/user's corresponding device.
  • the user interface 820 also includes a scan control 824, which can trigger the re-capturing of images through the camera.
  • the electronic device 100 before the electronic device 100 identifies the corresponding device according to the selected user in the captured image, the user (such as the above-mentioned user 821A) needs to enter the human body feature information (such as the face) into the electronic device (such as the above-mentioned device). 822), or the electronic device collects and extracts the user's human body feature information (such as face) in real time/periodically (for example, twice a day)/irregularly (for example, every time the user uses the camera).
  • the human body feature information such as the face
  • the electronic device such as the above-mentioned device 822
  • the electronic device collects and extracts the user's human body feature information (such as face) in real time/periodically (for example, twice a day)/irregularly (for example, every time the user uses the camera).
  • the electronic device 100 can identify the characteristic information of the at least one user, such as but not limited to: Gender, hair length, predicted age, skin color, whether to wear glasses, clothing type, clothing color, face data, etc.
  • the electronic device 100 can broadcast (for example, through Wi-Fi or BT) the original data or key data of the identified characteristic information. After receiving the broadcast message, other devices can match the stored human body characteristic information with the data in the broadcast message. If the match is successful, a response message is sent to the broadcast sender (ie, the electronic device 100).
  • the electronic device 100 can display and select the device corresponding to the user (such as the above-mentioned device 822 and device 823) according to the response message, so that the user can select the sharing object. Understandably, broadcasting only key data can reduce the amount of data transmission and identify the device corresponding to the selected user more efficiently.
  • the electronic device 100 may also identify the device corresponding to the selected user through a third-party device (such as a nearby electronic device, a network device such as a server).
  • the electronic device 100 may send the selected user's characteristic information and/or the location information of the electronic device 100 (for example, but not limited to including positioning information, cell information, Wi-Fi ID, etc.) to a third-party device.
  • the third-party device A matching query can be performed based on the received information, and the queried device information that matches the selected user is returned to the electronic device 100 .
  • the electronic device 100 may deselect any user in response to a touch operation for the user in the user interface 820 shown in FIG. 8B .
  • the electronic device 100 may delete any device in the user interface 820 shown in FIG. 8B in response to a touch operation on the device. For example, when the user 821A is deselected or the device 822 is deleted, the electronic device 100 cancels the display of the circle outside the user 821A in the image 821 and simultaneously cancels the display of the device 822 in the user interface 820 .
  • the electronic device 100 may display a user interface 830.
  • the user interface 830 may include an image 831 captured through the scan function.
  • the image 831 may include a device 831A and a device 831B selected by the user of the electronic device 100.
  • the user The interface 830 may also include specific devices recognized according to the selected device: the device 832 corresponding to the device 831A (including the characters "user S's notebook") and the device 833 corresponding to the device 831B (including the characters "user T's glasses”).
  • the electronic device 100 may share in real time to any device included in the user interface 830 in response to a touch operation on the device.
  • the user interface 830 also includes a scan control 834 that can trigger re-capturing of images through the camera.
  • the electronic device 100 when the electronic device 100 identifies the corresponding specific device based on the selected device in the captured image, it can identify at least one of the following: the type of the selected device in the image (for example, a notebook or a mobile phone), The device manufacturer/brand of the selected device (for example, identified by the trademark (logo) of the selected device in the image), and the appearance characteristics of the device (such as color).
  • the electronic device 100 can broadcast the identified characteristics or conduct a matching query through a third-party device to obtain and display the specific device corresponding to the selected device (such as the above-mentioned device 832 and device 833) for the user to select a sharing object.
  • the electronic device 100 can also respond to a touch operation on any device in the user interface 830 shown in FIG. 8C to deselect/delete the device.
  • the specific instructions are similar to those in FIG. 8B and will no longer be used. Repeat.
  • user A can also select the electronic device and/or the user as the sharing object from the images captured by other electronic devices (assumed to be the electronic device 200) communicating with the electronic device 100 for real-time sharing. Share, so that even if the distance between user A and the above-selected sharing object is far, the user A can customize the above-selected sharing object through the image captured by the electronic device 200. For example, when user A uses electronic device 100 and user B uses electronic device 200 for NewTalk, user B can operate electronic device 200 to turn on the camera and take pictures of nearby electronic devices and/or users, and the captured images can be shared with electronic device 100 for display.
  • the image captured by the electronic device 200 is displayed through the control 1541 in the user interface 1540 shown in FIG. 15C.
  • the electronic device 100 can send the sharing data to the electronic device 200, and the electronic device 200 then forwards the sharing data to the electronic device used by user C and user D. electronic equipment.
  • user A can also obtain the information of nearby electronic devices and/or users through the touch function of the electronic device 100 (for example, implemented through NFC), and based on the obtained The information is customized to add at least one device as a sharing object for real-time sharing.
  • This application does not limit the method of customizing the sharing object.
  • the above-mentioned touch operation for the sharing option 312D in the floating window 312 included in the user interface 410 shown in FIG. 4A, and the above-mentioned sliding operation for the user interface 610 shown in FIG. 6A (the sliding operation in the example of FIG. W" specific trajectory sliding), the above-mentioned touch operation for the sharing control 712B in the user interface 710 shown in Figure 7A, and the above-mentioned touch operation for the user interface 810 shown in Figure 8A
  • the touch operation of the control 814B in can be collectively referred to as a user operation for triggering the real-time sharing function/real-time sharing function.
  • the user operation for triggering the real-time sharing function can also be in other forms, such as a touch operation (such as a click operation), voice input, for the sharing option 312D in the call interface 310 shown in (A) of FIG. 3 , Gestures, etc., this application does not limit this.
  • the electronic device 100 can also receive a user operation for triggering a real-time sharing function in a non-call state.
  • the real-time sharing function can be implemented through near field communication technology.
  • the electronic device 100 may display a selection interface for sharing content and sharing objects in response to the user operation.
  • the electronic device may display the user interface 910 shown in FIG. 9A .
  • the user interface 910 is similar to the user interface 620 shown in FIG. 6B .
  • the status bar at the top of the user interface 910 does not include a call icon, indicating that the electronic device 100 Currently in non-call state.
  • the list 911 of selectable sharing objects in the user interface 910 does not include an option indicating the call party, but only includes a plurality of options indicating nearby devices. It is not limited to this.
  • the selection interface for sharing content and sharing objects can also be the user interface 920 shown in Figure 9B.
  • the user interface 920 is similar to the user interface 720 shown in Figure 7B. The difference is that the user interface 920 The status bar at the top does not include a call icon, indicating that the electronic device 100 is currently in a non-call state, and the list 921 of selectable sharing objects in the user interface 920 does not include an option to indicate the call party, but only includes an option to indicate a nearby device. Multiple options.
  • the electronic device 100 may receive a touch operation for any one of the multiple options for indicating nearby devices (taking option 622B as an example), and send a touch operation to the electronic device 400 indicated by option 622B (i.e., "User C"). ” of the “mobile phone”) to send a sharing request.
  • the electronic device 400 may display prompt information, for example, display the user interface 930 shown in FIG. 9C.
  • the user interface 930 may be the desktop of the electronic device 400.
  • the status information 931 at the top of the user interface 930 includes the logos of “5G” and Bluetooth, which may indicate that the electronic device 100 has currently enabled mobile data and Bluetooth functions.
  • the user interface 930 also includes prompt information 932.
  • the prompt information 932 includes the information of the short video application to which the shared content (taking the audio stream/video stream of the short video 1 played in the play window 412 in the user interface 410 shown in FIG. 4A as an example) belongs. Icon 932A, characters "User A invites you to watch together", and acceptance control 932B.
  • the electronic device 400 may, in response to a touch operation (such as a click operation) on the acceptance control 932B, receive the shared content sent by the electronic device 100 through a near field communication technology (such as Bluetooth), and display the shared content, such as Play window 522 in user interface 520 shown in Figure 5B.
  • a touch operation such as a click operation
  • the communication method through which the electronic device 100 sends the sharing request to the electronic device 400 may be Bluetooth. In other examples, it may also be other communication methods such as Wi-Fi or cellular communication. That is to say, the electronic device 100
  • the communication method used to send the sharing request to the electronic device 400 and the communication method used by the electronic device 100 to send the shared content to the electronic device 400 may be the same or different.
  • the multimedia data streams of the shared content sent to different shared devices may be the same or different.
  • the electronic device 100 may send the audio stream of the short video application to at least one shared device connected via Bluetooth, and to at least one shared device connected via Wi-Fi. Send audio stream and video stream of short video application.
  • the electronic device 100 can also implement the real-time sharing function through near field communication technology during a call, which is not limited in this application.
  • the above example can realize real-time sharing functions such as watching together and listening together in near field communication scenarios such as nearby Bluetooth devices.
  • the application scenarios are wider and the user experience is better.
  • real-time sharing functions such as watching together and listening together can also be implemented in satellite, D2D, V2X and other communication scenarios. This application does not limit the communication method for realizing the real-time sharing function.
  • the sharing device can, but is not limited to, determine the sharing object and content in any of the following ways:
  • Method 1 preset sharing objects and preset sharing content.
  • the electronic device 100 directly sets the call party (i.e., the electronic device 200 ) as the sharing object, and sets the foreground application (i.e., the short message 200 ) as the sharing object.
  • Video application audio stream/video stream is set to share content.
  • Method 2 Preset sharing objects and determine sharing content based on received user operations.
  • the electronic device 100 displays sharing in response to a touch operation (such as a click operation) on the sharing option 312D in the user interface 410 shown in FIG. 4A
  • the content selection interface displays, for example, the list 621 of selectable shared content in the user interface 620 shown in FIG. 6B .
  • the electronic device 100 can determine the shared content according to the operation input by the user based on the shared content selection interface, and the electronic device 100 The call counterparty (ie, the electronic device 200) can be directly set as the sharing object.
  • Method 3 Preset sharing content and determine the sharing object based on the received user operation.
  • the electronic device 100 can directly apply the user operation for triggering the real-time sharing function to the short video application.
  • Method 4 Determine the sharing content and sharing objects based on the received user operation.
  • the electronic device 100 can determine the sharing content and sharing objects based on the user's selection interface (i.e., the selection interface shown in FIG. 6B The operation input by user interface 620) determines the sharing content and sharing objects.
  • the display timing of the floating window used to trigger the real-time sharing function may be, but is not limited to, the following situations.
  • the floating window here (can also be understood as a display form) may be detailed information of the floating window, such as the floating window 312 shown in (A) of Figure 3 , or it may be an icon of the floating window, such as (A) of Figure 3
  • the electronic device 100 can switch the display mode when displaying the floating window. For specific examples, please refer to the description of FIG. 3(A) and FIG. 3(C) .
  • Case 1 The electronic device 100 can display a floating window in the call state.
  • the call state may be the call state of an operator. See Figure 3 for an interface example.
  • the call state may be a call state of an OTT call, and the interface example is similar to Figure 3 (for example, the interface of a voice/video call of a social application is displayed at this time).
  • the electronic device 100 may display a floating window when displaying the conversation interface.
  • the session interface may be an operator session interface (ie, a short message session interface).
  • the conversational interface may be an interface for an OTT conversation (such as a conversational interface of an instant messaging application, and the conversation may have one or more objects).
  • the electronic device 100 can display a floating window when displaying information about a certain call partner (not in the call state or conversation state at this time), which can also be understood as providing a floating window to the user when browsing a certain call partner.
  • the electronic device 100 may display a floating window when displaying detailed information of a contact.
  • the contact may be a contact in a preset application.
  • the preset application may be used to implement operator calls and/or Operator sessions can also be used to implement OTT calls and/or OTT sessions.
  • the electronic device 100 can display a floating window when displaying information about a certain communication identifier.
  • the communication identifier can be used to identify the call partner, and different call parties have different communication identifiers.
  • the call partner corresponding to the communication identification displayed by the electronic device 100 may be a call partner that has not been recorded/stored by the electronic device 100 , or may be a call partner that has been recorded/stored by the electronic device 100 (ie, a contact).
  • the communication identifier is, for example, a communication identifier for an operator call (such as a phone number), or a communication identifier for an OTT call (such as a personal number or personal name of an online chat application).
  • the electronic device 100 may display a floating window.
  • the electronic device 100 may display a floating window.
  • the electronic device 100 can display a floating window when displaying the default interface of the default application (not in the call state or conversation state at this time, and not only displaying information about one call partner).
  • the preset application can be used to implement operator calls and/or operator sessions, and can also be used to implement OTT calls and/or OTT sessions.
  • the preset interface includes a conversation list of short messages.
  • the preset interface includes call records/chat records.
  • the preset interface includes a list of contacts (such as the user interface 1140 shown in Figure 11D below).
  • the preset interface includes a list of OTT sessions (such as instant messaging sessions).
  • the electronic device 100 may display a floating window when displaying a specific interface (not in a call state or conversation state, and not only displaying information about one call partner).
  • the specific interface is the desktop.
  • the above situations 1, 2 and 3 can be understood as displaying a floating window when there is a specific call object.
  • the call object here can be the object who is making a call/conversation, or the object who intends to have a call/conversation ( For example, case 3).
  • the electronic device 100 can first establish a link with the call partner, and then display the floating window after the link is successfully established.
  • the electronic device 100 can first display the floating window, and receive the effect on the call party.
  • the floating window used to trigger the real-time sharing function
  • a link is established with the sharing object (which may or may not be the above-mentioned call object).
  • NewTalk can be pulled up (the main link and/or the auxiliary link can be used for conversation), or NewTalk can not be pulled up. Start NewTalk.
  • the above situations 4 and 5 can be understood as displaying a floating window when there is no specific call partner.
  • the electronic device 100 may first display a floating window, and then establish a link with the sharing object when receiving a user operation on the floating window (used to trigger the real-time sharing function).
  • the sharing object may be selected by the user.
  • the electronic device 100 displays a floating window when displaying the interface of application A.
  • the system may display Contacts of applications (such as calling applications/short message applications). When application A has contacts, the contacts of application A can be displayed, and the displayed contacts are used by the user to select sharing objects.
  • NewTalk can be pulled up (the main link and/or the auxiliary link can be used for conversation), or NewTalk can not be pulled up. Start NewTalk.
  • the electronic device 100 when the electronic device 100 is used as a sharing device for real-time sharing, it can manage the shared device. In one embodiment, when the electronic device 100 is used as a sharing device for real-time sharing, it can change the shared content. Specific examples are as follows. Said:
  • the sharing menu may be displayed in response to a touch operation (such as a click operation) on the sharing control option 421 in the user interface 420 , such as As shown in Figure 10A, the User interface 1010.
  • the user interface 1010 also includes a sharing menu 1011, which may include multiple options, such as option 1011A, option 1011B, and option 1011C.
  • option 1011A includes the characters "Change sharing content/sharing objects”.
  • Option 1011B includes the characters "Pause Sharing" for canceling/pausing/stopping real-time sharing.
  • Option 1011C includes the characters "more", which are used to trigger the display of more functional options.
  • the electronic device 100 may display the shared content and/or a management interface for the shared content in response to a touch operation (eg, a click operation) for option 1011A, such as displaying the user interface 640 shown in FIG. 6D .
  • a touch operation eg, a click operation
  • the electronic device 100 may cancel the short video application indicated by the sharing option 621A in response to a touch operation (eg, a click operation) for the option 621A (selected state) included in the list 621 in the user interface 640 shown in FIG. 6D audio stream/video stream.
  • the electronic device 100 may, in response to a touch operation (such as a click operation) on option 621C in the list 621, select the audio stream/video stream of the video application indicated by option 621C as the shared content, and the electronic device 100 may share the content with the selected sharing object ( That is, the device indicated by the option in the selected state included in the list 622 in the user interface 640) shares the above-mentioned selected sharing content.
  • the electronic device 100 changes the shared content from the audio stream/video stream of the short video application to the audio stream/video stream of the video application in response to the user operation.
  • the electronic device 100 may respond to a touch operation (such as a click operation) on any one option (selected state) in the list 622 and cancel sending the shared content to the device indicated by the option.
  • the option is the list 622 If option 622A is selected, the electronic device 100 will not send the shared content to the call party indicated by option 622A (ie, the electronic device 200).
  • the electronic device 100 deletes the existing shared device in response to the user operation. For example, after the above process, the electronic device 100 can display the user interface 1020 shown in Figure 10B.
  • the user interface 1020 is similar to the user interface 620 shown in Figure 6B. The difference is that in the user interface 1020, the option 621A in the list 621 is in an unselected state, option 621C is in a selected state, and option 622A in list 622 is in an unselected state.
  • the above example implements member management and content management in the real-time sharing process, which can meet the personalized needs of users and improve user experience.
  • the electronic device 100 can conduct operator calls/OTT calls, etc. NewTalk with multiple electronic devices, and the electronic device 100 can communicate in real time to at least one electronic device among the multiple electronic devices. Share audio/video streams.
  • the electronic device 100 may share the audio stream/video stream of the foreground application to multiple call parties (ie, the above-mentioned multiple electronic devices) in response to a user operation for triggering the real-time sharing function.
  • the electronic device 100 may, in response to a user operation for triggering the real-time sharing function, display the information of the above-mentioned multiple electronic devices on the selection interface of the sharing object, so that the user can choose whether to send real-time information to at least one of the devices.
  • the electronic device 100 may display the user interface 1110 shown in FIG. 11A .
  • the user interface 1110 is similar to the user interface 620 shown in FIG. 6B .
  • the list 1111 of selectable sharing objects also includes options.
  • 1111A, option 1111A includes the characters "Phone number 3 (on call)", which is used to indicate the call party whose communication number is "Phone number 3".
  • Option 622A and option 1111A in the list 622 may represent that the electronic device 100 is currently conducting NewTalk such as operator calls/OTT calls with the device whose communication number is "Phone Number 2" or the device whose communication number is "Phone Number 3".
  • the electronic device 100 may share audio in real time to the device with the communication number "Phone Number 2" and/or the device with the communication number "Phone Number 3" in response to a touch operation (such as a click operation) for option 622A and/or option 1111A.
  • a touch operation such as a click operation
  • the real-time sharing function of unicast type one shared device
  • the real-time sharing function of broadcast or multicast type multiple shared devices
  • the electronic device 100 may also receive a user operation for triggering the real-time sharing function in a non-call state.
  • the electronic device 100 may, in response to the user operation, display at least one device with recent communication on the selection interface of the sharing object for the user to select whether to share the audio stream/video stream to the at least one device in real time.
  • the at least one device may be a device that communicates with the electronic device 100 within a preset time range (such as 1 hour, 1 day, or 1 week).
  • the number of the at least one device may be the electronic device 100 Default, for example, less than or equal to 3.
  • the at least one device may be a device that communicates with the electronic device 100 through a preset application.
  • the preset application is to implement operator calls, OTT calls, and/or network chats. Applications. This application does not limit the specific type of the at least one device that has been recently communicated with.
  • the electronic device 100 may display the user interface 1120 shown in FIG. 11B.
  • the user interface 1120 is similar to the user interface 620 shown in FIG. 6B. The difference is that in the user interface 1120, the list 1121 of selectable sharing objects is not Including option 621A in user interface 620, list 1121 also includes option 1121A.
  • Option 1121A includes the characters "Phone number 2 (recent contact)", which is used to indicate the user/device of NewTalk whose communication number is "Phone number 2" and the electronic device 100 has recently made an operator call/OTT call or other NewTalk call.
  • the electronic device 100 may respond to the touch operation (such as a click operation) on option 1121A and send a NewTalk call request to the device with the communication number "Phone Number 2". After the device with the communication number "Phone Number 2" accepts the call request, The electronic device 100 can perform NewTalk with the device, and the electronic device 100 can share the audio stream/video stream with the device in real time based on the NewTalk.
  • the electronic device 100 may respond to a user operation for triggering the real-time sharing function, in the selection area of the sharing object. Icons of contacts are displayed on the screen for the user to choose whether to share the audio stream/video stream in real time with at least one contact stored in the electronic device 100.
  • the at least one contact may be a contact in a preset application.
  • the default application is an application that implements operator calls, OTT calls and/or online chats. This application does not limit the specific types of contacts.
  • the electronic device 100 may display the user interface 1130 shown in FIG. 11C.
  • the user interface 1130 is similar to the user interface 620 shown in FIG. 6B.
  • list 1131 of selectable sharing objects is not Including option 621A in user interface 620, list 1131 also includes option 1131A, which includes the characters "Contact.”
  • the electronic device 100 may display information of at least one contact stored by the electronic device 100 in response to a touch operation (eg, a click operation) for option 1131A, such as displaying the user interface 1140 shown in FIG. 11D .
  • User interface 1140 may include a title 1141 ("Contacts"), a search box 1142, a contact list 1143, and a determination control 1144.
  • the contact list 1143 may include information about multiple contacts, such as information 1143A about a contact named "Friends and Relatives 1".
  • a selection control 1143B is also displayed on the right side of the information 1143A.
  • the selection control 1143B is used to select the contact indicated by the information 1143A. "Friend 1" or cancel this selection. The information for other contacts is similar and will not be repeated.
  • the electronic device 100 may respond to a touch operation (such as a click operation) on the determination control 1144 and send a NewTalk call to the device corresponding to the selected contact in the contact list 1143 (such as the contact "Friends and Relatives 1" indicated by the information 1143A). After the device accepts the call request, the electronic device 100 can perform NewTalk with the device, and the electronic device 100 can share the audio stream/video stream with the device in real time based on the NewTalk.
  • a touch operation such as a click operation
  • the electronic device 100 can use the identification information (such as the above phone number, the above network chat information) of stored contacts (such as the above recent contacts, contacts in the contact list). account) to obtain the communication ID of the device corresponding to the contact, for example, addressing through the network device 300. After the electronic device 100 and the device corresponding to the contact are addressed, a connection can be established based on the obtained communication ID of the other party. The electronic device 100 can share the audio stream/video stream in real time with the device corresponding to the contact based on the established connection.
  • the connection established above may be, for example, but not limited to, a Bluetooth connection, a Wi-Fi connection, or a NewTalk connection.
  • the electronic device 100 may determine a real-time sharing method in response to a user operation, for example, select a real-time sharing method such as watching together, listening together, editing together, or playing together.
  • a real-time sharing method such as watching together, listening together, editing together, or playing together.
  • the following example illustrates the real-time sharing methods that can be selected, including watching together and listening together.
  • the electronic device 100 may display a real-time sharing mode selection interface in response to a user operation for triggering the real-time sharing function (such as a touch operation for the control 814D in the user interface 810 shown in FIG. 8A ), such as displaying
  • the user interface 1210 shown in FIG. 12A includes a prompt box 1211.
  • the prompt box 1211 includes a watch together option 1211A and a listen together option 1211B.
  • the real-time sharing method selected by the user is different, the sharing content and/or the selection interface of the sharing object displayed by the electronic device 100 may also be different.
  • the electronic device 100 may display the user interface 1220 shown in FIG.
  • the user interface 1220 in response to a touch operation (eg, a click operation) on the look-together option 1211A in the user interface 1210 .
  • the user interface 1220 is similar to the user interface 620 shown in FIG. 6B .
  • the list 1221 of selectable shared content in the user interface 1220 includes a plurality of options indicating shared content that can be viewed, such as the option 621A to share images of short video applications. , the option 621B of sharing the display content of the screen of the electronic device 100 , and the option 621C of sharing the image of the video application.
  • the list 1222 of selectable sharing objects in the user interface 1220 includes a plurality of options indicating a device that can display the image, such as an option 622A indicating an electronic device 200 (such as a mobile phone) with a communication number of “phone number 2”, an option 622A indicating a “user” C", an option 622B indicating “mobile phone”, an option 622C indicating "notebook” for "user A”, and an option 622D indicating "tablet” for "user D”.
  • the electronic device 100 may display the user interface 1230 shown in FIG. 12C in response to a touch operation (eg, a click operation) for the listen together option 1211B in the user interface 1210 .
  • the user interface 1230 is similar to the user interface 620 shown in FIG.
  • the list 1231 of selectable shared content in the user interface 1230 includes a plurality of options indicating the shared content that can be listened to, such as the option 621A to share the audio of a short video application. , the option 621C of sharing the audio of the video application, and the option 1231A of sharing the audio of the music application.
  • the list 1232 of selectable sharing objects in the user interface 1230 includes a plurality of options indicating devices that can play audio, such as options 622A indicating the electronic device 200 (such as a mobile phone) with a communication number of “phone number 2”, options 622A indicating “user
  • options 622E of "Headphones” indicates “User E”
  • the option 622F indicates "Speaker” for "User E”.
  • the list 1232 of selectable sharing objects in the user interface 1230 also includes option 622B, option 622C, and option 622D, which is not limited in this application.
  • the electronic device 100 may determine the real-time sharing method according to the user operation for triggering the real-time sharing function. That is to say, different user operations for triggering the real-time sharing function correspond to different real-time sharing methods.
  • the electronic device 100 may display the user interface shown in FIG. 12B in response to the first sliding operation for the user interface 610 shown in FIG. 6A (for example, the knuckles shown in FIG. 6A slide along a specific trajectory of “W”). 1220.
  • the electronic device 100 may respond to the second sliding operation on the user interface 610 shown in FIG. 6A (for example, in the user interface 1240 shown in FIG. 12D , the knuckle slides along a specific trajectory of “L”), display the display shown in FIG. 12C User interface 1230.
  • the sharing device performs real-time sharing, not only the audio stream/video stream of the running application (such as the foreground application and/or the background application) can be shared in real time, but also the audio stream/video stream of the application that is not running can be shared in real time.
  • the list 1231 of selectable sharing content includes the option 621A of sharing the audio of the short video application (foreground application), the option 621B of sharing the audio of the video application (background application), and the option 621B of sharing the audio of the music application (not running application).
  • Option 1231A The electronic device 100 may respond to a touch operation (eg, a click operation) for option 1231A, launch the music application, and share the audio stream/video stream of the music application to the selected sharing object in real time.
  • the electronic device 100 may determine the type of selectable sharing objects in response to a user operation.
  • the electronic device 100 may display a user interface for selecting a type of sharing object in response to a user operation for triggering a real-time sharing function, and then display a selection interface for a sharing object consistent with the selected type.
  • the electronic device 100 may first display the user interface 1310 shown in FIG. 13.
  • the user interface 1310 includes a prompt box 1311.
  • the prompt box 1311 includes an option 1311A (including the characters "Share with contacts"), an option 1311B (including the characters "Share with contacts"). Wi-Fi device”) and option 1311C (including the characters "Share to Bluetooth device").
  • the selectable sharing objects displayed by the electronic device 100 in response to the touch operation (such as a click operation) on option 1311A are: devices that communicate with the electronic device 100 through NewTalk such as operator calls/OTT calls, etc., such as the user shown in FIG. 6B
  • the selectable sharing objects displayed by the electronic device 100 in response to the touch operation (such as a click operation) on option 1311B are: devices that communicate with the electronic device 100 through Wi-Fi, such as the options in the user interface 620 shown in FIG. 6B 622C, the device indicated by option 622D.
  • the selectable sharing objects displayed by the electronic device 100 in response to the touch operation (such as a click operation) for option 1311C are: devices that communicate with the electronic device 100 through Bluetooth, such as options 622B in the user interface 620 shown in FIG. 6B, Devices indicated by Option 622E and Option 622F.
  • the prompt information corresponding to the sharing request can be played through an audio module such as a speaker.
  • This application does not limit the way in which the electronic device outputs the prompt information. .
  • the shared device is a headset.
  • the electronic device 100 can display a user interface 1410.
  • the user interface 1410 is similar to the user interface 1230 shown in Figure 12C. The difference is that the option 622E in the user interface 1410 is in a selected state, which can represent the option.
  • the electronic device 500 indicated by 622E ie, the "headphones" of "user C" is the selected sharing object.
  • Option 621A in the user interface 1410 is also in a selected state, which may indicate that the audio of the short video application indicated by option 621A is the selected sharing content.
  • the electronic device 100 can send a sharing request to the electronic device 500.
  • the electronic device 500 can play a prompt sound, such as "beep beep beep" shown in (2) of Figure 14A.
  • the electronic device 500 can accept the sharing request in response to a user operation (such as a click operation).
  • the electronic device 500 can receive the shared content sent by the electronic device 100 and play the shared content, that is, the audio of the above-mentioned short video application. , see (3) of Figure 14A for a specific example.
  • the shared device is a speaker.
  • the electronic device 100 can display a user interface 1420.
  • the user interface 1420 is similar to the user interface 1230 shown in Figure 12C. The difference is that the option 622F in the user interface 1420 is in a selected state, which can represent the option.
  • the electronic device 600 indicated by 622F ie, the "speaker" of "user E" is the selected sharing object.
  • Option 621A in the user interface 1420 is also in a selected state, which may indicate that the audio of the short video application indicated by option 621A is the selected sharing content.
  • the electronic device 100 can send a sharing request to the electronic device 600.
  • the electronic device 600 can play a prompt sound, such as "User A invites you to listen to the audio" shown in (2) of Figure 14B.
  • the electronic device 600 can accept the sharing request in response to a user operation (such as a click operation on the play button of the electronic device 600).
  • the electronic device 600 can receive the shared content sent by the electronic device 100 and play the shared content. , that is, the audio of the above short video application. For a specific example, see (3) in Figure 14B.
  • the shared device may not output the prompt information, but directly accept the sharing request.
  • the electronic device 100 can display a user interface 1430.
  • the user interface 1430 is similar to the user interface 1220 shown in Figure 12B, except that option 622C in the user interface 1430 is selected.
  • status which can represent that the electronic device 700 (i.e., "my” "notebook") indicated by option 622F is the selected sharing object, in which the login account of the electronic device 700 is the same as the login account of the electronic device 100 (i.e., the name is "User A"). ").
  • Option 621A in the user interface 1430 is also in a selected state, which may indicate that the image of the short video application indicated by option 621A is the selected sharing content.
  • the electronic device 100 can send a sharing request to the electronic device 700. After receiving the sharing request, the electronic device 700 can directly accept the sharing request, receive and display the sharing content sent by the electronic device 100. For a specific example, see (2) of Figure 14C.
  • the electronic device 700 may display a user interface 1440, which is used to display images of the above short video application.
  • the sharing device may send a sharing request for the shared device to other devices connected to the shared device, where the sharing request is used to request real-time sharing of audio to the shared device.
  • Stream/video stream after receiving the sharing request, the above-mentioned other devices can output prompt information, and the user can accept or reject the above-mentioned sharing request for the shared device through the above-mentioned other devices.
  • This application does not control the way the sharing device sends the sharing request. limited.
  • the electronic device 100 may display a user interface 1450 , which is similar to the user interface 620 shown in FIG.
  • the option 622E in the interface 1450 is in a selected state, which may indicate that the electronic device 500 (ie, the "headphones" of "user C") indicated by option 622E is the selected sharing object.
  • Option 621A in the user interface 1450 is also in a selected state, which may indicate that the image of the short video application indicated by option 621A is the selected sharing content.
  • the electronic device 100 can send a sharing request for the electronic device 500 to the electronic device 400.
  • the electronic device After receiving the sharing request, 400 may display prompt information, for example, display the user interface 1460 shown in (2) of Figure 14D.
  • the user interface 1460 may be the desktop of the electronic device 400 and may include prompt information 1461.
  • the prompt information 1461 includes prompt language 1461A (including the characters "User A invites you to listen together through headphones", where "headphones" refer to the electronic device 500), A determination control 1461B (for accepting the above-mentioned sharing request for the electronic device 500) and a cancel control 1461C (for rejecting the above-mentioned sharing request for the electronic device 500).
  • the electronic device 400 may accept the above sharing request for the electronic device 500 in response to a touch operation (eg, a click operation) on the determination control 1461B.
  • the electronic device 500 can receive and play the shared content sent by the electronic device 100, that is, the audio of the above-mentioned short video application. For a specific example, see (3) of Figure 14D.
  • the sharing device can send the shared content for the shared device to the shared device through other devices connected to the shared device, which can be understood as through a "third-party device" (That is, other devices mentioned above) forward data.
  • a “third-party device” That is, other devices mentioned above
  • the audio of the short video application can be sent to the electronic device 400 connected to the electronic device 500.
  • the electronic device 400 may forward the received audio of the short video application to the electronic device 500, and the electronic device 500 may play it.
  • the type of shared content (such as audio, image, or audio and image) can also be set through more options in the sharing menu.
  • This application provides The type setting method is not limited.
  • the electronic device 100 may display a sharing menu in response to a touch operation (eg, a click operation) for the sharing control option 421 in the user interface 420 shown in FIG. 4B , for example, displaying the sharing menu shown in (1) of FIG. 15A User interface 1510.
  • the sharing menu 1511 in the user interface 1510 may include multiple options, for example, the option 1511A for sharing the audio stream/video stream of the current application (FIG. 15A takes a short video application as an example), the option 1511A for sharing the electronic device 100
  • options 1511B for displaying content on the screen
  • options 1511C for canceling/pausing/stopping real-time sharing
  • options 1511D for triggering more functional options.
  • the electronic device 100 may display the user interface 1520 shown in (2) of FIG. 15A in response to a touch operation (for example, a click operation) on the option 1511D.
  • the user interface 1520 may include a setting window 1521, which may include a setting name 1521A (including the characters "audio and video settings") and a plurality of setting options, including, for example, option 1521B, option 1521C, and option 1521D.
  • option 1521B includes the characters "audio + picture", which is used to set the type of shared content to image and audio.
  • Option 1521C includes the characters "audio” and is used to set the type of shared content to audio.
  • Option 1521D includes the character "picture", which is used to set the type of shared content to image.
  • the user interface 1520 also includes a reset control 1522 and a save control 1523.
  • the reset control 1522 is used to set the preset options (such as option 1521B) in the setting window 1521 to a selected state.
  • the save control 1523 is used to save the current settings of the setting window 1521.
  • Content for example, option 1521B in the setting window 1521 shown in the user interface 1520 is in a selected state, and the electronic device 100 can set the shared content to the image indicated by option 1521B in response to a touch operation (such as a click operation) on the save control 1523 and audio.
  • the electronic device 100 can set whether to simultaneously share the audio captured by the microphone and/or the image captured by the camera when sharing the audio stream/video stream of the system and/or application in real time.
  • the electronic device 100 may display the user interface 1530 shown in FIG. 15B in response to a touch operation (eg, a click operation) for option 1511D in the user interface 1510 shown in (1) of FIG. 15A .
  • the user interface 1530 may include a setting window 1531, a reset control 1532, and a save control 1533.
  • the setting window 1531 may include a setting name 1531A (including the characters "Mixing Picture Settings") and a plurality of setting options, including, for example, option 1531B. , option 1531C, option 1531D and option 1531E.
  • option 1531B includes the characters “no mixing”, which is used to set up to share only the audio stream/video stream of the system and/or application in real time, and not to share the audio collected by the microphone and the image collected by the camera in real time.
  • option 1531C includes the characters “overlay MIC”, which is used to set up the real-time sharing of the audio stream of the system and/or application to also share the audio collected by the microphone in real time, and the real-time sharing of the video stream of the system and/or application without the real-time sharing of the image collected by the camera.
  • Option 1531D includes the character "Overlay Camera”, which is used to set up the real-time sharing of the audio stream of the system and/or application without sharing the audio collected by the microphone in real time, and the real-time sharing of the video stream of the system and/or application to also share the image collected by the camera in real time.
  • Option 1531E includes the characters “Overlay MIC and Camera”, which is used to set up the real-time sharing of the audio stream of the system and/or application to also share the audio captured by the microphone in real time, and the real-time sharing of the video stream of the system and/or application to also share the audio captured by the camera in real time. image.
  • the reset control 1532 is used to set the preset options (such as option 1531B) in the setting window 1531 to a selected state
  • the save control 1533 is used to save the current content of the setting window 1531.
  • the sharing device can send the shared content and the audio collected by the microphone of the sharing device to the shared device, and the shared device can play the shared content and the audio collected by the microphone of the sharing device at the same time.
  • the sharing device can send sharing content and sharing equipment to the shared device.
  • the shared device can simultaneously display the shared content and images captured by the camera of the sharing device.
  • the user interface 1540 shown in FIG. 15C can be displayed.
  • the user interface 1540 is similar to the user interface 520 shown in FIG. 5B , except that the user interface 1540 also includes a control 1541 , and the control 1541 is used to display the face image collected by the camera of the electronic device 100 .
  • the sharing device can also be preset to share or not share the audio captured by the microphone and/or the image captured by the camera by default when performing real-time sharing.
  • the sharing device receives a message for triggering When the user operates in real-time sharing, the user interface 1530 shown in Figure 15B is first displayed, which is not limited in this application.
  • the sharing user and the shared user can also have a conversation to meet the user's personalized needs and provide a better experience.
  • the electronic device 100 can set relevant permissions of the shared device based on the shared content.
  • the relevant permissions include saving permissions, such as the permission to record/take screenshots, and/or the permission to save files of shared content.
  • the relevant permissions include secondary communication permissions, such as instant communication permissions and / Or delayed propagation permission, where the immediate propagation permission is the permission of the shared device to forward the real-time shared content to other devices when playing the content shared by the sharing device in real time, and the delayed propagation permission is the permission of the shared device to save the sharing Whether the saved shared content can be forwarded to other devices after the shared content is sent by the device.
  • the electronic device 100 may display the user interface 1550 shown in FIG. 15D in response to a touch operation (eg, a click operation) for option 1511D in the user interface 1510 shown in (1) of FIG. 15A .
  • the user interface 1550 may include a setting window 1551, a reset control 1552, and a save control 1553.
  • the setting window 1551 may include a setting name 1551A (including the characters "permission setting") and a plurality of setting options.
  • the plurality of setting options include, for example, option 1551B, option 1551B, and option 1551B. 1551C and option 1551D.
  • option 1551B includes the characters “Destroy after reading (cannot be saved, cannot be forwarded)", which is used to set not to grant the shared device the permission to save and the permission to transmit again.
  • option 1551C includes the characters “can save, can take screenshots”, which is used to grant the shared device permission to save, but not to grant secondary transmission permission.
  • Option 1551D includes the characters “can be forwarded”, which is used to set the secondary transmission permission to the shared device, but not the saving permission.
  • the reset control 1552 is used to set the preset options (such as option 1551B) in the setting window 1551 to a selected state, and the save control 1553 is used to save the current content of the setting window 1551.
  • the permission settings in the settings window 1551 can also be more detailed, for example but not limited to including at least one of the following setting options: for setting not to grant the shared device saving permission and two.
  • Option 1 of the secondary propagation permission (in this case, the instant propagation permission) is used to set option 2 of not granting the shared device the saving permission, but granting the secondary propagation permission (in this case, the instant propagation permission).
  • Option 3 is used to set the shared device to be granted save permissions, but not to grant secondary propagation permissions (including immediate propagation permissions and delayed propagation permissions).
  • Option 4 of the post-propagation permission is used to set the storage permission and delayed propagation permission for the shared device, but does not grant the immediate propagation permission.
  • Option 5 is used to set the saving permission and secondary propagation permission for the shared device (including instant propagation). permissions and delayed propagation permissions), option 6, etc. This application does not limit the specific settings.
  • a shared device with instant broadcast permission plays content 1 shared by the sharing device in real time, it can share the content 1 to other devices in real time in response to a user operation for triggering the real-time sharing function.
  • a shared device with instant broadcast permission plays content 1 shared by the sharing device in real time
  • it can share the content 1 to other devices in real time in response to a user operation for triggering the real-time sharing function.
  • the electronic device 100 serving as a sharing device to share in real time with other shared devices which will not be described again.
  • a shared device that does not have the right to instant dissemination plays content 1 shared by the sharing device in real time
  • the electronic device 200 may display the user interface 520 shown in FIG. 5B , and the user interface 520 is used to play the audio stream/video stream of the short video application shared by the electronic device 100 in real time.
  • the electronic device 200 may display the user interface 1610 shown in FIG. 16A in response to a sliding operation on the user interface 520 (for example, a finger joint slides along a specific trajectory of “W”).
  • the user interface 1610 includes a prompt box 1611.
  • the prompt box 1611 includes prompt information 1611A (including the characters "No permission for others to watch/listen together, do you want to ask the other party for authorization?"), a request control 1611B and a cancel control 1611C.
  • the electronic device 200 may cancel the real-time sharing of the audio stream/video stream of the short video application to other devices in response to a touch operation (eg, a click operation) on the cancel control 1611C, for example, return to displaying the user interface shown in FIG. 5B 520.
  • a touch operation eg, a click operation
  • the electronic device 200 may, in response to a touch operation (such as a click operation) on the request control 1611B, send a request message to the sharing device to request to obtain the currently playing shared content (referred to as the current shared content, that is, the short video application). audio streaming/video streaming) instant broadcast permission.
  • the electronic device 100 and the electronic device 200 at this time can be seen in FIG. 16B.
  • the electronic device 200 can display a user interface 1620, and the user interface 1620 includes prompt information 1621 (including the characters "waiting for authorization").
  • the electronic device 100 can display a user interface 1630.
  • the user interface 1630 can include a prompt box 1631.
  • the prompt box 1631 includes prompt information 1631A (including the characters "Whether user B is authorized to be viewed by others"). /Listen"), agree control 1631B and reject control 1631C.
  • the agree control 1631B is used to grant the electronic device 200 the instant propagation permission of the currently shared content
  • the reject control 1631C is used to refuse to grant the electronic device 200 the instant propagation permission of the currently shared content.
  • the consent control 1631B can also be used to grant the electronic device 200 instant propagation permission for any shared content
  • the consent control 1631B can also be used to grant the electronic device 200 the instant propagation permission for the currently shared content. and delayed dissemination permissions.
  • the electronic device 100 may send a response message to the electronic device 200 in response to the user operation.
  • the electronic device 100 responds to a touch operation (for example, a click operation) for the consent control 1631B in the user interface 1630 shown in (2) of FIG. 16B, sending a response message indicating acceptance of the request to the electronic device 200,
  • the electronic device 200 can share the current shared content (ie, the audio stream/video stream of the short video application) to other devices in real time. For example, the electronic device 200 outputs prompt information indicating successful authorization.
  • the description of the electronic device 200 serving as a sharing device to share audio streams/video streams in real time with other devices is similar to the description of the electronic device 100 serving as a sharing device to share the audio stream/video stream in real time.
  • the electronic device 200 can Displays the selection interface for sharing objects and/or content.
  • the electronic device 100 sends a response message indicating the rejection request to the electronic device 200 in response to a touch operation (for example, a click operation) for the rejection control 1631C in the user interface 1630 shown in (2) of FIG. 16B
  • the electronic device 200 can cancel the real-time sharing of the current shared content (ie, the audio stream/video stream of the short video application) to other devices.
  • the electronic device 200 outputs a prompt message indicating authorization failure.
  • the electronic device 100 may not output the prompt information according to the preset rules. Directly reject or accept the request message, where the preset rule may be preset by the electronic device 100 or determined in response to a user operation, which is not limited in this application.
  • prompt information may be displayed in response to a user operation for triggering the real-time sharing function, and the prompt information indicates that the content is shared.
  • the device does not have the permission to broadcast immediately, for example, it includes the characters "No permission for others to watch/listen with”. It is not limited to this, and may not respond to user operations that trigger the real-time sharing function. This application does not limit this.
  • the playback interface of the shared content displayed by the shared device may include a save control, and the save control is used to save the shared content to the shared device.
  • the electronic device 200 may display the user interface 1640 shown in FIG. 16C in response to a touch operation (eg, a click operation) for the sharing control option 523 in the user interface 520 shown in FIG. 5B.
  • the user interface 1640 includes a sharing menu 1641, which may include multiple options, such as an option 1641A for pausing/exiting the playback interface of the shared content, an option 1641B for saving the shared content, and an option for triggering more functional options. 1641C.
  • the electronic device 200 with saving permission can save the shared content (such as the audio stream/video of the currently played short video application) that has been sent by the electronic device 100 in response to a touch operation (such as a click operation) for option 1641B. stream), at this time, for example, a prompt message is displayed (indicating that the save is successful).
  • the electronic device 200 that does not have the saving permission may display prompt information (indicating that the electronic device 200 does not have the saving permission) in response to a touch operation (such as a click operation) for option 1641B, or request from the electronic device 100 Obtain the saving permission of the currently shared content.
  • the specific example is similar to Figure 16A and Figure 16B and will not be described again. It is not limited to this, and it may not respond to the touch operation on option 1641B, which is not limited in this application.
  • saving the shared content can also be triggered by other operations, such as voice input, specific sliding operations, etc. This application does not limit this.
  • the electronic device 200 may also choose to save the shared content that has been played (which may be all or part of the shared content sent by the electronic device 100). This application does not make any specific reference to the saved shared content. limited.
  • the electronic device 200 may display a sharing interface for files sharing content, such as the user interface 1650 shown in FIG. 16D.
  • the user interface 1650 includes file information 1651, and the file information 1651 includes the characters "Content 1 shared by User A", File 1 used to indicate content shared by the electronic device 100 in real time.
  • a selection control 1652 is also displayed on the left side of the file information 1651, which is used to select the file indicated by the file information 1651 or cancel the selection. When the selection control 1652 is in the selected state, the prompt information 1653 in the user interface 1650 may include the characters "selected" 1 item".
  • the user interface 1650 also includes a cancel control 1654 and a selection box 1655 for the sharing mode of the selected file.
  • the cancel control 1654 is used to cancel sending the selected file 1 to other devices.
  • Selection box 1655 may include multiple options indicating different sharing methods, such as Option 1655A including the characters "instant sharing” (indicating a sharing method based on instant sharing), and option 1655B including the characters “recent contacts (phone number 4)" (indicating a sharing method based on NewTalk such as operator calls/OTT calls, etc., where
  • the sharing object is the device with the communication number of phone number 4), including the characters "WLAN direct connection” option 1655C (indicating the WLAN-based sharing method), including the characters “Bluetooth” option 1655D (indicating the Bluetooth-based sharing method), including
  • the option 1655E with the characters "Send to Friend" includes the option 1655F with the characters "Email" (indicating a mailbox/email based sharing method).
  • the electronic device 200 with delayed propagation permission can respond to a touch operation (such as a click operation) on any one of the above multiple options, and send the above-mentioned already-delayed message to other devices through the sharing method indicated by the option. Selected file 1.
  • the electronic device 200 that does not have the delay propagation permission may display prompt information (indicating that the electronic device 200 does not have the delay propagation permission) in response to a touch operation (such as a click operation) for any one of the multiple options. dissemination permission).
  • the electronic device 200 that does not have the delayed propagation permission can also request the electronic device 100 to obtain the delayed propagation permission of the selected file 1.
  • the specific example is as shown in FIG. 16A and FIG. 16B is similar and will not be described again.
  • the electronic device 200 that does not have the permission to delay propagation may not respond to the touch operation for any one of the above multiple options.
  • the file of the shared content saved by the electronic device 200 without delayed propagation permission may be encrypted, and the key used to decrypt the file is a dynamic key. Each time the electronic device 200 opens the file, it needs to request the electronic device 100 to obtain a dynamic key.
  • the dynamic key is time-sensitive (for example, valid within 1 minute or valid for the first 3 times, etc.).
  • the electronic device 200 obtains the dynamic key according to the obtained
  • the dynamic key must decrypt the file before it can be played. Even if the electronic device 200 successfully sends the file of the shared content to other devices (taking the electronic device 400 as an example), since the electronic device 400 cannot obtain the dynamic key, it cannot decrypt and play the file, thereby protecting the sharing user's rights. Privacy and security effects.
  • the file of the shared content saved by the electronic device 200 that does not have the delay propagation permission may be encrypted, and the key used to decrypt the file is obtained by using the device ID of the electronic device 200 as one of the factors. Therefore, only the electronic device 200 can use the key to decrypt the file.
  • the device ID is, for example, but not limited to, media access control addresses (MAC), serial number (SN), or international mobile equipment identity (IMEI). This application does not limit how to prohibit electronic devices that do not have delayed transmission permissions from sending files with shared content to other devices.
  • MAC media access control addresses
  • SN serial number
  • IMEI international mobile equipment identity
  • the user interface 1660 shown in FIG. 16E is displayed.
  • the user interface 1660 may include a title 1661 and a play box 1662.
  • the title 1661 includes the name of the file played in the play box 1622.
  • the play box 1622 may include a play/pause control 1662A and a progress bar control 1662B.
  • the electronic device 200 with delayed broadcast permission can respond to a user operation for triggering real-time sharing (FIG. 16E takes the user operation as an example of sliding the knuckles on the display screen according to a specific trajectory of "W").
  • the description of the electronic device 200 as a sharing device sharing the audio stream/video stream in real time to other devices is similar to the description of the electronic device 100 as the sharing device sharing the audio stream/video stream in real time.
  • the electronic device 200 responds to a message for triggering real-time sharing.
  • User operation displays the sharing object selection interface.
  • the electronic device 200 that does not have the permission to delay the broadcast can respond to a user operation for triggering real-time sharing (FIG. 16E uses the user operation as a knuckle sliding on the display screen according to a specific trajectory of "W").
  • a prompt message is displayed (indicating that the electronic device 200 does not have the permission to delay propagation).
  • the electronic device 200 that does not have the delayed propagation permission can also request the electronic device 100 to obtain the delayed propagation permission of the currently played file.
  • the specific example is similar to Figure 16A and Figure 16B ,No longer.
  • the electronic device 200 that does not have the permission to delay transmission may not respond to the user operation for triggering real-time sharing, which is not limited in this application.
  • the electronic device 100 can also automatically identify whether the shared data meets the preset conditions. When the shared data meets the preset conditions, the electronic device 200 is not granted the saving permission based on the shared data. /or secondary dissemination permission.
  • the above-mentioned preset condition is that the shared data is application data of a preset application.
  • the electronic device 100 can preset information of a preset application (which can be understood as a blacklist).
  • the blacklist can include at least one of the following: Item application information: application type, application name, package name, application ID, etc.
  • the preset condition is that the shared data is the application data of the preset application, which may include: the application information corresponding to the shared data is consistent with the application information in the blacklist.
  • the preset applications may include applications determined in response to user operations, or may include automatically recognized applications.
  • the electronic device 100 may identify the type of application and set banking, payment, and other types of applications as preset applications. .
  • the above-mentioned preset condition is that the shared data includes preset content, and the preset content may include content determined in response to user operations, or may include automatically recognized content.
  • the preset content is, for example but not limited to, text type, picture type or video type.
  • the default content is, for example, but not limited to, user name, password, account name, login name, ID number, bank card number, account balance, etc.
  • the above example can realize the permission management of the shared device based on the shared content, effectively ensuring the privacy and security of the sharing user.
  • the electronic The device 100 may determine at least one area on the display screen in response to a user operation, and the audio stream/video stream related to the determined area is used for real-time sharing.
  • the electronic device 100 may display a sharing menu, such as the user interface 1710 shown in FIG. 17A , in response to a touch operation (eg, a click operation) for the sharing control option 421 in the user interface 420 shown in FIG. 4B .
  • the sharing menu 1711 in the user interface 1710 may include multiple options, for example, options 1711A and 1711B for sharing audio/video streams of the current application (FIG. 17A takes a short video application as an example), options 1711B, and options 1711B for sharing electronic files.
  • the screen of the device 100 has options 1711C for displaying content, options 1711D for canceling/pausing/stopping real-time sharing, and options 1711E for triggering more functional options.
  • Option 1711B includes the characters "select area (grid)", which is used to select an area for real-time sharing through a grid method.
  • the electronic device 100 may display the user interface 1720 shown in FIG. 17B in response to a touch operation (eg, a click operation) for option 1711B in the user interface 1710 shown in FIG. 17A .
  • the electronic device 100 may move the dividing line in the playback window of the shared content in response to user operations.
  • the electronic device 100 may move the vertical dividing line 1721A to the left or right in response to a touch operation on the dividing line 1721A in the playback window 1721 included in the user interface 1720 shown in FIG. 17B .
  • FIG. 17C The user interface 1730 shown in FIG. 17C takes the touch operation as sliding to the left as an example.
  • the user interface 1730 shows the dividing line 1721A before movement and the dividing line 1721A after movement.
  • the vertical dividing line in the play window 1721 can also be moved upward or downward.
  • the electronic device 100 may add a dividing line in the play window of the shared content in response to the user operation.
  • the electronic device 100 may add a vertical dividing line and move the dividing line to the right or left in response to a touch operation on the left edge or the right edge of the playback window 1721 included in the user interface 1730 shown in FIG. 17C .
  • Move. For a specific example, see the user interface 1740 shown in Figure 17D.
  • Figure 17D takes the touch operation as sliding from the right edge of the screen to the middle of the screen (sliding to the left) as an example.
  • the user interface 1740 shows the newly added Vertical dividing line 1721D.
  • a horizontal dividing line may also be added in response to a touch operation on the upper edge or lower edge of the playback window 1721 .
  • the electronic device 100 may delete the dividing line in the play window of the shared content in response to the user operation. For example, the electronic device 100 may move the vertical dividing line 1721A to the left or right side of the screen in response to a touch operation (such as sliding left or right) on the dividing line 1721A in the playback window 1721 included in the user interface 1720 shown in FIG. 17B At this time, the play window 1721 may not display the dividing line 1721A, which can be understood as deleting the dividing line 1721A. Not limited to this, the horizontal dividing line in the playback window can also be moved to the upper or lower edge of the screen to delete the dividing line.
  • the electronic device 100 may select any grid (as a real-time sharing area) in the playback window of the shared content in response to the user operation.
  • the electronic device 100 may select the grid in response to a touch operation (eg, a click operation, a double-click operation, or a long press operation) on a grid located in the middle of the playback window 1721 included in the user interface 1740 shown in FIG. 17D .
  • the electronic device 100 can display the user interface 1750 shown in FIG. 17E , and the middle grid 1721E in the playback window 1721 shown in the user interface 1750 is in a selected state.
  • the user interface 1750 also includes a completion control 1751, which is used to save the currently selected grid (ie, the above-mentioned grid 1721E) as a real-time sharing area.
  • the size and/or number of grids included in the playback window will change.
  • 6 in the user interface 1720 shown in Figure 17B The size of the grids (before moving the dividing line and adding the dividing line shown in Figure 17C and Figure 17D) is different from the size of the six grids in the user interface 1750 shown in Figure 17E (after passing the moving dividing line and the new dividing line shown in Figure 17E After moving the dividing line and adding a new dividing line as shown in 17D).
  • the electronic device 100 may select multiple grids (as areas for real-time sharing) in the playback window of the shared content in response to user operations. For example, the electronic device 100 may sequentially receive touch operations (such as a click operation, a double-click operation, or a long press operation) for the three grids at the bottom of the playback window 1721 included in the user interface 1740 shown in FIG. 17D , and respond to these Touch operation to select these three grids.
  • touch operations such as a click operation, a double-click operation, or a long press operation
  • the electronic device 100 may also first receive a touch operation (such as a click operation, a double-click operation or a long press operation) for any one of the three grids, and select the grid.
  • the electronic device 100 may The user interface 1760 shown in Figure 17F is displayed, and the grid 1721F in the user interface 1760 is selected. As shown in FIG. 17F , after performing the above touch operation, the user can keep his finger touching the display screen of the electronic device 100 and slide left to the grid 1721G adjacent to the grid 1721F in the user interface 1760 . The electronic device 100 can select the grid 1721G in response to the above user operation. At this time, the user interface 1770 shown in FIG. 17G can be displayed. The grid 1771 in the user interface 1770 is in a selected state, and the grid 1771 merges the grid 1721F and the grid 1721F. Get 1721G Arrived. As shown in FIG.
  • the user can continue to keep the finger touching the display screen of the electronic device 100 and slide left to the grid 1721H adjacent to the grid 1721G in the user interface 1760 .
  • the electronic device 100 can select the grid 1721H in response to the above user operation.
  • the user interface 1780 shown in Figure 17H can be displayed.
  • the grid 1781 in the user interface 1780 is in a selected state, and the grid 1781 is the merged grid 1721F, Grid 1721G and Grid 1721H are obtained.
  • the video stream/audio stream related to the area can be shared with other devices in real time.
  • the electronic device 100 may set the selected grid 1721E and the grid 1781 as areas for real-time sharing in response to a touch operation (eg, a click operation) for the completion control 1751 in the user interface 1780 shown in FIG. 17H , and Share the relevant video stream/audio stream to the electronic device 200 in real time.
  • the electronic device 200 can display the user interface 1790 shown in Figure 17I.
  • the user interface 1790 is similar to the user interface 520 shown in Figure 5B. The difference is that in the playback window 522 of the shared content shown in the user interface 1790, only the above The content 1791 in the selected grid 1721E and grid 1781 does not display the content in other areas.
  • the electronic device 100 may respond to a touch operation (eg, click) on the option 1711B in the user interface 1710 shown in FIG. 17A operation), the user interface 1810 shown in Figure 18A is displayed.
  • the user interface 1810 is similar to the user interface 420 shown in FIG. 4B .
  • the short video play window 412 in the user interface 1810 also displays a check box 1811 .
  • the check box 1811 includes the play window 412 by default. Show all content.
  • the area where the checked box 1811 is located can be used as a real-time sharing area.
  • the electronic device 100 may adjust the size and/or position of the check box in the playback window of the shared content.
  • the electronic device 100 may receive a touch operation (such as sliding up and down, sliding left and right, sliding up or down diagonally) for the lower right corner of the check box 1811 in the user interface 1810 shown in FIG. 18A , and FIG. 18B uses the touch operation. Taking sliding from the lower right corner to the upper left corner as an example, the electronic device 100 can reduce the selected box 1811 in response to the touch operation.
  • the user interface 1820 shown in FIG. 18B shows the selected box 1811 before adjustment and the selected box after adjustment. 1811.
  • the electronic device 100 may continue to receive a touch operation (such as sliding up and down, sliding left and right, sliding up or down diagonally) for the upper left corner of the check box 1811 in the user interface 1820 shown in FIG. 18B .
  • a touch operation such as sliding up and down, sliding left and right, sliding up or down diagonally
  • FIG. 18C the touch operation is Sliding from the upper left corner to the lower right corner is used as an example.
  • the electronic device 100 can reduce the selected box 1811 in response to the touch operation.
  • the user interface 1830 shown in FIG. 18C shows the selected box 1811 before adjustment and the selected box 1811 after adjustment. .
  • the user interface 1830 also includes a completion control 1831.
  • the completion control 1831 is used to save the area where the current selected box 1811 (ie, the adjusted selected box 1811) is located as a real-time sharing area.
  • the electronic device 100 can share the video stream/audio stream related to the area to other devices in real time.
  • the electronic device 100 may respond to a touch operation (such as a click operation) on the completion control 1831 in the user interface 1830 shown in FIG. 18C , set the area where the check box 1811 is located as a real-time sharing area, and set the related video
  • the stream/audio stream is shared to the electronic device 200 in real time.
  • the electronic device 200 can display the user interface 1840 shown in Figure 18D.
  • the user interface 1840 is similar to the user interface 520 shown in Figure 5B. The difference is that in the playback window 522 of the shared content shown in the user interface 1840, only the selected Content 1841 in box 1811 does not display content in other areas.
  • the electronic device 100 may display the User interface 1910 shown in Figure 19A.
  • the user interface 1910 is similar to the user interface 1710 shown in FIG. 17A .
  • the difference is that the sharing menu 1911 in the user interface 1910 does not include the option 1711B, but includes the option 1911A.
  • the option 1911A includes the characters "select area (hand-drawn)" for Select the area to share in real time by hand drawing.
  • the electronic device 100 may display the user interface 1920 shown in FIG. 19B in response to a touch operation (eg, a click operation) for 1911A in the user interface 1910 shown in FIG. 19A.
  • the user interface 1920 and the user interface 1920 shown in FIG. 4B The user interface 420 is similar.
  • the electronic device 100 may respond to a touch operation on the play window 412 of the short video in the user interface 1920 and select an area related to the touch operation from the play window 412.
  • the touch operation is sliding in a clockwise direction.
  • the area related to the touch operation is area 1921 in user interface 1920 .
  • the user interface 1920 also includes a return control 1922 and a completion control 1923.
  • the return control 1922 is used to cancel the latest operation result, such as canceling the selection of the above-mentioned area 1921.
  • the completion control 1923 is used to save the currently selected area (for example, area 1921) as a real-time sharing area. Not limited to this, the user can also select multiple areas. For example, after FIG. 19B , the electronic device 100 can respond to a touch operation on the play window 412 of the short video, and then select the area related to the touch operation from the play window 412 , Figure 19C takes the touch operation as sliding in a clockwise direction as an example, and the area related to the touch operation is area 1931 in the user interface 1930 shown in Figure 19C.
  • the electronic device 100 can share the video stream/audio stream related to the area to other devices in real time.
  • the electronic device 100 may respond to a touch operation (such as a click operation) on the completion control 1923 in the user interface 1930 shown in FIG. 19C , set the area 1921 and the area 1931 selected by the user's hand drawing as areas for real-time sharing, and The relevant video stream/audio stream is shared to the electronic device 200 in real time.
  • the electronic device 200 may display the The user interface 1940 is similar to the user interface 520 shown in FIG. 5B. The difference is that in the playback window 522 of the shared content shown in the user interface 1940, only the content 1941 in the above area 1921 and the above area 1931 are displayed. Content 1942, content in other areas is not displayed.
  • the electronic device 100 may respond to a touch operation (such as a click operation) for the option 1911A in the user interface 1910 shown in FIG. 19A ), displaying the user interface 1950 shown in Figure 19E.
  • the user interface 1950 is similar to the user interface 420 shown in FIG. 4B and includes a short video playback window 412.
  • the electronic device 100 may thereby receive touch operations (such as click operations) for positions 1951 and 1952 in the user interface 1950, and in response to these touch operations, display on the user interface 1950 the boundaries of points whose endpoints are the positions 1951 and 1952. Line 1953 (solid line form).
  • the electronic device 100 may receive a touch operation for the position 1954 in the user interface 1950.
  • a boundary line 1955 in the form of a dotted line
  • the representation can be adjusted).
  • the user can keep the finger touching the display screen and move from the position 1954 to the position 1956.
  • the electronic device 100 can cancel the display of the boundary line 1955 on the user interface 1950 and display the boundary of the points whose endpoints are the position 1952 and the position 1956.
  • Line 1957 (in solid line form) can be understood as adjusting boundary line 1955 to boundary line 1957.
  • the electronic device 100 may sequentially receive touch operations (eg, click operations) for positions 1961, 1962, and 1951 in the user interface 1960 shown in FIG. 19F, and in response to these touch operations, display on the user interface 1960: endpoint Boundary line 1963 (solid line form) for points at positions 1956 and 1961, boundary line 1964 (solid line form) for points whose endpoints are positions 1961 and 1962, and boundary line 1965 for points whose endpoints are positions 1962 and 1951. (solid line form).
  • the above-mentioned boundary lines 1953, 1957, 1963, 1964, and 1965 may constitute a (user-selected) area 1966 in the user interface 1960.
  • the user interface 1960 also includes a return control 1967 and a completion control 1968.
  • the return control 1967 is used to cancel the latest operation result, such as canceling the display of the above-mentioned boundary line 1965.
  • the electronic device 100 may respond to a touch operation (such as a click operation) on the completion control 1968, set the area 1966 selected by the user's hand drawing as a real-time sharing area, and share the relevant video stream/audio stream to the electronic device 200 in real time.
  • the electronic device 200 can display the user interface 1970 shown in Figure 19G.
  • the user interface 1970 is similar to the user interface 520 shown in Figure 5B. The difference is that in the playback window 522 of the shared content shown in the user interface 1970, only the above Content 1971 in area 1966 does not display content in other areas.
  • the electronic device 100 may not share any application data of the default application in real time (for example, the interface used to display the shared content on the shared device is black).
  • the electronic device 100 may Preset default application information (can be understood as a blacklist).
  • the blacklist may include at least one of the following application information: application name, package name, application identification, etc., when the electronic device 100 recognizes the shared data corresponding to
  • the shared data may not be shared in real time (for example, but not limited to: the electronic device 100 outputs a prompt message indicating that real-time sharing cannot be performed, or the window used to display the shared data on the shared device is black).
  • the preset applications may include applications determined in response to user operations, or may include automatically recognized applications.
  • the electronic device 100 may identify the type of application and set banking, payment, and other types of applications as preset applications. .
  • the electronic device 100 may not share a certain interface of the application in real time (for example, when the shared device displays the shared content, if the video stream related to the interface is played, the display interface will be black; if other video streams are played, the displayed interface is normal), for example, when the electronic device 100 recognizes that the user interface to be shared includes preset content, the interface is not shared in real time.
  • the electronic device 100 may not share a certain area of the user interface in real time (the specific examples are similar to FIG. 19D and FIG.
  • the preset content may include content determined in response to user operations, or may include automatically recognized content.
  • the preset content is, for example but not limited to, text type, picture type or video type.
  • the default content is, for example, but not limited to, user name, password, account name, login name, ID number, bank card number, account balance, etc.
  • the electronic device 100 may determine at least one layer in the user interface in response to a user operation , the audio stream/video stream related to the layer determined above is used for real-time sharing.
  • the electronic device 100 may display a sharing menu, such as the user interface 2010 shown in FIG. 20A, in response to a touch operation (eg, a click operation) for the sharing control option 421 in the user interface 420 shown in FIG. 4B.
  • the user interface 2010 is similar to the user interface 1710 shown in Figure 17A. The difference is that the share menu 2011 in the user interface 2010 does not include option 1711B, but includes option 2011A.
  • Option 2011A includes the characters "select layer" for triggering selection. Layers for real-time sharing.
  • the electronic device 100 may display a layer selection interface, such as the user interface 2020 shown in FIG. 20B, in response to a touch operation (eg, a click operation) for option 2011A in the user interface 2010 shown in FIG. 20A.
  • the user interface 2020 may include a layer schematic interface 2021.
  • the layer schematic interface 2021 may include a layer 2021A, a layer 2021B, and a layer 2021C. These layers may be the playback of the short video in the user interface 420 shown in Figure 4B.
  • Window 412 (assuming that there is also a floating window 312 in a collapsed state) is displayed. Obtained by layer division.
  • layer 2021A may include the content of the short video application and the floating window 312
  • layer 2021B may include the content of the short video application (which can be understood as the specific content of short video 1 played by the play window 412)
  • layer 2021C may include The content of the short video application (can be understood as the relevant controls of the short video application).
  • the electronic device 100 may select the layer in response to a touch operation (such as a click operation, a double-click operation or a long press operation) on any layer in the layer representation interface 2021 .
  • the electronic device 100 can share the audio stream/video stream related to the selected layer to other devices, for example, share the audio stream/video stream of the short video 1 related to the layer 2021B to the electronic device 200.
  • the electronic device 200 for example
  • the user interface 1970 shown in Figure 19G is displayed.
  • the play window 522 in the user interface 1970 only displays the content 1971 in the layer 2021B, and does not display the content of other layers.
  • the layer selection interface displayed by the electronic device 100 may also be the user interface 2030 shown in FIG. 20C.
  • the user interface 2030 may include a layer schematic interface 2031.
  • the layer schematic interface 2031 may include a layer 2031A and a layer 2031B. These layers may be a playback interface for a video application (a floating window of a short message application is displayed on the playback interface). ) obtained by layer division.
  • layer 2031A may include video content 2031C in the video application and content 2031D in the short message application
  • layer 2031B may include content such as playback controls and progress bars in the video application.
  • the electronic device 100 may respond to a touch operation (such as a click operation, a double-click operation or a long press operation) on any layer in the layer representation interface 2031 to select the layer.
  • a touch operation such as a click operation, a double-click operation or a long press operation
  • the electronic device 100 can share the audio stream/video stream related to the selected layer to other devices.
  • the layer selection interface displayed by the electronic device 100 may also be the user interface 2040 shown in FIG. 20D.
  • the user interface 2040 may include a layer schematic interface 2041.
  • the layer schematic interface 2041 may include a layer 2041A, a layer 2041B, and a layer 2041C. These layers may be layer divisions for split-screen interfaces of video applications and short message applications. owned.
  • layer 2041A can include the content of the video application and the content of the short message application (which can be understood as including the content of the entire split-screen interface, and can also be understood as including the content in layer 2041B and layer 2041C).
  • Layer 2041B may include video content in a video application
  • layer 2041C may include short messages in a short message application.
  • the electronic device 100 may respond to a touch operation (such as a click operation, a double-click operation or a long press operation) on any layer in the layer representation interface 2041 to select the layer.
  • a touch operation such as a click operation, a double-click operation or a long press operation
  • the electronic device 100 can share the audio stream/video stream related to the selected layer to other devices.
  • the layer schematic interface 2031 in the user interface 2030 shown in Figure 20C also includes layer 2031D, layer 2031D.
  • 2031D includes the content of short message applications. This application does not limit the layer division method.
  • a layer may also include the content of more or less applications.
  • a layer may only include the system content of an electronic device (excluding the content of any application), Alternatively, one layer includes the content of two or more applications. This application does not limit the content included in the layer.
  • real-time sharing can be performed by any application, by any area (regular or irregular), full screen and other methods.
  • Foreground applications, background applications and non-running applications can all be shared in real time, that is to say, There are no restrictions on sharing content and a wider range of usage scenarios, effectively meeting user needs and improving user experience.
  • any one of the other devices mentioned above can also serve as a sharing device to share the first content with the electronic device 100 and other devices.
  • the second content means that two-way sharing can be achieved.
  • the above description of real-time sharing of other devices as a sharing device is similar to the description of real-time sharing of the electronic device 100 as a sharing device. Some scenarios are exemplarily shown below, but there can also be other and electronic devices. Scenarios similar to the real-time sharing of device 100 as a sharing device should not constitute a limitation.
  • the electronic device 200 can switch the playback interface (such as the playback window 522 in the user interface 520 shown in FIG. 5B ) of the content shared in real time by the electronic device 100 to background display (also known as is to switch the application corresponding to the real-time shared content to run in the background) and display the user interface of other applications in the foreground (it can also be called running other applications in the foreground), for example, the user interface 2110 of the short video application shown in Figure 21A , the user interface 2110 may include a call control 2111 at the top and a short video playback window 2112.
  • the call control 2111 may indicate that the electronic device 200 is currently in a call state and the call duration is 36 seconds.
  • the play window 2112 is used to display the played short video, for example, short video 2 named “Topic 2" published by “User 2" is currently being played.
  • the electronic device 200 may respond to a user operation for triggering real-time sharing, such as a user operation in which the knuckles slide along a specific trajectory of “W” as shown in FIG. 21A , and display a selection interface for sharing objects and sharing content, such as that shown in FIG. 21B User interface 2120.
  • the user interface 2120 may include a list 2121 of selectable sharing content and a list 2122 of selectable sharing objects.
  • the list 2121 may include an option 2121A for sharing display content of a foreground application (FIG. 21B takes a short video application as an example).
  • the list 2122 may include an option 2122A indicating the call party (ie, the electronic device 100 whose communication number is “phone number 1”) and a plurality of options indicating nearby devices.
  • the list of selectable sharing objects is not limited to the list of selectable sharing objects shown in FIG. 21B.
  • the list of selectable sharing content may include options for sharing content shared by the electronic device 100 in real time, optionally with the ability to instantaneously spread
  • the electronic device 200 with permission may display the option for sharing the content shared by the electronic device 100 in real time, and the electronic device 200 that does not have the real-time broadcast permission may not display the option for sharing the content shared by the electronic device 100 in real time.
  • the selection interface for sharing objects and sharing content displayed by the electronic device 200 is the user interface 2130 shown in FIG. 21C.
  • the user interface 2130 is similar to the user interface 2120 shown in FIG. 21B.
  • the list 2121 of shared content also includes an option 2121D, and the characters displayed under the option 2121D are: Share "Watch Together", which is used to share content shared by the electronic device 100 in real time, such as the content played by the play window 522 in the user interface 520 shown in Figure 5B Audio stream/video stream of short video 1.
  • the electronic device 200 may respond to a touch operation (eg, a click operation) for option 2122A in the user interface 2120 shown in FIG. 21B or the user interface 2130 shown in FIG. 21C , to the electronic device 100 indicated by option 2122A.
  • a touch operation eg, a click operation
  • Real-time sharing of the audio stream/video stream of the short video application indicated by the selected option 2121A (specifically, the audio stream/video stream of short video 2). That is to say, when the electronic device 100 shares the audio stream/video stream of the short video 1 to the electronic device 200 in real time, the electronic device 200 can also share the audio stream/video stream of the short video 2 to the electronic device 100 in real time.
  • the electronic device 200 may play the audio stream/video stream of the short video 1 shared by the electronic device 100 in real time in the foreground, for example, displaying the user interface 520 shown in FIG. 5B .
  • the electronic device 100 can also play the audio stream/video stream of the short video 2 shared by the electronic device 200 in real time in the foreground, for example, displaying the user interface 2140 shown in Figure 21D.
  • the user interface 2140 can include a prompt box 2141 and a play window 2142.
  • the prompt box 2141 includes the characters "Watching the content shared by user B", and the play window 2142 is used to display the shared content (for example, the image displayed by the play window 2112 in the user interface 2110 shown in FIG. 21A).
  • the electronic device 100 and the electronic device 200 can both play the audio stream/video stream of the short video 2 shared by the electronic device 200 in real time in the foreground.
  • the electronic device 100 displays the display as shown in Figure 21D 2140 of the user interface
  • the electronic device 200 displays the user interface 2110 shown in FIG. 21A.
  • the electronic device 100 and the electronic device 200 can both play the audio stream/video stream of the short video 1 shared by the electronic device 100 in real time in the foreground.
  • the electronic device 100 displays the user interface 420 shown in Figure 4B and the electronic device 200 displays the user interface 420 shown in Figure 5B User interface 520 is shown.
  • the electronic device 100 or the electronic device 200 can display the content shared by the electronic device 100 in real time and the content shared by the electronic device 200 in real time in a split screen.
  • the electronic device 200 displays the content shown in Figure 21E User interface 2150.
  • the user interface 2150 may include a playback window 2151 for a split-screen short video and a playback window 2152 for content shared in real time by the electronic device 100.
  • a control 2153 may be displayed between the playback window 2151 and the playback window 2152. The control 2153 can be used To adjust the size of the display area of the play window 2151 and the play window 2152.
  • the play window 2151 is used to display the image displayed by the play window 2112 in the user interface 2110 shown in Figure 21A.
  • a control 2151A is displayed in the play window 2151.
  • the play window 2152 is used to display the image displayed by the play window 522 in the user interface 520 shown in FIG. 5B.
  • the play window 2152 displays a control 2152A.
  • 2152A please refer to the description of the sharing control option 523 in the user interface 520.
  • the interface displayed by the electronic device 100 is similar to the user interface 2150 shown in FIG. 21E and will not be described again. This application does not limit the specific display method of two-way sharing.
  • the electronic device 100 and the electronic device 200 can simultaneously serve as sharing devices to share audio streams and/or video streams to the electronic device 400 in real time.
  • the electronic device 400 can follow any of the above examples or In other cases, the content shared by the electronic device 100 in real time and/or the content shared by the electronic device 200 in real time is displayed.
  • the electronic device 400 can display the user interface 2150 shown in FIG. 21E. At this time, the play window 2151 in the user interface 2150 also includes The prompt message is "Watching the content shared by user B".
  • any two devices are similar to the instructions for two-way sharing between the electronic device 100 and the electronic device 200. Again.
  • the electronic device 200 when the electronic device 200 serves as a sharing device to share the content shared by the electronic device 100 in real time to the electronic device 100 and/or other devices, it may perform different operations based on whether it has instant transmission permission, for example , the electronic device 200 with instant dissemination permission can share the content indicated by option 2121D in real time in response to a touch operation (such as a click operation) on option 2121D in the user interface 2130 shown in FIG. 21C.
  • the electronic device 200 that does not have the right to instant dissemination may request the electronic device 100 to obtain instant dissemination of the content indicated by option 2121D in response to a touch operation (such as a click operation) for option 2121D in the user interface 2130 shown in FIG.
  • the electronic device 200 that does not have the right to instantaneous communication can also directly display a prompt message indicating that it does not have the right to instantaneous communication, or directly not respond to the above-mentioned touch operation.
  • the electronic device 100 may enable the real-time sharing function in the above implementation by default. In another implementation, the electronic device 100 can respond to user operations to enable the real-time sharing function in the above implementation. The following illustrates some user interfaces for setting the real-time sharing function.
  • the electronic device 100 may display a user interface 2210 that includes a setting name. 2211 (including the characters "watch/listen together”) can indicate that the user interface 2210 is a setting interface for the real-time sharing function.
  • the electronic device 100 may display the user interface 2210 in response to a touch operation (such as a click operation) for the "Watch/Listen Together" option under the "More Connections" option in the settings menu.
  • the user interface 2210 may include a function name 2212 (including the characters "watch together, listen together”).
  • a corresponding switch control 2212A is also displayed on the right side of the function name 2212. The switch control 2212A is used to turn on or off the real-time sharing function indicated by the function name 2212.
  • the switch control 2212A can be understood as the master switch of the real-time sharing function.
  • the user interface 2210 also includes a setting menu of the sharing object: a plurality of setting options displayed under the title 2213 (including the characters "Share Menu"), such as setting options 2214 (including the characters "Allow watching/listening with the call party") and setting options 2215 (includes the characters "Allow watching/listening with nearby devices").
  • a corresponding switch control 2214A is also displayed on the right side of the setting option 2214.
  • the switch control 2214A is used to turn on or off the function of real-time sharing through NewTalk such as operator calls/OTT calls, etc. indicated by the setting option 2214.
  • a corresponding switch control 2215A is also displayed on the right side of the setting option 2215.
  • the switch control 2215A is used to turn on or off the real-time sharing function indicated by the setting option 2215 through near field communication technology. It is not limited to this, it may also include options indicating the function of real-time sharing through satellite, options indicating the function of real-time sharing with vehicle-mounted equipment, etc., or it may only include setting option 2214 or setting option 2215, which is not limited in this application. .
  • the electronic device 100 may display a user interface 2220 that includes a setting name 2221 (including the characters “New Talk”), which may represent the user interface 2220 as a setting interface for the NewTalk function.
  • the electronic device 100 may display the user interface 2220 in response to a touch operation (such as a click operation) for the "NewTalk" option in the setting menu under the "Phone” option.
  • the user interface 2220 may include a function name 2222 (including the characters "New Talk").
  • a corresponding switch control 2222A is also displayed on the right side of the function name 2222. The switch control 2222A is used to turn on or off the NewTalk function indicated by the function name 2222.
  • the switch control 2222A It can be understood as the master switch of NewTalk function.
  • the user interface 2220 also includes information about multiple sub-functions in the NewTalk function, such as sub-function 2223, sub-function 2224 and sub-function 2225.
  • the sub-function 2223 includes a function name: "Smartly increases call quality", and a corresponding function description 2223A is also displayed under the sub-function 2223 (including the characters "allow the use of cellular mobile data to ensure a smooth call experience").
  • the sub-function 2224 includes a function name: "Information sharing during a call", and a corresponding function description 2224A is also displayed under the sub-function 2224 (including the characters "allow receiving information during a call, such as pictures, locations, files, links, etc.”).
  • the sub-function 2224 is implemented, for example, through options 312B and 312C included in the floating window 312 in the call interface 310 shown in FIG. 3(A).
  • the sub-function 2225 includes a function name: "watch/listen together", and a corresponding function description 2225A is also displayed under the sub-function 2225 (including the characters "allow both parties who initiate or accept the call to watch and listen together, etc.”), which is used to indicate that the call is passed NewTalk implements real-time sharing functions such as operator calls/OTT calls, etc., for example, through the option 312D included in the floating window 312 in the call interface 310 shown in (A) of FIG. 3 .
  • any sub-function in the NewTalk function can be turned on or off independently.
  • the electronic device 100 can display a user interface 2230, and the user interface 2230 includes a setting name. 2231 (including the characters "New Talk"), for example, the electronic device 100 may display the user interface 2230 in response to a touch operation (such as a click operation) for the "New Talk" option in the settings menu under the "Phone" option.
  • the user interface 2230 may include options for multiple sub-functions in the NewTalk function, such as sub-function 2232, sub-function 2233 and sub-function 2234.
  • the sub-function 2232 includes a function name: "Call Quality Enhancement”, and a corresponding description is also displayed on the lower side. 2232A (including the characters "After turning on, calls are allowed to use cellular mobile data to ensure a smooth call experience. The actual traffic consumed is subject to operational statistics"), and the corresponding switch control 2232B is also displayed on the right. Switch control 2232B is used to turn on Or turn off sub-feature 2232.
  • Sub-function 2233 includes the function name: "Allow receiving information during calls", and the corresponding description 2233A is also displayed on the lower side (including the characters "When turned on, it will allow receiving information during calls, such as pictures, locations, files, links, etc.”), right
  • the corresponding switch control 2233B is also displayed on the side, and the switch control 2233B is used to turn on or off the sub-function 2233.
  • the sub-function 2234 includes the function name: "Allow both parties who initiate or receive the call to watch/listen together", and a corresponding description 2234A is also displayed on the lower side (including the characters "When turned on, both parties who initiate or receive the call will be allowed to watch/listen together, etc.”),
  • the corresponding switch control 2234B is also displayed on the right side.
  • the switch control 2234B is used to turn on or off the sub-function 2234, that is, the function of real-time sharing through NewTalk such as operator calls/OTT calls, etc.
  • the electronic device 100 can display a user interface 2240.
  • the user interface 2240 includes a setting name 2241 (including the characters "Huawei Share"), and a corresponding description 2241A (including the characters "Huawei Share”) is also displayed under the setting name 2241. "No need for traffic, extremely fast sharing of pictures, videos, applications, files, etc. with nearby devices"), it can be characterized that the user interface 2240 is the setting interface for the Huawei sharing (this application can also be called instant sharing) function.
  • the electronic device 100 may display the user interface 2240 in response to a touch operation (such as a click operation) for the "Huawei Share” option under the "More Connections" option in the settings menu.
  • the user interface 2240 may include a function name 2242 (including the characters “Huawei Share”) and a function name 2243 (including the characters “Allow to obtain Huawei account permissions”), wherein a corresponding function description 2242A (including the characters "This service uses Bluetooth and WLAN for data and multimedia streaming, uses NFC for device touch, calls storage permissions to read or save shared files, and watch and listen together. Huawei sharing can continue even if Bluetooth, WLAN, and NFC are turned off. The ability to use Bluetooth, WLAN, and NFC functions. By turning on the switch, you agree to the above content”), which can represent Huawei's sharing function of sharing files through near-field communication technologies such as Bluetooth, WLAN, and NFC, and sharing audio streams/video streams in real time.
  • Function name A corresponding switch control 2242B is also displayed on the right side of the scale 2242.
  • the switch control 2242B is used to turn on or off the Huawei sharing function indicated by the function name 2242.
  • the switch control 2242B can be understood as the master switch of the Huawei sharing function.
  • the Huawei sharing function indicated by the function name 2242 is implemented, for example, through the control 814B/control 814D in the user interface 810 shown in FIG. 8A .
  • the corresponding function description 2243A is also displayed below the function name 2243 (including the characters "Allows the local Huawei account nickname and avatar to be obtained and cached in the sender's device so that the sender can more easily identify you"), and is also displayed on the right
  • the corresponding switch control 2243B is used to turn on or off the function indicated by the function name 2243.
  • the user name shown in the list 622 of selectable sharing objects in the user interface 620 shown in FIG. 6B can be through the function.
  • the function indicated by name 2243 is obtained.
  • the function of real-time sharing through Huawei Share can be turned on or off independently of the Huawei Share function.
  • the electronic device 100 can display a user interface 2250 for the user.
  • the interface 2250 is similar to the user interface 2240 shown in Figure 22D.
  • the difference is that the function description 2251 displayed under the function name 2242 in the user interface 2250 is different.
  • the function description 2251 includes the characters "This service uses Bluetooth and WLAN for data and multimedia streaming.” , use NFC to touch the device, and call storage permissions to read or save shared files. Even if Bluetooth, WLAN, and NFC are turned off, Huawei Share can continue to use Bluetooth, WLAN, and NFC functions.
  • the user interface 2250 also includes a function name 2252 (including the characters “Allow watching and listening together with nearby devices"), and a corresponding function description 2252A is also displayed below the function name 2252 (including the characters “Allow nearby devices to share together through Huawei”).
  • “Watch and listen together” can represent the function of real-time sharing through Huawei Share, such as the function of real-time sharing through near-field communication technologies such as Bluetooth, WLAN, and NFC.
  • a corresponding switch control 2252B is also displayed on the right side of the function name 2252. The switch control 2252B is used to turn on or off the function indicated by the function name 2252.
  • the function indicated by the function name 2252 is implemented, for example, through the control 814B/control 814D in the user interface 810 shown in FIG. 8A.
  • the real-time sharing scenario may also include “play together”.
  • the real-time sharing scenario of "playing together” will be explained by taking the electronic device 100 and the electronic device 200 that perform NewTalk such as operator calls/OTT calls, etc. as an example.
  • the electronic device 100 may respond to a user operation for triggering real-time sharing, for example, a touch operation for the sharing option 312D in the call interface 310 shown in (A) of FIG. 3 , to the call party: the electronic device 200 sends a "play together" request.
  • the electronic device 100 and the electronic device 200 can display the game interface at the same time.
  • the electronic device 100 can display the game interface as shown in (1) of Figure 23A.
  • User interface 2310 the electronic device 200 can display the user interface 2320 shown in (2) of FIG. 23A.
  • the user interface 2310 may include a call icon 2311 at the top and a game window 2312.
  • the call icon 2311 may indicate that the electronic device 100 is currently in a call state and the call duration is 33 seconds.
  • the keyboard 2312D may include a determination control 2312E, which is used to submit the content in the input box 2312C as the answer corresponding to the question information 2312B to the review device, so that the review device reviews whether the answer is correct.
  • the game window 2312 may also include a control option 2313 and a switching option 2314.
  • the control option 2312 is used to trigger the display of a control menu.
  • the control menu includes, for example, but is not limited to, the option of pausing/exiting "Play Together".
  • the switching option 2314 is used to switch the topic information 2312B. What's included.
  • the user interface 2320 is similar to the user interface 2310, except that the question information 2321A in the game window 2321 shown in the user interface 2320 is different from the question information 2312B in the user interface 2310.
  • the topic information displayed by the electronic device 100 and the electronic device 200 may be the same.
  • the topic information in the user interface 2320 is the topic information 2312B in the user interface 2310.
  • the electronic device 100 may receive the character “38” input by the user in the input box 2312C in the user interface 2310 shown in (1) of FIG. 23A , and receive a determination in the user interface 2310 Touch operation (for example, click operation) of control 2312E. In response to the touch operation, the electronic device 100 may send the content in the input box 2312C (ie, character “38”) to the server.
  • Touch operation for example, click operation
  • the server can instruct the electronic device 100 to display prompt information, the prompt information represents the victory of the current game, and instructs the electronic device 200 displays prompt information, which indicates that the current game has failed.
  • the electronic device 100 can display the user interface 2330 shown in (1) of Figure 23B
  • the electronic device 200 can display the user interface 2330 shown in (2) of Figure 23B.
  • the user interface 2330 is similar to the user interface 2310 shown in (1) of FIG. 23A .
  • the difference is that the input box 2312C in the game window 2312 shown in the user interface 2330 displays the character "38" input by the user, and the game window 2312 Prompt information 2331 is also displayed.
  • the prompt information 2331 includes the characters "I win” and is used to indicate victory in the current game.
  • the user interface 2340 is similar to the user interface 2320 shown in (2) of Figure 23A.
  • the difference is that the game window 2321 shown in the user interface 2340 also displays prompt information 2341.
  • the prompt information 2341 includes the characters "the opponent wins", for instructions when The previous game failed.
  • the electronic device 100 may request the server to obtain a new round in response to a touch operation (eg, a click operation) for the switching option 2314 in the user interface 2330 shown in (1) of FIG. 23B .
  • the game content of the game After receiving the request, the server may send game content, such as title information in the game window, to the electronic device 100 and the electronic device 200 .
  • the electronic device 100 may also request the server to obtain the information of a new round of the game within a preset time period (for example, 10 seconds) after displaying the prompt information 2331 in the user interface 2330 shown in (1) of FIG. 23B .
  • Game content It is not limited to the above examples.
  • the electronic device 200 may also request the server to obtain the game content of a new round of the game, which is not limited in this application.
  • the audit device is a server. In other examples, the audit device may also be the electronic device 100, the electronic device 200, or other network devices.
  • the auditing device is electronic device 100 .
  • the electronic device 100 can determine by itself whether the content in the input box 2312C (ie, the character "38") in the user interface 2330 shown in (1) of Figure 23B is the answer corresponding to the question information 2312B in the user interface 2330. When the judgment result When yes, prompt information representing the current game victory may be displayed, and the electronic device 200 may be instructed to display prompt information representing the current game failure.
  • the auditing device is electronic device 200 .
  • the electronic device 100 can send the content in the input box 2312C (i.e., the character “38”) in the user interface 2330 shown in (1) of FIG. 23B to the electronic device 200, and the electronic device 200 determines whether the content is in the input box 2312C in the user interface 2330.
  • the answer corresponding to the question information 2312B is, when the judgment result is yes, the electronic device 200 can display the prompt information representing the failure of the current game, and instruct the electronic device 100 to display the prompt information representing the victory of the current game.
  • the device that provides "play together” game content is a server. It can be understood that the server is a master device/sharing device, and the electronic device 100 and the electronic device 200 are slave devices/shared devices. Not limited thereto, in other examples, the device that provides "play together” game content may also be the electronic device 100, the electronic device 200, or other network devices.
  • the electronic device 100 may share the game content to the electronic device 200 in real time, but is not limited to the following three sharing methods:
  • Method 1 Do not transmit multimedia data streams such as audio streams/video streams that can be directly output by the shared device, but only transmit game data.
  • the game data is, for example, the question information 2321A shown in Figure 23A and Figure 23B. It is not limited to this, and also It may include game status data such as scores, which is not limited in this application.
  • Method 2 Transmit multimedia data streams such as video streams/video streams that can be directly output by the shared device, but do not carry an obstructed canvas.
  • a new layer which can be called a canvas
  • the canvas is used to block the question information 2312B so that the user cannot see the question.
  • Message 2312B The electronic device 100 cancels displaying the canvas after a preset transmission duration.
  • the preset transmission duration may be the delay for the electronic device 100 to send the multimedia data stream to the electronic device 200 and the processing delay before the electronic device 200 plays the multimedia data stream (for example, Decoding and rendering delay), the preset transmission duration is, for example, the measured value, average value or estimated value of the electronic device 100 within the preset measurement duration (eg, within the latest transmission process).
  • Method 3 Transmit multimedia data streams such as video streams/video streams that can be directly output by the shared device, and also carry an obscured canvas.
  • the electronic device 100 and the electronic device 200 can display the canvas on the question information 2312B and the question information 2321A shown in FIGS. 23A and 23B before the preset game start time, and cancel the display of the canvas at the preset game start time.
  • the data transmission volume of the device is small, and the network environment requirements such as traffic and bandwidth are low. It can be well adapted to scenarios with small traffic or poor network quality, reducing device data transfer volume and reduce device power consumption.
  • the electronic device 100 may provide game data for the electronic device 200, and the electronic device 200 may provide game data for the electronic device 100.
  • Game data this application does not limit this.
  • the slave device/shared device can browse and operate games, but instead of running the game, it can run the "Play Together" playback window. It is not limited to this. In other examples, the slave device/shared device can also run a game based on the received game content, such as receiving a complete game application, which is not limited in this application.
  • the game content is not limited to the above examples. In other examples, it can also be "playing together" other games.
  • the electronic device 100 can display the user interface 2350 shown in (1) of Figure 23C.
  • the electronic device 200 may display the user interface 2360 shown in (2) of Figure 23C.
  • the game window 2351 in the user interface 2350 may include a window 2351A for displaying the game content of the electronic device 100, a window 2351B for displaying the game content of the game opponent (ie, the electronic device 200), a game score 2351C, and props of the electronic device 100.
  • Information 2351D is not limited to the above examples. In other examples, it can also be "playing together" other games.
  • the electronic device 100 can display the user interface 2350 shown in (1) of Figure 23C.
  • the electronic device 200 may display the user interface 2360 shown in (2) of Figure 23C.
  • the game window 2351 in the user interface 2350 may include a window 2351A for displaying the game content of the electronic device 100, a window 2351
  • the game window 2361 in the user interface 2360 may include a window 2361A for displaying the game content of the electronic device 200, a window 2361B for displaying the game content of the game opponent (ie, the electronic device 100), a game score 2361C, the electronic device 200 prop information 2361D.
  • the electronic device 100 or the electronic device 200 can send the updated game content on the device to the call party, so that the call party can update the displayed user interface.
  • the electronic device 200 may change the window in the user interface 2360
  • the content of window 2361A is sent to the electronic device 100 for the electronic device 100 to update the display content of the window 2351B in the user interface 2350.
  • the electronic device 200 can directly send the game content to the electronic device 100, or can first send the game to the server. The content is then forwarded to the electronic device 200 by the server (which can be called indirect transmission). Similarly, the electronic device 100 may directly or indirectly send the content of the window 2351A in the user interface 2350 to the electronic device 200 for the electronic device 200 to update the display content of the window 2361B in the user interface 2360. Not limited to this, for example, when the game score of the electronic device 100 changes, the latest game score can be sent directly or indirectly to the electronic device 200 for the electronic device 200 to update the display content of the game score 2361C in the user interface 2360 . Similarly, when the game score of the electronic device 200 changes, the latest game score can be sent to the electronic device 100 directly or indirectly, so that the electronic device 100 updates the display content of the game score 2351C in the user interface 2350.
  • the real-time sharing scene may also include "editing together”.
  • the real-time sharing scenario of "editing together” will be explained by taking the electronic device 100 and the electronic device 200 that perform NewTalk such as operator calls/OTT calls, etc. as an example.
  • the electronic device 100 (user A) can serve as a sharing device to share the content of a document (for example, word format) in real time with the other party: the electronic device 200 (user B).
  • the electronic device 100 and the electronic device 200 can display the document at the same time. specific content.
  • the electronic device 100 may display the user interface 2410 shown in (1) of FIG. 24A
  • the electronic device 200 may display the user interface 2420 shown in (2) of FIG. 24A .
  • the user interface 2410 may include an editing window 2411 of document 1.
  • the editing window 2411 may include the specific content of document 1 and an editing function list 2411A.
  • the editing function list 2411A may include, for example, controls for saving the document, undoing the most recent input, and restoring the most recent undo. Input controls, exit editing controls, etc.
  • User interface 2420 is similar to user interface 2410 and also includes an editing window 2421 of document 1.
  • the electronic device 100 may display a cursor 2411C on the right side of the text 2411B and in an area where the text 2411B is located in response to a touch operation on the text 2411B ("Text 1") in the editing window 2411
  • the edit mark 2411D, the cursor 2411C and the edit mark 2411D are used to indicate that the user A is currently using the electronic device 100 to edit the text 2411B.
  • the electronic device 200 can display the editing identification 2421A and the prompt information 2421B (including the characters "User A is editing simultaneously") in the area where the text 2411B is located in the editing window 2421 to indicate The other party (user A) is currently editing text 2411B.
  • the cursor 2421D can be displayed on the right side of the text 2421C in the editing window 2421, and the text 2421C The area where you are located can display the edit mark 2421E.
  • the area where the text 2421C in the editing window 2411 is located may display an editing identifier 2411E and a prompt message 2411F (including the characters "User B is editing simultaneously").
  • the electronic device 100 or the electronic device 200 can send the updated document content on the device to the call party, so that the call party can update the displayed document content.
  • user A is shown in (1) of Figure 24A If the text 2411B in the user interface 2410 is changed from “Text 1" to "Text 1 includes", then the text 2411B in the user interface 2420 shown in (2) of Figure 24A will also be updated to "Text 1 includes”.
  • the document may also be in table (excel) format.
  • the electronic device 100 may display a user interface 2440.
  • the user interface 2440 may include an editing window 2441 of Form 1.
  • the editing window 2441 may include the specific content of Form 1 and an editing function list.
  • a cursor 2441B is displayed on the right side of the content 2441A in the editing window 2441, and an editing mark 2441C is displayed in the area where the content 2441A is located, which is used to indicate that user A is currently using the electronic device 100 to edit the content 2441A.
  • the area where the content 2441D in the editing window 2441 is located displays an editing logo 2441E and a prompt message 2441F (including the characters "User B is editing simultaneously"), which are used to indicate that the other party (user B) is currently editing the content 2441D.
  • the interface displayed by the electronic device 200 is similar to the user interface 2440, and the specific description is similar to that of FIG. 24B, and will not be described again.
  • the document may also be in PPT format.
  • the electronic device 100 may display a user interface 2430.
  • the user interface 2430 may include an editing window 2431 of PPT1.
  • the editing window 2431 may include a display window 2432 of slide content and a list 2433 of slide content included in PPT1.
  • the list Option 2433A in 2433 is selected, which can indicate that the display window 2432 is used to display the slide content indicated by option 2433A.
  • a cursor 2432B is displayed on the right side of the content 2432A in the display window 2432, and an editing mark 2432C is displayed in the area where the content 2432A is located, which is used to indicate that user A is currently using the electronic device 100 to edit the content 2432A.
  • the area where the content 2432D in the display window 2432 is located displays an editing logo 2432E and a prompt message 2432F (including the characters "User B is editing simultaneously"), which are used to indicate that the other party (user B) is currently editing the content 2432D.
  • the interface displayed by the electronic device 200 is similar to the user interface 2430, and the specific description is similar to that of Figure 24C, and will not be described again.
  • the electronic device 100 is used to provide "edited together" documents. It can be understood that the electronic device 100 is the master device/sharing device, and the electronic device 200 is the slave device/shared device.
  • the slave device/shared device can browse and edit documents, but instead of running the document, run the "edit together" playback window. It is not limited to this.
  • the slave device/shared device can also run the document according to the received document content, for example, a complete document is received, which is not limited in this application.
  • the electronic device 100 can also share drawings, whiteboards, annotations, etc. to the electronic device 200 in real time.
  • user A can input content in the drawing window/whiteboard displayed by the electronic device 100 1.
  • the drawing window/whiteboard displayed by the electronic device 200 can display the content 1 input by user A. It is not limited to this, and the content can also be deleted or modified.
  • This application does not limit the specific editing method.
  • user A can add annotations to the video stream displayed by the electronic device 100, and the electronic device 100 can send the video stream and the annotation content together as sharing data to the electronic device 200 for display, to facilitate communication between the sharing user and the shared user. .
  • This application does not limit the content shared.
  • user operation events For example, touch operation time
  • related information such as touch operation occurrence time
  • the sharing device and the shared device may also be electronic devices configured with a foldable display screen (which may be called a folding screen) (which may be referred to as a foldable electronic device).
  • a foldable display screen which may be called a folding screen
  • layer 2041B and layer 2041C in the user interface 2040 shown in Figure 20D can be displayed on two display screens of the foldable electronic device respectively, for example, the play window 2151 and the play window 2151 in the user interface 2150 shown in Figure 21E
  • the play window 2152 can be displayed on two display screens of the foldable electronic device respectively.
  • the sharing method involved in this application is introduced.
  • This method can be applied to the sharing system 10 shown in Figure 1A.
  • This method can be applied to the sharing system 10 shown in Figure 1B.
  • This method can be applied to the sharing system 10 shown in Figure 1C.
  • This method can be applied to the sharing system 10 shown in Figure 2E.
  • Figure 25 is a schematic flowchart of a sharing method provided by an embodiment of the present application.
  • Sharing a device can include, but is not limited to, performing the following steps:
  • S11 The sharing device displays the sharing entrance.
  • the sharing device can perform a real-time sharing process in response to a user operation on the sharing portal.
  • a user operation please refer to the description of S12-S17.
  • the user operation can be understood as being used to trigger the real-time sharing function/real-time sharing process. user operations.
  • the sharing portal is the sharing option 312D included in the call interface 310 shown in (A) of FIG. 3 or the floating window 312 in the user interface 410 shown in FIG. 4A, which is used to trigger user operations of the real-time sharing function, such as It is a touch operation (such as a click operation) for the sharing option 312D.
  • the sharing portal is the user interface 610 of the short video application shown in FIG. 6A.
  • the user operation for triggering the real-time sharing function is, for example, a touch operation on the user interface 610.
  • the touch operation is, for example, a single-finger slide, multiple Sliding operations such as finger sliding or finger joint sliding (for example, the finger joints shown in Figure 6A slide according to a specific trajectory of "W").
  • the sharing entrance is the sharing control 712B in the user interface 710 of the multitasking list/multitasking window shown in FIG. 7A
  • the user operation used to trigger the real-time sharing function is, for example, a touch operation on the sharing control 712B (for example, click operation).
  • the sharing portal is the real-time sharing control 814B or the control 814D in the user interface 810 shown in FIG. 8A
  • the user operation used to trigger the real-time sharing function is, for example, a touch operation (such as clicking on the control 814B or the control 814D). operate).
  • the sharing device selects the target sharing content.
  • the sharing device can determine the target sharing content according to preset rules.
  • the sharing device can determine the target sharing content according to the sharing portal: a multimedia data stream of an application related to the sharing portal.
  • the sharing device when the sharing device receives a touch operation for the sharing option 312D included in the floating window 312 in the user interface 410 shown in FIG. 4A, since the user interface 410 is a user interface of a short video application, the sharing device can Determine the target sharing content to be the multimedia data stream of the short video application.
  • the sharing device when the sharing device receives a touch operation on the user interface 610 shown in FIG. 6A, since the user interface 610 is a user interface of a short video application, the sharing device can determine that the target sharing content is a multimedia content of a short video application. data flow.
  • the sharing device when the sharing device receives the sharing control 712B in the user interface 710 shown in FIG. 7A , since the sharing control 712B is a control in the user interface 710 related to the window 712 of the short video application, the sharing device can Determine the target sharing content to be the multimedia data stream of the short video application.
  • the sharing device can determine the target sharing content in response to the user operation.
  • the sharing device can display a selection interface for sharing content, and the sharing device can respond In the selection interface for this Any user operation of sharing content determines that the shared content is the target shared content.
  • the user interface 620 shown in FIG. 6B is a selection interface for sharing content.
  • the list 621 in the user interface 620 shows multiple selectable options for sharing content.
  • the multiple shared content can be foreground applications. Multimedia data streams (such as short video applications), display content on the screen of the electronic device 100 (sharing device), and multimedia data streams of background applications (such as video applications).
  • the user interface 1230 shown in FIG. 12C is a selection interface for sharing content.
  • the list 1231 in the user interface 1230 shows multiple selectable options for sharing content.
  • the multiple shared content can be foreground applications. Multimedia data streams (such as short video applications), multimedia data streams of background applications (such as video applications), and multimedia data streams of applications (such as music applications) that are not running on the electronic device 100 (sharing device).
  • the sharing device selects the target sharing object (i.e., the shared device).
  • the sharing device may first discover devices/objects that can be selected/shared in real time, and then select the target sharing object from the discovered devices/objects. Among them, the sharing device discovers selectable/real-time shareable devices/objects through communication technologies such as, but not limited to, cellular communication technology, near field communication technology, satellite communication technology, D2D, etc.
  • the sharing device can determine the target sharing object according to preset rules.
  • the sharing device can determine the target sharing object according to the sharing portal: a device related to the sharing portal.
  • the electronic device 100 (sharing device) receives a touch operation for the sharing option 312D included in the floating window 312 in the user interface 410 shown in FIG. 4A
  • the floating window 312 is a control related to NewTalk (specifically Please refer to the description of (A) of FIG. 3).
  • the electronic device 100 is currently performing NewTalk with the electronic device 200. Therefore, the electronic device 100 can determine that the target sharing object is the other party of the call: the electronic device 200.
  • the sharing device can determine the target sharing object in response to the user operation.
  • the sharing device can display a selection interface for the sharing object, and the selection interface can Includes discovered devices/objects that are selectable/shareable in real time.
  • the sharing device may respond to a user operation on any sharing object in the selection interface and determine that the sharing object is the target sharing object.
  • the user interface 620 shown in FIG. 6B is a selection interface for sharing objects.
  • the list 622 in the user interface 620 shows the options of multiple selectable sharing objects.
  • the multiple sharing objects may include a call counterparty. and at least one nearby device.
  • the user interface 1110 shown in FIG. 11A is a selection interface for sharing objects.
  • the list 1111 in the user interface 1110 shows the options of multiple selectable sharing objects.
  • the multiple sharing objects may include multiple calls. each other and at least one nearby device.
  • the user interface 1120 shown in FIG. 11B is a selection interface for sharing content.
  • the list 1121 in the user interface 1120 shows the options of multiple selectable sharing objects.
  • the multiple sharing objects may include at least one recent Contacts and at least one nearby device.
  • the user interface 1130 shown in FIG. 11C is a selection interface for sharing content.
  • the list 1131 in the user interface 1130 shows the options of multiple selectable sharing objects.
  • the multiple sharing objects may include contacts and A specific example of at least one nearby device, contact can be seen in Figure 11D.
  • S12 and S13 are not limited, for example, they may be executed simultaneously.
  • the sharing device may first display a real-time sharing mode selection interface.
  • the sharing device may display a selection interface for sharing content and/or sharing objects in response to a user operation for any real-time sharing method in the selection interface (the displayed sharing content and/or sharing objects are related to the real-time sharing method).
  • the user interface 1210 shown in FIG. 12A is a real-time sharing mode selection interface.
  • the sharing device may display the user interface 1220 shown in FIG. 12B in response to the user operation for the watch together option 1211A in the user interface 1210, and the list 1221 in the user interface 1220 shows a plurality of options for sharing content that can be viewed.
  • list 1222 in user interface 1220 shows options for multiple devices that can display images.
  • the sharing device may display the user interface 1230 shown in FIG. 12C in response to the user operation for the listen together option 1211B in the user interface 1210, and the list 1231 in the user interface 1230 shows a plurality of options for sharing content that can be listened to.
  • list 1232 in user interface 1230 shows options for multiple devices that can play audio.
  • the sharing device may determine the real-time sharing method based on the received user operation for triggering the real-time sharing function, and then display the sharing content and/or sharing object selection interface (where the displayed sharing content and/or The sharing objects are related to the real-time sharing method).
  • the sharing content selection interface is the user interface 1220 shown in Figure 12B.
  • the selection interface for sharing content is as shown in Figure 12C User interface 1230.
  • S14 The sharing device selects the target communication link.
  • the target communication link may, but is not limited to, include one or more of link 1-link 6 and V2X link shown in Figure 2E. links.
  • the sharing device can determine the target communication link according to preset rules.
  • the sharing device may determine the target communication link based on the target sharing object.
  • the target communication link may be a link related to the established call link between the electronic device 100 and the electronic device 200. For example NewTalk link or auxiliary link.
  • the sharing device may determine the target communication link according to the sharing portal to be: a device related to the sharing portal.
  • the target communication link may be a link related to the instant sharing function, such as Wi-Fi. Fi link or BT link.
  • the sharing device may determine the target communication link in response to a user operation.
  • the sharing device may display the user interface 1310 shown in FIG. 13 , the user interface 1310 may include an option to share with contacts 1311A, an option to share with a Wi-Fi device 1311B, and an option to share with a Bluetooth device 1311C.
  • the target communication link corresponding to option 1311A is, for example, a NewTalk link or an auxiliary link
  • the target communication link corresponding to option 1311B is, for example, a Wi-Fi link
  • the target communication link corresponding to option 1311C is, for example, a Bluetooth link.
  • S14 and S11-S13 are not limited.
  • S13 and S14 may be executed simultaneously.
  • S15 The sharing device and the shared device establish a target communication link.
  • S15 and S11-S13 are not limited. For example, S15 has been executed before S11.
  • the target communication link is a far-field Wi-Fi link.
  • a sharing device and a shared device in different local area networks can establish a far-field Wi-Fi link.
  • the target communication link is a near-field Wi-Fi link.
  • a sharing device and a shared device connected to the same Wi-Fi signal source in the same local area network at this time can establish a near-field connection. Wi-Fi link in field form.
  • S16 The sharing device captures the shared data.
  • the sharing device can capture sharing data related to the target sharing content.
  • the target sharing content is the multimedia data stream of Application 1
  • the sharing device can capture the layers and other content of Application 1 to generate the multimedia data stream (shared data) of Application 1 such as images and/or audio.
  • the target sharing content is the display content of the screen of the sharing device and/or related audio data
  • the sharing device can capture content such as layers displayed on the sharing device to generate multimedia such as images and/or audio of the system. Data flow (shared data).
  • the target sharing content may not be the data output by the device in the foreground or background, but the data not output by the device.
  • the shared data can be shared through 3G/4G/5G/
  • the 6G broadcast channel receives the broadcast data of the channel sent by the base station and does not output the broadcast data, but uses the broadcast data as shared data for real-time sharing.
  • the sharing device may not capture the application-level and/or system-level multimedia data of the device as sharing data, but may generate sharing data related to the target sharing content and send it to the shared device. For example, assuming that the type of the target shared content is a game, the sharing device can generate game-type sharing data and send it to the shared device.
  • the sharing device can also capture received user operation events and related information (such as occurrence time).
  • the sharing device can capture user operation events and related information through an interface provided by the system (for example, the interface is provided for application integration and calling).
  • the interface includes, for example, but is not limited to, at least one of the following: including Discovery. ) interface (for example, used to discover members (Member)), link management (Link Manager, LinkMgr) interface, transmission (Transmit) interface (for example, used to send (send) and/or receive (receive, recv)).
  • this application does not limit the specific content of the shared data.
  • the shared data can be encoded, packetized, and diverted.
  • the processed shared data can be sent to the shared device, that is, used to perform S17.
  • S17 The sharing device sends sharing data to the shared device.
  • the sharing device may send the sharing data to the shared device through the target communication link. Understandably, since real-time sharing is performed between the sharing device and the shared device, the shared data is actually a data stream. Therefore, the sharing device can continue to send the sharing data stream (such as audio stream/video stream) to the shared device during real-time sharing. ).
  • the sharing device can also capture multimedia data related to any area on the screen of the sharing device and send it to the shared device.
  • the sharing device can determine the area to be shared in response to user operations. For specific examples, see Figures 17A-17I, 18A-18D, and 19A-19G.
  • the sharing device can also capture multimedia data related to any layer on the screen of the sharing device and send it to the shared device.
  • the sharing device can determine the layer to be shared in response to the user operation. For specific examples, see FIG. 20A-FIG. 20D.
  • the shared device can, but is not limited to, perform the following steps:
  • S21 The shared device receives the sharing request.
  • the shared device may continuously monitor whether a sharing request is received.
  • the shared device can accept the sharing request according to preset rules. For example, when the sharing device is a device that is communicating, has communicated with, or has been discovered, the shared device can accept the sharing request by default. Share request. In another implementation, the shared device can also respond to the user operation and accept the sharing request. For example, after the electronic device 200 (the shared device) receives the sharing request sent by the electronic device 100 (the sharing device), it can display In the prompt information 511 in the user interface 510 shown in FIG. 5A , the electronic device 200 can accept the sharing request in response to a touch operation (such as a click operation) on the acceptance control 511B in the prompt information 511 . After the shared device accepts the sharing request sent by the sharing device, it can establish a target communication link with the sharing device.
  • preset rules For example, when the sharing device is a device that is communicating, has communicated with, or has been discovered, the shared device can accept the sharing request by default. Share request. In another implementation, the shared device can also respond to the user operation and accept the sharing
  • the order of S21 and any of the above-mentioned S11-S16 is not limited.
  • the target sharing object can be determined to be the electronic device 200 (shared device) indicated by option 622A (ie, perform S13), and send a sharing request to the electronic device 200, and the electronic device 200 can receive the sharing request (ie, perform S21).
  • S22 and the above-mentioned S15 are executed simultaneously.
  • the order of S22 and S21 is not limited.
  • S23 The shared device receives the sharing data sent by the sharing device.
  • the shared device may execute S23.
  • the shared data after the shared data is received by the shared device, the shared data can be aggregated, unpacked, and decoded.
  • the processed shared data can be output to the user, that is, used to perform S24.
  • S24 The shared device outputs shared data.
  • the shared device can display the images in the shared data through the display screen and/or play the audio in the shared data through the speaker.
  • the shared device can display the images in the shared data through the display screen and/or play the audio in the shared data through the speaker.
  • FIG 5B, Figure 14A, Figure 14B and Figure 14C There are no restrictions on the way the sharing device outputs shared data.
  • the sharing device can also send a sharing request for the shared device to other devices connected to the shared device. After receiving the sharing request, the other devices can output prompt information. , the user can accept or reject the sharing request for the shared device through the other devices mentioned above. For specific examples, see Figure 14D.
  • the sharing device and the shared device may not directly establish a communication link, but may establish a communication link through a third-party device "relay” and transmit the data through a third-party device “relay” Share data, see Figure 14D for specific examples.
  • any device performing real-time sharing can receive user operations when displaying shared data, and process the shared data in response to the user operations, such as setting a certain content to editing status, updating content, etc. .
  • the device can send processing information (such as the edited location, updated content, and information related to the updated content) to other devices for real-time sharing, so that other devices can update the shared data displayed by the device.
  • the shared data is game content.
  • the electronic device 100 determines that the content in the input box 2312C (i.e., the character “38”) is corresponding to the question information 2312B in the user interface 2330.
  • prompt information 2331 can be displayed, and information indicating that the current game has failed can be sent to the electronic device 200 (which can be understood as information related to the updated content).
  • the electronic device 200 can display the user interface 2340 Prompt message 2341 in.
  • the sharing data is game content.
  • the electronic device 100 may update the window 2351A and the game score 2351C in the user interface 2350 in response to the user operation, and send the updated content to the electronic device 200.
  • the electronic device 200 may display the window in the user interface 2360 according to the updated window 2351A. 2361B, displaying the game score 2361C in the user interface 2360 based on the updated game score 2351C.
  • the shared data is document 1 in word format.
  • the electronic device 100 can send the currently editing position: text 2411B to the electronic device 200 (since the characters included in the text 2411B are not currently modified, Therefore, the updated content may not be sent). Therefore, the electronic device 200 may display the editing identification 2421A and the prompt information 2421B in the area where the text 2411B is located in the user interface 2420.
  • the shared data can also be documents in other formats. For specific examples, see Figure 24B and Figure 24C.
  • the sharing data may not be provided by the sharing device, but may be provided by a network device such as a server.
  • the sharing device can be understood as a device that initiates real-time sharing, but not Provide facilities for sharing data.
  • the sharing device can send a sharing request to the network device, and the network device sends sharing data to the shared device based on the sharing request, where,
  • the network device is, for example, an application server that shares an application corresponding to the data.
  • the network device can also send sharing data to the sharing device.
  • the sharing data sent by the network device to the sharing device and the sharing data sent to the shared device can be the same or different, for example, as shown in Figure 23A- Figure 23B
  • the server can send different topic information to the electronic device 100 and the electronic device 200 respectively.
  • the game window 2312 shown in the user interface 2310 displayed by the electronic device 100 and the user interface displayed by the electronic device 200 The game window 2321 shown in 2320 is different (the title information therein is different).
  • the server can also serve as an auditing device to verify whether the answer sent by the electronic device 100 or the electronic device 200 is correct. Please do not limit the devices that provide shared data.
  • the sharing device can manage the shared device, for example, cancel real-time sharing to a certain device (which can also be called deleting the device). For specific examples, see FIG. 10A-FIG. 10B.
  • the sharing device can change the shared content. For specific examples, see FIG. 10A-FIG. 10B.
  • the sharing device can set relevant permissions of the shared device based on the shared content, such as but not limited to saving permissions and forwarding permissions. For specific examples, see Figures 15A-15D and 16A-16E.
  • the shared device when the sharing device shares the first content to the shared device in real time, the shared device can also share the second content to the sharing device in real time. That is to say, two-way sharing can be achieved, and the shared device shares to the sharing device in real time.
  • the description is similar to the above description of real-time sharing from the sharing device to the shared device and will not be repeated. For specific examples, see Figure 21A- Figure 21E.
  • the electronic device can enable the real-time sharing function in the above implementation by default. In another implementation, the electronic device can enable the real-time sharing function in the above implementation in response to the user's operation. For specific examples, see FIG. 22A-FIG. 22E.
  • Figure 25 takes an example of real-time sharing between a sharing device and a shared device.
  • the sharing device can be shared in real time with multiple shared devices.
  • the sharing device can be shared with any one of the multiple shared devices.
  • the description of real-time sharing by the shared device can be found in the description of Figure 25.
  • This application allows the sharing device and one or more call parties, nearby devices and other shared devices to realize real-time sharing and collaboration functions such as watching together, listening together, playing together and editing together through one user operation for the sharing portal. It provides a more concise and convenient user experience operation sequence, solving the problem of inability to share in real time in operator call and near field communication scenarios. There is no need to install chat applications or conference applications, applications to be shared, and no need to adapt to the applications to be shared. The application greatly broadens the application scenarios, allowing users to quickly share multimedia data streams of any application, any area, and any layer, effectively meeting user needs and improving user experience. Moreover, real-time sharing can reduce the possibility of secondary transmission and improve the protection of user privacy and security.
  • the sharing device may send the first image/video captured by the camera and the second image/video shared in real time (which may be application-level and/or system-level images/video) to the shared device for display together. /play, allowing the shared user to watch the real-time shared content and the actual scene of the other party at the same time, meeting the user's personalized needs.
  • the sharing device can send the first audio collected by the microphone and the second audio shared in real time (which can be application-level/system-level/background audio) to the shared device to play together, that is, to implement mixed playback. , allowing the shared user to listen to the real-time shared audio and the other party’s voice at the same time to meet the user’s personalized needs.
  • the transmission methods of the first audio and the second audio may include, but are not limited to, the following three methods:
  • Method 1 As shown in Figure 26A, on the sharing device side, after the sharing device collects the first audio through the microphone, it can perform 3A processing on the collected first audio and obtain the processed first audio, where the 3A processing can include returning Acoustic echo cancellation (AEC), adaptive noise suppression (ANS) and automatic gain control (AGC).
  • the sharing device can also obtain the shared second audio (for example, capture and generate the second audio). This application does not limit the order in which the sharing device obtains the processed first audio and obtains the second audio.
  • the sharing device may mix the processed first audio and the acquired second audio, and uniformly encode the mixed audio (which may be referred to as mixed encoding) to obtain the third audio.
  • the sharing device can send the third audio to the shared device. On the side of the shared device, the shared device can directly decode and play the third audio without separating the third audio.
  • Method 2 As shown in Figure 26B, the processing method on the sharing device side is the same as method 1. The difference is that on the shared device side, the shared device can separate and decode the third audio to obtain the first audio and the second audio. , the shared device can perform 3A processing on the first audio. The shared device can play the first audio and the second audio processed by 3A at the same time.
  • Method three As shown in Figure 26C, the processing method on the sharing device side is similar to method one. The difference is that the sharing device will not mix and encode the processed first audio and the acquired second audio, but will encode them separately. Moreover, the respectively encoded first audio and second audio can be to be transmitted to the shared device through different links.
  • the shared device can decode the received first audio and the second audio respectively, the shared device can perform 3A processing on the decoded first audio, and the shared device can play the 3A processed audio at the same time. The first audio and the decoded second audio.
  • the respectively encoded first audio and second audio can also be transmitted to the shared device through the same link.
  • the shared device can perform unified noise reduction on the received first audio and second audio (for example, the third audio obtained by mixing and encoding the first audio and the second audio), and in another In this implementation, the shared device may only perform noise reduction on the received first audio, but not the second audio. This application does not limit the specific method of noise reduction.
  • Figure 27 illustrates an architectural schematic diagram of yet another sharing system 10.
  • the sharing system 10 shown in FIG. 27 can be applied to the scenario of real-time sharing through NewTalk.
  • the following example uses real-time audio sharing as an example.
  • the electronic device 100 and the electronic device 200 in the sharing system 10 can perform real-time sharing through NewTalk such as watching together, listening together, playing together, and editing together. It is not limited to the unicast scenario in the above example. In other examples, more devices can be shared in real time, that is, it can be applied to multicast or broadcast scenarios, which is not limited in this application.
  • the description of the electronic device 200 is similar.
  • the application system of the electronic device 100 can be divided into three layers, which are the application framework layer, the hardware abstraction layer and the kernel layer from top to bottom.
  • the application framework layer includes sharing module, NewTalk function module, communication management module, audio framework module and multi-path transmission management module.
  • the hardware abstraction layer includes the radio interface layer (RIL), audio abstraction module, communication map and auxiliary link module.
  • the kernel layer includes mobile interface module and audio core module. in:
  • the communication management module is used to manage NewTalk's answering and hanging up functions. It is not limited to this. In some examples, the communication management module can also be used to manage related functions of short messages and Internet calls, which is not limited in this application.
  • the NewTalk function module can interact with the RIL through the communication management module to implement NewTalk between the electronic device 100 and the electronic device 200 .
  • RIL is an interface layer used to connect/interact with wireless communication systems.
  • the communication management module can interact with RIL.
  • the communication management module can interact with RIL through the NewTalk service module in the kernel layer.
  • the RIL may interact with the cellular communication system in the wireless communication system of the electronic device 100 through the mobile interface module.
  • the mobile interface module includes, for example, a mobile modem (mobile station modem, MSM) interface (interface), and a module for managing attention commands (attention commands, AT), where the attention command (AT) instruction set can be a terminal equipment (terminal equipment) , TE) or data terminal equipment (data terminal equipment, DTE) to the terminal adapter (terminal adapter, TA) or data circuit terminal equipment (data circuit terminal equipment, DCE).
  • TE or DTE can send attention commands (AT) by sending To control the functions of mobile station (MS) and interact with network services.
  • the audio framework module, audio abstraction module and audio core module are responsible for managing audio functions at the application framework layer, hardware abstraction layer and kernel layer respectively.
  • the audio framework module can interact with the audio core module through the audio abstraction module, and the audio core module can interact with the digital signal processing module in the wireless communication system to implement the audio processing process.
  • the audio framework module can also be called the audio framework (Audio Framework), and the audio abstraction module can also be called the audio hardware layer (Audio Hardware Layer, Audio HAL).
  • the audio core module can be the advanced sound architecture (ALSA) and/or the core (CORE) layer of the ALSA system on chip (ASoC).
  • ALSA can provide digital interfaces for audio and music devices (musical instrument digital interface (MIDI) support.
  • ASoC can be built on ALSA.
  • ASoC can rely on the standard ALSA driven framework.
  • ALSA CORE can provide logical device system calls upwards and drive hardware devices downwards.
  • Logical devices include, but are not limited to, PCM devices, control (control, CTL) devices, MIDI devices and timer (Timer) devices.
  • Hardware devices include, but are not limited to, PCM devices. Limited to include mechanical (Machine) equipment, I2S equipment, direct memory access (direct memory access, DMA) equipment and codec (codec) equipment, etc.
  • the digital signal processing module in the wireless communication system is, for example, an audio digital signal processing (ADSP) system (for example, used for audio decoding).
  • the digital signal processing module includes, for example, a PCM module.
  • the multi-path transmission management module can be responsible for establishing connections and transmitting data through multiple different paths (for example, it is called Quad Network+), and is responsible for efficiently transmitting data based on multiple paths (for example, it is called Huawei Public Cloud Network Plane (Huawei Open Network, HON), among which HON can be integrated into the future minimalist network of cloud services, integrating the synergy advantages of terminals, pipes, and cloud to build an optimal network communication experience).
  • Communication maps may include general communication maps, optionally as well as personalized communication maps.
  • the communication map can be used for predictive link building, including but not limited to predicting whether to establish a communication link, the time to establish the communication link, the type of communication link to be established, the location to establish the communication link, etc.
  • the NewTalk function module can use the audio framework module, audio abstraction module, and audio core module. and digital signal processing modules to process the real-time shared audio stream.
  • the processed real-time shared audio stream can be sent to the cellular communication module through the digital signal processing module, and the cellular communication module can simultaneously transmit the NewTalk call data stream and the real-time shared audio stream to the electronic device 200 .
  • the real-time shared audio stream can also be transmitted to the electronic device 200 through other communication modules such as a Bluetooth communication module, a satellite communication module, or a Wi-Fi communication module in the wireless communication module.
  • the NewTalk function module can interact with the auxiliary link module to interact with the electronic device. 200 establishes an auxiliary link, which can be used to transmit real-time shared audio streams.
  • Network device 300 may include an authentication module.
  • the authentication module is used to provide identity information, which can be user-level identity information (such as access token (AT)) or device-level identity information (such as Huawei certificate).
  • the NewTalk function module of the electronic device 100 can obtain the identity information of the electronic device 100 through the authentication module of the network device 300.
  • the authentication module of the network device 300 can provide information for the electronic device 100 logged in with a Huawei account. Corresponding identity information. Not limited to this, in some examples, the authentication module is also used to wake up an electronic device in an idle state or a dormant state.
  • the NewTalk function module of the electronic device 100 can authenticate the identity information (such as the above-mentioned access token (AT) or Huawei certificate) through the addressing module of the network device 300. After the authentication is passed, the network device 300 can generate an electronic device. 100 P2P-TOKEN, P2P-TOKEN can be used for NAT traversal or NAT relay. It is not limited to this.
  • the addressing module of the network device 300 can also be used by both parties in the call to exchange their respective Session IDs.
  • the addressing module of the network device 300 can also be used to interface with a push (PUSH) server, and wake up electronic devices in idle or dormant states through the PUSH server.
  • PUSH push
  • the awakened electronic device can be connected to the network device 300, and the authentication and addressing of the identity information are implemented through the authentication module and addressing module of the network device 300.
  • the NewTalk data stream (also called call data stream) can be transmitted through the NewTalk link shown in Figure 27.
  • the NewTalk link can be called Main link.
  • the description of the NewTalk link can be found in the description of link 1 in Figure 2E.
  • the communication link used to transmit the multimedia data stream shared in real time may be the NewTalk link (main link). It is not limited thereto. In another implementation, it may also be a NewTalk data channel. In another implementation, it may also be an auxiliary link. In some examples, the auxiliary link may be a NAT traversal link, or Server transit link (for example, NAT relay link), the description of the auxiliary link can be found in the description of link 6 in Figure 2E.
  • the following is an example of the discovery, link building, and transmission processes during real-time sharing through NewTalk.
  • the sharing device/sharing initiator discovers one or more candidate shared devices/sharing receivers.
  • discovery may facilitate the sharing device/sharing initiator to initiate a real-time sharing process to a designated device among the one or more candidate shared devices/sharing recipients.
  • the discovery process has been completed when NewTalk is established.
  • the other party in the call is the shared device/sharing receiver.
  • other parties in the call can be candidate shared devices/sharing receivers.
  • Link building Establish a communication link for transmitting multimedia data streams for real-time sharing.
  • link building may include but is not limited to the following three situations: always building a link, predicting a link, and building a link on demand, where always building a link is NewTalk
  • Predictive link establishment is to establish a communication link according to the predicted content.
  • the communication link is established according to the predicted time A and arrival in area A.
  • the predicted content is obtained based on a communication map, for example.
  • On-demand link building is to establish a communication link when there is a need for data transmission.
  • the communication link established for transmitting real-time shared multimedia data streams may include one or more communication links.
  • a low-power communication link may be maintained at all times and pulled up as needed. Create a high-speed and stable communication link.
  • the link establishment time may be, but is not limited to, any of the following situations:
  • Case 1 After NewTalk starts, link building is initiated at any point in time before real-time sharing. For example, in the embodiment shown in FIGS. 6A-6C , after the electronic device 100 and the electronic device 200 make an operator call/OTT call, the electronic device 100 responds to the touch operation on option 622A in the user interface 620 shown in FIG. 6B Previously, the electronic device 100 (sharing device) can initiate link building to the electronic device 200 (shared device), wherein the electronic device 100 can initiate real-time sharing to the electronic device 200 indicated by option 622A in response to the above touch operation.
  • Case 2 The sharing device selects the target sharing object and initiates link building.
  • the electronic device 100 (sharing device) responds to the touch operation on option 622A in the user interface 620 shown in FIG. 6B and determines that the electronic device 200 indicated by option 622A is the target sharing
  • the electronic device 100 can initiate link establishment to the electronic device 200 and perform real-time sharing based on the established link.
  • the sharing device initiates link building after selecting the target sharing object and the target sharing content.
  • the electronic device 100 (sharing device) responds to the touch operation on option 622A in the user interface 620 shown in FIG. 6B and determines that the electronic device 200 indicated by option 622A is the target sharing
  • the electronic device 100 can initiate link establishment to the electronic device 200 and proceed based on the established link. Share in real time.
  • Case 4 When establishing a NewTalk link, a communication link for transmitting multimedia data streams for real-time sharing is also established.
  • communication links used to transmit real-time shared multimedia data streams include NewTalk links.
  • Case 5 Before establishing the NewTalk link, establish a communication link for transmitting real-time shared multimedia data streams.
  • Case 6 Since communication links have been established in communication scenarios such as call packet replenishment, file sharing, and link sharing, the established communication links can be directly used as communication links for transmitting real-time shared multimedia data streams.
  • Link building The time is the establishment time of the above-mentioned established communication link.
  • the link building is predicted based on the communication map and other information, and the link building time is determined based on the prediction results.
  • the link building method may be, but is not limited to, any of the following:
  • Method 1 reuse the NewTalk link (main link).
  • the call data stream and the real-time shared multimedia data stream can share the NewTalk link (main link) for transmission.
  • the call data stream can be transmitted through the NewTalk link (main link) first, and then the multimedia data stream for real-time sharing can be transmitted.
  • the header fields of the call data stream and the real-time shared multimedia data stream may be different.
  • NewTalk is a call based on the IMS protocol (can be called an IMS call), so it can be extended and added to the original real-time transport protocol (RTP) packets, such as call data streams and real-time sharing.
  • the RTP header of the multimedia data stream is different.
  • the core network is in transparent transmission mode and does not filter or transcode the packets of multimedia data streams shared in real time.
  • the Data channel is based on the IMS proprietary bearer and is a data transmission channel other than the call signaling QCI5 and the multimedia channel QCI1/QCI2.
  • Method 3 Establish an auxiliary link.
  • call data streams can be transmitted using the NewTalk link (primary link), and real-time shared multimedia can be transmitted using the secondary link.
  • link establishment negotiation is performed through messages transmitted on the NewTalk link (primary link) to establish the auxiliary link.
  • the sharing device can carry the information used to establish the auxiliary link in the real-time control protocol (RTCP) message transmitted by the main link, so as to request the other party during the call. Establish a secondary link.
  • the sharing device can carry information for establishing the auxiliary link in the source description items (SDES) field included in the RTCP message.
  • the SDES field is, for example, used to describe the source that sends the RTCP message.
  • the sharing device can store the communication ID (such as SessionID), address information (such as IP address) and other information used for NAT penetration in the SDES field in a text encoding manner.
  • the SDES field is, for example, a terminal identifier. (canonical name, CNAME).
  • the sharing device can call the NAT interface for traversal or relay to establish a secondary link.
  • the sharing device can also perform link establishment negotiation through session initialization protocol (SIP) messages to establish an auxiliary link.
  • SIP session initialization protocol
  • the sharing device can The invitation (INVITE) message carries information such as the communication ID (such as SessionID) to exchange their respective communication IDs with the shared device (for subsequent establishment of auxiliary links).
  • the sharing device can carry information such as communication ID (such as SessionID) in the re-invite (reINVITE) message or update (UPDATE) message to exchange their respective communication IDs with the shared device (for subsequent establishment assistance).
  • This application does not limit this.
  • the auxiliary link may not be established through the NewTalk link (primary link), but the auxiliary link may be established through addressing through the network device 300 .
  • either party in the call can perform parameter binding on the network device 300, optionally Specifically, the identification information such as phone number, OTT ID and communication ID (such as SessionID) are bound/set to be associated.
  • other devices can address the communication ID of the device through the network device 300 based on the device's phone number, OTT ID and other identification information.
  • FIG. 28 is a schematic flowchart of an auxiliary link establishment process.
  • FIG. 28 illustrates an example in which the electronic device 100 and the electronic device 200 performing NewTalk are addressed through the network device 300 to establish an auxiliary link.
  • the establishment process may include, but is not limited to, the following steps:
  • the electronic device 100 binds the first identification information of the electronic device 100 to the first communication ID of the electronic device 100, registers and/or logs in to the network device 300 (which may be called a binding operation).
  • the first identification information is a phone number or a communication number such as an OTT ID.
  • the first communication ID is SessionID.
  • the network device 300 may perform identity authentication on the electronic device 100, such as verifying whether the access token (AT) or Huawei certificate of the electronic device 100 meets the requirements.
  • identity authentication of the electronic device 100 passes, the network device 300 can generate a P2P-TOKEN of the electronic device 100.
  • the P2P-TOKEN for example, carries a key identification (key id) and uses a private key signature.
  • the binding operation can be performed only after the identity authentication of the electronic device 100 passes.
  • the binding operation may include: the electronic device 100 sends the first identification information of the electronic device 100 to the network device 300, and the network device 300 may return the first communication ID of the electronic device 100.
  • the first identification information may include one or more identification information of the electronic device 100. For example, if the first identification information includes phone number 1 and phone number 2, then the electronic device 100 may send the hashed phone number to the network device 300. Number 1 and phone number 2 can be characterized as: HASH (phone number 1) + HASH (phone number 2).
  • the electronic device 100 can perform a refresh (Refresh) operation.
  • the refresh operation is similar to the binding operation, except that the binding identification information and communication ID
  • the ID is the changed identification information and communication ID.
  • the network device 300 may store the first identification information of the electronic device 100 and the first communication ID associated with the first identification information, which may also be referred to as establishing a binding operation. Determine the relationship.
  • the electronic device 200 binds the second identification information of the electronic device 200 to the second communication ID of the electronic device 200, registers and/or logs in to the network device 300.
  • the electronic device 100 obtains the second communication ID of the electronic device 200 from the network device 300 according to the second identification information of the electronic device 200 (which can be called an addressing operation).
  • the electronic device 100 may send a query request to the network device 300 , where the query request is used to query the electronic device 200 .
  • Communication ID the query request may carry at least one identification information of the electronic device 200 known to the electronic device 100.
  • the network device 300 may obtain the first communication ID associated with the at least one piece of identification information and return it to the electronic device 100 .
  • the electronic device 100 can actively release to the network device 300 through the provided session end interface to release the binding relationship implemented in 1 in Figure 28.
  • the network device 300 can automatically release after a preset period of time (for example, 10 minutes) to release the binding relationship implemented by 1 in Figure 28, which can be called automatic cleaning of the binding relationship over time. .
  • the electronic device 200 obtains the first communication ID of the electronic device 100 from the network device 300 according to the first identification information of the electronic device 100 .
  • the electronic device 100 and the electronic device 200 establish an auxiliary link according to the first communication ID and the second communication ID.
  • the electronic device 100 can complete the link establishment negotiation with the electronic device 200 through the second communication ID of the electronic device 200, such as but not limited to IP direct connection, through NAT traversal, or through server relay (such as NAT relay). , thereby establishing an auxiliary link.
  • the second communication ID of the electronic device 200 such as but not limited to IP direct connection, through NAT traversal, or through server relay (such as NAT relay).
  • the electronic device 200 may not perform 2 (that is, not perform the binding operation).
  • the network device 300 cannot match the communication ID associated with the second identification information of the electronic device 200 .
  • the network device 300 can wake up the electronic device 200 through the connected PUSH server.
  • the awakened electronic device 200 can be connected to the network device 300 and pass the network device 300 (such as the authentication module and addressing module therein) Perform identity authentication and addressing (for example, to address the electronic device 100).
  • the network device 300 may not wake up the electronic device 200, and therefore may return addressing failure indication information (for example, including the reason) to the electronic device 100. "Not waking up”), in other examples, the network device 300 cannot successfully wake up the electronic device 200, and therefore may return addressing failure indication information to the electronic device 100 (for example, including the reason "waking up failed”). It is not limited to this. In other examples, before the above-mentioned step 4, the electronic device 100 may not perform the binding operation. The specific description is similar to the above and will not be described again.
  • the auxiliary link may not be established with the help of the NewTalk link (main link), but the auxiliary link may be established through a peripheral device.
  • the peripheral device may, but is not limited to, use near field communication.
  • a device that communicates with a shared device a device that communicates with a shared device through far-field communication methods such as cellular communication or satellite, a device that is known to the sharing device (such as information about the device is stored) or a device that is unknown to the sharing device (such as it is not stored) any information about this device).
  • the electronic device 100 is a device without addressing capability.
  • the electronic device 100 can establish an auxiliary link with the electronic device 200 through peripheral devices.
  • the electronic device 100 is a smart watch (such as a modem powered off). , tablet computers (if there is no SIM card interface), smart speakers or headphones, etc., the electronic device 100 can establish an auxiliary link with the electronic device 200 through the smartphone connected to the electronic device 100 .
  • a smart watch such as a modem powered off
  • tablet computers if there is no SIM card interface
  • smart speakers or headphones etc.
  • the electronic device 100 can establish an auxiliary link with the electronic device 200 through the smartphone connected to the electronic device 100 .
  • the auxiliary link established in the third method is an auxiliary link between the sharing device and the shared device.
  • the auxiliary link established in the third method includes the sharing device.
  • the electronic device 100 is a device that does not have the ability to directly establish an auxiliary link.
  • the electronic device 100 can communicate with the electronic device 200 through a relay device.
  • the electronic device 100 is a smart watch (such as a modem powered off). , tablet computers (if there is no SIM card interface), smart speakers or headphones, etc., the electronic device 100 can establish an auxiliary link with the electronic device 200 through the smartphone connected to the electronic device 100 .
  • a sharing device in the call state can use the above method one, two or three to establish a link
  • a sharing device in an idle state can use the above method two or three to establish a link.
  • the communication map may include a universal communication map
  • the universal communication map may include crowdsourced data of multiple electronic devices, such as but not limited to including at least one of the following data: service set identifier (service set identifier) , SSID) (also known as Wi-Fi ID), cell ID (CELLID), signal strength parameters (such as reference signal receiving power (RSRP)), signal quality parameters (such as reference signal receiving quality (reference signal receiving quality, RSRQ)), call QoE parameters (such as packet loss rate, delay, number of interruptions, etc.), link transmission quality parameters (such as packet loss rate, delay, jitter, etc.), time period, GNSS Positioning longitude and latitude information, GNSS positioning absolute position information, GNSS positioning indoor relative position information, call partner information (such as phone number, etc.).
  • RSRP reference signal receiving power
  • RSRQ reference signal receiving quality
  • call QoE parameters such as packet loss rate, delay, number of interruptions, etc.
  • link transmission quality parameters such as packet loss rate, delay, jit
  • predicting link building based on a universal communication map can be: performing data analysis on the cloud (such as a server) based on crowdsourced data from multiple electronic devices to obtain the characteristics of the communication link in space and time, and obtain The characteristics of can be used to determine at least one of the following: the time of link establishment, the location of link establishment, the type of established link, etc.
  • the established link may include a physical link and/or a logical link, wherein, through different communications
  • the physical links established through different methods may be different.
  • Multiple logical links established through the same communication method may be different.
  • the logical links established through different ports of the electronic device may be different for the same communication method. For example, through cellular communication.
  • the relay link and traversal link established by Wi-Fi communication method can be different logical links.
  • electronic devices can use the cloud to determine whether a certain link established at a certain time period and at a certain location is stable. Among them, the communication quality is better when it is stable, and the communication quality is poor when it is unstable.
  • the communication quality is, for example, but not It is limited to determining by packet loss rate, delay, jitter, bandwidth, etc.
  • an electronic device can use the communication status of other electronic devices to guide the link establishment behavior of its own device. For example, during time period 1, the communication quality of the cellular communication link established by other electronic devices at location 1 is relatively high. Therefore, the device can establish a cellular communication link when it is not in time period 1 and at location 1, thereby ensuring call quality. There is no need for the device to learn link by link at each time and location, effectively saving power consumption.
  • the communication map may include a personalized communication map, and the personalized communication map may be user operations that may be performed later by learning personal information such as usage habits and operating behaviors.
  • a personalized communication map may include crowdsourced data from multiple electronic devices as described above.
  • the personalized communication map may include private data, such as but not limited to at least one of the following data: the intimacy level of each call partner (for example, through call duration, call period, notes/relationships marked in the address book, (e.g., the level of happiness during the call), the information shared in real time with each call partner (such as time, location, frequency, etc.), the situation of watching/listening to video/audio simultaneously during the call, the transmission link and The status of data such as files, the user’s operating habits and behavior sequence during the call (such as commonly used keys, touch methods, touch locations, etc.), the accuracy of prediction and link building of historical calls, etc.
  • the intimacy level of each call partner for example, through call duration, call period, notes/relationships marked in the address book, (e.g.,
  • the electronic device or the cloud can mark high-frequency real-time sharing objects (which may be referred to as high-frequency objects) according to a personalized communication map, for example, real-time sharing within a preset period (such as within a week)
  • high-frequency objects which may be referred to as high-frequency objects
  • the N objects that have been shared the most are marked as high-frequency objects, and/or the first M objects arranged from late to early in real-time sharing time are marked as high-frequency objects.
  • the high-frequency objects marked above can be used for predictive link building.
  • the electronic device or the cloud can mark close objects according to a personalized communication map. For example, contacts in the address book whose notes/relationships are family members, leaders, friends, colleagues, etc. are marked as close objects. , and/or, mark contacts in the call history who have frequent calls (for example, more calls and/or shorter call times) as close objects.
  • the above-mentioned marked intimacy objects and the information shared in real time with the intimacy objects can be used to predict and build links.
  • electronic devices or clouds can predict operating behaviors based on personalized communication maps, for example, based on simultaneous viewing/listening of video/audio during a call, transmission of data such as links and files during a call, etc. Predict the user's operating behavior based on the situation and the user's operating habits during the call (such as commonly used keys, touch methods, touch locations, etc.). Predicted operating behaviors can be used for predictive link building.
  • predictive link building can be used to implement at least one of the following functions:
  • the selection of the optimal link For example, when multiple links can be established, you can select at least one of the optimal/better/relatively optimal links among the multiple links, establish the at least one link, and do not establish the link among the multiple links. other links.
  • the communication map in the above example may be divided into sections in the form of a grid, for example, a grid with a specification of 20 meters ⁇ 20 meters as shown in FIG. 29 .
  • the congestion status of various links in each grid and the optimal period for link construction can be obtained.
  • the personalized communication map filters out the user's private information locally on the electronic device or in the cloud, it can predict whether the user has a need to establish a link based on the model, and optionally, and when the link needs to be established. It is not limited to the above examples. In other examples, it may also have an irregular shape. In other examples, it may also be in a form carrying three-dimensional information such as altitude. This application does not limit the specific form of the communication map.
  • Figure 30 illustrates a schematic flowchart of predictive link building.
  • the sharing device can record the operation behavior and sequence before initiating real-time sharing, for example, record the operation behavior and sequence during the call.
  • Sharing devices can predict and build links based on recorded operating behaviors and sequences, as well as communication maps.
  • the sharing device can determine whether the predicted link establishment is correct based on the predicted link establishment results and the actual results of whether real-time sharing is initiated. In some examples, when the result of predicting whether to build a link is yes, and the result of whether to actually initiate real-time sharing is yes, the prediction is correct. In other examples, when the result of predicting whether to build a link is yes, and the result of whether to actually initiate real-time sharing is no, the prediction is incorrect.
  • the sharing device may record the operation behavior and sequence of whether real-time sharing is actually initiated for subsequent predictive link building. In one implementation, the sharing device can record the result of whether the predicted link building is correct, and feed the result back to the predictive link building system for subsequent predictive link building.
  • predictive link building can be performed by cloud servers or other network devices to reduce processing pressure on electronic devices and save power consumption.
  • the above example takes the sharing device initiating the link establishment as an example.
  • the shared device may also initiate the link establishment, which is not limited in this application.
  • Transmission is the transmission of a real-time shared data stream between a sharing device and at least one shared device.
  • the sharing device and the shared device can transmit data streams directly point-to-point.
  • the sharing device can directly send data stream 1 to the shared device 1, and the sharing device can directly send data stream 1 to the shared device.
  • Device 2 sends data stream 2.
  • the data stream can be transmitted between the sharing device and the shared device through a relay device (such as a network device such as a server).
  • the sharing device can transmit data to the shared device 1 through the relay device.
  • Data stream 3 is sent, that is to say, data stream 3 can be relayed through the relay device.
  • Data flow 3 can pass through the link between the sharing device and the relay device, and the link between the relay device and the shared device.
  • the sharing device can send the data stream 4 to the shared device 2 through the relay device.
  • the specific description is similar to the above description and will not be described again.
  • the data stream is transmitted in layers, such as the audio stream/video stream transmission architecture shown in Figure 32A. From top to bottom, it can be: data layer (such as audio data/video data), encoding layer (such as using audio/video coding standards such as H.265 and H.264, such as using sound coding formats such as Opus), transport layer (such as using RTP, real time streaming protocol (RTSP)), network layer (For example, using TCP/IP protocol, or using user datagram protocol (UDP)), physical layer (for example, using protocols of physical links such as cellular communication/Wi-Fi/BT/D2D/satellite).
  • data layer such as audio data/video data
  • encoding layer such as using audio/video coding standards such as H.265 and H.264, such as using sound coding formats such as Opus
  • transport layer such as using RTP, real time streaming protocol (RTSP)
  • RTSP real time streaming protocol
  • network layer For example, using TCP/IP protocol, or using user datagram protocol (UDP
  • the format of the transmitted data packet is, for example, as shown in Figure 32B.
  • the data packet may include a network protocol header (such as IP Head), Fields such as transmission protocol header (such as RTP Head), encoded information header (such as H.265 Head/Opus Head) and raw data (RAW Data).
  • IP Head such as IP Head
  • RTP Head transmission protocol header
  • encoded information header such as H.265 Head/Opus Head
  • RAW Data raw data
  • the sharing device can offload the real-time shared data streams according to preset transmission rules (for example, through the Four-network+ implementation in the multipath transmission management module), the transmission rules may include, but are not limited to, at least one of the following:
  • Audio streams and video streams are transmitted separately.
  • the audio stream and the video stream are encoded separately/independently.
  • the audio stream is transmitted through link A, and the video stream is transmitted through link B.
  • link A is a relatively stable communication link with low latency and/or low jitter.
  • Road B is a communication link with large bandwidth and/or low or no tariff.
  • Audio streams and video streams are transmitted separately.
  • the application-level/system-level/background audio stream and call data stream are mixed and encoded (for details, please refer to the description on the sharing device side of Figure 26A).
  • the mixed-encoded audio stream is transmitted through link A, and the video stream is transmitted through link A. B transmission.
  • the data flow based on data flow or the rich data flow is related to coding (such as hierarchical coding).
  • the data flow with a higher degree of encoding can be a rich data stream, and the data stream with a lower degree of encoding can be a basic data stream.
  • Audio streams and video streams are transmitted together. Audio streams and video streams with the same timestamp are encoded together. In some examples, they can be transmitted on the same link. In other examples, they can also be dynamically migrated to other links according to changes in link quality to ensure transmission. the optimal effect.
  • Rule 6 Redundant supplementary packets for audio streams and/or video streams.
  • the supplementary packets may be transmitted on the same link, such as carrying the encoded data of two adjacent frames at a time, in other examples.
  • the supplementary packet can be transmitted through at least one other link.
  • the audio stream and/or the video stream may be partially redundantly packed, and in other examples, the audio stream and/or the video stream may be fully redundantly packed.
  • the sharing device in order to ensure that the receiving end (the shared device) receives a set of data packets in the shortest time, the sharing device can perform the processing according to the transmission delays and jitter conditions of multiple communication links when sending a set of data packets.
  • the receiving end (the shared device) can group the data packets after receiving all the data packets sent by the sharing device.
  • Specific examples can be See Figure 33 below.
  • there are differences between different links so the transmission delays and jitter conditions of different links can be different.
  • Different from the jitter situation optionally, there are differences between logical links.
  • the transmission delay and jitter situation of multiple Wi-Fi links established using different ports are different.
  • the Wi-Fi links established directly The transmission delay and jitter are different from the Wi-Fi link established through relay equipment.
  • Figure 33 exemplarily shows a schematic diagram of split transmission.
  • the sharing device may include port 1, port 2 and port 3, and the shared device may include port 4 and end Wi-Fi port 5, where link 1 and port 1 are established between port 1 and port 4.
  • Link 2 is established between port 2 and port 4
  • link 3 is established between port 2 and port 5
  • link 4 is established between port 3 and port 5.
  • the four links in order from high to low delay are: link 1, link 2, link 3 and link 4, then the sharing device is sending the group of data packet 1, data packet 2 and data packet 3.
  • Data packets can be executed in sequence: packet 1 is sent through link 4, packet 2 is transmitted through link 3, packet 2 is transmitted through link 2, packet 1 is transmitted through link 1, where link 2 transmits
  • the data packet 2 can be a supplementary packet.
  • the shared device can receive the above-mentioned data packet 1, data packet 2 and data packet 3 at similar times and group them together to prevent some data packets from arriving at the shared device much later than other data packets. time of the device, reducing the time it takes for the shared device to receive a set of data packets, and the transmission efficiency is higher.
  • Figure 34 illustrates an architectural schematic diagram of yet another sharing system 10.
  • the sharing system 10 shown in FIG. 34 can be applied to a scenario of real-time sharing through Wi-Fi.
  • the electronic device 100 and the electronic device 200 in the sharing system 10 can perform real-time sharing through Wi-Fi such as watching together, listening together, playing together, and editing together.
  • the electronic device 100 is a sharing device that sends a real-time shared data stream
  • the electronic device 200 is a shared device that receives a real-time shared data stream.
  • the following example takes a multicast scenario as an example.
  • the electronic device 100 can also be called a multicast sender (source), and the electronic device 200 can also be called a multicast receiver (sink), where the electronic device 200 is a plurality of multicast receivers. Any one of them is not limited to this and can also be applied to unicast or broadcast scenarios, which is not limited in this application.
  • the software system of the electronic device 100 can be divided into four layers, from top to bottom: application framework layer, kernel layer, firmware layer and hardware layer.
  • the application framework layer includes sharing module, discovery module and crawling module.
  • the kernel layer includes transmission protocol stack, encoding module, multicast management protocol, multicast control algorithm and multicast key management.
  • the firmware layer includes multicast frame transmission and multicast frame encryption.
  • the hardware layer includes Wi-Fi baseband and radio frequency. in:
  • the capture module can be used to capture shared data, such as capturing application-level/system-level/background audio and/or images, and encoding the captured audio and/or images to generate audio/video source data.
  • the encoding module can be used to encode audio/video data packets (such as fountain coding) before sending the data packets, thereby improving the reliability of transmission and reducing the probability of packet loss on the air interface channel.
  • audio/video data packets such as fountain coding
  • the multicast management protocol can be used to manage members of Wi-Fi multicast groups, such as members joining and leaving.
  • Multicast control algorithms can be used to dynamically control the aggregation scheduling of multicast messages, the signal modulation level of modulation and coding scheme, etc.
  • Multicast key management can be used to manage multicast keys, such as dynamic generation and distribution of multicast keys.
  • Multicast frame transmission can be used to encapsulate audio/video data into Wi-Fi multicast data frames (which may be referred to as Wi-Fi multicast frames), and send them to the electronic device 200 and other Wi-Fi multicast groups through the air interface. members) sends Wi-Fi multicast frames.
  • Wi-Fi multicast data frames which may be referred to as Wi-Fi multicast frames
  • members sends Wi-Fi multicast frames.
  • Multicast frame encryption can be used to encrypt Wi-Fi multicast frames based on the multicast key.
  • the multicast frame sent over the air interface is specifically an encrypted Wi-Fi multicast frame. frame.
  • Wi-Fi's baseband and radio frequency are used to send/receive W-Fi multicast frames.
  • the software system of the electronic device 200 can be divided into four layers, from top to bottom: application framework layer, kernel layer, firmware layer and hardware layer.
  • the application framework layer includes sharing module, discovery module and playback module.
  • the kernel layer includes transmission protocol stack, decoding module, multicast management protocol and multicast key management.
  • the firmware layer includes multicast frame filtering and multicast frame decryption.
  • the hardware layer includes Wi-Fi baseband and video. The specific description is similar to the description of the software system of the electronic device 100 mentioned above. Next, the modules in the electronic device 200 that are different from the modules of the electronic device 100 will be mainly described.
  • the playback module can be used to decode audio/video data and output the decoded audio/video data.
  • the decoding module can be used to decode received audio/video data packets (such as fountain decoding) to recover lost data packets.
  • Multicast frame filtering can be used to filter Wi-Fi multicast frames based on the address information of the multicast group that the electronic device 200 has joined after receiving the Wi-Fi multicast frame on the air interface, and discard the Wi-Fi multicast frames that do not belong to the multicast group. , retaining Wi-Fi multicast frames belonging to this multicast group.
  • Multicast frame decryption can be used to decrypt the received Wi-Fi multicast frame based on the multicast key after receiving the Wi-Fi multicast frame on the air interface.
  • the following is an example of the discovery, connection, transmission and departure processes during real-time sharing via Wi-Fi.
  • the sharing device can serve as the source device of the Wi-Fi multicast group and search for nearby devices through broadcast messages to complete device discovery.
  • the sharing device (multicast sender) completes device discovery, it can send a real-time sharing request to the shared device (multicast receiver). After the shared device accepts the request, it can complete the process with the sharing device.
  • negotiation multicast address, multicast key and other information to complete the connection.
  • Figure 35 illustrates a schematic flowchart of device discovery and connection.
  • device discovery may include, but is not limited to, the following steps:
  • the electronic device 100 (sharing device/multicast sender) sends a broadcast message to the electronic device 200 (shared device/multicast receiver) to search for nearby devices.
  • the broadcast message is, for example, but not limited to, a Wi-Fi broadcast message or a Bluetooth broadcast message.
  • the electronic device 200 responds to the received broadcast message and sends the communication information of the electronic device 200 to the electronic device 100 .
  • the communication information includes, for example, but is not limited to, the ID, MAC address and other information of the electronic device 200 .
  • the discovery module of the electronic device 100 and the discovery module of the electronic device 200 can complete device discovery, such as 1-3 of Figure 35 .
  • device connection may, but is not limited to, include the following steps:
  • the electronic device 100 sends a real-time sharing request to the electronic device 200 .
  • the electronic device 100 sends a real-time sharing request to the electronic device 200 in response to the user operation for triggering the real-time sharing function described in the above embodiment.
  • the electronic device 200 accepts the real-time sharing request sent by the electronic device 100 .
  • the electronic device 200 responds to the user operation and accepts the real-time sharing request.
  • the electronic device 100 and the electronic device 200 transmit the multicast address and negotiate the multicast key.
  • the multicast management protocol of the electronic device 100 and the multicast management protocol of the electronic device 200 can complete the connection, such as 4-6 in Figure 35.
  • a real-time shared data stream can be transmitted between the sharing device (multicast sender) and multiple shared devices (multicast receivers).
  • Figure 36 exemplarily shows a schematic diagram of a data flow shared in real time through Wi-Fi transmission.
  • the transfer process can be Not limited to the following steps:
  • the electronic device 100 captures and generates a real-time shared data stream (referred to as a shared data stream).
  • the electronic device 100 (eg, the included capture module) captures images and/or audio of the application layer/system layer/background layer, and encodes the captured image and/or audio to Generate audio/video source data (i.e., the above shared data stream).
  • the electronic device 100 slices the shared data stream and encapsulates it into a multicast data frame.
  • the electronic device 100 (eg, the included transport protocol stack) slices the audio/video source data and encapsulates it into a multicast data frame.
  • the electronic device 100 encodes the multicast data frame.
  • the electronic device 100 (eg, the included encoding module) performs fountain encoding on the multicast data frame to add redundant information.
  • the electronic device 100 encrypts the multicast data frame.
  • electronic device 100 (eg, includes multicast frame encryption) encrypts multicast data frames based on a negotiated multicast key.
  • the electronic device 100 sends the multicast data frame to the electronic device 200.
  • the electronic device 100 (eg, multicast frame transmission included) sends multicast frames to multicast group members such as the electronic device 200 over the air interface based on the Wi-Fi data multicast protocol.
  • the electronic device 200 filters the received multicast data frames.
  • the electronic device 200 after the electronic device 200 receives the multicast data frame on the air interface, the electronic device 200 (for example, the included multicast frame filtering) can discard the multicast data frame that does not belong to the multicast group in which the electronic device 200 is located, and retain the multicast data frame. Multicast data frames belonging to this multicast group.
  • the electronic device 200 decrypts the multicast data frame.
  • electronic device 200 decrypts the multicast data frame based on the negotiated multicast key.
  • the electronic device 200 decodes the multicast data frame.
  • the electronic device 200 (eg, the included decoding module) performs fountain decoding on the multicast data frame and recovers the lost data frame based on the redundant information.
  • the electronic device 200 decapsulates and reassembles the multicast data frame to obtain the shared data stream.
  • the electronic device 200 decapsulates and reassembles the multicast data frame to restore an audio stream/video stream (ie, a shared data stream).
  • the electronic device 200 plays the shared data stream.
  • the electronic device 200 decodes the shared data stream and displays and/or plays the decoded video stream/audio stream in the foreground.
  • the transmission flow direction of the audio stream/video stream for real-time sharing may be: source application/source system of the electronic device 100 (used to generate the audio stream/video stream for real-time sharing) -> capture of the electronic device 100 Get the module->Transmission protocol stack of the electronic device 100->Encoding module of the electronic device 100->Multicast frame encryption of the electronic device 100->Multicast frame transmission of the electronic device 100->Multicast frame filtering of the electronic device 200- >Multicast frame decryption of the electronic device 200 -> Decoding module of the electronic device 200 -> Transmission protocol stack of the electronic device 200 -> Playback module of the electronic device 200 -> Target application/target system of the electronic device 200 (for outputting real-time shared audio stream/video stream).
  • any one of the sharing device (multicast sender) and multiple shared devices (multicast receiver) can exit the current real-time sharing, which can be understood as the device can leave the current group. broadcast group.
  • any multicast receiver when any multicast receiver (for example, electronic device 200) receives an instruction to exit the current real-time sharing, it can notify the multicast sender (electronic device 100), and the multicast sender can be in the multicast group. Delete the member (i.e., the above-mentioned multicast receiver). See Figure 37 for a specific example.
  • Figure 37 illustrates a schematic flowchart of a multicast receiver leaving.
  • the departure process may include, but is not limited to, the following steps:
  • the electronic device 200 receives the instruction to exit real-time sharing. For example, the electronic device 200 receives a touch operation (eg, a click operation) for the option 531A of "Exit viewing" in the user interface 530 shown in FIG. 5C .
  • a touch operation eg, a click operation
  • the electronic device 200 sends a leaving notification message to the electronic device 100 (multicast sender).
  • the notification message is a multicast signaling frame.
  • the electronic device 100 deletes the multicast group member: the electronic device 200.
  • the electronic device 100 sends a response message confirming the departure to the electronic device 200.
  • the response message is a multicast signaling frame.
  • the multicast sender when the multicast sender receives an instruction to exit the current real-time sharing, it can notify other multicast group members (multiple multicast receivers) to leave the current multicast group and delete the current multicast group. Specific examples See Figure 38.
  • Figure 38 illustrates a schematic flow chart of a multicast sender leaving.
  • the departure process may include, but is not limited to, the following steps:
  • the multicast sender receives the instruction to exit real-time sharing.
  • the electronic device 100 receives a message for the user shown in FIG. 4C A touch operation (such as a click operation) on the "pause sharing" option 431E in the user interface 430.
  • the multicast sender notifies all multicast group members to exit. Specifically, the multicast sender sends notification messages to exit the multicast group to multiple multicast receivers (multicast receiver 1,..., multicast receiver N, where N is a positive integer greater than 1). For example, the The notification message is a multicast signaling frame.
  • multicast receivers 1,..., multicast receivers N
  • the response messages are multicast signaling frames.
  • the multicast management protocol of the electronic device 100 and the multicast management protocol of the electronic device 200 can complete the maintenance of multicast group members, such as realizing the multicast receiver leaving as shown in Figure 37 and/or as shown in Figure 38 The multicast sender leaves.
  • the format of the multicast signaling frame is, for example, as shown in Figure 39.
  • the multicast signaling frame may include: destination address (destination address, DestAddr), source address (source address, SrcAddr), type/length (Type/Length), actual destination address (Actual DestAddr), actual source address (Actual SrcAddr), control number (Control ID), lower edge of the sending window (transport lower, TX LE), payload (Payload) and other fields (fields ),in:
  • the destination address (6 bytes (octets)) belongs to the multicast address and is the receiving address corresponding to the Ethernet header and MAC header of the multicast signaling frame.
  • the source address (6 bytes) belongs to the multicast address and is the sending address corresponding to the Ethernet header and MAC header of the multicast signaling frame.
  • using the destination address and source address belonging to the multicast address can prevent an attacker from obtaining the multicast key based on the actual source address and/or the actual destination address, thereby improving the security of data transmission.
  • Type/length (2 bytes) can include multicast type (Multicast Type) and subtype (Subtype), where the multicast frame type (10 bits) is used to characterize the type of multicast frame, for example, multicast message Let the type field in the frame be 0x1FF. The subtype (6 bits) is used to characterize the subtype of the multicast frame.
  • the actual destination address (6 bytes) is the multicast MAC address that actually receives the multicast signaling frame.
  • the multicast MAC address can be a multicast address segment of multiple devices that actually receive the multicast signaling frame.
  • the MAC address can be within this multicast address segment.
  • the actual source address (6 bytes) is the MAC address of the device that actually sent the multicast signaling frame.
  • the control number (1 byte) is the encoding of the control signaling frame and can be used for retransmission.
  • the lower edge of the sending window (1 byte) is used to instruct the receiving end to shift the receiving window.
  • the payload is specific control signaling information.
  • the size of the payload can be different in different situations, which is a variable.
  • the multicast signaling frame may be a raw multicast frame of the WLAN. It is not limited to this. In other examples, the multicast signaling frame can also be determined through Huawei Magneto Link (HML), which can meet lower power consumption and WLAN concurrency scenarios.
  • HML Huawei Magneto Link
  • the following is an example of how to implement real-time sharing via Bluetooth.
  • the following example uses real-time audio sharing as an example.
  • Figure 40 illustrates an architectural schematic diagram of yet another sharing system 10.
  • the sharing system 10 shown in FIG. 40 can be applied to a scenario of real-time sharing through Bluetooth.
  • the electronic device 100 and the electronic device 200 in the sharing system 10 can perform real-time sharing through Bluetooth, such as watching together, listening together, playing together, and editing together.
  • the electronic device 100 is a sharing device (also called a source device) that sends a real-time shared data stream
  • the electronic device 200 is a shared device (also called a receiving device) that receives the real-time shared data stream.
  • real-time sharing through Bluetooth can be applied to unicast, multicast or broadcast scenarios.
  • the electronic device 200 can be any one of multiple receiving devices.
  • the source device can send audio data to multiple receiving devices at the same time, and the multiple receiving devices play the audio data at the same time after receiving the audio data.
  • the software system of the electronic device 100 can be divided into an upper layer, including an application framework layer, a native layer (Native), and a kernel layer from top to bottom.
  • the application framework layer includes source applications/source systems, audio framework modules, sharing modules and sharing services.
  • the sharing services include device management and key management.
  • the native layer includes the audio abstraction module and the Bluetooth stack.
  • the Bluetooth stack includes the encoding module, Bluetooth protocol stack, transmission standard and timestamp synchronization.
  • the kernel layer includes Bluetooth chips/drivers (for example, Hi110x), and Bluetooth chips/drivers include Bluetooth Low Energy Controller (BLE Controller). in:
  • Source applications/source systems are used to generate audio streams for real-time sharing, such as for music applications, video applications, or game applications.
  • the audio framework module (Audio Framework) and audio abstraction module (Audio HAL) are responsible for managing audio functions at the application framework layer and native layer respectively.
  • the audio data generated by the source application/source system can be sent to the audio framework module, and then sent to the audio abstraction module after being processed by the audio framework module.
  • the audio abstraction module can send the processed audio data to the Bluetooth stack for processing.
  • Device management can be used to manage devices for real-time sharing via Bluetooth, such as the joining and leaving of devices.
  • Key management can be used to manage Bluetooth keys, such as the generation of Bluetooth keys, such as broadcast isochronous streams (BIS) protocol keys.
  • Bluetooth keys such as the generation of Bluetooth keys, such as broadcast isochronous streams (BIS) protocol keys.
  • BIOS broadcast isochronous streams
  • the encoding module can be used to encode the audio data sent by the audio abstraction module, such as L3 encoding.
  • the Bluetooth protocol stack is, for example, the BIS protocol stack.
  • the transmission standard may be a standard for transmitting audio unicast/multicast/broadcast configuration parameters, such as but not limited to including broadcast audio scan service (BASS), basic audio profile (BAP), Generic attribute profile (GATT).
  • BASS broadcast audio scan service
  • BAP basic audio profile
  • GATT Generic attribute profile
  • Timestamp synchronization can be used to synchronize time with other receiving devices, so that multiple receiving devices can play the audio data at the same time after receiving the audio data.
  • the Bluetooth chip/driver can be used to send the audio data processed by the Bluetooth stack to the receiving device.
  • the software system of the electronic device 200 may include a Bluetooth module and an audio module, wherein the Bluetooth module includes timestamp synchronization, key management, and broadcast modules (for example, used to implement BIS broadcast) , transmission standards and Bluetooth low energy controllers.
  • the audio module includes an audio queue, a decoding module, an audio synchronization and a codec. The specific description is similar to the description of the software system of the electronic device 100. Next, the modules in the electronic device 200 that are different from the modules of the electronic device 100 will be mainly described.
  • the Bluetooth module can be used to receive and process audio data sent by the source device, and send the processed audio data to the audio module.
  • the audio queue can be used to cache audio data processed by the Bluetooth module.
  • the audio module can process the audio data in the audio queue.
  • the decoding module can be used to decode data in the audio queue, such as L3 decoding.
  • Audio synchronization can be used to agree with other receiving devices on the time to play audio data, so that multiple receiving devices can play the audio data at the same time at the above-mentioned agreed time after receiving the audio data.
  • Codecs can be used to decode audio/video data and obtain the original audio data.
  • the receiving device can play the original audio data at the time agreed with other devices, that is, multiple receiving devices can play the original audio data at the same time.
  • the transmission flow of the audio stream used for real-time sharing in the software system shown in Figure 40 is shown in Figure 41, for example.
  • the source device electronic device 100
  • the electronic device 100 can establish a Bluetooth connection with the electronic device 200 and transmit the Bluetooth key.
  • the audio data of the above sound sources can be transmitted from the source application to the audio framework module, audio abstraction module, and encoding module in sequence.
  • the encoding module can encode the PCM original audio data of the above-mentioned audio source (such as L3 encoding).
  • the encoded audio data can be transmitted from the encoding module to the Bluetooth protocol stack.
  • the Bluetooth protocol stack can encode the encoded audio according to the Bluetooth key transmitted above.
  • the data is encrypted, the encrypted audio data can be transmitted to the Bluetooth chip, and the electronic device 100 can send the audio data to the electronic device 200 through the Bluetooth chip.
  • the electronic device 200 can receive audio data through the Bluetooth chip and transmit the audio data to the Bluetooth protocol stack.
  • the Bluetooth protocol stack can decrypt the audio data based on the transmitted Bluetooth key.
  • the decrypted audio data can be sequentially transmitted to the decoding module (for example, using L3 decoding) and the codec for decoding, and the decoded original audio data can be used for playback (for example, through a speaker).
  • the source device can distribute audio data through Bluetooth broadcast (which may be referred to as data distribution).
  • data distribution can employ BLE Audio BIS technology.
  • the principle of data distribution is to repeatedly send audio data in a sequential or interleaved manner at a fixed transmission interval (such as the BIS transmission interval) to improve the success rate of broadcast reception.
  • the receiving device does not need to communicate with the source device. Once the connection is established, the audio data sent by the source device can be received and played in one direction. For specific examples, see Figure 42 below.
  • FIG. 42 exemplarily shows a schematic diagram of an audio data sending process.
  • Figure 42 illustrates this by taking the transmission interval (interval) as 20 milliseconds (ms) and each data packet being sent three times as an example.
  • the source device starts to send data packets, for example, every 150 microseconds, and sequentially sends: the left part of data packet 1 ) channel part (can be referred to as data packet 1-L), the right channel part of data packet 1 (can be referred to as data packet 1-R), data packet 1-L, data packet 1-R, data Packet 2-L, data packet 2-R, that is to say, the source device transmitted data packet 1 twice and data packet 2 once within 6.9ms.
  • the source device sends the data packet again, for example, every 150 microseconds, and sends in sequence: data packet 2- L, data packet 2-R, data packet 2-L, data packet 2-R, data packet 3-L, data packet 3-R.
  • the source device transmitted data packet 2 and 2 times within 6.9ms. 1 time packet 3.
  • the source device can select multiple receiving devices to share the audio stream in real time, such as listening together.
  • the source device can transmit the encrypted audio to the multiple receiving devices selected above.
  • the source The device can connect to multiple receiving devices selected above through BLE and initiate security management protocol pairing (security manager protocol, SMP) for link encryption. Then, the source device can send messages to the selected receiving devices through the encrypted link. Multiple receiving devices transmit broadcast codes (Broadcast Code). The source device can sequentially transmit broadcast passwords to multiple receiving devices selected above. After the current receiving device completes receiving the broadcast password, the source device can disconnect from the receiving device to transmit the broadcast password to the next receiving device. For specific examples, see Figure 43 below.
  • Figure 43 exemplarily shows a schematic flow chart of a password transmission process.
  • Figure 43 takes the source device as an example of selecting two devices among N devices (device 1,..., device N, N is a positive integer greater than 1) for real-time sharing.
  • the process may, but is not limited to, include the following steps:
  • the source device scans (device) with a high duty cycle after receiving the real-time sharing instruction.
  • N devices continue to send Bluetooth broadcast messages (such as BLE broadcast) to the source device.
  • Bluetooth broadcast messages such as BLE broadcast
  • Select device 1 and device N as the source device.
  • the source device selects device 1 and device N in response to the user operation.
  • the source device requests device 1 to establish a Bluetooth connection.
  • the source device and device 1 establish a Bluetooth connection (for example, a BLE connection).
  • the source device pairs with device 1 based on SMP and encrypts the Bluetooth link between the source device and device 1.
  • the source device sends the broadcast password to Device 1 over the encrypted Bluetooth link.
  • the source device can transmit the broadcast password to the next receiving device (device N), that is, perform 8-11 in Figure 43.
  • device N the next receiving device
  • the description of 8-11 is similar to the above 4-7 and will not be described again.
  • the synchronization mechanism of multiple receiving devices can be implemented through Bluetooth broadcast (such as BIS broadcast), that is, multiple receiving devices simultaneously play audio data sent by the source device.
  • multiple receiving devices can obtain the first parameter (such as the delay parameter (Presentation_Delay)) through the Broadcast Audio Announcement Service (BAP) of the source device, and delay the audio data sent by the source device after receiving it.
  • the audio data is played for a first duration, where the first duration may be determined based on the first parameter.
  • the first duration is the first parameter.
  • Figure 44 see Figure 44 below.
  • Figure 44 exemplarily shows a flow chart of a multi-device synchronization process.
  • Figure 44 takes multiple receiving devices as Device 1 and Device 2 as an example for illustration. The process may include, but is not limited to, the following steps:
  • the source device sends Presentation_Delay to device 1 and device 2 based on BAP.
  • the source device sends audio message 1 (also called broadcast audio message 1) to device 1 and device 2.
  • audio message 1 also called broadcast audio message 1
  • device 1 After receiving audio message 1, device 1 delays Presentation_Delay to play audio message 1.
  • device 2 After receiving audio message 1, device 2 delays Presentation_Delay to play audio message 1.
  • steps 3 and 4 in Figure 44 can be executed simultaneously.
  • multiple receiving devices can receive an audio data message sent by the source device at the same time, and multiple receiving devices can delay playing the audio data message for a first time after receiving it.
  • the user experience will be better.
  • the above embodiments take a one-level sharing scenario (that is, a sharing device shares a first multimedia data stream to at least one shared device) as an example. In other embodiments, it can also be applied to a multi-level sharing scenario. , for example, applied to the secondary sharing scenario: any one of the above-mentioned at least one shared devices can then serve as a sharing device to share the second multimedia data stream to at least one device, where the second multimedia data stream and the third A multimedia data stream can be the same or different.
  • any level of sharing in a multi-level sharing scenario please refer to the description of the level of sharing shown in the above embodiment.
  • Figure 45A exemplarily shows a schematic diagram of a secondary sharing scenario.
  • Figure 45A takes the first level of sharing through Wi-Fi and the second level of sharing through Bluetooth as an example.
  • electronic device 401 (first-level device) can share in real time to second-level devices such as electronic device 402 and electronic device 403 through Wi-Fi broadcast, and electronic device 402 can share to electronic device 404 ( The third-level device) shares the audio stream/video stream in real time.
  • the electronic device 403 can share the audio stream/video stream in real time to the third-level device such as the electronic device 405 and the electronic device 406 through Bluetooth broadcast.
  • Figure 45B exemplarily shows a schematic diagram of yet another secondary sharing scenario.
  • Figure 45B illustrates this by taking the example of both first-level sharing and second-level sharing being implemented through Wi-Fi.
  • the electronic device 411 (the first-level device) can broadcast to the electronic device 412, the electronic device 413, etc. through Wi-Fi.
  • Second-level devices share in real time.
  • Electronic device 413 can share in real time to third-level devices such as electronic device 414 and electronic device 415 through Wi-Fi broadcast. This can be understood as a Wi-Fi cascade relay scenario.
  • Figure 45C exemplarily shows a schematic diagram of yet another secondary sharing scenario.
  • Figure 45C illustrates this by taking the example of both first-level sharing and second-level sharing being implemented through Bluetooth.
  • electronic device 421 (first-level device) can share real-time information to second-level devices such as electronic device 422 and electronic device 423 through Bluetooth broadcast, and electronic device 422 can share information in real time with electronic device 424 (third-level device) through Bluetooth unicast. level devices), the electronic device 423 can share in real time to third-level devices such as the electronic device 425 and the electronic device 426 through Bluetooth broadcast.
  • Figure 45D exemplarily shows a schematic diagram of a three-level sharing scenario.
  • Figure 45D takes the first-level sharing through far-field communication methods such as NewTalk and satellite as an example, and the second-level sharing and third-level sharing through near-field communication methods such as WiFi, D2D, and BT.
  • far-field communication methods such as NewTalk and satellite
  • near-field communication methods such as WiFi, D2D, and BT.
  • the electronic device 431 (the first level device) can share in real time to the electronic device 432 (the second level device) via NewTalk link or auxiliary link unicast.
  • the electronic device 432 can share in real time to the electronic device 433 (third-level device) through D2D unicast, and can also share in real time to third-level devices such as the electronic device 434 through Bluetooth broadcast.
  • the electronic device 434 (the third-level device) can share in real time to the fourth-level devices such as the electronic device 435 and the electronic device 436 through Wi-Fi broadcast.
  • it can also be Far field sharing + far field sharing + near field sharing.
  • it can also be far field sharing + near field sharing + far field sharing.
  • it can also be near field sharing + far field sharing + Near field sharing, in other examples, can also be near field sharing + near field sharing + far field sharing.
  • it can also be near field sharing + far field sharing + far field sharing. This application is about Not limited.
  • the electronic device 401 (first-level device) can share in real time to the electronic device 402 (second-level device) through Wi-Fi unicast.
  • any level of sharing can be unicast, multicast or broadcast.
  • the electronic device for real-time sharing can adjust the code rate for encoding/decoding the multimedia data for real-time sharing according to the network environment. For example, when the network bandwidth is larger, the code rate can be larger, and when the network bandwidth is smaller, the code rate can be larger. , the code rate can be smaller, that is, it supports the dynamic code rate of audio/video and can adapt to the network. It is not limited to this.
  • the code rate of encoding/decoding the multimedia data shared in real time can also be adjusted according to the power/power consumption of the electronic device, the requirements for the output effect, etc., which is not limited in this application. This can balance user experience and device power consumption in various scenarios and improve device availability.
  • the sharing device can share 3G/4G/5G/6G broadcast data in real time, for example, through the 3G/4G/5G/6G broadcast module shown in Figure 2C to Figure 2E.
  • MBS multicast and broadcast service
  • NR broadcast/multicast (multicast can also be called multicast) technology can transmit user services in a point-to-multipoint manner by sharing wireless and transmission resources, allowing one service flow to cover as many people as possible users, thereby effectively improving network resource utilization, while also improving users' business experience and reducing the problem of poor business experience caused by resource congestion.
  • Figure 46A is a schematic architectural diagram of an NR communication system provided by this application.
  • FIG. 46A exemplarily shows a communication scenario of NR broadcast/multicast and a communication scenario of NR unicast.
  • the NR communication system shown in FIG. 46A may include a broadcast platform 4611, a core network device 4612, a core network device 4613, a base station 4614, a base station 4615, a plurality of user equipments (UEs) 4616, and a plurality of UEs 4617.
  • the broadcast platform 4611, the core network equipment 4612, the base station 4614 and multiple UEs 4616 can implement unicast communication.
  • the broadcast platform 4611, core network equipment 4613, base station 4615 and multiple UEs 4617 can implement broadcast/multicast communication.
  • the broadcast platform 4611 may be a network device, such as a service server that provides 5G broadcast data and related services.
  • the broadcast platform 4611 may also be called a broadcast service server.
  • any UE please refer to the description of the electronic device shown in the above embodiment.
  • the broadcast platform 4611 can perform unicast communication through the core network device 4612, the base station 4614, and any one of multiple UEs 4616 (taking UE4616A as an example for illustration).
  • the broadcast platform 4611 can send data to the base station 4614 through the core network equipment 4612.
  • the base station 4614 After receiving the data, the base station 4614 sends the data to the UE 4616A, that is, downlink transmission is performed in a point-to-point manner.
  • the broadcast platform 4611 can perform unicast communication with multiple UEs 4616 respectively.
  • the multiple UEs 4616 can use different bearers respectively. For example, 3 UEs 4616 use 3 bearers.
  • UE4616A can also perform uplink transmission with the base station 4614, the core network equipment 4612, or the broadcast platform 4611 in a point-to-point manner. The specific description is similar and will not be described again.
  • at least one device among the broadcast platform 4611, the core network device 4612, and the base station 4614 can sense the UE 4616A.
  • the broadcast platform 4611 can perform broadcast communication or multicast communication through the core network device 4613, the base station 4615 and multiple UEs 4617.
  • the broadcast platform 4611 can send data to the base station 4615 through the core network equipment 4613.
  • the base station 4615 can send data to multiple UEs 4617, that is, perform downlink transmission in a point-to-multipoint manner.
  • multiple UE4617 can use the same bearer. For example, 3 UE4617 share one bearer.
  • uplink transmission may not be performed.
  • the broadcast platform 4611, the core network device 4613, and the base station 4615 may not sense multiple UEs 4617.
  • uplink transmission can be performed during multicast communication.
  • UE4616A can perform uplink transmission with the base station 4615, core network equipment 4613, or broadcast platform 4611 in a point-to-point manner.
  • at least one device among the broadcast platform 4611, the core network device 4613, and the base station 4615 may sense multiple UEs 4617.
  • any one of the multiple UEs 4617 that performs broadcast/multicast communication with the broadcast platform 4611 can share the received broadcast/multicast data with at least one other UE in real time, for example, as shown in Figure 46A, UE4617A
  • the broadcast/multicast data sent by the base station 4615 can be shared with the UE4618.
  • the UE4617A has the ability to receive 3G/4G/5G/6G broadcast data (for example, the UE4617A includes a modem), and the UE4618 can have the ability to receive 3G/4G/5G /6G broadcast data capability, or does not have the ability to receive 3G/4G/5G/6G broadcast data.
  • UE4618 is for example but not limited to any of the following situations:
  • UE4618 has the ability to receive 3G/4G/5G/6G broadcast data, but the base station where UE4618 is located does not have the ability to receive 3G/4G/5G/6G broadcast data.
  • UE4618 has the ability to receive 3G/4G/5G/6G broadcast data, but the base station where UE4618 is located and the base station where UE4617A is located are different (for example, they belong to different operators).
  • UE4618 has the ability to receive 3G/4G/5G/6G broadcast data, but the base station where UE4618 is located and the base station where UE4617A is located (for example, belong to the same operator) are far away.
  • Case 4 UE4618 does not have the ability to receive 3G/4G/5G/6G broadcast data.
  • UE4618 cannot receive and play the channel data that UE4617A can receive, but UE4617A can share the received channel data to UE4618 in real time, so that users of UE4618 can watch/listen to the channels that UE4617A can receive. Not limited by environment and equipment, the usage scenarios are more extensive and the user experience is better.
  • the broadcast/multicast communication scenario is introduced as an example.
  • Figure 46B is an architectural schematic diagram of yet another NR communication system provided by this application.
  • the NR communication system shown in Figure 46B may include a broadcast service server (for example, the broadcast platform 4611 shown in Figure 46A), a core network, a radio access network (RAN) and n UEs (UE1, UE2, ..., UEn, n is a positive integer greater than 1).
  • the core network may include at least one core network device, for example, the core network device 4612 and the core network device 4613 shown in Figure 46A.
  • the RAN may include at least one access network device, including, for example, base station 4614 and base station 4615 shown in Figure 46A.
  • any UE please refer to the description of the electronic device shown in the above embodiment.
  • the broadcast service server can notify the core network of broadcast start
  • the core network can notify the RAN of broadcast start, such as but not limited to sending information such as service ID, cell list, etc.
  • RAN can return a broadcast response to the core network
  • RAN can implement broadcast channel configuration for n UEs through the multicast control channel (MCCH).
  • the configured channels may be one or more, the broadcast data corresponding to different channels may be different, and the UE may receive the broadcast data corresponding to the configured channels.
  • the broadcast service server can send broadcast data corresponding to channel 1 to the RAN through the core network (can be referred to as channel 1 data), and the RAN can send channel 1 data to n UEs through the multicast traffic channel (multicast traffic channel, MTCH). .
  • the broadcast service server can also send channel 2 data to the RAN through the core network, and the RAN can send channel 2 data to n UEs through the MTCH.
  • the order of the broadcasting process of channel 1 data and the broadcasting process of channel 2 data is not limited. In some examples, the MTCH used by the RAN when transmitting channel 1 data to n UEs and the MTCH used when transmitting channel 2 data may be different.
  • the received channel data can be determined in response to a user operation.
  • the user can choose to have the UE receive channel 1 data but not channel 2 data.
  • any one UE among the n UEs mentioned above can share the received channel data with at least one other UE in real time.
  • Figure 46C is an architectural schematic diagram of another NR communication system provided by this application.
  • the NR communication system shown in Figure 46C may include UE4617A and UE4618 in Figure 46A, and UE4617A may implement Share 3G/4G/5G/6G broadcast data in real time.
  • the 3G/4G/5G/6G broadcast data is channel 1 data as an example.
  • UE4617A may include an application processor (AP), a modem processor (modem), and a wireless communication module.
  • the wireless communication module includes a cellular communication module, a Wi-Fi communication module, and a Bluetooth communication module.
  • the satellite communication module is taken as an example for illustration. In specific implementation, the wireless communication module may include more or fewer communication modules. in:
  • the application processor can include a broadcast/multicast application (APP), shared transmission module, transmission protocol stack, broadcast/multicast network card (MBS network, MBSNET), A-core data service (ADS), display driver, playback driver and capture Get the module.
  • the broadcast/multicast APP may be an APP used to implement MBS (such as a call), and may include a module for providing user experience (UI/UX) display, a module for providing service logic, a transmission module and a codec module, where , the transmission module can be used to receive 3G/4G/5G/6G broadcast data from the transmission protocol stack and send it to the codec module.
  • the codec module can be used to encode or decode the received 3G/4G/5G/6G broadcast data.
  • the decoded data can be played on the broadcast/multicast APP.
  • the sharing transmission module can be used to realize real-time sharing of multimedia data streams to other UEs.
  • the transport protocol stack is, for example, the TCP/IP protocol stack.
  • the display driver can be used to call display modules such as display screens to implement display functions.
  • the playback driver can be used to call audio modules such as speakers to implement audio playback functions.
  • the capture module can be used to capture decoded multimedia data streams that can be played directly, such as capturing multimedia data streams that are being played.
  • the modem processor may include an NR protocol stack, C-core data service (CDS) and broadcast/multicast service (MBS), where the NR protocol stack may include medium access control.
  • MAC medium access control
  • RLC radio link control
  • PDCP packet data convergence protocol
  • UE4617A can receive channel 1 data through the 3G/4G/5G/6G broadcast module in the cellular communication module. Then, the 3G/4G/5G/6G broadcast module can send the channel 1 data to modem processing device. In the modem processor, channel 1 data can be transmitted to the MAC layer, RLC layer, and PDCP layer in sequence. The PDCP layer then sends the channel 1 data to the CDS, and the CDS sends the channel 1 data to the application processor. In the application processor, channel 1 data can be transmitted to ADS, MBSNET, and the transmission protocol stack in sequence.
  • UE4617A can obtain channel 1 data from the application processor and share the channel 1 data to UE4618 in real time.
  • the method by which UE4617A obtains channel 1 data from the application processor may, but is not limited to, include the following three: kind:
  • UE4617A can obtain the decoded channel 1 data from the broadcast/multicast APP.
  • the transmission protocol stack can send the channel 1 data to the transmission module in the broadcast/multicast APP, and the transmission module then sends the channel 1 data to the codec module for decoding.
  • the decoded channel 1 data can be sent to the sharing transmission module, which will share it to UE4618 in real time.
  • UE4617A can directly obtain the channel 1 data before decoding from the transmission protocol stack.
  • the transmission protocol stack can send the channel 1 data to the shared transmission module, and the shared transmission module shares it with UE4618 in real time.
  • UE4617A can capture the decoded, displayed and/or played channel 1 data through the capture module.
  • the transmission protocol stack can send the channel 1 data to the transmission module in the broadcast/multicast APP, and the transmission module then sends the channel 1 data to the codec module for decoding.
  • the decoded channel 1 data can be sent to the display driver and/or playback driver for output (display and/or playback).
  • the capture module captures the output multimedia data stream and transmits it to the sharing transmission module, which shares it to UE4618 in real time.
  • the modem processor can route the channel 1 data sent to the shared transmission module in CDS to send the channel 1 data to UE4618 through the corresponding communication method.
  • UE4617A and UE4618 communicate through cellular communication.
  • the channel 1 data can be sent to UE4618 using transmission mode 1 shown in Figure 46C.
  • CDS can perform IP on the channel 1 data.
  • the processed channel 1 data passes through the PDCP layer, RLC layer and MAC layer in sequence, and finally the channel 1 data is sent to UE4618 by the cellular communication module.
  • UE4617A and UE4618 communicate through Wi-Fi.
  • the channel 1 data can be sent to UE4618 using transmission mode 2 shown in Figure 46C.
  • CDS can send channel 1 data Sent to the Wi-Fi communication module (for example, including a Wi-Fi chip), the Wi-Fi communication module can replace the IP header and IP packetize the channel 1 data, and send the processed channel 1 data to UE4618.
  • CDS can replace the IP packet and IP header of the channel 1 data, and then send the processed channel 1 data to the Wi-Fi communication module, which then sends it to UE4618.
  • the channel 1 data can be sent to UE4618 using transmission method 3 shown in Figure 46C.
  • UE4617A and UE4618 When communicating through Bluetooth communication, you can use transmission method 4 shown in Figure 46C to send channel 1 data to UE4618.
  • the descriptions of transmission method 3 and transmission method 4 are similar to transmission method 2 and will not be described again.
  • UE4617A and UE4618 can also communicate through other communication methods, and can use corresponding transmission methods to transmit channel 1 data, which is not limited in this application.
  • the application processor of the sharing device does not need to be awakened (for example, there is no need to wake up the 3G/4G/5G/6G broadcast data, play 3G/4G/5G/6G broadcast data, etc.), and directly send the received 3G/4G/5G/6G broadcast data to the shared device through the modem processor, which can provide low power consumption Transmission mode reduces device power consumption and improves device availability.
  • the modem processor which can provide low power consumption Transmission mode reduces device power consumption and improves device availability.
  • there is no need to share the device to run the broadcast/multicast APP used to play 3G/4G/5G/6G broadcast data) in the foreground, and there is no need to share the device to have the ability to decode and play 3G/4G/5G/6G broadcast data, broadening the application scenarios, the user experience is better.
  • Scenario 1 During an operator call between friends, one party sees an interesting video (such as a movie, TV, short video) or audio (such as music) and wants to watch the video or listen to the audio together with the other party. You can Initiate real-time sharing so that both parties on the call can watch/listen and discuss at the same time. For specific examples, see the description of the real-time sharing scenario of watching together above. It solves the current problem of operators being unable to share audio/video streams in real time during calls and improves user experience.
  • an interesting video such as a movie, TV, short video
  • audio such as music
  • Scenario 2 During a carrier call between subordinates and leaders, subordinates need to report/share document materials (such as word format, excel format or PPT format, etc.) with the leader, explain the content of the document materials line by line/page by page, and Modifications are made based on the leader's opinions.
  • the leader wants to see the results of the modification simultaneously, but the subordinates do not want the leader to see the pictures/audio of other applications on the mobile phones and other electronic devices used by the subordinates. Therefore, the subordinates can only share the document materials in real time. By applying it, leaders can not only see the modification results synchronously, but also modify the document materials.
  • the real-time sharing scenario of editing together above please refer to the description of the real-time sharing scenario of editing together above, which makes the use more flexible.
  • Scenario 3 When consumers call customer service to inquire about how to use the purchased items, customer service can initiate real-time sharing and share guidance videos, pictures or documents with consumers, reducing the time and energy spent on phone calls.
  • Scenario 4 When their children are taking online classes, working parents hope to see their children taking online classes during the company’s lunch break or on the way to and from get off work. They can remotely supervise their children and see the online classes. They can focus on the parts that their children have questions about. Provide explanations and comments, and provide remote guidance to the children, then the children can share the audio stream/video stream of the online class application in real time through the phone, and at the same time share the sound collected by the microphone and the face image collected by the camera. For specific examples, see Figure 15A and Figure 15B .
  • Scenario 5 The elderly at home do not know how to use certain things. For example, they cannot access the Internet when using mobile phones. Children from other places need remote guidance through the phone. Children from other places can actively initiate real-time sharing requests, and the elderly at home will automatically share after accepting the request. The picture on the screen of the elderly's mobile phone can guide the elderly more conveniently and quickly.
  • Scenario 6 During a family dinner, users can share real-time images of the application with multiple nearby users through near-field communication methods such as Wi-Fi. This can be understood as a near-field one-to-any real-time sharing scenario.
  • Scenario 7 When friends are gathering, the user can share the real-time screen of the game application with multiple nearby users through near field communication, and/or play the same game together (see Figure 23A- Figure 23C for specific examples), which can be understood as It is a near-field pair of any (1 to any) real-time shared scene.
  • Scenario 8 When conducting business meetings in the same conference room, users can share the content of document materials with multiple nearby users through near field communication, and can even edit document materials together. It can be understood as a near-field pair of arbitrary (1 to any) real-time sharing scene.
  • Scenario nine In a near-field one-to-any real-time sharing scenario, take scenario six as an example.
  • the sharing user wants to share part or all of the video with other people for their own privacy and security considerations. , but do not want to send the source file of the video to other people. Therefore, the sharing user can play the video on his own device. When the content he wants to share is played, he can share it in real time through near field communication. Stop sharing in real time when content is being shared. Moreover, the sharing user may not allow the shared user to save and forward the real-time shared video. For specific examples, see Figure 15D, Figure 16A- Figure 16E.
  • Scenario 10 You can share audio one-to-one in real time, or you can share one audio to multiple headphones for playback one-to-many, allowing you to listen together without sending audio files. And the sharing user can not allow the shared user to save and forward the music shared in real time to protect the copyright of the music.
  • Scenario 11 Outdoor party or outdoor square dance, you can share an audio to multiple speaker devices through near-field broadcasting to avoid disturbing the crowd and avoiding a cold scene caused by a large speaker playing at a high volume.
  • Scenario 12 The leader calls a subordinate and wants to share document materials in real time with the subordinate and colleagues near the subordinate. However, it is not convenient to send the document materials directly. Therefore, the leader can share the application of the document materials with the subordinate in real time, and the subordinate Then share it to nearby colleagues in real time, so that nearby colleagues can watch it through their own devices. There is no need for multiple people to gather around the subordinate and share the subordinate's device (such as a mobile phone or other smaller mobile terminal) to watch. For a specific example, see Figure 45C , which can greatly improve the user’s viewing experience.
  • Display in this application can be replaced by other output methods, such as playing through a speaker.
  • play in this application can also be replaced by other output methods, such as display through a display screen.
  • the output in this application not only includes execution through output modules such as the display screen of the device itself, but also includes execution through output modules such as display screens of other devices connected to the device.
  • the microphone in this application can be replaced by other modules that can collect audio/voice/sound.
  • the camera in this application can be replaced by other modules that can shoot/collect images.
  • the methods provided by the embodiments of this application can be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, a network device, a user equipment, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., the computer instructions may be transferred from a website, computer, server, or data center Transmit to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the readable storage medium can be any available media that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated.
  • the available media can be magnetic media (for example, floppy disks, hard disks, tapes ), optical media (for example, digital video disc (DWD)), or semiconductor media (for example, solid state disk (SSD), etc.).
  • magnetic media for example, floppy disks, hard disks, tapes
  • optical media for example, digital video disc (DWD)
  • semiconductor media for example, solid state disk (SSD), etc.

Abstract

本申请提供了一种共享方法、电子设备及系统,该方法应用于第一设备,该方法包括:显示第一界面,第一界面用于指示当前和第二设备进行运营商通话;和第二设备进行运营商通话时,显示第一应用的第二界面;接收作用于第二界面的用户操作;向第二设备发送第一数据,第一数据用于第二设备输出第二界面相关的多媒体数据。本申请能够让用户以更加简单快捷的操作方式和通话对方实现一起看、一起听等实时共享功能,有效满足用户的需求,提升用户体验。

Description

共享方法、电子设备及系统
本申请要求于2022年07月22日提交中国专利局、申请号为202210867898.7、申请名称为“共享方法、电子设备及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,尤其涉及一种共享方法、电子设备及系统。
背景技术
目前无法实现实时共享音频流、视频流等多媒体数据流的功能,例如,用户在电话中往往通过口述来分享内容,通话对方无法观看/收听到相关内容,或者,用户通过蓝牙、近距离无线通信技术(near field communication,NFC)等近场通信技术实现的是文件式的分享。
发明内容
本申请公开了一种共享方法、电子设备及系统,能够让用户以更加简单快捷的操作方式和至少一个通话对方、附近用户等实现一起看、一起听、一起玩和一起编辑等实时共享功能。
第一方面,本申请提供了一种共享方法,应用于第一设备,该方法包括:显示第一界面,所述第一界面用于指示当前和第二设备进行运营商通话;和所述第二设备进行所述运营商通话时,显示第一应用的第二界面;接收作用于所述第二界面的第一用户操作;向所述第二设备发送第一数据,所述第一数据用于所述第二设备输出所述第二界面相关的多媒体数据。
在上述方法中,第一设备和第二设备进行运营商通话时,第一设备可以根据接收到的第一用户操作,将作为前台应用的第一应用相关的第一数据发送给第二设备,以使第二设备输出第一应用的界面相关的多媒体数据,解决了运营商通话的场景下无法实时共享多媒体数据流的问题,让用户以更加简单快捷的操作方式和通话对方实现一起看、一起听,满足用户需求,提升用户体验。
在一种可能的实现方式中,所述第一界面和所述第二界面包括第一悬浮窗,所述第一用户操作为作用于所述第一悬浮窗中的分享控件的用户操作;或者,所述第一用户操作为以第一轨迹滑动的用户操作。
在上述方法中,用于触发向第二设备发送第一数据的第一用户操作的类型可以有多种,用户可以根据需求自行选择执行哪种类型的第一用户操作,满足不同的用户需求,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据之前,所述方法还包括:显示所述第二界面时,抓取所述第二界面相关的多媒体数据,所述第一数据包括所述第二界面相关的多媒体数据。
在上述方法中,第一数据包括第一设备抓取的本设备输出的第二界面相关的音频流、视频流等多媒体数据,因此,第二设备接收到第一数据后可以直接输出第二界面相关的多媒体数据,无需安装第一应用或适配第一应用,即可以实时共享任意应用的多媒体数据,拓宽了应用场景,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:通过所述运营商通话的主链路向所述第二设备发送所述运营商通话的通话数据和所述第一数据。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:通过所述运营商通话的主链路向所述第二设备发送所述运营商通话的通话数据,通过所述运营商通话的数据通路向所述第二设备发送所述第一数据。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:通过所述运营商通话的主链路向所述第二设备发送所述运营商通话的通话数据,通过辅助链路向所述第二设备发送所述第一数据。
例如,辅助链路是网络地址转换NAT的穿越链路或者中继链路。
例如,辅助链路的物理通道为蜂窝通信链路、无线保真Wi-Fi链路、蓝牙BT链路、点对点D2D链路或者卫星链路。
在上述方法中,第一设备和第二设备进行运营商通话时,第一设备可以通过和该运营商通话的主链路、数据通路或相关的辅助链路发送实时分享的第一数据,因此,第一设备和第二设备无需安装聊天应用、会议应用等用于实现实时共享多媒体数据的应用程序,用户可以基于当前进行的运营商通话快捷地实时共享多媒体数据,拓宽了应用场景,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据之前,所述方法还包括:向网络设备发 送第一请求消息,所述第一请求消息包括所述第二设备的标识信息;接收所述网络设备基于所述第一请求消息发送的所述第二设备的会话标识;根据所述第二设备的会话标识和所述第二设备建立所述辅助链路。
例如,标识信息包括电话号码、过顶OTT标识ID、网络账号。
在上述方法中,即使第一设备原本未存储第二设备的会话标识,也可以通过已有的第二设备的标识信息获取第二设备的会话标识,从而和第二设备建立辅助链路,标识信息的类型多种多样,提升了成功建立辅助链路的概率,应用场景更广泛。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:显示第三界面,所述第三界面包括多个设备的信息;接收作用于所述多个设备中的所述第二设备的第二用户操作;向所述第二设备发送所述第一数据。
在上述方法中,和第一设备进行实时分享的被分享设备(即第二设备)可以是响应于用户操作确定的,用户使用更加灵活,提升用户体验。
例如,所述第二设备为通过近场通信方式和所述第一设备连接的设备。这样解决了近场通信的场景下无法实时共享多媒体数据流的问题,让用户以更加简单快捷的操作方式和附近设备实现一起看、一起听,满足用户需求,提升用户体验。
在一种可能的实现方式中,所述多个设备包括以下至少一项:发现的设备、连接的设备、最近一次进行过运营商通话的设备、存储有标识信息的设备、根据拍摄的图像识别的设备。
例如,所述连接的设备包括所述第二设备、通过近场通信方式连接的设备、通过远场通信方式连接的设备。例如,所述发现的设备包括通过近场通信方式发现的设备、通过远场通信方式发现的设备。例如,所述拍摄的图像包括所述第一设备拍摄的图像,和/或所述第一设备连接的设备拍摄的图像。
在上述方法中,可供用户选择的被分享设备的类型多种多样,满足用户向不同设备实时分享多媒体数据的需求,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:显示第四界面,所述第四界面包括多个窗口的信息;接收作用于所述多个窗口中的第一窗口的第三用户操作,所述第一窗口包括所述第二界面的内容;向所述第二设备发送所述第一数据。
在上述方法中,第一设备实时分享的内容(即第一数据)可以是响应于用户操作确定的,用户使用更加灵活,提升用户体验。
在一种可能的实现方式中,所述多个窗口包括以下至少一项:前台应用的窗口、后台应用的窗口、所述第一设备已安装但未运行的应用的窗口。
在上述方法中,可供用户选择的待分享内容可以为前台应用的多媒体数据,也可以为后台应用的多媒体数据,还可以为第一设备已安装但未运行的应用的多媒体数据,满足用户实时分享不同多媒体数据的需求,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:显示第五界面,所述第五界面包括多个共享方式;接收作用于所述多个共享方式中的第一方式的第四用户操作;显示第六界面,所述第六界面包括多个窗口和多个设备的信息,所述多个窗口和所述多个设备是根据所述第一方式确定的;接收作用于所述多个窗口中的第二窗口的第五用户操作,接收作用于所述多个设备中的所述第二设备的第六用户操作,所述第二窗口包括所述第二界面的内容;根据所述第五用户操作和所述第六用户操作,向所述第二设备发送所述第一数据。
例如,所述第一方式为一起看,所述多个窗口包括视频应用的窗口,所述多个设备包括配置有显示屏的设备(例如手机、平板电脑)。例如,所述第一方式为一起听,所述多个窗口包括音乐应用的窗口,所述多个设备包括配置有扬声器的设备(例如耳机、音箱)。
在上述方法中,第一设备显示的可供用户选择的被分享设备和待分享内容,可以是根据用户选择的共享方式确定的,从而过滤掉用户不想要选择的被分享设备和待分享内容,避免这些信息影响用户选择,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据之前,所述方法还包括:接收第七用户操作;响应于所述第七用户操作,确定共享数据的类型为第一类型;其中,当所述第一类型为音频时,所述第一数据包括所述第二界面相关的音频数据;当所述第一类型为图像时,所述第一数据包括所述第二界面相关的视频数据;当所述第一类型为音频和图像时,所述第一数据包括所述第二界面相关的音频数据和视频数据。
在上述方法中,用户可以选择分享内容的类型,即选择第一数据的类型为音频、图像或者音频和图像, 满足用户的个性化需求,提升用户体验。
在一种可能的实现方式中,所述第一数据包括所述第二界面相关的视频数据;所述方法还包括:接收作用于所述第二界面且以第二轨迹滑动的第八用户操作;向所述第二设备发送第二数据,所述第二数据包括所述第二界面相关的音频数据。
例如,所述第一轨迹为W的轨迹,所述第二轨迹为L的轨迹。
在上述方法中,用户可以通过执行不同的用户操作触发分享不同类型的内容,操作更加简单方便,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据之前,所述方法还包括:接收用于选择所述第二界面中的第一区域的第九用户操作,所述第一数据包括所述第一区域相关的多媒体数据。
在上述方法中,用户可以选择分享第二界面中的部分区域相关的多媒体数据,即让用户可以做到快捷地分享任意区域的多媒体数据,满足用户的个性化需求,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据之前,所述方法还包括:接收用于选择所述第二界面中的第一图层的第十用户操作,所述第一数据包括所述第一图层相关的多媒体数据。
在上述方法中,用户可以选择分享第二界面中的部分图层相关的多媒体数据,即让用户可以做到快捷地分享任意图层的多媒体数据,满足用户的个性化需求,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:当所述第一应用不为预设应用时,向所述第二设备发送所述第一数据,所述预设应用的安全等级高于第一等级。
例如,所述预设应用包括所述第一设备响应于用户操作确定的应用。例如,所述预设应用包括所述第一设备按照预设规则确定的应用。例如,所述预设应用包括银行应用和/或支付应用。
在上述方法中,第一设备可以不分享安全等级高于第一等级的预设应用的多媒体数据,有效保障用户的隐私安全。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:识别到所述第二界面中的第二区域相关的数据的安全等级高于第二等级;向所述第二设备发送所述第一数据,所述第一数据不包括所述第二区域相关的数据。
例如,所述第二区域相关的数据包括所述第一设备响应于用户操作确定的数据。例如,所述第二区域相关的数据包括所述第一设备按照预设规则确定的数据。例如,所述第二区域相关的数据包括用户名、密码、账户名、登录名、身份证号码、银行卡号、账户余额。
在上述方法中,第一设备可以不分享安全等级高于第二等级的数据,有效保障用户的隐私安全。
在一种可能的实现方式中,所述显示第一应用的第二界面,包括:接收网络设备发送的第一频道的广播数据;根据所述第一频道的广播数据显示所述第二界面。
在一种可能的实现方式中,所述方法还包括:接收所述网络设备发送的第二频道的广播数据,所述第一设备显示的用户界面和所述第二频道的广播数据无关;接收第十一用户操作;向第三设备发送所述第二频道的广播数据,所述第二频道的广播数据用于所述第三设备输出所述第二频道的音频和/或视频。
在上述方法中,第一设备可以不输出接收到的第二频道的广播数据,而是响应于用户操作,直接将第二频道的广播数据发送给第三设备,第一设备的应用处理器无需被唤醒来处理第二频道的广播数据,从而减小设备功耗。并且,第一设备无需具备解码和播放广播数据的能力,拓宽应用场景,用户体验更好。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:向所述第二设备发送所述第一数据和第三数据,所述第三数据包括所述第一设备通过麦克风采集到的音频数据和/或所述第一设备通过摄像头采集到的图像数据。
在上述方法中,第一设备向第二设备发送的多媒体数据可以叠加麦克风采集的音频数据和/或摄像头采集的图像数据,使用第二设备的用户可以一边看/听应用数据,一边看对方情况和/或一边听对方讲解,满足用户的个性化需求,提升用户体验。
在一种可能的实现方式中,所述方法还包括:接收第十二用户操作;响应于所述第十二用户操作,确定不授予所述第二设备保存所述第一数据的权限和转发所述第一数据的权限;接收所述第二设备发送的第二请求消息,所述第二请求消息用于请求保存和/或转发所述第一数据;根据所述第二请求消息显示第一提示信息。
在上述方法中,第一设备可以设置不允许第二设备保存和转发第一数据,第二设备需要保存第一数据或者转发第一数据时,可以请求第一设备允许,从而避免在使用第一设备的用户不知情的情况下,第二设备二次传播第一设备分享的第一数据,提升对用户的隐私安全的保护。
在一种可能的实现方式中,所述方法还包括:接收所述第二设备发送的第三请求消息,所述第三请求消息用于请求向所述第一设备实时分享多媒体数据;根据所述第三请求消息显示第二提示信息;接收第十三用户操作,所述第十三用户操作用于接受所述第二请求消息指示的请求;接收所述第二设备发送的第四数据;输出所述第四数据。
在上述方法中,第一设备向第二设备分享第一数据时,第二设备也可以向第一设备分享多媒体数据,即实现双向分享,满足用户个性化的实时共享需求,提升用户体验。
在一种可能的实现方式中,所述输出所述第四数据,包括:根据所述第四数据显示第七界面,所述第一设备显示所述第七界面时,所述第二设备显示所述第二界面的内容;或者,所述输出所述第四数据,包括:分屏显示所述第二界面和第八界面,所述第八界面是根据所述第四数据确定的。
在上述方法中,第一设备显示第二设备分享的内容时,第二设备也可以显示第一设备分享的内容,即“你看我的,我看你的”,或者,第一设备也可以分屏显示本设备分享的内容和第二设备分享的内容,显示方式灵活多样,满足用户在不同场景下的不同需求。
在一种可能的实现方式中,所述接收所述第二设备发送的第四数据之后,所述方法还包括:接收第十四用户操作;向第四设备发送所述第四数据,以使所述第四设备输出所述第四数据。
在上述方法中,第一设备可以将第二设备分享的第四数据再分享给其他设备,满足用户个性化的实时共享需求,提升用户体验。
在一种可能的实现方式中,所述向所述第二设备发送第一数据,包括:通过第一链路和第二链路向所述第二设备发送所述第一数据,所述第一链路为蜂窝通信链路或辅助链路,所述第二链路包括以下至少一项:蓝牙链路、无线保真Wi-Fi链路、V2X链路、卫星链路、点对点D2D链路、蜂窝通信链路和辅助链路,所述第一链路和所述第二链路不同。
在上述方法中,第一设备可以通过不同通信方式的不同传输路径一起传输第一数据,例如,通过第一链路传输一次第一数据,通过第二链路再传输一次第一数据,可以理解为是实现冗余补包,避免某个链路不稳定造成第二设备无法接收到有效的第一数据,提升传输质量。
在一种可能的实现方式中,所述方法还包括:显示第九界面,所述第九界面包括所述第一设备运行的多个用户界面的信息;接收作用于所述第九界面中的第一控件的第十五用户操作,所述第一控件和所述多个用户界面中的第十界面相关;向第五设备发送第五数据,所述第五数据用于所述第五设备输出所述第十界面相关的多媒体数据。
例如,所述第九界面为多任务列表的用户界面。
在上述方法中,用户可以基于多任务列表的用户界面触发分享其中一个任务(第十界面)相关的多媒体数据,触发实时分享的方式多种多样,满足用户在不同场景下的不同需求,提升用户体验。
在一种可能的实现方式中,所述方法还包括:显示第十一界面,所述第十一界面包括控制中心的多个功能的信息;接收作用于所述第十一界面中的第二控件的第十六用户操作,所述第二控件和所述多个功能中的分享功能相关;向第六设备发送第六数据,所述第六数据用于所述第六设备输出所述第一设备的前台应用的多媒体数据。
例如,所述第十一界面为所述第一设备响应于从屏幕上侧边缘往下滑动的用户操作,显示的控制中心的用户界面。
在上述方法中,用户可以基于控制中心的用户界面触发实时分享,触发方式多种多样,满足用户在不同场景下的不同需求,提升用户体验。
第二方面,本申请提供了又一种共享方法,应用于第一设备,该方法包括:显示第一界面,所述第一界面包括所述第一设备运行的多个窗口的信息;接收作用于所述第一界面中的第一控件的第一用户操作,所述第一控件和所述多个窗口中的第一应用的第一窗口相关;向第二设备发送第一数据,所述第一数据用于所述第二设备输出所述第一窗口相关的多媒体数据。
例如,所述第一界面为多任务列表的用户界面。
例如,所述第二设备为和所述第一设备进行运营商通话的设备。例如,所述第二设备为和所述第一设备通过近场通信方式连接的设备。例如,所述第二设备为和所述第一设备通过远场通信方式连接的设备。
在上述方法中,用户可以基于多任务列表的用户界面触发分享其中一个任务(第一窗口)相关的多媒体数据,分享的第二设备可以是通话对方或附近设备,解决了运营商通话和近场通信的场景下无法实时共享多媒体数据流的问题,让用户以更加简单快捷的操作方式和通话对方、附近设备、远场设备实现一起看、 一起听,满足用户需求,提升用户体验。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:显示第二界面,所述第二界面包括多个设备的信息;接收作用于所述多个设备中的所述第二设备的第二用户操作;向所述第二设备发送所述第一数据。
在一种可能的实现方式中,所述多个设备包括以下至少一项:发现的设备、连接的设备、最近一次进行过运营商通话的设备、存储有标识信息的设备、根据拍摄的图像识别的设备。
例如,所述连接的设备包括当前进行运营商通话的设备、通过近场通信方式连接的设备、通过远场通信方式连接的设备。
在一种可能的实现方式中,所述显示第二界面之前,所述方法还包括:显示第三界面,所述第三界面包括多个共享方式;接收作用于所述多个共享方式中的第一方式的第三用户操作,所述多个设备是根据所述第一方式确定的。
例如,所述第一方式为一起看,所述多个设备包括配置有显示屏的设备(例如手机、平板电脑)。例如,所述第一方式为一起听,所述多个设备包括配置有扬声器的设备(例如耳机、音箱)。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收第四用户操作;响应于所述第四用户操作,确定共享数据的类型为第一类型;其中,当所述第一类型为音频时,所述第一数据包括所述第一窗口相关的音频数据;当所述第一类型为图像时,所述第一数据包括所述第一窗口相关的视频数据;当所述第一类型为音频和图像时,所述第一数据包括所述第一窗口相关的音频数据和视频数据。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收作用于所述第一窗口中的第一区域的第五用户操作,所述第一数据包括所述第一区域相关的多媒体数据。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收作用于所述第一窗口中的第一图层的第六用户操作,所述第一数据包括所述第一图层相关的多媒体数据。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:当所述第一应用不为预设应用时,向所述第二设备发送所述第一数据,所述预设应用的安全等级高于第一等级。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:识别到所述第一窗口中的第二区域相关的数据的安全等级高于第二等级;向所述第二设备发送所述第一数据,所述第一数据不包括所述第二区域相关的数据。
在一种可能的实现方式中,所述显示第一界面,包括:接收网络设备发送的第一频道的广播数据;根据所述第一频道的广播数据显示所述第一界面中的所述第一窗口。
在一种可能的实现方式中,所述方法还包括:接收所述网络设备发送的第二频道的广播数据,所述第一设备显示的用户界面和所述第二频道的广播数据无关;接收第七用户操作;向所述第二设备发送所述第二频道的广播数据,所述第二频道的广播数据用于所述第二设备输出所述第二频道的音频和/或视频。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:向所述第二设备发送所述第一数据和第二数据,所述第二数据包括所述第一设备通过麦克风采集到的音频数据和/或所述第一设备通过摄像头采集到的图像数据。
在一种可能的实现方式中,所述方法还包括:接收第八用户操作;响应于所述第八用户操作,确定不授予所述第二设备保存所述第一数据的权限和转发所述第一数据的权限;接收所述第二设备发送的第一请求消息,所述第一请求消息用于请求保存和/或转发所述第一数据;根据所述第一请求消息显示第一提示信息。
在一种可能的实现方式中,所述方法还包括:接收所述第二设备发送的第二请求消息,所述第二请求消息用于请求实时分享;根据所述第二请求消息显示第二提示信息;接收第九用户操作,所述第九用户操作用于接受所述第二请求消息指示的请求;接收所述第二设备发送的第三数据;输出所述第三数据。
在一种可能的实现方式中,所述输出所述第三数据,包括:根据所述第三数据显示第四界面,所述第一设备显示所述第四界面时,所述第二设备显示所述第一窗口的内容;或者,所述输出所述第三数据,包括:分屏显示第五界面和第六界面,所述第五界面包括所述第一窗口的内容,所述第六界面是根据所述第三数据确定的。
在一种可能的实现方式中,所述接收所述第二设备发送的第三数据之后,所述方法还包括:接收第十用户操作;向第三设备发送所述第三数据,以使所述第三设备输出所述第三数据。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:通过第一链路和第二链路向所述第 二设备发送所述第一数据,所述第一链路和所述第二链路包括以下至少一项:蜂窝通信链路、辅助链路、蓝牙链路、无线保真Wi-Fi链路、V2X链路、卫星链路、点对点D2D链路,所述第一链路和所述第二链路不同。
第三方面,本申请提供了又一种共享方法,应用于第一设备,该方法包括:显示第一界面,所述第一界面包括控制中心的多个功能的信息;接收作用于所述第一界面中的第一控件的第一用户操作,所述第一控件和所述多个功能中的分享功能相关;向第二设备发送第一数据,以使所述第二设备输出所述第一数据。
例如,所述第一界面为所述第一设备响应于从屏幕上侧边缘往下滑动的用户操作,显示的控制中心的用户界面。
例如,所述第二设备为和所述第一设备进行运营商通话的设备。例如,所述第二设备为和所述第一设备通过近场通信方式连接的设备。例如,所述第二设备为和所述第一设备通过远场通信方式连接的设备。
在上述方法中,用户可以基于控制中心的用户界面触发实时分享,分享的第二设备可以是通话对方或附近设备,解决了运营商通话和近场通信的场景下无法实时共享多媒体数据流的问题,让用户以更加简单快捷的操作方式和通话对方、附近设备、远场设备实现一起看、一起听,满足用户需求,提升用户体验。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:显示第二界面,所述第二界面包括多个设备的信息;接收作用于所述多个设备中的所述第二设备的第二用户操作;向所述第二设备发送所述第一数据。
在一种可能的实现方式中,所述多个设备包括以下至少一项:发现的设备、连接的设备、最近一次进行过运营商通话的设备、存储有标识信息的设备、根据拍摄的图像识别的设备。
例如,所述连接的设备包括当前进行运营商通话的设备、通过近场通信方式连接的设备、通过远场通信方式连接的设备。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:显示第三界面,所述第三界面包括多个窗口的信息;接收作用于所述多个窗口中的第一窗口的第三用户操作,所述第一数据包括所述第一窗口相关的多媒体数据;向所述第二设备发送所述第一数据。
在一种可能的实现方式中,所述多个窗口包括以下至少一项:前台应用的窗口、后台应用的窗口、所述第一设备已安装但未运行的应用的窗口。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收作用于所述第一窗口中的第一区域的第四用户操作,所述第一数据包括所述第一区域相关的多媒体数据。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收作用于所述第一窗口中的第一图层的第五用户操作,所述第一数据包括所述第一图层相关的多媒体数据。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:当所述第一数据对应的应用不为预设应用时,向所述第二设备发送所述第一数据,所述预设应用的安全等级高于第一等级。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:识别到所述第一窗口中的第二区域相关的数据的安全等级高于第二等级;向所述第二设备发送所述第一数据,所述第一数据不包括所述第二区域相关的数据。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:显示第四界面,所述第四界面包括多个共享方式;接收作用于所述多个共享方式中的第一方式的第六用户操作;显示第五界面,所述第五界面包括多个窗口和多个设备的信息,所述多个窗口和所述多个设备是根据所述第一方式确定的;接收作用于所述多个窗口中的第二窗口的第七用户操作,接收作用于所述多个设备中的所述第二设备的第八用户操作,所述第一数据包括所述第二窗口相关的多媒体数据;根据所述第七用户操作和所述第八用户操作,向所述第二设备发送所述第一数据。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收第九用户操作;响应于所述第九用户操作,确定共享数据的类型为第一类型;其中,当所述第一类型为音频时,所述第一数据包括音频数据;当所述第一类型为图像时,所述第一数据包括视频数据;当所述第一类型为音频和图像时,所述第一数据包括音频数据和视频数据。
在一种可能的实现方式中,所述向第二设备发送第一数据之前,所述方法还包括:接收网络设备发送的第一频道的广播数据;根据所述第一频道的广播数据显示第六界面,所述第一数据包括所述第六界面相关的多媒体数据。
在一种可能的实现方式中,所述方法还包括:接收所述网络设备发送的第二频道的广播数据,所述第 一设备显示的用户界面和所述第二频道的广播数据无关,所述第一数据包括所述第二频道的广播数据,所述第一数据用于所述第二设备输出所述第二频道的音频和/或视频。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:向所述第二设备发送所述第一数据和第二数据,所述第二数据包括所述第一设备通过麦克风采集到的音频数据和/或所述第一设备通过摄像头采集到的图像数据。
在一种可能的实现方式中,所述方法还包括:接收第十用户操作;响应于所述第十用户操作,确定不授予所述第二设备保存所述第一数据的权限和转发所述第一数据的权限;接收所述第二设备发送的第一请求消息,所述第一请求消息用于请求保存和/或转发所述第一数据;根据所述第一请求消息显示第一提示信息。
在一种可能的实现方式中,所述方法还包括:接收所述第二设备发送的第二请求消息,所述第二请求消息用于请求实时分享;根据所述第二请求消息显示第二提示信息;接收第十一用户操作,所述第十一用户操作用于接受所述第二请求消息指示的请求;接收所述第二设备发送的第三数据;输出所述第三数据。
在一种可能的实现方式中,所述输出所述第三数据,包括:根据所述第三数据显示第七界面,所述第一设备显示所述第七界面时,所述第二设备显示所述第一数据包括的视频数据;或者,所述输出所述第三数据,包括:分屏显示第八界面和第九界面,所述第八界面是根据所述第一数据确定的,所述第九界面是根据所述第三数据确定的。
在一种可能的实现方式中,所述方法还包括:接收第十二用户操作;向第三设备发送所述第三数据,以使所述第三设备输出所述第三数据。
在一种可能的实现方式中,所述向第二设备发送第一数据,包括:通过第一链路和第二链路向所述第二设备发送所述第一数据,所述第一链路和所述第二链路包括以下至少一项:蜂窝通信链路、辅助链路、蓝牙链路、无线保真Wi-Fi链路、V2X链路、卫星链路、点对点D2D链路,所述第一链路和所述第二链路不同。
第四方面,本申请提供了一种电子设备,包括收发器、处理器和存储器,上述存储器用于存储计算机程序,上述处理器调用上述计算机程序,用于执行上述任一方面任一项可能的实现方式中的共享方法。
第五方面,本申请提供了一种计算机存储介质,该计算机存储介质存储有计算机程序,该计算机程序被处理器执行时,实现上述任一方面任一项可能的实现方式中的共享方法。
第六方面,本申请提供了一种计算机程序产品,当该计算机程序产品在电子设备上运行时,使得该电子设备执行上述任一方面任一项可能的实现方式中的共享方法。
第七方面,本申请提供一种电子设备,该电子设备包括执行本申请任一实现方式所介绍的方法或装置。上述电子设备例如为芯片。
应当理解的是,本申请中对技术特征、技术方案、有益效果或类似语言的描述并不是暗示在任意的单个实现方式中可以实现所有的特点和优点。相反,可以理解的是对于特征或有益效果的描述意味着在至少一个实现方式中包括特定的技术特征、技术方案或有益效果。因此,本说明书中对于技术特征、技术方案或有益效果的描述并不一定是指相同的实现方式。进而,还可以任何适当的方式组合本申请中所描述的技术特征、技术方案和有益效果。本领域技术人员将会理解,无需特定实现方式的一个或多个特定的技术特征、技术方案或有益效果即可实现本申请。在其他实现方式中,还可在没有体现本申请的特定实现方式中识别出额外的技术特征和有益效果。
附图说明
以下对本申请用到的附图进行介绍。
图1A是本申请提供的一种共享系统的架构示意图;
图1B是本申请提供的又一种共享系统的架构示意图;
图1C是本申请提供的又一种共享系统的架构示意图;
图2A是本申请提供的一种电子设备的硬件结构示意图;
图2B是本申请提供的一种电子设备的软件架构示意图;
图2C是本申请提供的又一种电子设备的软件架构示意图;
图2D是本申请提供的又一种电子设备的软件架构示意图;
图2E是本申请提供的又一种共享系统的架构示意图;
图3、图4A-图4C、图5A-图5D、图6A-图6D、图7A-图7C、图8A-图8C、图9A-图9C、图10A- 图10B、图11A-图11D、图12A-图12D、图13、图14A-图14D、图15A-图15D、图16A-图16E、图17A-图17I、图18A-图18D、图19A-图19G、图20A-图20D、图21A-图21E、图22A-图22E、图23A-图23C、图24A-图24C是本申请提供的一些用户界面的示意图;
图25是本申请提供的一种共享方法的流程示意图;
图26A是本申请提供的一种音频传输方式的示意图;
图26B是本申请提供的又一种音频传输方式的示意图;
图26C是本申请提供的又一种音频传输方式的示意图;
图27是本申请提供的又一种共享系统的架构示意图;
图28是本申请提供的一种辅助链路的建立过程的流程示意图;
图29是本申请提供的一种通信地图的示意图;
图30是本申请提供的一种预测建链的流程示意图;
图31是本申请提供的一种数据传输的示意图;
图32A是本申请提供的一种音频流和/或视频流传输的架构示意图;
图32B是本申请提供的一种数据包的示意图;
图33是本申请提供的又一种数据传输的示意图;
图34是本申请提供的又一种共享系统的架构示意图;
图35是本申请提供的一种设备发现和连接的流程示意图;
图36是本申请提供的又一种数据传输的示意图;
图37是本申请提供的一种组播组成员离开的流程示意图;
图38是本申请提供的又一种组播组成员离开的流程示意图;
图39是本申请提供的又一种数据包的示意图;
图40是本申请提供的又一种共享系统的架构示意图;
图41是本申请提供的又一种共享系统的架构示意图;
图42是本申请提供的又一种数据传输的示意图;
图43是本申请提供的一种密码传输过程的流程示意图;
图44是本申请提供的一种多设备同步过程的流程示意图;
图45A-图45D是本申请提供的一些多级分享的场景示意图;
图46A-图46C是本申请提供的一些新无线接入NR通信系统的架构示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
目前的分享功能可以通过以下三种方式实现:
方式一:使用手机的用户在电话(也可称为运营商通话)中往往通过口述来分享在手机上看到的内容,通话对方无法观看到该内容。
方式二:用户通过蓝牙、近距离无线通信技术(near field communication,NFC)等近场通信技术实现的是文件式的分享,例如可以将图片文件分享给附近用户,无法实时分享音频流/视频流等多媒体数据流,存在二次传播的可能性,无法有效保障用户的隐私安全。
方式三:用户通过电子设备上安装的聊天应用或会议应用实时共享其他应用的多媒体数据流。但这样仍然无法实现运营商通话和近场通信的场景下的实时共享。并且,分享设备和被分享设备都需要安装聊天应用或者会议应用,以及待共享的应用,甚至可能要求被分享设备注册和/或登录到该待共享的应用。待共享的应用还需要和聊天应用或会议应用进行适配,未适配的应用的多媒体数据流无法被实时分享。应用场景限定,无法满足用户的需求。
本申请提供了一种共享方法,可以提供更加简洁和方便的用户体验操作序列,让分享设备和一个或多个通话对方、附近设备、远场设备等被分享设备,实现一起看、一起听、一起玩和一起编辑等实时共享、协同的功能,解决了运营商通话和近场通信的场景下无法实时共享的问题,无需安装聊天应用或会议应用、待共享的应用,也无需适配待共享的应用,大大拓宽了应用场景,让用户可以做到快捷地分享任意应用、任意区域的多媒体数据流,有效满足用户的需求,提升用户体验。并且,实时分享可以减少二次传播的可能性,提升对用户的隐私安全的保护。
本申请中,实时共享可以是分享设备/分享用户向至少一个被分享设备/被分享用户分享多媒体数据流等分享数据,分享设备/分享用户和至少一个被分享设备/被分享用户可以一起观看/收听多媒体数据流。其中,多媒体数据流可以包括图像数据(多帧图像可称为视频流)和音频数据(多帧音频可以称为音频流)。分享设备是发起实时共享/实时分享的设备,也可称为分享发起方,在一种实施方式中,分享设备可以提供分享内容(也可称为分享数据,例如上述任意应用、任意区域的多媒体数据流)。被分享设备是接收上述发起的实时共享/实时分享的设备,也可称为是分享接收方,被分享设备可以接收分享内容和输出分享内容。分享用户和被分享用户的描述类似,分享用户可以使用分享设备向使用被分享设备的被分享用户实时共享分享数据。对于分享设备/分享用户而言,被分享设备/被分享用户可以简称为分享对象。本申请中的实时分享属于实时共享,可以是以分享设备的角度说明实时共享的描述。
可以理解地,分享设备/分享用户和被分享设备/被分享用户是相对的角色概念,而非物理概念,一个设备/用户在不同的分享场景下可以是不同的角色。例如,设备1/用户1在时间1可以作为分享设备/分享用户给其他设备/用户实时分享多媒体数据流,在时间2可以作为被分享设备接收其他分享设备实时分享的多媒体数据流。例如,设备1/用户1可以给设备2/用户2实时分享多媒体数据流,同时,设备2/用户2还可以给设备3/用户3多媒体数据流,在这种情况下,对于设备1而言,设备2是被分享设备,但对于设备3而言,设备2是分享设备。
本申请中,一起看、一起听、一起玩和一起编辑可以是四种不同的实时共享方式。例如,一起看可以是实时共享可被观看的内容(例如视频应用的图像),一起听可以是实时共享可被收听的内容(例如音乐应用的音频),一起玩可以是实时共享游戏相关的内容(例如游戏应用的图像和/或音频),一起编辑可以是实时共享文档(可被编辑,例如word格式、表格(excel)格式和演示文稿(powerpoint,PPT)格式的文档等)相关的内容。在一种实施方式中,用户可以选择实时共享的方式,但可以理解地,用户选择的实时共享方式不会对实际实时分享的内容构成限定,例如,用户先选择了一起看的实时共享方式,但在实际实时分享时,用户可以使用分享设备向其他被分享设备发送可被收听的内容、游戏相关的内容和/或文档相关的内容等,如视频应用的音频流和视频流。在另一种实施方式中,电子设备也可以自行确定实时共享的方式,例如默认设置一种实时共享方式,或者按照预设规则确定实时共享方式。不限于上述示例的情况,还可以有其他实时共享的方式。本申请对实时共享方式的具体内容和确定方式不作限定。
本申请中,电子设备可以运行至少一个应用,这至少一个应用中用户可见且可交互的应用程序可称为前台应用,电子设备可以显示前台应用的用户界面,也可称为是电子设备在前台运行该应用。这至少一个应用中用户不可见且不可交互的应用程序可称为后台应用,电子设备不会显示后台应用的用户界面,但仍然会运行后台应用,也可称为是电子设备在后台运行该应用。可以理解地,前台应用和后台应用是角色概念,而非物理概念,一个应用在不同场景下可以是不同的角色。例如,电子设备显示应用1的用户界面时(此时应用1为前台应用,应用2为后台应用),可以响应于用户操作显示应用2的用户界面(此时应用2为前台应用,应用1为后台应用)。
本申请中,附近设备是电子设备可以通过蓝牙、无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi))、点对点通信(device to device communication,D2D)、近距离无线通信技术(near field communication,NFC)、超宽带(ultra wide band,UWB)、红外线等近场通信技术进行通信的设备。附近设备可以包括电子设备发现但未连接的设备,和/或电子设备已连接的设备。本申请对近场通信技术的具体内容不作限定。
本申请中,远场设备是电子设备可以通过WLAN、卫星、蜂窝通信等远场通信技术进行通信的设备。远场设备可以包括电子设备发现但未连接的设备,和/或电子设备已连接的设备。本申请对远场通信技术的具体内容不作限定。
本申请中的触摸操作可以但不限于包括:单击、双击、长按、单指长按、多指长按、单指滑动、多指滑动、指关节滑动等多种形式。其中,滑动形式的触摸操作可以简称为滑动操作,滑动操作例如但不限于为左右滑动、上下滑动、往第一特定位置滑动、按照特定轨迹滑动等,本申请对滑动操作的轨迹不作限定。 在一些实施方式中,该触摸操作可以作用于电子设备上的第二特定位置。上述特定位置可以位于电子设备的显示屏上,例如图标等控件所在的位置或显示屏的边缘等,或者,该特定位置也可以位于电子设备的侧边、背面等其他位置,例如音量键、电源键等按键的位置。其中,上述特定位置为电子设备预设的,或者,该特定位置为电子设备响应于用户操作确定的。上述特定轨迹为电子设备预设的,或者,该特定轨迹为电子设备响应于用户操作确定的。
下面介绍本申请实施例涉及的共享系统10。
图1A示例性示出了一种共享系统10的架构示意图。
如图1A所示,共享系统10可以包括电子设备11,电子设备11可以通过不同的通信方式和不同的电子设备进行通信,具体示例如下所述:
在一些实施例中,电子设备11可以通过蜂窝通信网络(也可称为通过蜂窝通信方式)和至少一个电子设备进行通信,可选地实现运营商通话(即电话)。图1A以这至少一个电子设备包括电子设备12为例进行说明。电子设备11、蜂窝通信网络和这至少一个电子设备可以构成蜂窝通信系统,该蜂窝通信系统例如但不限于为全球移动通讯系统(global system for mobile communications,GSM)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA)、时分同步码分多址(time division synchronous code division multiple ac,TD-SCDMA)、长期演进(long term evolution,LTE)、新无线接入(new radio,NR)或其他未来网络系统。其中,蜂窝通信网络例如但不限于包括基站、核心网和通信线路。基站是一种部署在无线接入网(radio access network,RAN)中用于提供无线通信功能的设备。在不同的无线接入系统中,基站的名称可能不同,例如但不限于,GSM或CDMA中的基站收发台(base transceiver station,BTS),WCDMA中的节点B(node B,NB),LTE中的演进型基站(evolved node B,eNodeB),NR中的下一代基站(g node B,gNB),或者其他未来网络系统中的基站。核心网为蜂窝通信系统中的关键控制节点,主要负责信令处理功能,例如但不限于用于实现接入控制、移动性管理、会话管理等功能。核心网设备例如但不限于包括接入和移动性管理功能(access and mobility management function,AMF)实体、会话管理功能(session management function,SMF)实体、用户面功能(user plane function,UPF)实体等。通信线路例如但不限于包括双绞线、同轴电缆、光纤。在一些示例中,电子设备11可以通过空中接口(简称空口)和蜂窝通信网络中的基站1连接,电子设备12可以通过空中接口和蜂窝通信网络中的基站2连接,基站1和基站2可以连接到核心网。不限于上述示例,在另一些示例中,基站1和基站2也可以为同一个基站。
在一些实施例中,电子设备11可以通过近场通信技术和至少一个电子设备进行通信,近场通信技术例如但不限于包括蓝牙、WLAN(如Wi-Fi)、D2D、NFC、UWB、红外线等。图1A以这至少一个电子设备包括电子设备13、电子设备14和电子设备15为例进行说明,电子设备11通过WLAN和电子设备13进行通信,通过蓝牙和电子设备14进行通信,通过D2D和电子设备15进行通信,电子设备11和电子设备15进行通信的示例可参见下图1B。
近场通信方式的WLAN例如包括对等网络(Peer to Peer,P2P)直连,或者,连接了同一个WLAN信号源的两个设备(此时处于同一个局域网)可以通过近场的WLAN进行通信。不限于此,在另一些示例中,WLAN也可以为远场通信方式,例如,属于不同局域网的两个设备可以通过远场的WLAN进行通信。
在一些实施例中,电子设备11还可以通过车用无线通信(vehicle to X,V2X)技术和至少一个车辆进行通信,图1A以这至少一个车辆包括车辆16为例进行说明。在一些示例中,电子设备11可以通过蜂窝通信网络和车辆16进行通信,可以理解为是通过蜂窝通信网络实现V2X。在另一些示例中,电子设备11可以直接和车辆16进行通信。不限于上述示例的情况,在另一些示例中,电子设备11还可以通过V2X技术和车载设备等其他设备进行通信。
在一些实施例中,电子设备11还可以通过卫星和至少一个电子设备进行通信,卫星系统例如但不限于包括北斗、天通、星链等。图1A这至少一个电子设备包括电子设备12为例进行说明。在一些示例中,电子设备11可以连接卫星,再通过卫星连接蜂窝通信网络,最后通过蜂窝通信网络连接电子设备12,可参见下图1C所示的示例。
不限于此,在另一些实施例中,电子设备11还可以和至少一个电子设备实现过顶(over the top,OTT)通话,在一些示例中,OTT通话可以是越过运营商发展基于开放互联网的各种视频等数据服务的业务,例如通过Wi-Fi实现,在另一些示例中,OTT通话可以基于运营商的蜂窝数据业务实现。
图1B示例性示出又一种共享系统10的架构示意图。
如图1B所示,共享系统10包括电子设备11和电子设备15,电子设备11和电子设备15之间基于空口(例如PC5)和通信链路(例如侧行链路(sidelink))实现D2D通信,其中,不同于蜂窝通信链路区分上行链路(uplink)和下行链路(downlink),sidelink可以体现通信两端的对等性。D2D通信提供了直连发现(direct discovery)功能和直连通信(direct communication)功能,其中,direct discovery可以提供给电子设备A发现周围有可以直连的电子设备B的功能,direct communication可以提供给电子设备A和周围的电子设备B进行数据交互的功能,例如,电子设备A为电子设备11,电子设备B为电子设备15,或者,电子设备A为电子设备15,电子设备B为电子设备11。在一些实施例中,可以通过D2D技术在电子设备11和电子设备15这两端进行直连发现和直连通信,从而实现一起看、一起听、一起玩、一起编辑等实时共享的功能。
图1C示例性示出又一种共享系统10的架构示意图。
如图1C所示,共享系统10包括电子设备11、卫星、地面接收站、基站1、核心网设备1、数据网络(Data Network)、核心网设备2、基站2和电子设备12。在一些实施例中,电子设备11和电子设备12可以通过共享系统10实现一起看、一起听、一起玩、一起编辑等实时共享的功能,以电子设备11为分享设备,电子设备12为被分享设备为例进行说明:
电子设备11可以连接至卫星,将分享内容发送至卫星。卫星可以将分享内容发送至地面接收站。在一种实施方式中,地面接收站可以经由基站1接入核心网设备1,通过基站1将分享内容发送至核心网设备1,在另一种实施方式中,地面接收站也可以直接连接至核心网设备1,直接将分享内容发送至核心网设备1。然后,核心网设备1可以通过Data Network将分享内容发送至核心网设备2。电子设备12可以经由基站2接入核心网设备2,核心网设备2可以通过基站2将分享内容发送至电子设备12输出。
不限于上述示例的情况,在另一种实施方式中,卫星和电子设备12之间可以有更多或更少的设备,例如,地面接收站可以经过至少一个接入转换的网关设备连接至核心网设备1。
不限于上述示例的情况,在另一种实施方式中,电子设备12也可以不是通过蜂窝通信的网络设备(如基站2和核心网设备2)接入Data Network,而是通过WLAN(如Wi-Fi)方式接入Data Network,本申请对电子设备12接入Data Network的方式不作限定。
在一些实施例中,分享设备和被分享设备之间可以通过多种通信方式实现多连接,例如通过不同通信方式的不同传输路径进行冗余补包,从而保障实时共享的传输质量(例如实时性和/或稳定性)。上述多种通信方式例如但不限于包括上图1A、图1B和图1C所述的通信方式。本申请中的补包可以是在传输某一个数据包时,再传输至少一次该数据包的部分或全部内容,每次传输的内容可以相同或不同(例如包括三种情况:完全相同、部分相同和完全不同),每次传输的时间可以相同或不同。例如,在时间1通过卫星传输数据包1的全部内容、通过蜂窝通信方式传输数据包1的片段1,在时间2通过卫星传输数据包1的片段2,在时间3通过蜂窝通信方式传输数据包1的全部内容、通过蓝牙传输数据包1的片段2。
需要说明的是,图1A、图1B和图1C所示的电子设备11、12、13、14、15,车辆16,卫星,蜂窝通信网络,地面接收站,基站1、2,核心网设备1、2,Data Network的形态和数量仅用于示例,本申请实施例对此不作限定。
不限于上述示例的情况,在另一种实施方式中,上述基站也可以是其他接入网设备,例如用户设备(user equipment,UE)、接入热点(access point,AP),收发点(transmission and receiver point,TRP),中继设备,或者具备基站的功能的其他网络设备等。
接下来介绍本申请实施例提供的示例性的电子设备100。在一些实施例中,电子设备100可以是共享系统10中的任意一个电子设备。
本申请中,电子设备100可以是手机、平板电脑、手持计算机、桌面型计算机、膝上型计算机、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、蜂窝电话、个人数字助理(personal digital assistant,PDA),以及智能大屏、智智能音箱等智能家居设备,智能手环、智能手表、智能眼镜等可穿戴设备,增强现实(augmented reality,AR)、虚拟现实(virtual reality,VR)、混合现实(mixed reality,MR)等扩展现实(extended reality,XR)设备,车载设备或智慧城市设备,本申请实施例对电子设备的具体类型不作特殊限制。
图2A示例性示出了一种电子设备100的硬件结构示意图。
应理解的是,图2A所示电子设备100仅是一个范例,并且电子设备100可以具有比图2A中所示的更多的或者更少的部件,可以组合两个或多个的部件,或者可以具有不同的部件配置。图2A中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
如图2A所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器(modem),图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一种实施方式中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一种实施方式中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现,例如传输实时分享的音频流/视频流。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G/6G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一种实施方式中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一种实施方式中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频 信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一种实施方式中,调制解调处理器可以是独立的器件。在另一种实施方式中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR),D2D,V2X等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一种实施方式中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能,例如显示实时共享的视频流。
GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。显示屏194(也可称为屏幕)用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一种实施方式中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能,例如可以拍摄人像,以用于和应用的视频流一起实时分享给其他设备。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度等进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一种实施方式中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一种实施方式中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121 可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能,例如播放实时分享的音频流。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一种实施方式中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐或其他实时分享的音频流,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一种实施方式中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一种实施方式中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。例如,麦克风170C实时采集的音频可以和应用的音频流一起实时分享给其他设备。
耳机接口170D用于连接有线耳机。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一种实施方式中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。在另一种实施方式中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
压力传感器180A和/或触摸传感器180K用于检测作用于其上或附近的触摸操作。压力传感器180A和/或触摸传感器180K可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。温度传感器180J用于检测温度。骨传导传感器180M可以获取振动信号。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。马达191可以产生振动提示。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口195用于连接SIM卡。
接下来示例性说明电子设备100的软件系统。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。例如,分层架构的软件系统可以是安卓(Android)系统,也可以是鸿蒙(harmony)操作系统(operating system,OS),或其它软件系统。
图2B示例性示出一种电子设备100的软件架构示意图。图2B以分层架构的Android系统为例,示例性说明电子设备100的软件架构。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一种实施方式中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图2B所示,应用程序包可以包括通讯录,图库,蓝牙,WLAN,通话,短信息,浏览器,音乐、分享、短视频和视频等应用程序。其中,分享应用可以提供和一个或多个通话对方、附近设备、远场设备等被分享设备一起看、一起听、一起编辑、一起玩等实时共享功能。分享可以为独立的应用程序,也可以是通话、蓝牙、WLAN等其他应用程序封装的功能组件,本申请对此不作限定。本申请中,应用程序包也可以替换为小程序等其他形式的软件。以下实施例以通话、蓝牙和WLAN集成了分享的功能组件为例进行说明。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2B所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器和分享模块等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
分享模块可以用于实现一起看、一起听、一起编辑、一起玩等实时共享功能,例如但不限于包括用户体验(user experience,UX)显示、提供用户交互功能(例如接收并响应用户输入的操作)、业务功能和服务逻辑等,UX显示例如但不限于包括:发起一起看、一起听、一起编辑、一起玩等实时分享操作的显示界面(包括触发实时分享操作的控件),播放实时共享的多媒体数据流的显示界面,选择分享内容的显示界面,选择被分享设备/被分享用户(也可称为分享对象)的显示界面。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4、H.264、H.265等视频编码格式,MP3、AAC、AMR、SBC、LC3、aptX、LDAC、L2HC、WAV、FLAC等音频编码格式,JPG、PNG、BMP、GIF等图片编码格式。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合接听电话的场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为通话应用的接听控件为例,通话应用调用应用框架层的接口,进而通过调用内核层启动音频驱动,通过受话器170B播放通话对方的语音信息和/或通过麦克风170C获取当前用户的语音信息。
在一些实施例中,电子设备100的软件系统可以包括应用处理器系统(application processor,AP)和无线通信系统。其中:
无线通信系统可以但不限于包括以下至少一项:蜂窝通信系统(例如2G/3G/4G/5G/6G等)、卫星系统(例如北斗、天通、星链等)、Wi-Fi、BT、NFC、D2D等。在一种实施方式中,无线通信系统可以包括协处理器(CoProcessor,CP)和/或数字信号处理器(digital signal processor,DSP),其中,CP在终端中可以为基带芯片加协处理器或者多媒体加速器,CP可以包括与网络通信所需的数字组件,CP可以包括一个基于精简指令集计算机(reduced instruction set computer,RISC)微处理器(advanced RISC machines,ARM)的处理器和一个DSP。CP可以具有操作系统,可以通过高速(high speed,HS)串行连接和运行Android、IOS、Windows等操作系统的应用处理器进行通信。CP可以实现VR、AR、图像处理、高保真(high fidelity,HiFi)、高速数据传输(high data rate,HDR)、传感器管理等处理逻辑。不限于此,CP也可以是蜂窝调制解调器(cellular processor,CP)。
应用系统用于实现用户界面的渲染和呈现,用户操作的输入和响应,业务功能以及音频/视频等多媒体数据的播放等控制逻辑。该软件系统的具体示例可参见下图2C-图2E。
图2C示例性示出又一种电子设备100的软件架构示意图。
如图2C所示,电子设备100的应用系统包括分享模块、发现模块、抓取模块、新通话(NewTalk)功能模块、Wi-Fi功能模块、BT功能模块、D2D功能模块、卫星(Satellite)功能模块、NewTalk链路(Link)模块、Wi-Fi链路模块、BT链路模块、D2D链路模块、卫星链路模块。其中:
分享模块可以理解为是一起看(View)、一起听(Listen)、一起玩(Play)和一起编辑(Edit)等实时共享的核心功能模块,分享模块例如称为Together(View/Listen/Play/Edit)。分享模块可以用于UX显示,例如但不限于包括:发起一起看、一起听、一起编辑、一起玩等实时分享操作的显示界面(包括触发实时分享操作的控件),播放实时共享的多媒体数据流的显示界面,选择分享内容的显示界面,选择被分享设备/被分享用户(也可称为分享对象)的显示界面。不限于此,分享模块还可以用于提供实时共享的用户交互功能、提供实时共享的相关业务功能和实现实时共享的服务逻辑等,本申请对此不作限定。
发现模块用于通过Wi-Fi、BT、D2D等近场通信技术发现附近设备,发现模块例如称为Nearby。不限于此,也可以通过蜂窝通信技术、卫星等远场通信技术发现设备,本申请对发现设备的通信技术不作限定。
抓取模块用于抓取分享数据。在一些示例中,抓取模块可以基于应用和/或系统的接口获取解码后的多媒体数据流(可直接播放)或者解码前的多媒体数据流(例如生成的原始数据),例如,解码后的多媒体数据流是针对特定的电子设备100、处理得到的可直接播放的数据,为了保证分享数据在被分享设备上的播放效果,抓取模块可以抓取解码前的多媒体数据流以用于实时分享。在另一些示例中,抓取模块可以在系统层直接抓取解码前的多媒体数据流,例如,电子设备100通过3G/4G/5G/6G广播模块接收到基站发送的广播数据后,可以通过内核层的蜂窝通信网卡(未示出)上报到系统层,电子设备100可以不播放该广播数据,而是由抓取模块获取该广播数据以用于实时分享。
NewTalk功能模块用于基于NewTalk实现实时分享的功能,其中,NewTalk可以但不限于为运营商通话和/或OTT通话,NewTalk例如但不限于通过蜂窝通信方式实现。在一种实施方式中,NewTalk功能模块可以基于正在通话(可简称为通话态)的NewTalk实现实时共享,在另一种实施方式中,NewTalk功能模块可以基于未通话(可简称为非通话态)的NewTalk实现实时共享。
Wi-Fi功能模块用于基于Wi-Fi实现实时共享,其中,可以使用单播(unicast)、广播(broadcast)或组播(multicast)(也可称为多播)等传输方式实现Wi-Fi通信。
BT功能模块用于基于BT实现实时共享,其中,可以使用单播、广播或组播等传输方式实现BT通信。
D2D功能模块用于基于D2D实现实时共享。
卫星功能模块用于基于通信卫星实现实时共享。
NewTalk链路模块用于管理NewTalk的链路,例如但不限于包括链路的建立、释放、数据的传输等。 在一种实施方式中,NewTalk的链路可以包括主链路和辅助链路。
Wi-Fi链路模块用于管理Wi-Fi的链路,例如但不限于包括链路的建立、释放、数据的传输等。
BT链路模块用于管理BT的链路,例如但不限于包括链路的建立、释放、数据的传输等。
D2D链路模块用于管理D2D的链路,例如但不限于包括链路的建立、释放、数据的传输等。
卫星链路模块用于管理通信卫星的链路,例如但不限于包括链路的建立、释放、数据的传输等。
如图2C所示,电子设备100的无线通信系统包括蜂窝通信模块、Wi-Fi通信模块、BT通信模块和卫星通信模块,其中:
蜂窝通信模块(modem)包括网际互联协议(internet protocol,IP)多媒体系统(IP multimedia subsystem,IMS)通信模块、电路交换(circuited switched,CS)通信模块和3G/4G/5G/6G广播模块,其中,IMS通信模块可以但不限于实现LTE语音通话(voice over LTE,VoLTE)、LTE视频通话(video over LTE,ViLTE)、NR语音通话(voice over NR,VoNR)、NR视频通话(video over NR,ViNR)、Wi-Fi语音通话(voice over Wi-Fi,VoWiFi)、Wi-Fi视频通话(video over Wi-Fi,ViWiFi)、演进分组系统回退(evolved packet system-Fallback,EPS-Fallback)等基于IMS协议的通话。CS通信模块可以提供CS Fallback的功能。
3G/4G/5G/6G广播模块可以用于监听3G/4G/5G/6G的广播信道。电子设备100可以处于至少一个基站的覆盖区域内,这至少一个基站中的任意一个基站可以通过广播信道向处于覆盖区域的电子设备(包括电子设备100)发送广播数据(例如音频流/视频流等多媒体数据),任意一个基站可以维护至少一个频道,不同频道对应的广播数据可以不同。在一些示例中,用户可以通过电子设备100选择接收的广播数据对应的频道。在一些示例中,电子设备100可以通过3G/4G/5G/6G广播模块接收基站发送的广播数据,3G/4G/5G/6G广播模块可以通过内核层的蜂窝通信网卡(未示出)上报到系统层进行处理。在一些示例中,电子设备100可以通过系统应用(例如通话)或第三方应用(例如聊天应用、会议应用)播放接收到的广播数据,电子设备100可以将播放的内容分享给其他设备。在另一些示例中,电子设备100可以不播放接收到的广播数据,而是将接收到的广播数据直接分享给其他设备,或者将处理后的广播数据分享给其他设备。
Wi-Fi通信模块可以包括WiFi通信的硬件模块,例如固件和芯片。
BT通信模块可以包括BT通信的硬件模块,例如固件和芯片。
卫星通信模块可以包括卫星通信的硬件模块,例如固件和芯片。
如图2C所示,基于NewTalk、卫星等远场通信方式和Wi-Fi、BT、D2D等近场通信方式的实时共享功能统一由分享模块实现。在一种实施方式中,近场通信方式和远场通信方式的各种无线电接入技术(radio access technology,RAT)可以仅负责通信链路的管理,例如由这些通信方式的链路模块负责通信链路的管理,部分服务(service)功能(例如但不限于安全、编码/解码等)可以由分享模块实现。不限于此,在另一种实施方式中,部分服务(service)功能(例如但不限于安全、编码/解码等)也可以由对应的通信方式的功能模块实现。
不限于图2C所示的软件架构示意图,在另一些实施例中,电子设备100的软件架构示意图可参见图2D,图2D和图2C类似,区别在于,图2D中,基于NewTalk、卫星等远场通信方式和Wi-Fi、BT、D2D等近场通信方式的一起看、一起听、一起玩、一起编辑等实时共享功能各自独立,这些通信方式的功能模块可以分别集成分享模块。
以上说明了电子设备的硬件结构和软件架构,接下来结合图2E示例性说明分享设备和被分享设备的通信架构,图2E中的部分模块的功能和可能实现可参考前述实施例中的电子设备的软件架构的描述,例如图2C所示的电子设备100的说明。
图2E示例性示出又一种共享系统10的架构示意图。
如图2E所示,共享系统10可以包括电子设备100、电子设备200和网络设备300,其中,电子设备100和电子设备200之间可以进行一起看、一起听、一起玩和一起编辑等实时共享。网络设备300可以包括至少一个服务器,例如,网络设备300是多个服务器组成的服务器集群。其中,任意一个服务器可以为硬件服务器,也可以为云服务器,例如,网页服务器、后台服务器、应用服务器、下载服务器等。
以电子设备100为例说明电子设备的软件系统架构,电子设备200的说明类似。
在一种实施方式中,如图2E所示,电子设备100的应用系统(AP)可以分为三层,从上至下分别为应用程序框架层(framework,FW)、硬件抽象层(hardware abstract layer,HAL)和内核层(kernel)。应用程序框架层包括分享模块、发现模块、抓取模块、NewTalk功能模块、Wi-Fi功能模块、BT功能模块、 D2D功能模块、卫星功能模块。分享模块可以包括一起看(View)的功能模块、一起听(Listen)的功能模块、一起玩(Play)的功能模块、一起编辑(Edit)的功能模块、链路管理模块、安全模块、成员管理模块、质量模块、编解码模块、抓流模块、传输模块、数据处理模块和播放模块,其中:
链路管理(Link Manager)模块用于统一管理NewTalk、卫星等远场通信方式和Wi-Fi、BT、D2D等近场通信方式的链路,例如但不限于包括对一个或多个物理链路进行建立、维系、销毁等操作,这一个或多个物理链路可以包括以下至少一个链路:NewTalk的主链路、NewTalk的辅助链路、卫星链路、D2D链路、BT广播链路、BT单播链路、Wi-Fi广播链路、Wi-Fi单播链路。
安全(Security)模块可以但不限于用于实现证书认证、加密/解密等安全功能。
成员管理(Member Manager)模块用于管理进行实时共享的成员(设备/用户),在一些示例中,可以添加、删除进行实时共享的成员。例如,电子设备100为分享设备时,可以选择对哪些设备/用户进行分享、查看正在播放分享内容的设备/用户,取消分享给某些设备/用户等。成员管理模块可以但不限于通过设备的地址信息、用户的名称信息等标识信息来管理进行实时共享的成员。
质量模块用于管控进行实时共享的用户的体验质量(quality of experience,QoE)。
编解码模块(Codec)用于实现音频(Audio)、视频(Video)、语音(Speech)等数据的编码和解码。
抓流(CaptureStream)模块为抓流功能的适配模块,可以但不限于用于抓取音频、视频和语音等数据流。
传输模块用于管理NewTalk、卫星等远场通信方式和Wi-Fi、BT、D2D等近场通信方式的传输功能。
数据处理模块可以实现至少一种数据处理策略,例如但不限于包括分片(Slice)、聚合(Aggregation)和冗余(Redundancy)。
播放(PlayStream)模块为播放功能的适配模块,可以但不限于用于播放音频、视频和语音等数据流。
HAL可以包括NewTalk服务模块、Wi-Fi协议栈、D2D协议栈、BT协议栈、卫星服务模块和辅助链路模块。其中,Wi-Fi协议栈可以实现Wi-Fi单播、组播和广播通信。BT协议栈可以实现BT单播、组播和广播通信。在一种实施方式中,辅助链路模块可以包括网络地址转换(network address translation,NAT)穿越和/或中继的端侧服务模块,例如称为NATService,其中,穿越(session traversal utilities for NAT,STUN)可以理解为是一种P2P技术,用于在两点之间直接通信。中继(traversal using relays around NAT,TURN)可以由服务器等网络设备负责通信双方的数据的转发,从而实现两点之间的通信。不限于此,辅助链路模块还可以包括即时通信(real time communication,RTC)的服务模块,该服务模块例如通过即时网络(real time networks,RTN)实现辅助链路的数据传输,进一步提升传输效率和质量。
内核层可以包括传输协议栈、Wi-Fi网卡(network interface controller,NIC)、Wi-Fi驱动(driver)、蜂窝通信网卡、A核数据服务(A-core data service,ADS)、D2D驱动、蓝牙驱动和卫星驱动。其中,传输协议栈可以但不限于包括传输控制协议(transmission control protocol,TCP)/IP协议栈。蜂窝通信网卡的英文全称可以是remote(wireless wide area)network,可以简称为RMNET。RMNET可以是一种modem或其他外部设备作为操作系统提供的远程网卡,可以在操作系统内核形成虚拟网卡设备,例如,modem芯片可以采用这种端侧组网方式和网卡设备。蓝牙驱动例如为蓝牙低能耗(bluetooth low energy,BLE)控制(Control)模块,用于控制BLE的信令。
网络设备300可以包括寻址(wiseFunction)模块、(NAT)穿越(STUN)模块和(NAT)中继(TURN)模块。其中:
寻址模块用于为建立链路进行身份认证和寻址,例如,电子设备100的NewTalk功能模块可以通过网络设备300的寻址模块实现访问令牌(access token,AT)认证和NAT穿越的会话身份(Session)标识(identity document,ID)的交换,电子设备100可以获取到电子设备200的SessionID。类似地,电子设备200的NewTalk功能模块也可以通过网络设备300的寻址模块实现AT认证和NAT穿越的SessionID的交换,电子设备200可以获取到电子设备100的SessionID。SessionID可以用于建立链路,例如NAT穿越链路或NAT中继链路。
(NAT)穿越模块用于实现NAT穿越链路的建立和信令传输,例如,电子设备100的辅助链路模块和电子设备200的辅助链路模块可以通过网络设备300的NAT穿越模块建立P2P穿越链路(辅助链路)和基于该链路进行信令传输。
(NAT)中继模块用于实现NAT中继链路的建立和信令传输,例如,电子设备100的辅助链路模块和电子设备200的辅助链路模块可以通过网络设备300的NAT中继模块建立中继链路(辅助链路)和基于该链路进行信令传输。
如图2E所示,电子设备100和电子设备200之间的通信链路可以包括以下至少一个:
链路1:NewTalk链路,其中,NewTalk链路可以包括IMS通信链路和CS通信链路,IMS通信链路可以但不限于是Qos类标识符(QoS class identifier,QCI)1/QCI2的多媒体通路,或者数据通路(Data channel)。在一些示例中,NewTalk链路可以通过电子设备100的蜂窝通信模块和电子设备200的蜂窝通信模块建立,例如,电子设备100的蜂窝通信模块和基站1连接,基站1和基站2连接,基站2和电子设备200的蜂窝通信模块连接,NewTalk链路为电子设备100的蜂窝通信模块和电子设备200的蜂窝通信模块之间的通信链路。在一些示例中,NewTalk链路用于实现运营商通话(例如通过上述蜂窝通信实现)和/或OTT通话。
链路2:Wi-Fi链路,其中,Wi-Fi链路可以包括单播链路、组播链路和/或广播链路。在一些示例中,Wi-Fi链路可以通过电子设备100的Wi-Fi通信模块和电子设备200的Wi-Fi通信模块建立。在一些示例中,Wi-Fi链路用于实现Wi-Fi通信。
链路3:BT链路,其中,BT链路可以包括单播链路、组播链路和/或广播链路。在一些示例中,BT链路可以通过电子设备100的BT通信模块和电子设备200的BT通信模块建立。在一些示例中,BT链路用于实现BT通信。
链路4:D2D链路。在一些示例中,D2D链路可以通过电子设备100的蜂窝通信模块和电子设备200的蜂窝通信模块建立。在另一些示例中,D2D链路可以通过电子设备100的Wi-Fi通信模块和电子设备200的Wi-Fi通信模块建立。在另一些示例中,D2D链路可以通过电子设备100的无线通信系统中的D2D通信模块(未在图2E示出),以及电子设备200的无线通信系统中的D2D通信模块(未在图2E示出)建立。在一些示例中,D2D链路用于实现D2D通信。
链路5:卫星链路。在一些示例中,卫星链路可以通过电子设备100的卫星通信模块和电子设备200的卫星通信模块建立。在一些示例中,卫星链路用于实现卫星通信。
链路6:辅助链路,辅助链路可以是NAT穿越(P2P直传)和/或NAT中继。在一些示例中,辅助链路在通话态时建立,在另一些示例中,辅助链路在非通话态时建立。辅助链路的物理通道可以但不限于是NewTalk链路、Wi-Fi链路、BT链路、D2D链路、卫星链路等通信链路,具体建立方式可参见上述链路1-链路5的说明,不再赘述。在一些示例中,辅助链路用于实现运营商通话和/或OTT通话。
在一些实施例中,电子设备100和电子设备200可以根据传输场景的需求,选择建立上述链路1-链路5中的至少一种链路(任意一种链路或者多种链路的组合),例如,电子设备100和电子设备200距离较近时可以建立链路3和链路4,建立多条链路可以避免一条链路异常时无法通信或通信质量较差的情况,提升通信的稳定性。
基于图2E所示的电子设备的软件系统,示例性说明不同通信链路的上行/下行数据流,以下示例以电子设备100为分享设备,电子设备200为被分享设备为例进行说明。
示例一,NewTalk链路的上行数据流(在电子设备100的软件系统中的数据流向):抓取模块->分享模块(抓流模块->编解码模块(例如用于编码)->数据处理模块(例如用于封包)->传输模块(例如用于分流))->NewTalk功能模块->NewTalk服务模块->传输协议栈->蜂窝通信网卡->ADS->蜂窝通信模块->空口。NewTalk链路的下行数据流(在电子设备200的软件系统中的数据流向):空口->蜂窝通信模块->ADS->蜂窝通信网卡->传输协议栈->NewTalk服务模块->NewTalk功能模块->分享模块(传输模块(例如用于聚合)->数据处理模块(例如用于解包)->编解码模块(例如用于解码)->播放模块)。
示例二,Wi-Fi链路的上行数据流(在电子设备100的软件系统中的数据流向):抓取模块->分享模块(抓流模块->编解码模块(例如用于编码)->数据处理模块(例如用于封包)->传输模块(例如用于分流))->Wi-Fi功能模块->Wi-Fi协议栈->传输协议栈->Wi-Fi网卡->Wi-Fi驱动->Wi-Fi通信模块->空口。Wi-Fi链路的下行数据流(在电子设备200的软件系统中的数据流向):空口->Wi-Fi通信模块->Wi-Fi驱动->Wi-Fi网卡->传输协议栈->Wi-Fi协议栈->Wi-Fi功能模块->分享模块(传输模块(例如用于聚合)->数据处理模块(例如用于解包)->编解码模块(例如用于解码)->播放模块)。
示例三,BT链路的上行数据流(在电子设备100的软件系统中的数据流向):抓取模块->分享模块(抓流模块->编解码模块(例如用于编码)->数据处理模块(例如用于封包)->传输模块(例如用于分流))->BT功能模块->BT协议栈->BT驱动->BT通信模块->空口。BT链路的下行数据流(在电子设备200的软件系统中的数据流向):空口->BT通信模块->BT驱动->BT协议栈->BT功能模块->分享模块(传输模块(例如用于聚合)->数据处理模块(例如用于解包)->编解码模块(例如用于解码)->播放模块)。
示例四,D2D链路的上行数据流(在电子设备100的软件系统中的数据流向):抓取模块->分享模块(抓流模块->编解码模块(例如用于编码)->数据处理模块(例如用于封包)->传输模块(例如用于分流)) ->D2D功能模块->D2D协议栈->D2D驱动->蜂窝通信模块/Wi-Fi通信模块->空口。D2D链路的下行数据流(在电子设备200的软件系统中的数据流向):空口->蜂窝通信模块/Wi-Fi通信模块->D2D驱动->D2D协议栈->D2D功能模块->分享模块(传输模块(例如用于聚合)->数据处理模块(例如用于解包)->编解码模块(例如用于解码)->播放模块)。不限于此,在另一些示例中,D2D链路的上行数据流中的D2D驱动也可以替换为:传输协议栈->蜂窝通信网卡->ADS,此时蜂窝通信模块/Wi-Fi通信模块具体为蜂窝通信模块。D2D链路的下行数据流中的D2D驱动也可以替换为:ADS->蜂窝通信网卡->传输协议栈,此时蜂窝通信模块/Wi-Fi通信模块具体为蜂窝通信模块。在另一些示例中,D2D链路的上行数据流中的D2D驱动也可以替换为:传输协议栈->Wi-Fi网卡->Wi-Fi驱动,此时蜂窝通信模块/Wi-Fi通信模块具体为Wi-Fi通信模块。D2D链路的下行数据流中的D2D驱动也可以替换为:Wi-Fi驱动->Wi-Fi网卡->传输协议栈,此时蜂窝通信模块/Wi-Fi通信模块具体为Wi-Fi通信模块。在另一些示例中,D2D链路的上行/下行数据流中的蜂窝通信模块/Wi-Fi通信模块可以调换为D2D通信模块(未在图2E示出),D2D通信模块可以包括D2D通信的硬件模块,例如固件和芯片。
示例五,卫星链路的上行数据流(在电子设备100的软件系统中的数据流向):抓取模块->分享模块(抓流模块->编解码模块(例如用于编码)->数据处理模块(例如用于封包)->传输模块(例如用于分流))->卫星功能模块->卫星服务模块->卫星驱动->卫星通信模块->空口。卫星链路的下行数据流(在电子设备200的软件系统中的数据流向):空口->卫星通信模块->卫星驱动->卫星服务模块->卫星功能模块->分享模块(传输模块(例如用于聚合)->数据处理模块(例如用于解包)->编解码模块(例如用于解码)->播放模块)。
示例六,辅助链路的上行数据流(在电子设备100的软件系统中的数据流向):抓取模块->分享模块(抓流模块->编解码模块(例如用于编码)->数据处理模块(例如用于封包)->传输模块(例如用于分流))->NewTalk功能模块->NewTalk服务模块->辅助链路模块->NewTalk/Wi-Fi/BT/D2D/卫星传输模块->空口。辅助链路的下行数据流(在电子设备200的软件系统中的数据流向):空口->NewTalk/Wi-Fi/BT/D2D/卫星传输模块->辅助链路模块->NewTalk服务模块->NewTalk功能模块->分享模块(传输模块(例如用于聚合)->数据处理模块(例如用于解包)->编解码模块(例如用于解码)->播放模块)。其中:
在一种实施方式中,辅助链路的物理通道为NewTalk链路。辅助链路的上行数据流中的NewTalk传输模块为:传输协议栈->蜂窝通信网卡->ADS->蜂窝通信模块,辅助链路的下行数据流中的NewTalk传输模块为:蜂窝通信模块->ADS->蜂窝通信网卡->传输协议栈。
在另一种实施方式中,辅助链路的物理通道为Wi-Fi链路。辅助链路的上行数据流中的Wi-Fi传输模块为:传输协议栈->Wi-Fi网卡->Wi-Fi驱动->Wi-Fi通信模块,辅助链路的下行数据流中的Wi-Fi传输模块为:Wi-Fi通信模块->Wi-Fi驱动->Wi-Fi网卡->传输协议栈。
在另一种实施方式中,辅助链路的物理通道为BT链路。辅助链路的上行数据流中的BT传输模块为:BT驱动->BT通信模块,辅助链路的下行数据流中的BT传输模块为:BT通信模块->BT驱动。
在另一种实施方式中,辅助链路的物理通道为D2D链路。辅助链路的上行数据流中的D2D传输模块为:D2D驱动->蜂窝通信模块/Wi-Fi通信模块/D2D通信模块,辅助链路的下行数据流中的D2D传输模块为:蜂窝通信模块/Wi-Fi通信模块/D2D通信模块->D2D驱动,D2D驱动还可以替换为上述示例四所述的其他模块,具体可参见上述示例四的说明。
在另一种实施方式中,辅助链路的物理通道为卫星链路。辅助链路的上行数据流中的卫星传输模块为:卫星驱动->卫星通信模块,辅助链路的下行数据流中的卫星传输模块为:卫星通信模块->卫星驱动。
下面示例性介绍本申请实施例涉及的应用场景以及该场景下的用户界面(user interface,UI)示例。以下示例以电子设备100为分享设备为例进行说明。
图3示例性示出一种通话界面的示意图。
如图3所示,电子设备100(用户A,电话号码1)和电子设备200(用户B,电话号码2)之间可以进行运营商通话/OTT通话等NewTalk。如图3的(A)所示,电子设备100可以显示通话应用的用户界面310(可简称为通话界面310),通话界面310包括通话信息311和悬浮窗312,通话信息311包括通话对方的信息(联系人名称“用户B”和通讯号码“电话号码2”),以及通话时长“1秒”。悬浮窗312包括多个选项,例如切换通话模式的选项312A、发送位置信息的选项312B、发送文件的选项312C和分享选项312D。如图3的(B)所示,电子设备200可以显示通话应用的用户界面320,用户界面320和通话界面310类似,也包括通话信息321和悬浮窗322,通话信息321包括通话对方的信息(联系人名称“用户A” 和通讯号码“电话号码1”)以及通话时长“1秒”,悬浮窗322和悬浮窗312一致,悬浮窗322也包括分享选项322A。
在一种实施方式中,电子设备100在通话界面310显示悬浮窗312预设时长(图3以10秒为例进行示意)后,可以取消显示悬浮窗312的详细信息,而是显示悬浮窗312的图标,可以称为是悬浮窗312在通话界面310停留预设时长后收起,例如可以靠屏幕左、右、上或下侧边缘收起,具体示例可参见图3的(C)所示的用户界面330。电子设备100显示的用户界面330中,悬浮窗312在屏幕左侧边缘以图标形式显示。在一种实施方式中,电子设备100可以响应于针对图3的(C)所示的用户界面330中的悬浮窗312的触摸操作(例如点击操作),显示悬浮窗312的详细信息,例如显示图3的(A)所示的通话界面310。
在一种实施方式中,在图3的(A)和图3的(B)之后,通话态的电子设备100可以显示其他应用的用户界面,该应用的多媒体数据流可以实时分享给通话对方、附近设备,具体示例可参见图4A(图4A以将短视频应用的多媒体数据流实时分享给通话对方:电子设备200为例进行示意)。
如图4A所示,电子设备100可以显示短视频应用的用户界面410,用户界面410可以包括位于顶部的通话控件411、短视频的播放窗口412和悬浮窗312。通话控件411可以表征当前电子设备100处于通话态,且通话时长为33秒。播放窗口412用于显示播放的短视频,例如当前正在播放“用户1”发布的名称为“主题1”的短视频1。在一些示例中,悬浮窗312中的分享选项312D用于触发将前台应用(图4A以短视频应用为例进行示意)的多媒体数据流实时分享给通话对方(电子设备200/用户B)。
在一种实施方式中,电子设备100可以响应于针对图4A所示的用户界面410中的分享选项312D的触摸操作(例如点击操作),向电子设备200发送分享请求。电子设备200接受该分享请求后,电子设备100可以通过蜂窝通信方式将和播放窗口412相关的音频流(例如短视频1的音频和/或麦克风采集的音频)和/或视频流(例如短视频1的图像和/或摄像头采集的图像)发送至电子设备200。电子设备100可以显示图4B所示的用户界面420。相比图4A所示的用户界面410,用户界面420不包括悬浮窗312,并且,用户界面420中的播放窗口412为选中状态,可以表征当前正在分享播放窗口412相关的音频流和/或视频流(即为分享内容)。用户界面420还包括分享控制选项421,分享控制选项421用于触发显示分享菜单,分享菜单例如但不限于包括暂停/退出分享、变更分享内容、变更被分享设备等功能选项。
在一些示例中,电子设备100向电子设备200发送分享请求的通信方式可以是蜂窝通信方式,在另一些示例中,也可以是近场通信方式等其他通信方式。
在一种实施方式中,电子设备100可以响应于针对图4B所示的用户界面420中的分享控制选项421的触摸操作(例如点击操作),显示分享菜单,例如显示图4C所示的用户界面430。相比图4B所示的用户界面420,用户界面430还包括分享菜单431,分享菜单431可以包括多个选项,例如选项431A、选项431B、选项431C、选项431D、选项431E和选项431F。其中,选项431A包括字符“仅当前应用(画面+音频)”,用于设置分享内容为前台应用(图4B以短视频应用为例进行示意)的图像和音频(例如播放窗口412播放的短视频1的图像和音频)。选项431B包括字符“仅当前应用(音频)”,用于设置分享内容为前台应用的音频(例如播放窗口412播放的短视频1的音频)。选项431C包括字符“仅当前应用(画面)”,用于设置分享内容为前台应用的图像(例如播放窗口412播放的短视频1的图像)。选项431D包括字符“整个屏幕”,用于设置分享内容为电子设备100的屏幕的显示内容(例如用户界面430相关的图像/音频)。选项431E包括字符“暂停分享”,用于取消/暂停/停止实时分享。选项431F包括字符“更多”,用于触发显示更多的功能选项,例如是否分享麦克风采集的音频,是否分享摄像头采集的图像,是否允许保存,是否允许转发等。
在一种实施方式中,在图4A之后,电子设备200接收到电子设备100发送的分享请求时,可以显示提示信息,例如显示图5A所示的用户界面510。用户界面510和图3的(B)所示的用户界面320类似,区别在于,用户界面510不包括悬浮窗322,但包括提示信息511。提示信息511包括分享内容(以图4A所示的用户界面410中的播放窗口412播放的短视频1的音频流/视频流为例)所属的短视频应用的图标511A,字符“用户A邀请您一起看”,以及接受控件511B。
在一种实施方式中,电子设备200可以响应于针对图5A所示的用户界面510中的接受控件511B的触摸操作(例如点击操作),通过蜂窝通信方式接收电子设备100发送的分享内容,并显示该分享内容,例如显示图5B所示的用户界面520。用户界面520可以包括位于顶部的通话控件521、分享内容的播放窗口522、分享控制选项523和提示框524。通话控件521可以表征当前电子设备200处于通话态,且通话时长为35秒。提示框524包括字符“正在观看用户A分享的内容”。播放窗口522用于显示分享内容,例如图4A所示的用户界面410中的播放窗口412显示的图像。分享控制选项523用于触发显示分享菜单,分享 菜单例如但不限于包括暂停/退出播放分享内容的选项。
在一种实施方式中,电子设备200可以响应于针对图5B所示的用户界面520中的分享控制选项523的触摸操作(例如点击操作),显示图5C所示的用户界面530。相比图5B所示的用户界面520,用户界面530还包括分享菜单531,分享菜单531可以包括多个选项,例如选项531A和选项531B。其中,选项531A包括字符“退出观看”,用于暂停/退出分享内容的播放界面。选项531B包括字符“更多”,用于触发显示更多的功能选项,例如用于触发向其他用户实时分享音频流/视频流的选项。电子设备200可以响应于针对选项531A的触摸操作(例如点击操作),退出观看当前播放的分享内容,例如显示图3的(B)所示的用户界面320。在一些示例中,电子设备200再次接收到电子设备100发送的分享请求时,可以响应于用户操作,接受该分享请求,再次播放分享内容,例如显示图5B所示的用户界面520。
在一种实施方式中,电子设备200可以响应于针对图5B所示的用户界面520中的通话控件521的触摸操作(例如点击操作),返回显示通话界面,例如显示图3的(B)所示的用户界面320。在上述实施方式中,电子设备200可以但不限于按照以下三种情况工作:
情况1:电子设备200接收到针对图5B所示的用户界面520中的通话控件521的触摸操作后,不向电子设备100发送通知消息,因此,电子设备100会继续向电子设备200发送分享内容。在一些示例中,电子设备200可以基于接收到的分享内容,在后台运行分享内容的播放界面。
情况2:电子设备200接收到针对图5B所示的用户界面520中的通话控件521的触摸操作后,向电子设备100发送通知消息,电子设备100接收到该通知消息后,不向电子设备200发送分享内容。
情况3:电子设备200接收到针对图5B所示的用户界面520中的通话控件521的触摸操作后,向电子设备100发送通知消息,电子设备100接收到该通知消息(例如与分辨率和/或帧率相关)后,降低分享内容的传输带宽(例如通过降低分享内容的分辨率、帧率、码率等实现),从而节省设备功耗和传输资源。在一些示例中,电子设备200可以基于接收到的降低传输带宽后的分享内容,在后台运行分享内容的播放界面。
不限于上述实施方式,在另一种实施方式中,电子设备200可以响应于针对图5B所示的用户界面520中的通话控件521的触摸操作(例如点击操作),分屏显示通话界面(例如图3的(B)所示的用户界面320)和分享内容的播放界面(例如图5B所示的用户界面520),不限于此,在另一种实施方式中,电子设备200可以响应于针对图5B所示的用户界面520中的通话控件521的触摸操作(例如点击操作),显示通话界面,并且在通话界面以悬浮小窗的形式显示分享内容的播放界面,本申请对具体显示方式不作限定。在上述两种实施方式中,电子设备200可以但不限于按照上述三种情况工作,在一些示例中,在上述情况1下,电子设备200接收到分享内容后,需要先对分享内容进行处理(例如降低分辨率、降低帧率等),再以分屏形式或悬浮小窗的形式显示处理后的分享内容。
在一种实施方式中,电子设备200返回显示通话界面(假设为图3的(B)所示的用户界面320)后,电子设备200可以响应于针对用户界面320所示的悬浮窗322中的分享选项322A的触摸操作(例如点击操作),重新播放分享内容,例如显示图5B所示的用户界面520。不限于此,在另一种实施方式中,电子设备200可以响应于针对上述通话界面的触摸操作(例如从下往上滑动),显示多任务窗口/多任务列表的用户界面,例如显示图5D所示的用户界面540。用户界面540用于显示窗口列表,窗口列表包括电子设备200上运行的至少一个窗口,例如短信息应用的窗口541、实时共享的窗口542和通话应用的窗口543。实时共享的窗口542上显示有实时共享功能的图标和字符“一起看”542A,实时共享的窗口542用于指示分享内容的播放窗口。电子设备200可以响应于针对窗口542的触摸操作(例如点击操作),重新播放分享内容。可以理解为是,上述电子设备200返回显示通话界面时,电子设备200在后台运行分享内容的播放界面,上述电子设备200重新播放分享内容可以是将分享内容切换为在前台运行分享内容的播放界面。
在一些示例中,在上述情况1下,电子设备200可以直接基于接收到的分享内容,重新播放分享内容。在另一些示例中,在上述情况2下,电子设备200可以响应于上述针对用户界面320所示的悬浮窗322中的分享选项322A的触摸操作,或者上述针对窗口542的触摸操作,向电子设备100发送通知消息,电子设备100接收到该通知消息后,向电子设备200发送分享内容以用于电子设备200重新播放分享内容。在另一些示例中,在上述情况3下,电子设备200可以响应于上述针对用户界面320所示的悬浮窗322中的分享选项322A的触摸操作,或者上述针对窗口542的触摸操作,向电子设备100发送通知消息,电子设备100接收到该通知消息后,增大分享内容的传输带宽(例如通过增大分享内容的分辨率、帧率、码率等实现),电子设备200可以基于接收到的增大传输带宽后的分享内容,重新播放分享内容。
在一种实施方式中,电子设备200分屏显示通话界面和分享内容的播放界面时,可以响应于针对分享 内容的播放界面的用户操作(例如拖动分屏界面中通话界面和分享内容的播放界面之间的拖动条),全屏显示分享内容的播放界面,例如显示图5B所示的用户界面520,具体示例和上述电子设备200返回显示通话界面后的示例类似,不再赘述。
在一种实施方式中,电子设备200在通话界面以悬浮小窗的形式显示分享内容的播放界面时,可以响应于针对悬浮小窗的用户操作,全屏显示分享内容的播放界面,例如显示图5B所示的用户界面520,具体示例和上述电子设备200返回显示通话界面后的示例类似,不再赘述。
不限于图4A-图4B所示的实施方式(通过悬浮窗触发实时分享功能),在另一种实施方式中,还可以通过滑动操作触发实时分享功能,例如,该滑动操作为上下滑动、左右滑动或以特定轨迹滑动等,具体示例可参见图6A所示的用户界面610。
如图6A所示,用户界面610和图4A所示的用户界面410类似,区别在于,用户界面610中的悬浮窗312为收起状态,例如在屏幕左侧边缘以图标形式显示。电子设备100可以响应于针对用户界面610的滑动操作(图6A以该滑动操作为指关节按照“W”的特定轨迹滑动为例进行说明),显示分享内容和分享对象的选择界面,例如显示图6B所示的用户界面620。
如图6B所示,用户界面620包括可选择的分享内容的列表621和可选择的分享对象的列表622。其中,列表621可以包括选项621A、选项621B和选项621C。选项621A下显示有字符“共享短视频应用”,选项621A用于指示前台应用(图6B以短视频应用为例进行示意)的窗口。选项621B下显示有字符“共享屏幕”,选项621B用于指示电子设备100的屏幕的显示内容。选项621C下显示有字符“共享视频应用”,选项621C用于指示后台应用(图6B以视频应用为例进行示意)的窗口。不限于上述示例的情况,在另一些示例中,电子设备100的后台应用可以更少或者更多,例如,电子设备100未运行视频应用,则列表621不包括选项621C,或者,电子设备100还运行了其他后台应用(例如短信息应用),则列表621还可以包括指示短信息应用的窗口的选项。在一些示例中,电子设备100可以响应于针对列表621的触摸操作(例如左右滑动),显示列表621包括的其他选项。
在一种实施方式中,电子设备100可以响应于针对列表621中的任意一个选项的触摸操作(例如点击操作),选择该选项相关的音频流/视频流为分享内容,或者取消该选择。例如,电子设备100可以响应于针对选项621A的触摸操作(例如点击操作),选择短视频应用的音频流/视频流为分享内容,此时选项621A可以为图6B所示的选中状态,选项621B和选项621C可以为图6B所示的未选中状态。如图6B所示,列表621上可以显示有提示信息623,提示信息623可以指示选择的分享内容的数量,例如当前为“已选择1项”,可以表征当前已选择了1个分享内容(即上述短视频应用的音频流/视频流)。例如,电子设备100可以响应于针对选项621B的触摸操作(例如点击操作),选择电子设备100的屏幕的显示内容,可选地以及电子设备100的扬声器的播放内容为分享内容,在这种情况下,电子设备100显示图6A所示的用户界面610时实时分享用户界面610相关的音频流/视频流,电子设备100响应于用户操作将显示界面切换为图3的(A)所示的通话界面310时,实时分享通话界面310相关的音频流/视频流。
不限于上述示例的情况,在另一些示例中,用户可以基于列表621选择多个分享内容,电子设备100可以将用户选择的多个分享内容发送给被分享设备。在一种情况下,被分享设备可以分屏显示上述多个分享内容,界面示例和图21E类似。不限于此,在另一种情况下,被分享设备也可以响应于用户操作确定显示的分享内容,例如,被分享设备可以默认显示上述多个分享内容中的一个分享内容,接收到用于切换分享内容的用户操作时,显示上述多个分享内容中的其他分享内容。在另一种情况下,被分享设备可以通过连接的设备一起显示上述多个分享内容,例如,被分享设备可以显示一个分享内容,和被分享设备连接的设备显示另一个分享内容。本申请对被分享设备显示多个分享内容的方式不作限定。在一些示例中,电子设备100可以响应于用户操作,向被分享设备发送上述多个分享内容中的N个分享内容的音频数据,不向被分享设备发送其他分享内容的音频数据,N为正整数。在另一些示例中,被分享设备接收到电子设备100发送的多个分享内容的音频数据后,可以响应于用户操作,播放其中M个分享内容的音频数据,M为正整数,从而避免多个音频数据一起播放影响用户体验。
如图6B所示,列表622包括指示通话对方(即用户B/电子设备200)的选项622A和指示附近设备的多个选项,选项622A包括字符“电话号码2(通话中)”,其中电话号码2为通话对方的通讯号码。指示附近设备的多个选项例如包括选项622B、选项622C、选项622D、选项622E、选项622F。选项622B包括字符“用户C的手机”,用于指示设备类型为“手机”、相关的用户名称为“用户C”的附近设备。选项622C包括字符“我的笔记本”,用于指示设备类型为“笔记本”,相关的用户名称为使用电子设备100的用户A的附近设备。其他选项类似,选项622D包括字符“用户D的平板电脑”,选项622E包括字符“用 户C的耳机”,选项622F包括字符“用户E”的音箱。列表622还包括选项622G,选项622G用于触发显示更多的功能选项,例如查看更多的附近设备,选择列表622示出的全部选项(即将这些选项指示的附近设备设置为分享对象)等。不限于上述示例的情况,在另一些示例中,附近设备可以更多或更少,相应地,列表622包括的选项可以更多或更少。
在一种实施方式中,电子设备100可以响应于针对列表622中的任意一个选项的触摸操作(例如点击操作),选择该选项指示的设备为分享对象,或者取消该选择。
在一些示例中,电子设备100可以响应于针对列表622中的选项622A的触摸操作(例如点击操作),选择选项622A指示的通话对方(即电子设备200)为分享对象,在一些示例中,电子设备100接收到针对选项622A的触摸操作(例如点击操作)之后,选项622A可以为选中状态,具体示例可参见图6C所示的用户界面630,用户界面630中的选项622A包括的字符为“用户B(电话号码2)正在观看”。
在一些示例中,电子设备100可以响应于针对选项622A的触摸操作(例如点击操作),向选项622A指示的通话对方(即电子设备200)发送分享请求,具体说明和图4A-图4C、图5A-图5D所示的实施方式类似,其中,在一种情况下,电子设备100向电子设备200发送分享请求后,可以继续显示分享内容和分享对象的选择界面,例如显示图6C所示的用户界面630,电子设备100可以响应于针对用户界面630中的收起选项631的触摸操作(例如点击操作或者上下滑动),返回显示上一级界面,例如图6A所示的用户界面610。不限于此,在另一种情况下,用户基于列表621选择的分享内容为后台应用(假设为视频应用)的音频流/视频流,电子设备100可以将视频应用切换至前台运行,并将视频应用的音频流/视频流分享给电子设备200。电子设备100可以响应于上述针对用户界面630中的收起选项631的触摸操作,显示视频应用的用户界面。
不限于上述示例的情况,在另一些示例中,用户可以基于列表622选择多个分享对象,电子设备100可以将分享内容发送给用户选择的多个分享对象。例如,电子设备100可以依次接收针对列表622中的选项622A、选项622B、选项622C、选项622D、选项622E和选项622F的触摸操作(例如点击操作),此时,电子设备100可以显示图6D所示的用户界面640,在用户界面640中,选项622A、选项622B、选项622C、选项622D、选项622E和选项622F均为选中状态,可以表征用户已选择选项622A、选项622B、选项622C、选项622D、选项622E和选项622F指示的设备为分享对象。选项622A包括的字符为“用户B(电话号码2)正在观看”,选项622B包括的字符为“用户C(手机)正在观看”,选项622C包括的字符为“我的笔记本正在播放”,选项622D包括的字符为“用户D(平板电脑)正在观看”,选项622E包括的字符为“用户C(耳机)正在收听”,选项622F包括的字符为“用户E(音箱)正在收听”。
在另一种实施方式中,还可以通过多任务列表/多任务窗口的用户界面触发实时分享功能,例如,电子设备100可以响应于针对图4A所示的用户界面410的触摸操作(例如从下往上滑动),显示多任务列表/多任务窗口的用户界面,具体示例可参见图7A所示的用户界面710。
如图7A所示,用户界面710用于显示窗口列表,窗口列表包括电子设备100上运行的至少一个窗口,例如通话应用的窗口711、短视频应用的窗口712和视频应用的窗口713。任意一个窗口上可以显示有应用程序的图标和名称,以及用于触发实时分享该应用的音频流/视频流的分享控件,例如,短视频应用的窗口712上显示有短视频应用的图标和名称“短视频”712A,以及分享控件712B。在一些示例中,电子设备100可以响应于针对分享控件712B的触摸操作(例如点击操作),显示可选择的分享对象的列表,例如显示图7B所示的用户界面720。
如图7B所示,用户界面720和图7A所示的用户界面710类似,区别在于,用户界面720还包括可选择的分享对象的列表721,并且,短视频应用的窗口712为选中状态,分享控件712B为选中状态。列表721和图6B所示的用户界面620中的列表622类似,包括指示通话对方(即用户B/电子设备200)的选项721A和指示附近设备的多个选项,指示附近设备的多个选项例如包括选项721B(包括字符“用户C的手机”)、选项721C(包括字符“我的笔记本”)和选项721D(包括字符“用户D的平板电脑”)。在一种实施方式中,电子设备100可以响应于针对列表721中的任意一个选项的触摸操作(例如点击操作),选择该选项指示的设备为分享对象,或者取消该选择,具体示例和上述电子设备100响应于针对图6B所示的用户界面620包括的列表622中的任意一个选项的触摸操作的示例类似。不限于此,在另一些示例中,用户可以基于列表721选择多个分享对象,电子设备100可以将分享内容发送给用户选择的多个分享对象。例如,电子设备100可以显示图7C所示的用户界面730,在用户界面730中,列表721中的选项721A、选项721B、选项721C和选项721D均为选中状态,可以表征:用户已选择选项721A、选项721B、选项721C和选项721D指示的设备为分享对象。选项721A包括的字符为“用户B(电话号码2)正在观看”, 选项721B包括的字符为“用户C(手机)正在观看”,选项721C包括的字符为“我的笔记本正在播放”,选项721D包括的字符为“用户D(平板电脑)正在观看”。
在一些示例中,电子设备100选择分享对象后,可以接收针对图7B所示的用户界面720或图7C所示的用户界面730中的短视频应用的窗口712的触摸操作(例如点击操作),显示分享内容的播放界面,例如显示图4B所示的用户界面420。
不限于上述示例的情况,在另一些示例中,被分享设备显示的多任务列表/多任务窗口还包括实时共享内容的显示窗口,该显示窗口上也可以显示有分享控件,该分享控件用于触发将上述实时共享内容再分享给其他设备。例如,实时共享内容的显示窗口可以为图5D所示的用户界面540中的窗口542,窗口542上可以显示图7A所示的用户界面710中的分享控件712B。
在另一种实施方式中,还可以通过通知界面触发实时分享功能,例如,电子设备100可以响应于针对图4A所示的用户界面410的触摸操作(例如从上往下滑动),显示通知界面,具体示例可参见图8A所示的用户界面810。
如图8A所示,用户界面810包括后台应用(图8A以视频应用为例进行示意)的通知栏811、Wi-Fi功能的控件812、蓝牙功能的控件813和菜单814。控件812可以用于开启或关闭电子设备100的Wi-Fi功能,还可以用于选择连接的Wi-Fi信号源(图8A以已连接名称为“信号源1”的Wi-Fi信号源为例进行示意)。控件813可以用于开启或关闭电子设备100的蓝牙功能,还可以用于选择电子设备100通过蓝牙连接的设备(图8A以已连接名称为“耳机1”的设备为例进行示意)。菜单814可以包括多个功能的控件,例如手电筒的控件、飞行模式的控件、移动数据的控件814A、自动旋转的控件、即时分享的控件814B、定位功能的控件、截屏功能的控件、静音功能的看空间、屏幕录制的控件和NFC的控件等。其中,控件814A用于开启或关闭电子设备100的移动数据(也可称为是开启或关闭蜂窝通信功能)。控件814B下显示有字符“即时分享”814C和控件814D,控件814B可以用于开启或关闭电子设备100的即时分享功能,控件814D可以触发显示即时分享功能的更多功能信息,例如选择即时分享的方式。用户界面810中的控件812、控件813和控件814A均为开启状态,并且,用户界面810中位于顶部的状态信息815包括“5G”、Wi-Fi和蓝牙的标识,可以表征电子设备100当前已开启移动数据、Wi-Fi功能和蓝牙功能。在一些示例中,电子设备100可以响应于针对控件814B的触摸操作(例如点击操作),显示分享内容和分享对象的选择界面,例如显示图6B所示的用户界面620。
以上示例可以通过悬浮窗中的按钮、滑动操作、多任务列表/多任务窗口、通知界面中的按钮触发实现通话中一起看、一起听等实时共享功能,使用方便灵活,用户体验感好。
不限于上述实施方式,在另一种实施方式中,进行通信的多个设备还可以通过摄像头采集用户的人脸图像,并将采集的图像共享给其他用户,其中,该图像可以是设备当前采集的,也可以是设备之前采集的(例如进行通信之前采集的),本申请对此不作限定。对于其中一个用户而言,使用的电子设备可以显示至少一个窗口,每个窗口可以显示一个用户的图像,例如图15C所示的用户界面1540中的控件1541。在一些示例中,分享用户可以选择电子设备显示的至少一个窗口,从而选择这至少一个窗口对应的设备/用户为分享对象,例如,上述可选择的分享对象的列表可以包括显示有用户的图像的至少一个窗口。在另一种实施方式中,设备1发现其他任意一个设备(假设为设备2)时,设备2可以响应设备1,并在响应设备1时将使用设备2的用户的头像(例如联系人中的头像、即时分享中的头像或聊天应用中的图像等)发送给设备1,设备1显示的分享对象的列表可以包括该头像,该头像可以用于触发向设备2进行实时共享。本申请对分享对象的显示方式不作限定。类似地,上述可选择的分享内容的列表也可以包括图标,本申请对分享内容的显示方式不作限定。
不限于上述实施方式(用户基于电子设备100显示的分享对象的列表选择分享对象),在另一种实施方式中,用户也可以通过电子设备100的扫一扫功能自定义添加分享对象。在一些示例中,电子设备100可以响应于针对图6B所示的用户界面620中的选项622G的触摸操作,显示选择设备的选项,电子设备100可以响应于针对该选项的触摸操作,通过摄像头拍摄附近的电子设备和/或用户,并从拍摄的图像中选择电子设备和/或用户作为分享对象进行实时分享,具体示例可参见图8B和图8C。
如图8B所示,电子设备100可以显示用户界面820,用户界面820可以包括通过扫一扫功能拍摄到的图像821,图像821可以包括电子设备100的使用者选中的用户821A和用户821B,用户界面820还可以包括电子设备100根据选中用户识别到的具体设备:用户821A对应的设备822(包括字符“用户M的手机”)和用户821B对应的设备823(包括字符“用户N的手机”),在一些示例中,电子设备100可以响应于针对用户界面820包括的任意一个设备/用户的触摸操作,向该设备/该用户对应的设备实时分享。用 户界面820还包括扫一扫的控件824,控件824可以触发重新通过摄像头拍摄图像。
在一种实施方式中,电子设备100根据拍摄图像中的选中用户识别对应的设备之前,用户(例如上述用户821A)需要将人体特征信息(例如人脸)录入到使用的电子设备(例如上述设备822)上,或者,电子设备实时/周期性(例如每天2次)/不定期(例如用户每次使用相机时)采集和提取使用者的人体特征信息(例如人脸)。在一种实施方式中,电子设备100的使用者选择通过扫一扫功能拍摄到的图像中的至少一个用户后,电子设备100可以识别得到这至少一个用户的特征信息,例如但不限于包括:性别、头发长度、预测年龄、肤色、是否佩戴眼镜、服装类型、服装颜色、人脸数据等。电子设备100可以广播(例如通过Wi-Fi或BT)识别到的特征信息的原始数据或者关键数据,其他设备接收到广播消息后,可以将存储的人体特征信息和广播消息中的数据进行匹配,若匹配成功则向广播发送方(即电子设备100)发送响应消息。电子设备100可以根据响应消息显示选中用户对应的设备(例如上述设备822和设备823),以供使用者选择分享对象。可以理解地,仅广播关键数据可以减小数据传输量,更加高效地识别选中用户对应的设备。不限于上述实施方式,在另一种实施方式中,电子设备100也可以通过第三方设备(例如附近的电子设备、服务器等网络设备)识别选中用户对应的设备。例如,电子设备100可以将选中用户的特征信息和/或电子设备100的位置信息(例如但不限于包括定位信息、蜂窝小区的信息、Wi-Fi ID等)发送给第三方设备,第三方设备可以根据接收到的信息进行匹配查询,将查询到的和选中用户匹配的设备信息返回给电子设备100。
在一些示例中,电子设备100可以响应于针对图8B所示的用户界面820中的任意一个用户的触摸操作,取消选择该用户。在一些示例中,电子设备100可以响应于针对图8B所示的用户界面820中的任意一个设备的触摸操作,删除该设备。例如,取消选择用户821A或者删除设备822时,电子设备100取消显示图像821中的用户821A外部的圆圈,同时取消显示用户界面820中的设备822。
如图8C所示,电子设备100可以显示用户界面830,用户界面830可以包括通过扫一扫功能拍摄到的图像831,图像831可以包括电子设备100的使用者选中的设备831A和设备831B,用户界面830还可以包括根据选中设备识别到的具体设备:设备831A对应的设备832(包括字符“用户S的笔记本”)和设备831B对应的设备833(包括字符“用户T的眼镜”)。在一些示例中,电子设备100可以响应于针对用户界面830包括的任意一个设备的触摸操作,向该设备实时分享。用户界面830还包括扫一扫的控件834,控件834可以触发重新通过摄像头拍摄图像。
在一种实施方式中,电子设备100根据拍摄的图像中的选中设备识别对应的具体设备时,可以识别以下至少一项:图像中的选中设备的类型(例如为笔记本或者手机),图像中的选中设备的设备制造商/品牌(例如通过图像中的选中设备的商标(logo)识别),设备的外观特征(例如颜色)。电子设备100可以将识别到的特征以广播方式或者通过第三方设备进行匹配查询,以获取到并显示选中设备对应的具体设备(例如上述设备832和设备833),以供使用者选择分享对象。以广播方式进行匹配查询、通过第三方设备进行匹配查询的说明可参见上述电子设备100根据拍摄图像中的选中用户识别对应的设备中,广播识别选中用户对应的设备、通过第三方设备识别选中用户对应的设备的说明。
在一些示例中,电子设备100也可以响应于针对图8C所示的用户界面830中的任意一个设备的触摸操作,取消选择该设备/删除该设备,具体说明和图8B的说明类似,不再赘述。
不限于上述示例的情况,在另一些示例中,用户A也可以从和电子设备100通信的其他电子设备(假设为电子设备200)拍摄的图像中选择电子设备和/或用户作为分享对象进行实时分享,这样即使用户A和上述选择的分享对象的距离较远,也可以通过电子设备200拍摄的图像自定义添加上述选择的分享对象。例如,用户A使用电子设备100和使用电子设备200的用户B进行NewTalk时,用户B可以操作电子设备200开启摄像头并拍摄附近的电子设备和/或用户,拍摄的图像可以分享给电子设备100显示(例如通过图15C所示的用户界面1540中的控件1541显示电子设备200拍摄的图像)。假设用户A从该图像中选择了用户C和用户D作为分享对象,电子设备100可以将分享数据发送给电子设备200,电子设备200再将分享数据转发给用户C使用的电子设备和用户D使用的电子设备。
不限于上述实施方式,在另一种实施方式中,用户A也可以通过电子设备100的碰一碰功能(例如通过NFC实现)获取到附近电子设备和/或用户的信息,并基于获取到的信息自定义添加至少一个设备为分享对象以进行实时分享,本申请对自定义添加分享对象的方式不作限定。
上述针对图4A所示的用户界面410包括的悬浮窗312中的分享选项312D的触摸操作,上述针对图6A所示的用户界面610的滑动操作(图6A示例的该滑动操作为指关节按照“W”的特定轨迹滑动),上述针对图7A所示的用户界面710中的分享控件712B的触摸操作,以及上述针对图8A所示的用户界面810 中的控件814B的触摸操作,可以统称为用于触发实时分享功能/实时共享功能的用户操作。不限于此,用于触发实时分享功能的用户操作还可以有其他形式,例如针对图3的(A)所示的通话界面310中的分享选项312D的触摸操作(例如点击操作),语音输入,手势等,本申请对此不作限定。
不限于上述实施方式,在另一种实施方式中,电子设备100也可以在非通话态下接收用于触发实时分享功能的用户操作,该实时分享功能可以通过近场通信技术实现。
在一些示例中,电子设备100可以响应于该用户操作,显示分享内容和分享对象的选择界面。例如,电子设备可以显示图9A所示的用户界面910,用户界面910和图6B所示的用户界面620类似,区别在于,用户界面910中位于顶部的状态栏不包括通话图标,表征电子设备100当前处于非通话态。并且,用户界面910中的可选择的分享对象的列表911不包括指示通话对方的选项,仅包括指示附近设备的多个选项。不限于此,在另一些示例中,分享内容和分享对象的选择界面也可以为图9B所示的用户界面920,用户界面920和图7B所示的用户界面720类似,区别在于,用户界面920中位于顶部的状态栏不包括通话图标,表征电子设备100当前处于非通话态,并且,用户界面920中的可选择的分享对象的列表921不包括指示通话对方的选项,仅包括指示附近设备的选项多个。
在一些示例中,电子设备100可以接收针对上述指示附近设备的多个选项中的任意一个选项(以选项622B为例进行示意)的触摸操作,向选项622B指示的电子设备400(即“用户C”的“手机”)发送分享请求。电子设备400接收到电子设备100发送的分享请求时,可以显示提示信息,例如显示图9C所示的用户界面930。用户界面930可以为电子设备400的桌面,用户界面930中位于顶部的状态信息931包括“5G”和蓝牙的标识,可以表征电子设备100当前已开启移动数据和蓝牙功能。用户界面930还包括提示信息932,提示信息932包括分享内容(以图4A所示的用户界面410中的播放窗口412播放的短视频1的音频流/视频流为例)所属的短视频应用的图标932A,字符“用户A邀请您一起看”,以及接受控件932B。在一些示例中,电子设备400可以响应于针对接受控件932B的触摸操作(例如点击操作),通过近场通信技术(例如蓝牙)接收电子设备100发送的分享内容,并显示该分享内容,例如显示图5B所示的用户界面520中的播放窗口522。
在一些示例中,电子设备100向电子设备400发送分享请求的通信方式可以是蓝牙,在另一些示例中,也可以是Wi-Fi或者蜂窝通信方式等其他通信方式,也就是说,电子设备100向电子设备400发送分享请求的通信方式,和电子设备100向电子设备400发送分享内容的通信方式,可以相同,也可以不同。
在一些示例中,电子设备100实时分享任意一个内容时,向不同被分享设备发送的该分享内容的多媒体数据流可以相同,也可以不同。例如,电子设备100实时分享短视频应用的多媒体数据流时,电子设备100可以向通过蓝牙连接的至少一个被分享设备发送短视频应用的音频流,向通过Wi-Fi连接的至少一个被分享设备发送短视频应用的音频流和视频流。
不限于上述示例的情况,电子设备100也可以在通话态下通过近场通信技术实现实时分享功能,本申请对此不作限定。
以上示例可以实现附近蓝牙设备等近场通信场景中的一起看、一起听等实时共享功能,应用场景更加广泛,用户体验感更好。
不限于上述示例的情况,在另一些示例中,还可以在卫星、D2D、V2X等通信场景下实现一起看、一起听等实时共享功能,本申请对实现实时共享功能的通信方式不作限定。
可以理解地,分享设备接收到用于触发实时分享功能的用户操作后,可以但不限于按照以下任意一种方式确定分享对象和分享内容:
方式1,预设分享对象和预设分享内容,例如图4A-图4B所示的示例中,电子设备100直接将通话对方(即电子设备200)设置为分享对象,以及将前台应用(即短视频应用)的音频流/视频流设置为分享内容。
方式2:预设分享对象和根据接收到的用户操作确定分享内容,例如,电子设备100响应于针对图4A所示的用户界面410中的分享选项312D的触摸操作(例如点击操作),显示分享内容的选择界面,例如显示图6B所示的用户界面620中的可选择的分享内容的列表621,电子设备100可以根据用户基于分享内容的选择界面输入的操作确定分享内容,并且,电子设备100可以直接将通话对方(即电子设备200)设置为分享对象。
方式3:预设分享内容和根据接收到的用户操作确定分享对象,例如图7A-图7C所示的示例中,电子设备100可以直接将用于触发实时分享功能的用户操作针对的短视频应用的音频流/视频流设置为分享内容,并且,电子设备100可以根据用户基于分享对象的选择界面(即图7B所示的用户界面720)输入的 操作确定分享对象。
方式4:根据接收到的用户操作确定分享内容和分享对象,例如图6A-图6D所示的示例中,电子设备100可以根据用户基于分享内容和分享对象的选择界面(即图6B所示的用户界面620)输入的操作确定分享内容和分享对象。
本申请中,用于触发实时分享功能的悬浮窗的显示时机可以但不限于为以下几种情况。其中,这里的悬浮窗(也可理解为是显示形态)可以是悬浮窗的详细信息,例如图3的(A)所示的悬浮窗312,也可以是悬浮窗的图标,例如图3的(C)所示的悬浮窗312。电子设备100显示悬浮窗时可以切换显示形态,具体示例可参见图3的(A)和图3的(C)的说明。
情况1:电子设备100可以在通话态下显示悬浮窗。在一些示例中,该通话态可以为运营商通话的通话态,界面示例可参见图3。在另一些示例中,该通话态可以为OTT通话的通话态,界面示例和图3类似(例如,此时显示社交应用的语音/视频通话的界面)。
情况2:电子设备100可以在显示会话界面时显示悬浮窗。在一些示例中,该会话界面可以为运营商会话的界面(即短信息的会话界面)。在另一些示例中,该会话界面可以为OTT会话(如即时通信应用的会话界面,该会话的对象可以有一个或多个)的界面。
情况3:电子设备100可以在显示某个通话对象的信息时(此时不为通话态或会话态)显示悬浮窗,也可理解为是用户在浏览某个通话对象时为用户提供悬浮窗。在一些示例中,电子设备100可以在显示某个联系人的详细信息时显示悬浮窗,该联系人可以是预设应用中的联系人,该预设应用可以用于实现运营商通话和/或运营商会话,也可以用于实现OTT通话和/或OTT会话。在另一些示例中,电子设备100可以在显示某个通信标识的信息时显示悬浮窗,通信标识可以用于标识通话对象,不同通话对象的通信标识不同。这里电子设备100显示的通信标识对应的通话对象可以为电子设备100未记录/存储的通话对象,也可以为电子设备100已记录/存储的通话对象(即联系人)。通信标识例如为运营商通话的通信标识(如电话号码)、OTT通话的通信标识(如网络聊天应用的个人号码或个人名称)。例如,用户在电子设备100的拨号界面中输入某个电话号码时,电子设备100可以显示悬浮窗。又例如,用户在网络聊天应用的搜索栏中输入其他用户的个人号码或者个人名称时,电子设备100可以显示悬浮窗。
情况4:电子设备100可以在显示预设应用的预设界面时(此时不为通话态或会话态,此时不是仅显示一个通话对象的信息)显示悬浮窗。该预设应用可以用于实现运营商通话和/或运营商会话,也可以用于实现OTT通话和/或OTT会话。在一些示例中,该预设界面包括短信息的会话列表。在另一些示例中,该预设界面包括通话记录/聊天记录。在另一些示例中,该预设界面包括联系人的列表(例如下图11D所示的用户界面1140)。在另一些示例中,该预设界面包括OTT会话(如即时通信会话)的列表。
情况5:电子设备100可以在显示特定界面时(此时不为通话态或会话态,此时不是仅显示一个通话对象的信息)显示悬浮窗,例如,该特定界面为桌面。
其中,上述情况1、情况2和情况3可以理解为是存在具体的通话对象时显示悬浮窗,这里的通话对象可以是正在进行通话/会话的对象,也可以是意图进行通话/会话的对象(例如情况3)。在一种实施方式中,电子设备100可以先和该通话对象建链,建链成功后再显示悬浮窗,在另一种实施方式中,电子设备100可以先显示悬浮窗,接收到作用于该悬浮窗的用户操作(用于触发实时分享功能)时,再和分享对象(可以是上述通话对象,也可以不是上述通话对象)建链。在一种实施方式中,电子设备100和分享对象进行实时分享时/电子设备100显示悬浮窗时,可以拉起NewTalk(可以使用主链路和/或辅助链路进行通话),也可以不拉起NewTalk。
上述情况4和情况5可以理解为是不存在具体的通话对象时显示悬浮窗。在一种实施方式中,电子设备100可以先显示悬浮窗,接收到作用于该悬浮窗的用户操作(用于触发实时分享功能)时,再和分享对象建链。可选地,该分享对象可以是用户选择的,例如,电子设备100显示应用A的界面时显示悬浮窗,响应于作用于该悬浮窗的用户操作,当应用A不存在联系人,可以显示系统应用(如通话应用/短信息应用)的联系人,当应用A存在联系人,可以显示应用A的联系人,显示的联系人用于用户从中选择出分享对象。在一种实施方式中,电子设备100和分享对象进行实时分享时/电子设备100显示悬浮窗时,可以拉起NewTalk(可以使用主链路和/或辅助链路进行通话),也可以不拉起NewTalk。
在一种实施方式中,电子设备100作为分享设备进行实时分享时,可以管理被分享设备,在一种实施方式中,电子设备100作为分享设备进行实时分享时,可以改变分享内容,具体示例如下所述:
电子设备100显示分享内容的播放界面(例如图4B所示的用户界面420)时,可以响应于针对用户界面420中的分享控制选项421的触摸操作(例如点击操作),显示分享菜单,例如显示图10A所示的用 户界面1010。相比图4B所示的用户界面420,用户界面1010还包括分享菜单1011,分享菜单1011可以包括多个选项,例如选项1011A、选项1011B和选项1011C。其中,选项1011A包括字符“变更分享内容/分享对象”。选项1011B包括字符“暂停分享”,用于取消/暂停/停止实时分享。选项1011C包括字符“更多”,用于触发显示更多的功能选项。电子设备100可以响应于针对选项1011A的触摸操作(例如点击操作),显示分享内容和/或分享内容的管理界面,例如显示图6D所示的用户界面640。
在一些示例中,电子设备100可以响应于针对图6D所示的用户界面640中的列表621包括的选项621A(选中状态)的触摸操作(例如点击操作),取消分享选项621A指示的短视频应用的音频流/视频流。电子设备100可以响应于针对列表621中的选项621C的触摸操作(例如点击操作),选择选项621C指示的视频应用的音频流/视频流为分享内容,电子设备100可以向已选择的分享对象(即用户界面640中的列表622包括的选中状态的选项指示的设备)分享上述已选择的分享内容。可以理解为是,电子设备100响应于用户操作,将分享内容从短视频应用的音频流/视频流变更为视频应用的音频流/视频流。在一些示例中,电子设备100可以响应于针对列表622中的任意一个选项(选中状态)的触摸操作(例如点击操作),取消向该选项指示的设备发送分享内容,例如,该选项为列表622中的选项622A,则电子设备100不向选项622A指示的通话对方(即电子设备200)发送分享内容。可以理解为是,电子设备100响应于用户操作,删除已有的被分享设备。例如,经过上述过程后,电子设备100可以显示图10B所示的用户界面1020,用户界面1020和图6B所示的用户界面620类似,区别在于,在用户界面1020中,列表621中的选项621A为未选中状态,选项621C为选中状态,列表622中的选项622A为未选中状态。
以上示例实现了实时共享过程中的成员管理、内容管理,可以满足用户的个性化需求,提升用户体验感。
不限于上述实施方式,在另一种实施方式中,电子设备100可以和多个电子设备进行运营商通话/OTT通话等NewTalk,电子设备100可以向这多个电子设备中的至少一个电子设备实时分享音频流/视频流。在一些示例中,电子设备100可以响应于用于触发实时分享功能的用户操作,向多个通话对方(即上述多个电子设备)分享前台应用的音频流/视频流。在另一些示例中,电子设备100可以响应于用于触发实时分享功能的用户操作,在分享对象的选择界面上显示上述多个电子设备的信息,以用于用户选择是否向其中至少一个设备实时分享音频流/视频流。例如,电子设备100可以显示图11A所示的用户界面1110,用户界面1110和图6B所示的用户界面620类似,区别在于,在用户界面1110中,可选择的分享对象的列表1111还包括选项1111A,选项1111A包括字符“电话号码3(通话中)”,用于指示通讯号码为“电话号码3”的通话对方。列表622中的选项622A和选项1111A可以表征电子设备100当前和通讯号码为“电话号码2”的设备、通讯号码为“电话号码3”的设备进行运营商通话/OTT通话等NewTalk。电子设备100可以响应于针对选项622A和/或选项1111A的触摸操作(例如点击操作),向通讯号码为“电话号码2”的设备和/或通讯号码为“电话号码3”的设备实时分享音频流/视频流。
以上示例中,不仅可以实现单播类型的实时共享功能(一个被分享设备),而且可以实现广播或组播类型的实时分享功能(多个被分享设备),可以根据具体场景自适应调整,满足用户在不同场景下的不同需求,提升用户体验感。
不限于上述实施方式,在另一种实施方式中,电子设备100也可以在非通话态下接收用于触发实时分享功能的用户操作。在一些示例中,电子设备100可以响应于该用户操作,在分享对象的选择界面上显示最近通信的至少一个设备,以用于用户选择是否向这至少一个设备实时分享音频流/视频流。可选地,这至少一个设备可以是预设的时间范围(例如1小时、1天或1周)内和电子设备100通信的设备,可选地,这至少一个设备的数量可以是电子设备100预设的,例如小于或等于3,可选地,这至少一个设备可以是通过预设应用和电子设备100通信的设备,例如,预设应用为实现运营商通话、OTT通话和/或网络聊天的应用。本申请对上述最近通信的至少一个设备的具体类型不作限定。示例性地,电子设备100可以显示图11B所示的用户界面1120,用户界面1120和图6B所示的用户界面620类似,区别在于,在用户界面1120中,可选择的分享对象的列表1121不包括用户界面620中的选项621A,列表1121还包括选项1121A。选项1121A包括字符“电话号码2(最近联系人)”,用于指示通讯号码为“电话号码2”、电子设备100最近进行运营商通话/OTT通话等NewTalk的用户/设备。电子设备100可以响应于针对选项1121A的触摸操作(例如点击操作),向通讯号码为“电话号码2”的设备发送NewTalk的呼叫请求,通讯号码为“电话号码2”的设备接受呼叫请求后,电子设备100可以和该设备进行NewTalk,电子设备100可以基于该NewTalk向该设备实时分享音频流/视频流。
在另一些示例中,电子设备100可以响应于用于触发实时分享功能的用户操作,在分享对象的选择界 面上显示联系人的图标,以用于用户选择是否向电子设备100存储的至少一个联系人实时分享音频流/视频流,可选地,这至少一个联系人可以是预设应用中的联系人,例如,预设应用为实现运营商通话、OTT通话和/或网络聊天的应用,本申请对联系人的具体类型不作限定。示例性地,电子设备100可以显示图11C所示的用户界面1130,用户界面1130和图6B所示的用户界面620类似,区别在于,在用户界面1130中,可选择的分享对象的列表1131不包括用户界面620中的选项621A,列表1131还包括选项1131A,选项1131A包括字符“联系人”。电子设备100可以响应于针对选项1131A的触摸操作(例如点击操作),显示电子设备100存储的至少一个联系人的信息,例如显示图11D所示的用户界面1140。用户界面1140可以包括标题1141(“联系人”)、搜索框1142、联系人列表1143和确定控件1144。联系人列表1143可以包括多个联系人的信息,例如名称为“亲友1”的联系人的信息1143A,信息1143A右侧还显示有选择控件1143B,选择控件1143B用于选择信息1143A指示的联系人“亲友1”或者取消该选择,其他联系人的信息类似,不再赘述。电子设备100可以响应于针对确定控件1144的触摸操作(例如点击操作),向联系人列表1143中已选择的联系人(例如信息1143A指示的联系人“亲友1”)对应的设备发送NewTalk的呼叫请求,该设备接受呼叫请求后,电子设备100可以和该设备进行NewTalk,电子设备100可以基于该NewTalk向该设备实时分享音频流/视频流。
不限于上述示例的情况,在另一些示例中,电子设备100可以通过存储的联系人(例如上述最近联系人、联系人列表中的联系人)的标识信息(例如上述电话号码、上述网络聊天的账号)获取到该联系人对应的设备的通信ID,例如通过网络设备300进行寻址。电子设备100和该联系人对应的设备寻址完成后,可以基于获取到的对方的通信ID建立连接,电子设备100可以基于建立的连接向该联系人对应的设备实时分享音频流/视频流,其中,上述建立的连接例如但不限于为蓝牙连接、Wi-Fi连接或NewTalk连接等。
不限于上述实施方式,在另一种实施方式中,电子设备100可以响应于用户操作确定实时分享的方式,例如,选择一起看、一起听、一起编辑或者一起玩等实时分享方式。以下示例以可被选择的实时分享方式包括一起看和一起听为例进行示意。
在一些示例中,电子设备100可以响应于用于触发实时分享功能的用户操作(例如针对图8A所示的用户界面810中的控件814D的触摸操作),显示实时分享方式的选择界面,例如显示图12A所示的用户界面1210,用户界面1210包括提示框1211,提示框1211包括一起看的选项1211A和一起听的选项1211B。在一些示例中,用户选择的实时分享方式不同时,电子设备100显示的分享内容和/或分享对象的选择界面也可以不同。例如,电子设备100可以响应于针对用户界面1210中的一起看的选项1211A的触摸操作(例如点击操作),显示图12B所示的用户界面1220。用户界面1220和图6B所示的用户界面620类似,用户界面1220中的可选择的分享内容的列表1221包括多个指示可被观看的分享内容的选项,例如共享短视频应用的图像的选项621A、共享电子设备100的屏幕的显示内容的选项621B、共享视频应用的图像的选项621C。用户界面1220中的可选择的分享对象的列表1222包括多个指示可显示图像的设备的选项,例如指示通讯号码为“电话号码2”的电子设备200(例如手机)的选项622A、指示“用户C”的“手机”的选项622B、指示“用户A”的“笔记本”的选项622C和指示“用户D”的“平板电脑”的选项622D。电子设备100可以响应于针对用户界面1210中的一起听的选项1211B的触摸操作(例如点击操作),显示图12C所示的用户界面1230。用户界面1230和图6B所示的用户界面620类似,用户界面1230中的可选择的分享内容的列表1231包括多个指示可被收听的分享内容的选项,例如共享短视频应用的音频的选项621A、共享视频应用的音频的选项621C、共享音乐应用的音频的选项1231A。用户界面1230中的可选择的分享对象的列表1232包括多个指示可播放音频的设备的选项,例如指示通讯号码为“电话号码2”的电子设备200(例如手机)的选项622A、指示“用户C”的“耳机”的选项622E和指示“用户E”的“音箱”的选项622F。不限于上述示例的情况,在另一些示例中,用户界面1230中的可选择的分享对象的列表1232还包括选项622B、选项622C、选项622D,本申请对此不作限定。
在另一些示例中,电子设备100可以根据用于触发实时分享功能的用户操作确定实时共享的方式,也就是说,不同的用于触发实时分享功能的用户操作对应不同的实时共享方式。例如,电子设备100可以响应于针对图6A所示的用户界面610的第一滑动操作(例如图6A所示的指关节按照“W”的特定轨迹滑动),显示上图12B所示的用户界面1220。电子设备100可以响应于针对图6A所示的用户界面610的第二滑动操作(例如图12D所示的用户界面1240中,指关节按照“L”的特定轨迹滑动),显示上图12C所示的用户界面1230。
可以理解地,分享设备进行实时分享时,不仅可以实时分享已运行的应用(例如前台应用和/或后台应用)的音频流/视频流,而且可以实时分享未运行的应用的音频流/视频流。例如,图12C所示的用户界面 1230中,可选择的分享内容的列表1231包括共享短视频应用(前台应用)的音频的选项621A、共享视频应用(后台应用)的音频的选项621B和共享音乐应用(未运行的应用)的音频的选项1231A。电子设备100可以响应于针对选项1231A的触摸操作(例如点击操作),启动音乐应用,并向已选择的分享对象实时分享音乐应用的音频流/视频流。
不限于上述实施方式,在另一种实施方式中,电子设备100可以响应于用户操作确定可选择的分享对象的类型。
在一些示例中,电子设备100可以响应于用于触发实时分享功能的用户操作,显示用于选择分享对象的类型的用户界面,然后显示和选择的类型一致的分享对象的选择界面。例如,电子设备100可以先显示图13所示的用户界面1310,用户界面1310包括提示框1311,提示框1311包括选项1311A(包括字符“分享给联系人”)、选项1311B(包括字符“分享给Wi-Fi设备”)和选项1311C(包括字符“分享给蓝牙设备”)。电子设备100响应于针对选项1311A的触摸操作(例如点击操作)显示的可选择的分享对象为:和电子设备100通过运营商通话/OTT通话等NewTalk进行通信的设备,例如图6B所示的用户界面620中的选项622A指示的设备。电子设备100响应于针对选项1311B的触摸操作(例如点击操作)显示的可选择的分享对象为:和电子设备100通过Wi-Fi进行通信的设备,例如图6B所示的用户界面620中的选项622C、选项622D指示的设备。电子设备100响应于针对选项1311C的触摸操作(例如点击操作)显示的可选择的分享对象为:和电子设备100通过蓝牙进行通信的设备,例如图6B所示的用户界面620中的选项622B、选项622E和选项622F指示的设备。
不限于上述实施方式,在另一种实施方式中,被分享设备接收到分享请求后,可以通过扬声器等音频模块播放该分享请求对应的提示信息,本申请对电子设备输出提示信息的方式不作限定。
在一些示例中,上述被分享设备为耳机。如图14A的(1)所示,电子设备100可以显示用户界面1410,用户界面1410和图12C所示的用户界面1230类似,区别在于,用户界面1410中的选项622E为选中状态,可以表征选项622E指示的电子设备500(即“用户C”的“耳机”)为选择的分享对象。用户界面1410中的选项621A也为选中状态,可以表征选项621A指示的短视频应用的音频为选择的分享内容。电子设备100可以向电子设备500发送分享请求,电子设备500接收到该分享请求后可以播放提示音,例如图14A的(2)所示的“嘟嘟嘟”。电子设备500可以响应于用户操作(例如点击操作)接受该分享请求,接受该分享请求后,电子设备500可以接收电子设备100发送的分享内容,并播放该分享内容,即上述短视频应用的音频,具体示例可参见图14A的(3)。
在另一些示例中,上述被分享设备为音箱。如图14B的(1)所示,电子设备100可以显示用户界面1420,用户界面1420和图12C所示的用户界面1230类似,区别在于,用户界面1420中的选项622F为选中状态,可以表征选项622F指示的电子设备600(即“用户E”的“音箱”)为选择的分享对象。用户界面1420中的选项621A也为选中状态,可以表征选项621A指示的短视频应用的音频为选择的分享内容。电子设备100可以向电子设备600发送分享请求,电子设备600接收到该分享请求后可以播放提示音,例如图14B的(2)所示的“用户A邀请您收听音频”。电子设备600可以响应于用户操作(例如针对电子设备600的播放按键的点击操作)接受该分享请求,接受该分享请求后,电子设备600可以接收电子设备100发送的分享内容,并播放该分享内容,即上述短视频应用的音频,具体示例可参见图14B的(3)。
不限于上述实施方式,在另一种实施方式中,被分享设备接收到分享请求后,也可以不输出提示信息,而是直接接受该分享请求。在一些示例中,如图14C的(1)所示,电子设备100可以显示用户界面1430,用户界面1430和图12B所示的用户界面1220类似,区别在于,用户界面1430中的选项622C为选中状态,可以表征选项622F指示的电子设备700(即“我”的“笔记本”)为选择的分享对象,其中,电子设备700的登录帐号和电子设备100的登录帐号相同(即名称为“用户A”)。用户界面1430中的选项621A也为选中状态,可以表征选项621A指示的短视频应用的图像为选择的分享内容。电子设备100可以向电子设备700发送分享请求,电子设备700接收到该分享请求后可以直接接受该分享请求,接收并显示电子设备100发送的分享内容,具体示例可参见图14C的(2),电子设备700可以显示用户界面1440,用户界面1440用于显示上述短视频应用的图像。
不限于上述实施方式,在另一种实施方式中,分享设备可以向和被分享设备连接的其他设备发送针对该被分享设备的分享请求,该分享请求用于请求向该被分享设备实时分享音频流/视频流,上述其他设备接收到该分享请求后,可以输出提示信息,用户可以通过上述其他设备接受或拒绝上述针对该被分享设备的分享请求,本申请对分享设备发送分享请求的方式不作限定。在一些示例中,如图14D的(1)所示,电子设备100可以显示用户界面1450,用户界面1450和图6B所示的用户界面620类似,区别在于,用户 界面1450中的选项622E为选中状态,可以表征选项622E指示的电子设备500(即“用户C”的“耳机”)为选择的分享对象。用户界面1450中的选项621A也为选中状态,可以表征选项621A指示的短视频应用的图像为选择的分享内容。假设电子设备500和用户界面1450中的选项622B指示的电子设备400(即“用户C”的“手机”)已连接,电子设备100可以向电子设备400发送针对电子设备500的分享请求,电子设备400接收到该分享请求后,可以显示提示信息,例如显示图14D的(2)所示的用户界面1460。用户界面1460可以为电子设备400的桌面,可以包括提示信息1461,提示信息1461包括提示语1461A(包括字符“用户A邀请您通过耳机一起听”,其中,“耳机”即为电子设备500)、确定控件1461B(用于接受上述针对电子设备500的分享请求)和取消控件1461C(用于拒绝上述针对电子设备500的分享请求)。电子设备400可以响应于针对确定控件1461B的触摸操作(例如点击操作),接受上述针对电子设备500的分享请求。接受该分享请求后,电子设备500可以接收并播放电子设备100发送的分享内容,即上述短视频应用的音频,具体示例可参见图14D的(3)。
不限于上述示例的情况,在另一些示例中,分享设备可以通过和被分享设备连接的其他设备向被分享设备发送针对该被分享设备的分享内容,可以理解为是通过“第三方设备”(即上述其它设备)转发数据。例如,图14D所示的示例中,电子设备100作为分享设备向电子设备500实时分享短视频应用的音频时,可以将短视频应用的音频发送给和电子设备500连接的电子设备400,电子设备400可以将接收到的短视频应用的音频转发给电子设备500,由电子设备500播放。
不限于图4C所示的实施方式,在另一种实施方式,还可以通过分享菜单中的更多选项设置分享内容的类型(例如音频、图像、或者音频和图像),本申请对分享内容的类型的设置方式不作限定。
在一些示例中,电子设备100可以响应于针对图4B所示的用户界面420中的分享控制选项421的触摸操作(例如点击操作),显示分享菜单,例如显示图15A的(1)所示的用户界面1510。用户界面1510中的分享菜单1511可以包括多个选项,例如,用于分享当前应用(图15A以短视频应用为例进行示意)的音频流/视频流的选项1511A、用于分享电子设备100的屏幕的显示内容的选项1511B、用于取消/暂停/停止实时分享的选项1511C和用于触发更多功能选项的选项1511D。电子设备100可以响应于针对选项1511D的触摸操作(例如点击操作),显示图15A的(2)所示的用户界面1520。用户界面1520可以包括设置窗口1521,设置窗口1521可以包括设置名称1521A(包括字符“音视频设置”)和多个设置选项,多个设置选项例如包括选项1521B、选项1521C和选项1521D。其中,选项1521B包括字符“音频+画面”,用于设置分享内容的类型为图像和音频。选项1521C包括字符“音频”,用于设置分享内容的类型为音频。选项1521D包括字符“画面”,用于设置分享内容的类型为图像。用户界面1520还包括重置控件1522和保存控件1523,重置控件1522用于将设置窗口1521中的预设选项(例如选项1521B)设置为选中状态,保存控件1523用于保存设置窗口1521的当前内容,例如,用户界面1520所示的设置窗口1521中的选项1521B为选中状态,电子设备100可以响应于针对保存控件1523的触摸操作(例如点击操作),将分享内容设置为选项1521B指示的图像和音频。
在一种实施方式中,电子设备100可以设置在实时分享系统和/或应用的音频流/视频流时,是否同时分享麦克风采集的音频和/或摄像头采集的图像。
在一些示例中,电子设备100可以响应于针对图15A的(1)所示的用户界面1510中的选项1511D的触摸操作(例如点击操作),显示图15B所示的用户界面1530。用户界面1530可以包括设置窗口1531、重置控件1532和保存控件1533,设置窗口1531可以包括设置名称1531A(包括字符“混音画设置”)和多个设置选项,多个设置选项例如包括选项1531B、选项1531C、选项1531D和选项1531E。其中,选项1531B包括字符“无混合”,用于设置仅实时分享系统和/或应用的音频流/视频流,不实时分享麦克风采集的音频和摄像头采集的图像。选项1531C包括字符“叠加MIC”,用于设置实时分享系统和/或应用的音频流时也实时分享麦克风采集的音频,实时分享系统和/或应用的视频流时不实时分享摄像头采集的图像。选项1531D包括字符“叠加Camera”,用于设置实时分享系统和/或应用的音频流时不实时分享麦克风采集的音频,实时分享系统和/或应用的视频流时也实时分享摄像头采集的图像。选项1531E包括字符“叠加MIC和Camera”,用于设置实时分享系统和/或应用的音频流时也实时分享麦克风采集的音频,实时分享系统和/或应用的视频流时也实时分享摄像头采集的图像。重置控件1532用于将设置窗口1531中的预设选项(例如选项1531B)设置为选中状态,保存控件1533用于保存设置窗口1531的当前内容。
例如,选项1531C或者选项1531E为选中状态时,分享设备可以向被分享设备发送分享内容和分享设备的麦克风采集的音频,被分享设备可以同时播放分享内容和分享设备的麦克风采集的音频。
例如,选项1531D或者选项1531E为选中状态时,分享设备可以向被分享设备发送分享内容和分享设 备的摄像头采集的图像,被分享设备可以同时显示分享内容和分享设备的摄像头采集的图像。示例性地,电子设备200(被分享设备)接收到电子设备100(分享设备)发送的短视频应用的视频流(分享内容)和麦克风采集的图像后,可以显示图15C所示的用户界面1540。用户界面1540和图5B所示的用户界面520类似,区别在于,用户界面1540还包括控件1541,控件1541用于显示电子设备100的摄像头采集的人脸图像。
不限于上述示例的情况,在另一些示例中,还可以通过分享设备的系统设置功能或应用设置功能设置是否实时分享麦克风采集的音频和/或摄像头采集的图像,本申请对此不作限定。
不限于上述实施方式,在另一种实施方式中,分享设备还可以预先设置进行实时分享时默认分享或者不分享麦克风采集的音频和/或摄像头采集的图像,例如,分享设备接收到用于触发实时分享的用户操作时,先显示图15B所示的用户界面1530,本申请对此不作限定。
以上示例中,可以同时分享麦克风采集的音频和系统级/应用级/背景的音频(也可称为混音),和/或,同时分享摄像头采集的图像和系统级/应用级的图像,让分享用户可以“边看边讲解”,被分享用户可以“边看边听讲解”,分享用户和被分享用户还可以对话,满足用户的个性化需求,体验感更好。
在一种实施方式中,电子设备100可以设置被分享设备基于分享内容的相关权限。可选地,该相关权限包括保存权限,例如包括录屏/截屏的权限,和/或保存分享内容的文件的权限,可选地,该相关权限包括二次传播权限,例如包括即时传播权限和/或延后传播权限,其中,即时传播权限是被分享设备在播放分享设备实时分享的内容时是否可以将该实时分享的内容转发给其它设备的权限,延后传播权限是被分享设备保存分享设备发送的分享内容后是否可以将保存的分享内容转发给其它设备的权限。
在一些示例中,电子设备100可以响应于针对图15A的(1)所示的用户界面1510中的选项1511D的触摸操作(例如点击操作),显示图15D所示的用户界面1550。用户界面1550可以包括设置窗口1551、重置控件1552和保存控件1553,设置窗口1551可以包括设置名称1551A(包括字符“权限设置”)和多个设置选项,多个设置选项例如包括选项1551B、选项1551C和选项1551D。其中,选项1551B包括字符“阅后即焚(不可保存、不可转发)”,用于设置不授予被分享设备保存权限和二次传播权限。选项1551C包括字符“可保存、可截屏”,用于设置授予被分享设备保存权限,但不授予二次传播权限。选项1551D包括字符“可转发”,用于设置授予被分享设备二次传播权限,但不授予保存权限。重置控件1552用于将设置窗口1551中的预设选项(例如选项1551B)设置为选中状态,保存控件1553用于保存设置窗口1551的当前内容。不限于图15D示例的情况,在另一些示例中,设置窗口1551中的权限设置也可以更精细,例如但不限于包括以下至少一项设置选项:用于设置不授予被分享设备保存权限和二次传播权限(这种情况下即为即时传播权限)的选项1,用于设置不授予被分享设备保存权限、但授予二次传播权限(这种情况下即为即时传播权限)的选项2,用于设置授予被分享设备保存权限、但不授予二次传播权限(包括即时传播权限和延后传播权限)的选项3,用于设置授予被分享设备保存权限和即时传播权限、但不授予延后传播权限的选项4,用于设置授予被分享设备保存权限和延后传播权限、但不授予即时传播权限的选项5,用于设置授予被分享设备保存权限和二次传播权限(包括即时传播权限和延后传播权限)的选项6等。本申请对具体设置内容不作限定。
接下来示例性说明上述基于分享内容的相关权限的应用场景以及该场景下的UI示例。
在一种实施方式中,具备即时传播权限的被分享设备播放分享设备实时分享的内容1时,可以响应于用于触发实时分享功能的用户操作,向其它设备实时分享上述内容1,具体说明可参见上述电子设备100作为分享设备向其它被分享设备实时分享的实施方式,不再赘述。
在另一种实施方式中,不具备即时传播权限的被分享设备播放分享设备实时分享的内容1时,可以响应于用于触发实时分享功能的用户操作,向分享设备请求获取内容1的即时传播权限。在一些示例中,电子设备200可以显示图5B所示的用户界面520,用户界面520用于播放电子设备100实时分享的短视频应用的音频流/视频流。电子设备200可以响应于针对用户界面520的滑动操作(例如指关节按照“W”的特定轨迹滑动),显示图16A所示的用户界面1610。用户界面1610包括提示框1611,提示框1611包括提示信息1611A(包括字符“没有权限给其他人一起看/听,是否请求对方授权”)、请求控件1611B和取消控件1611C。
在一些示例中,电子设备200可以响应于针对取消控件1611C的触摸操作(例如点击操作),取消向其它设备实时分享短视频应用的音频流/视频流,例如返回显示图5B所示的用户界面520。
在一些示例中,电子设备200可以响应于针对请求控件1611B的触摸操作(例如点击操作),向分享设备发送请求消息,以请求获取当前播放的分享内容(简称当前分享内容,即短视频应用的音频流/视频流) 的即时传播权限,此时的电子设备100和电子设备200例如可参见图16B。如图16B的(1)所示,电子设备200可以显示用户界面1620,用户界面1620包括提示信息1621(包括字符“等待授权中”)。如图16B的(2)所示,电子设备100可以显示用户界面1630,用户界面1630可以包括提示框1631,提示框1631包括提示信息1631A(包括字符“是否授权用户B,允许给其它人一起看/听”)、同意控件1631B和拒绝控件1631C。同意控件1631B用于授予电子设备200当前分享内容的即时传播权限,拒绝控件1631C用于拒绝授予电子设备200当前分享内容的即时传播权限。不限于上述示例的情况,在另一些示例中,同意控件1631B也可以用于授予电子设备200任意分享内容的即时传播权限,同意控件1631B也可以用于授予电子设备200当前分享内容的即时传播权限和延后传播权限。
在一些示例中,电子设备100接收到电子设备200发送的用于请求获取当前分享内容的即时传播权限的请求消息后,可以响应于用户操作,向电子设备200发送响应消息。在一种情况下,电子设备100响应于针对图16B的(2)所示的用户界面1630中的同意控件1631B的触摸操作(例如点击操作),向电子设备200发送指示接受请求的响应消息,电子设备200接收到该响应消息后,可以向其它设备实时分享当前分享内容(即短视频应用的音频流/视频流),电子设备200例如输出指示授权成功的提示信息。电子设备200作为分享设备向其他设备实时分享音频流/视频流的说明和上述电子设备100作为分享设备实时分享音频流/视频流的说明类似,例如,电子设备200接收到上述响应消息后,可以显示分享对象和/或分享内容的选择界面。在另一种情况下,电子设备100响应于针对图16B的(2)所示的用户界面1630中的拒绝控件1631C的触摸操作(例如点击操作),向电子设备200发送指示拒绝请求的响应消息,电子设备200接收到该响应消息后,可以取消向其它设备实时分享当前分享内容(即短视频应用的音频流/视频流),电子设备200例如输出指示授权失败的提示信息。
不限于上述示例的情况,在另一些示例中,电子设备100接收到电子设备200发送的用于请求获取当前分享内容的即时传播权限的请求消息后,也可以不输出提示信息,根据预设规则直接拒绝或接收该请求消息,其中,该预设规则可以是电子设备100预设的,也可以是响应于用户操作确定的,本申请对此不作限定。
在另一种实施方式中,不具备即时传播权限的被分享设备播放分享设备实时分享的内容1时,可以响应于用于触发实时分享功能的用户操作,显示提示信息,该提示信息指示被分享设备不具备即时传播权限,例如包括字符“没有权限给其他人一起看/听”。不限于此,也可以不响应触发实时分享功能的用户操作,本申请对此不作限定。
在一种实施方式中,被分享设备显示的分享内容的播放界面可以包括保存控件,保存控件用于将该分享内容保存到被分享设备中。在一些示例中,电子设备200可以响应于针对图5B所示的用户界面520中的分享控制选项523的触摸操作(例如点击操作),显示图16C所示的用户界面1640。用户界面1640包括分享菜单1641,分享菜单1641可以包括多个选项,例如用于暂停/退出分享内容的播放界面的选项1641A、用于保存分享内容的选项1641B和用于触发更多功能选项的选项1641C。在一种情况下,具备保存权限的电子设备200可以响应于针对选项1641B的触摸操作(例如点击操作),保存电子设备100已发送的分享内容(例如当前播放的短视频应用的音频流/视频流),此时例如显示提示信息(指示保存成功)。在另一种情况下,不具备保存权限的电子设备200可以响应于针对选项1641B的触摸操作(例如点击操作),显示提示信息(指示电子设备200不具备保存权限),或者向电子设备100请求获取当前分享内容的保存权限,具体示例和图16A和图16B类似,不再赘述。不限于此,也可以不响应针对选项1641B的触摸操作,本申请对此不作限定。
不限于上述示例的情况,在另一些示例中,还可以通过其它操作触发保存分享内容,例如,语音输入,特定的滑动操作等,本申请对此不作限定。
不限于上述示例的情况,在另一些示例中,电子设备200也可以选择保存已播放的分享内容(可以是电子设备100发送的分享内容的全部或部分),本申请对具体保存的分享内容不作限定。
在一种实施方式中,被分享设备保存分享内容后,可以触发向其它设备分享保存的分享内容。在一些示例中,电子设备200可以显示分享内容的文件的分享界面,例如图16D所示的用户界面1650,用户界面1650包括文件信息1651,文件信息1651包括字符“用户A分享的内容1”,用于指示电子设备100实时分享的内容的文件1。文件信息1651的左侧还显示有选择控件1652,用于选择文件信息1651指示的文件或者取消该选择,当选择控件1652为选中状态时,用户界面1650中的提示信息1653可以包括字符“已选择1项”。用户界面1650还包括取消控件1654和已选择的文件的分享方式的选择框1655,取消控件1654用于取消向其他设备发送上述已选择的文件1。选择框1655可以包括指示不同分享方式的多个选项,例如 包括字符“即时分享”的选项1655A(指示基于即时分享的分享方式),包括字符“最近联系人(电话号码4)”的选项1655B(指示基于运营商通话/OTT通话等NewTalk的分享方式,其中分享对象为通讯号码为电话号码4的设备),包括字符“WLAN直连”的选项1655C(指示基于WLAN的分享方式),包括字符“蓝牙”的选项1655D(指示基于蓝牙的分享方式),包括字符“发送给朋友”的选项1655E(指示基于聊天应用的分享方式),包括字符“电子邮件”的选项1655F(指示基于邮箱/电子邮件的分享方式)。在一种情况下,具备延后传播权限的电子设备200可以响应于针对上述多个选项中的任意一个选项的触摸操作(例如点击操作),通过该选项指示的分享方式向其他设备发送上述已选择的文件1。在另一种情况下,不具备延后传播权限的电子设备200可以响应于针对上述多个选项中的任意一个选项的触摸操作(例如点击操作),显示提示信息(指示电子设备200不具备延后传播权限)。
不限于上述示例的情况,在另一些示例中,不具备延后传播权限的电子设备200也可以向电子设备100请求获取上述已选择的文件1的延后传播权限,具体示例和图16A和图16B类似,不再赘述。在另一些示例中,不具备延后传播权限的电子设备200也可以不响应针对上述多个选项中的任意一个选项的触摸操作。在另一些示例中,不具备延后传播权限的电子设备200保存的分享内容的文件可以是经过加密的,并且用于解密该文件的密钥为动态密钥。电子设备200每次打开该文件都需要向电子设备100请求获取动态密钥,该动态密钥是有时效性的(例如1分钟内有效或者前3次有效等),电子设备200根据获取到的动态密钥解密该文件后才能进行播放。即使电子设备200成功向其他设备(以电子设备400为例说明)发送分享内容的文件,但由于电子设备400无法获取到动态密钥,因此也无法解密和播放该文件,从而达到保护分享用户的隐私安全的效果。在另一些示例中,不具备延后传播权限的电子设备200保存的分享内容的文件可以是经过加密的,并且用于解密该文件的密钥是将电子设备200的设备ID作为因子之一得到的,因此,电子设备200才能使用密钥解密该文件,其他设备即使获取到了密钥和文件,也无法使用密钥解密该文件,进一步保证了分享内容的安全性。其中,设备ID例如但不限于为介质访问控制地址(media access control addres,MAC)、序列号(serial number,SN)或国际移动设备识别码(international mobile equipment identity,IMEI)等。本申请对如何禁止不具备延后传播权限的电子设备向其他设备发送分享内容的文件不作限定。
在另一些示例中,电子设备200播放分享内容的文件时,例如显示图16E所示的用户界面1660,用户界面1660可以包括标题1661和播放框1662,标题1661包括播放框1622中播放的文件的名称“用户A分享的内容1”,播放框1622可以包括播放/暂停控件1662A和进度条控件1662B。在一种情况下,具备延后传播权限的电子设备200可以响应于用于触发实时分享的用户操作(图16E以该用户操作为指关节按照“W”的特定轨迹在显示屏上滑动为例进行示意),向其他设备实时分享当前播放的音频流/视频流。电子设备200作为分享设备向其他设备实时分享音频流/视频流的说明和上述电子设备100作为分享设备实时分享音频流/视频流的说明类似,例如,电子设备200响应于用于触发实时分享的用户操作,显示分享对象的选择界面。在另一种情况下,不具备延后传播权限的电子设备200可以响应于用于触发实时分享的用户操作(图16E以该用户操作为指关节按照“W”的特定轨迹在显示屏上滑动为例进行示意),显示提示信息(指示电子设备200不具备延后传播权限)。不限于上述示例的情况,在另一些示例中,不具备延后传播权限的电子设备200也可以向电子设备100请求获取当前播放的文件的延后传播权限,具体示例和图16A和图16B类似,不再赘述。在另一些示例中,不具备延后传播权限的电子设备200也可以不响应用于触发实时分享的用户操作,本申请对此不作限定。
不限于上述示例的情况,在另一些示例中,电子设备100也可以自动识别分享数据是否符合预设条件,当分享数据符合预设条件时,不授予电子设备200基于该分享数据的保存权限和/或二次传播权限。
在一些示例中,上述预设条件为分享数据为预设应用的应用数据,例如,电子设备100可以预置预设应用的信息(可以理解为是黑名单),该黑名单可以包括以下至少一项应用信息:应用类型、应用程序的名称、包名、应用标识等。预设条件为分享数据为预设应用的应用数据,可以包括:分享数据对应的应用信息和黑名单中的应用信息一致。其中,预设应用可以包括响应于用户操作确定的应用程序,也可以包括自动识别的应用程序,例如,电子设备100可以识别应用程序的类型,将银行、支付等类型的应用设置为预设应用。
在一些示例中,上述预设条件为分享数据包括预设内容,预设内容可以包括响应于用户操作确定的内容,也可以包括自动识别的内容。预设内容例如但不限于为文本类型、图片类型或视频类型。预设内容例如但不限于为用户名、密码、账户名、登录名、身份证号码、银行卡号、账户余额等。
以上示例可以实现被分享设备基于分享内容的权限管理,有效保证分享用户的隐私安全。
不限于上述实施方式(实时分享整个应用或者整个显示屏的显示内容),在另一种实施方式中,电子 设备100可以响应于用户操作确定显示屏上的至少一个区域,上述确定的区域相关的音频流/视频流用于进行实时分享。
在一些示例中,电子设备100可以响应于针对图4B所示的用户界面420中的分享控制选项421的触摸操作(例如点击操作),显示分享菜单,例如显示图17A所示的用户界面1710。用户界面1710中的分享菜单1711可以包括多个选项,例如,用于分享当前应用(图17A以短视频应用为例进行示意)的音频流/视频流的选项1711A、选项1711B、用于分享电子设备100的屏幕的显示内容的选项1711C、用于取消/暂停/停止实时分享的选项1711D和用于触发更多功能选项的选项1711E。选项1711B包括字符“选择区域(栅格)”,用于通过栅格方式选择实时分享的区域。
在一些示例中,电子设备100可以响应于针对图17A所示的用户界面1710中的选项1711B的触摸操作(例如点击操作),显示图17B所示的用户界面1720。用户界面1720和图4B所示的用户界面420类似,区别在于,用户界面1720中的短视频的播放窗口1721被多条分割线划分为多个区域(区域也可称为栅格),图17B以播放窗口1721被三条分割线(纵向的分割线1721A、横向的分割线1721B和分割线1721C)划分为2×3=6个栅格为例进行示意。播放窗口1721中的任意一个栅格可以被选中,选中的栅格可以作为实时分享的区域。
在一些示例中,电子设备100可以响应于用户操作,在分享内容的播放窗口中移动分割线。例如,电子设备100可以响应于针对图17B所示的用户界面1720包括的播放窗口1721中的分割线1721A的触摸操作,将纵向的分割线1721A向左或向右移动,具体示例可参见图17C所示的用户界面1730,图17C以该触摸操作为向左滑动为例进行示意,用户界面1730示出了移动前的分割线1721A和移动后的分割线1721A。不限于此,还可以将播放窗口1721中的纵向的分割线向上或向下移动。
在一些示例中,电子设备100可以响应于用户操作,在分享内容的播放窗口中新增分割线。例如,电子设备100可以响应于针对图17C所示的用户界面1730包括的播放窗口1721的左侧边缘或者右侧边缘的触摸操作,新增纵向的分割线并将该分割线向右或向左移动,具体示例可参见图17D所示的用户界面1740,图17D以该触摸操作为从屏幕右侧边缘向屏幕中间滑动(向左滑动)为例进行示意,用户界面1740示出了新增的纵向的分割线1721D。不限于此,还可以响应于针对播放窗口1721的上侧边缘或者下侧边缘的触摸操作,新增横向的分割线。
在一些示例中,电子设备100可以响应于用户操作,在分享内容的播放窗口中删除分割线。例如,电子设备100可以响应于针对图17B所示的用户界面1720包括的播放窗口1721中的分割线1721A的触摸操作(例如左右滑动),将纵向的分割线1721A移动至屏幕左侧或者右侧边缘,此时播放窗口1721可以不显示分割线1721A,可以理解为是删除分割线1721A。不限于此,还可以将播放窗口中的横向的分割线移动至屏幕上侧或者下侧边缘,以此删除该分割线。
在一些示例中,电子设备100可以响应于用户操作,在分享内容的播放窗口中选择任意一个栅格(作为实时分享的区域)。例如,电子设备100可以响应于针对图17D所示的用户界面1740包括的播放窗口1721中位于中间的栅格的触摸操作(例如单击操作、双击操作或者长按操作),选中该栅格。此时,电子设备100可以显示图17E所示的用户界面1750,用户界面1750所示的播放窗口1721中位于中间的栅格1721E为选中状态。用户界面1750还包括完成控件1751,完成控件1751用于保存当前选中的栅格(即上述栅格1721E)为实时分享的区域。
可以理解地,在分享内容的播放窗口中移动、删除和新增分割线后,该播放窗口包括的栅格的大小和/个数会发生变化,例如,图17B所示的用户界面1720中的6个栅格的大小(经过图17C和图17D所示的移动分割线和新增分割线之前),和图17E所示的用户界面1750中的6个栅格的大小不同(经过图17C和图17D所示的移动分割线和新增分割线之后)。
在一些示例中,电子设备100可以响应于用户操作,在分享内容的播放窗口中选择多个栅格(作为实时分享的区域)。例如,电子设备100可以依次接收针对图17D所示的用户界面1740包括的播放窗口1721中位于底部的三个栅格的触摸操作(例如单击操作、双击操作或者长按操作),响应于这些触摸操作,选中这三个栅格,具体示例和图17E类似。不限于此,电子设备100也可以先接收针对这三个栅格中任意一个栅格的触摸操作(例如单击操作、双击操作或者长按操作),选中该栅格,例如,电子设备100可以显示图17F所示的用户界面1760,用户界面1760中的栅格1721F为选中状态。如图17F所示,用户执行上述触摸操作后可以保持手指触摸电子设备100的显示屏,并向左滑动至用户界面1760中和栅格1721F相邻的栅格1721G。电子设备100可以响应于上述用户操作,选中栅格1721G,此时可以显示图17G所示的用户界面1770,用户界面1770中的栅格1771为选中状态,栅格1771是合并栅格1721F和栅格1721G得 到的。如图17G所示,用户可以继续保持手指触摸电子设备100的显示屏,并向左滑动至用户界面1760中和栅格1721G相邻的栅格1721H。电子设备100可以响应于上述用户操作,选中栅格1721H,此时可以显示图17H所示的用户界面1780,用户界面1780中的栅格1781为选中状态,栅格1781是合并栅格1721F、栅格1721G和栅格1721H得到的。
在一些示例中,电子设备100确定实时分享的区域后,可以将和该区域相关的视频流/音频流实时分享给其他设备。例如,电子设备100可以响应于针对图17H所示的用户界面1780中的完成控件1751的触摸操作(例如点击操作),将已选中的栅格1721E和栅格1781设置为实时分享的区域,并将相关的视频流/音频流实时分享给电子设备200。例如,电子设备200可以显示图17I所示的用户界面1790,用户界面1790和图5B所示的用户界面520类似,区别在于,用户界面1790所示的分享内容的播放窗口522中,仅显示上述已选中的栅格1721E和栅格1781中的内容1791,不显示其他区域中的内容。
不限于上图17B-图17H所示的选择实时分享的区域的方式,在另一些示例中,电子设备100可以响应于针对图17A所示的用户界面1710中的选项1711B的触摸操作(例如点击操作),显示图18A所示的用户界面1810。用户界面1810和图4B所示的用户界面420类似,区别在于,用户界面1810中的短视频的播放窗口412中还显示有选中框1811,可选地,选中框1811默认包括播放窗口412中的全部显示内容。选中框1811所在的区域可以作为实时分享的区域。
在一些示例中,电子设备100可以调整分享内容的播放窗口中的选中框的大小和/或位置。例如,电子设备100可以接收针对图18A所示的用户界面1810中的选中框1811的右下角的触摸操作(例如上下滑动、左右滑动、斜向上或斜向下滑动),图18B以该触摸操作为从右下角向左上角滑动为例进行示意,电子设备100可以响应于该触摸操作缩小选中框1811,图18B所示的用户界面1820示出了调整前的选中框1811和调整后的选中框1811。电子设备100可以继续接收针对图18B所示的用户界面1820中的选中框1811的左上角的触摸操作(例如上下滑动、左右滑动、斜向上或斜向下滑动),图18C以该触摸操作为从左上角向右下角滑动为例进行示意,电子设备100可以响应于该触摸操作缩小选中框1811,图18C所示的用户界面1830示出了调整前的选中框1811和调整后的选中框1811。用户界面1830还包括完成控件1831,完成控件1831用于保存当前的选中框1811(即调整后的选中框1811)所在的区域为实时分享的区域。
在一些示例中,电子设备100确定选中框所在的区域为实时分享的区域后,可以将和该区域相关的视频流/音频流实时分享给其他设备。例如,电子设备100可以响应于针对图18C所示的用户界面1830中的完成控件1831的触摸操作(例如点击操作),将选中框1811所在的区域设置为实时分享的区域,并将相关的视频流/音频流实时分享给电子设备200。例如,电子设备200可以显示图18D所示的用户界面1840,用户界面1840和图5B所示的用户界面520类似,区别在于,用户界面1840所示的分享内容的播放窗口522中,仅显示选中框1811中的内容1841,不显示其他区域中的内容。
不限于上述示例的选择实时分享的区域的方式,在另一些示例中,电子设备100可以响应于针对图4B所示的用户界面420中的分享控制选项421的触摸操作(例如点击操作),显示图19A所示的用户界面1910。用户界面1910和图17A所示的用户界面1710类似,区别在于,用户界面1910中的分享菜单1911不包括选项1711B,而是包括选项1911A,选项1911A包括字符“选择区域(手绘)”,用于通过手绘方式选择实时分享的区域。
在一些示例中,电子设备100可以响应于针对图19A所示的用户界面1910中的1911A的触摸操作(例如点击操作),显示图19B所示的用户界面1920,用户界面1920和图4B所示的用户界面420类似。电子设备100可以响应于针对用户界面1920中的短视频的播放窗口412的触摸操作,从播放窗口412中选择出和该触摸操作相关的区域,图19B以该触摸操作为按照顺时针方向滑动为例进行示意,和该触摸操作相关的区域为用户界面1920中的区域1921。用户界面1920还包括返回控件1922和完成控件1923,返回控件1922用于取消最近一次的操作结果,例如取消选择上述区域1921。完成控件1923用于保存当前选择的区域(例如区域1921)为实时分享的区域。不限于此,用户还可以选择多个区域,例如,图19B之后,电子设备100可以响应于针对短视频的播放窗口412的触摸操作,再从播放窗口412中选择出和该触摸操作相关的区域,图19C以该触摸操作为按照顺时针方向滑动为例进行示意,和该触摸操作相关的区域为图19C所示的用户界面1930中的区域1931。
在一些示例中,电子设备100确定用户手绘选择的区域为实时分享的区域后,可以将和该区域相关的视频流/音频流实时分享给其他设备。例如,电子设备100可以响应于针对图19C所示的用户界面1930中的完成控件1923的触摸操作(例如点击操作),将用户手绘选择的区域1921和区域1931设置为实时分享的区域,并将相关的视频流/音频流实时分享给电子设备200。例如,电子设备200可以显示图19D所示的 用户界面1940,用户界面1940和图5B所示的用户界面520类似,区别在于,用户界面1940所示的分享内容的播放窗口522中,仅显示上述区域1921中的内容1941和上述区域1931中的内容1942,不显示其他区域中的内容。
不限于上图19B和图19C示例的选择实时分享的区域的方式,在另一些示例中,电子设备100可以响应于针对图19A所示的用户界面1910中的选项1911A的触摸操作(例如点击操作),显示图19E所示的用户界面1950。用户界面1950和图4B所示的用户界面420类似,包括短视频的播放窗口412。电子设备100可以依此接收针对用户界面1950中的位置1951和位置1952的触摸操作(例如点击操作),响应于这些触摸操作,在用户界面1950上显示端点为位置1951和位置1952的点的边界线1953(实线形式)。然后,电子设备100可以接收针对用户界面1950中的位置1954的触摸操作,用户手指保持触摸位置1954时,可以在用户界面1950上显示端点为位置1952和位置1954的点的边界线1955(虚线形式,表征可调整)。用户可以保持手指触摸显示屏并从位置1954移动至位置1956,电子设备100可以响应于该用户操作,在用户界面1950上取消显示边界线1955,并且显示端点为位置1952和位置1956的点的边界线1957(实线形式),可以理解为是,将边界线1955调整为边界线1957。然后,电子设备100可以依次接收针对图19F所示的用户界面1960中的位置1961、位置1962和位置1951的触摸操作(例如点击操作),响应于这些触摸操作,在用户界面1960上显示:端点为位置1956和位置1961的点的边界线1963(实线形式)、端点为位置1961和位置1962的点的边界线1964(实线形式)、端点为位置1962和位置1951的点的边界线1965(实线形式)。上述边界线1953、1957、1963、1964、1965可以构成用户界面1960中的(用户选择的)区域1966。用户界面1960还包括返回控件1967和完成控件1968,返回控件1967用于取消最近一次的操作结果,例如取消显示上述边界线1965。电子设备100可以响应于针对完成控件1968的触摸操作(例如点击操作),将用户手绘选择的区域1966设置为实时分享的区域,并将相关的视频流/音频流实时分享给电子设备200。例如,电子设备200可以显示图19G所示的用户界面1970,用户界面1970和图5B所示的用户界面520类似,区别在于,用户界面1970所示的分享内容的播放窗口522中,仅显示上述区域1966中的内容1971,不显示其他区域中的内容。
不限于上述示例的情况,在另一些示例中,电子设备100可以不实时分享预设应用的任意应用数据(例如被分享设备上用于显示分享内容的界面为黑色),例如,电子设备100可以预置预设应用的信息(可以理解为是黑名单),该黑名单可以包括以下至少一项应用信息:应用程序的名称、包名、应用标识等,当电子设备100识别到分享数据对应的应用信息和黑名单中的应用信息一致时,可以不实时分享该分享数据(例如但不限于为:电子设备100输出指示无法实时分享的提示信息,或者被分享设备上用于显示分享数据的窗口为黑色)。其中,预设应用可以包括响应于用户操作确定的应用程序,也可以包括自动识别的应用程序,例如,电子设备100可以识别应用程序的类型,将银行、支付等类型的应用设置为预设应用。在另一些示例中,电子设备100可以不实时分享应用程序的某个界面(例如被分享设备显示分享内容时,若播放至该界面相关的视频流则显示界面为黑色,若播放其他的视频流则显示的界面正常),例如,当电子设备100识别到待分享的用户界面包括预设内容时,不实时分享该界面。在另一些示例中,电子设备100也可以不实时分享用户界面中的某个区域(具体示例和图19D和图19G类似),例如,当电子设备100识别到待分享的用户界面包括预设内容时,不实时分享该界面中显示有该预设内容的区域。其中,预设内容可以包括响应于用户操作确定的内容,也可以包括自动识别的内容。预设内容例如但不限于为文本类型、图片类型或视频类型。预设内容例如但不限于为用户名、密码、账户名、登录名、身份证号码、银行卡号、账户余额等。
不限于上述实施方式(实时分享整个应用、整个显示屏或者用户选择的区域的显示内容),在另一种实施方式中,电子设备100可以响应于用户操作,确定用户界面中的至少一个图层,上述确定的图层相关的音频流/视频流用于进行实时分享。
在一些示例中,电子设备100可以响应于针对图4B所示的用户界面420中的分享控制选项421的触摸操作(例如点击操作),显示分享菜单,例如显示图20A所示的用户界面2010。用户界面2010和图17A所示的用户界面1710类似,区别在于,用户界面2010中的分享菜单2011不包括选项1711B,而是包括选项2011A,选项2011A包括字符“选择图层”,用于触发选择用于进行实时分享的图层。
在一些示例中,电子设备100可以响应于针对图20A所示的用户界面2010中的选项2011A的触摸操作(例如点击操作),显示图层选择界面,例如显示图20B所示的用户界面2020。用户界面2020可以包括图层示意界面2021,图层示意界面2021可以包括图层2021A、图层2021B和图层2021C,这些图层可以是对图4B所示的用户界面420中的短视频的播放窗口412(假设还显示有收起状态的悬浮窗312)进行 图层划分得到的。其中,图层2021A可以包括短视频应用的内容和悬浮窗312,图层2021B可以包括短视频应用的内容(可以理解为是播放窗口412播放的短视频1的具体内容),图层2021C可以包括短视频应用的内容(可以理解为是短视频应用的相关控件)。电子设备100可以响应于针对图层示意界面2021中的任意一个图层的触摸操作(例如点击操作、双击操作或长按操作),选中该图层。电子设备100可以将和选中的图层相关的音频流/视频流分享给其他设备,例如将图层2021B相关的短视频1的音频流/视频流分享给电子设备200,电子设备200此时例如显示图19G所示的用户界面1970,用户界面1970中的播放窗口522仅显示图层2021B中的内容1971,不显示其他图层的内容。
不限于上述示例的情况,在另一些示例中,电子设备100显示的图层选择界面也可以为图20C所示的用户界面2030。用户界面2030可以包括图层示意界面2031,图层示意界面2031可以包括图层2031A和图层2031B,这些图层可以是对视频应用的播放界面(该播放界面上显示有短信息应用的悬浮窗口)进行图层划分得到的。其中,图层2031A可以包括视频应用中的视频内容2031C和短信息应用的内容2031D,图层2031B可以包括视频应用中的播放控件、进度条等内容。电子设备100可以响应于针对图层示意界面2031中的任意一个图层的触摸操作(例如点击操作、双击操作或长按操作),选中该图层。电子设备100可以将和选中的图层相关的音频流/视频流分享给其他设备。
在另一些示例中,电子设备100显示的图层选择界面也可以为图20D所示的用户界面2040。用户界面2040可以包括图层示意界面2041,图层示意界面2041可以包括图层2041A、图层2041B和图层2041C,这些图层可以是对视频应用和短信息应用的分屏界面进行图层划分得到的。其中,图层2041A可以包括视频应用的内容和短信息应用的内容(可以理解为包括整个分屏界面的内容,也可理解为是包括图层2041B和图层2041C中的内容,),图层2041B可以包括视频应用中的视频内容,图层2041C可以包括短信息应用中的短信息。电子设备100可以响应于针对图层示意界面2041中的任意一个图层的触摸操作(例如点击操作、双击操作或长按操作),选中该图层。电子设备100可以将和选中的图层相关的音频流/视频流分享给其他设备。
不限于上述示例的情况,在另一些示例中,可以划分得到更多或更少的图层,例如,图20C所示的用户界面2030中的图层示意界面2031还包括图层2031D,图层2031D包括短信息应用的内容。本申请对图层划分方式不作限定。
不限于上述示例的情况,在另一些示例中,一个图层还可以包括更多或更少的应用的内容,例如,一个图层仅包括电子设备的系统内容(不包括任意应用的内容),或者,一个图层包括两个或两个以上的应用的内容,本申请对图层包括的内容不作限定。
以上示例中,可以按任意应用、按任意区域(规则或不规则)、全屏幕等多种方式来进行实时分享,前台应用、后台应用和未运行的应用均可以进行实时分享,也就是说,分享内容无限制,使用场景更加广泛,有效满足用户的需求,提升用户体验感。
不限于上述实施方式,在另一种实施方式中,电子设备100作为分享设备向其他设备实时分享第一内容时,上述其他设备中的任意一个设备也可以作为分享设备向电子设备100等设备分享第二内容,也就是说可以实现双向分享,上述其他设备作为分享设备实时分享的说明和电子设备100作为分享设备实时分享的说明类似,下面示例性示出一些场景,但还可以有其他和电子设备100作为分享设备实时分享的场景类似的场景,不应构成限定。
在一些示例中,在图5B之后,电子设备200可以将电子设备100实时分享的内容的播放界面(例如图5B所示的用户界面520中的播放窗口522)切换至后台显示(也可称为是将实时分享的内容对应的应用程序切换至后台运行),在前台显示其他应用的用户界面(也可称为是在前台运行其他应用),例如图21A所示的短视频应用的用户界面2110,用户界面2110可以包括位于顶部的通话控件2111和短视频的播放窗口2112,通话控件2111可以表征当前电子设备200处于通话态且通话时长为36秒。播放窗口2112用于显示播放的短视频,例如当前正在播放“用户2”发布的名称为“主题2”的短视频2。电子设备200可以响应于用于触发实时分享的用户操作,例如图21A所示的指关节按照“W”的特定轨迹滑动的用户操作,显示分享对象和分享内容的选择界面,例如图21B所示的用户界面2120。用户界面2120可以包括可选择的分享内容的列表2121和可选择的分享对象的列表2122,列表2121可以包括用于共享前台应用(图21B以短视频应用为例进行示意)的显示内容的选项2121A、用于共享电子设备200的屏幕的显示内容的选项2121B、用于共享后台应用(图21B以浏览器应用为例进行示意)的显示内容的选项2121C。列表2122可以包括指示通话对方(即通讯号码为“电话号码1”的电子设备100)的选项2122A和多个指示附近设备的选项。
不限于图21B所示的可选择的分享对象的列表,在另一些示例中,可选择的分享内容的列表可以包括用于共享电子设备100实时分享的内容的选项,可选地,具备即时传播权限的电子设备200可以显示用于共享电子设备100实时分享的内容的选项,不具备即时传播权限的电子设备200可以不显示用于共享电子设备100实时分享的内容的选项。例如,电子设备200显示的分享对象和分享内容的选择界面为图21C所示的用户界面2130,用户界面2130和图21B所示的用户界面2120类似,区别在于,用户界面2130中的可选择的分享内容的列表2121还包括选项2121D,选项2121D下显示有字符:共享“一起看”,用于共享电子设备100实时分享的内容,例如图5B所示的用户界面520中的播放窗口522播放的短视频1的音频流/视频流。
在一些示例中,电子设备200可以响应于针对图21B所示的用户界面2120或者图21C所示的用户界面2130中的选项2122A的触摸操作(例如点击操作),向选项2122A指示的电子设备100实时分享选中状态的选项2121A指示的短视频应用的音频流/视频流(具体为短视频2的音频流/视频流)。也即是说,电子设备100向电子设备200实时分享短视频1的音频流/视频流时,电子设备200也可以向电子设备100实时分享短视频2的音频流/视频流。在一种情况下,电子设备200可以在前台播放电子设备100实时分享的短视频1的音频流/视频流,例如显示图5B所示的用户界面520。同时,电子设备100也可以在前台播放电子设备200实时分享的短视频2的音频流/视频流,例如显示图21D所示的用户界面2140,用户界面2140可以包括提示框2141和播放窗口2142,提示框2141包括字符“正在观看用户B分享的内容”,播放窗口2142用于显示分享内容(例如图21A所示的用户界面2110中的播放窗口2112显示的图像)。不限于上述情况,在另一种情况下,电子设备100和电子设备200可以均在前台播放电子设备200实时分享的短视频2的音频流/视频流,例如,电子设备100显示图21D所示的用户界面2140,电子设备200显示图21A所示的用户界面2110。或者,电子设备和电子设备200可以均在前台播放电子设备100实时分享的短视频1的音频流/视频流,例如,电子设备100显示图4B所示的用户界面420,电子设备200显示图5B所示的用户界面520。不限于上述情况,在另一种情况下,电子设备100或电子设备200可以分屏显示电子设备100实时分享的内容和电子设备200实时分享的内容,例如,电子设备200显示图21E所示的用户界面2150,用户界面2150可以包括分屏显示的短视频的播放窗口2151和电子设备100实时分享的内容的播放窗口2152,播放窗口2151和播放窗口2152之间可以显示有控件2153,控件2153用于调整播放窗口2151和播放窗口2152的显示区域的大小。播放窗口2151用于显示图21A所示的用户界面2110中的播放窗口2112显示的图像,播放窗口2151中显示有控件2151A,控件2151A的说明可参见图4B所示的用户界面420中的分享控制选项421的说明。播放窗口2152用于显示图5B所示的用户界面520中的播放窗口522显示的图像,播放窗口2152中显示有控件2152A,2152A的说明可参见用户界面520中的分享控制选项523的说明。电子设备100显示的界面和图21E所示的用户界面2150类似,不再赘述。本申请对双向分享的具体显示方式不作限定。
不限于上述示例,在另一些示例中,电子设备100和电子设备200可以同时作为分享设备向电子设备400实时分享音频流和/或视频流,电子设备400可以按照上述示例的任意一种情况或其他情况显示电子设备100实时分享的内容和/或电子设备200实时分享的内容,例如,电子设备400可以显示图21E所示的用户界面2150,此时,用户界面2150中的播放窗口2151也包括提示信息“正在观看用户B分享的内容”。
不限于上述示例,在另一些示例中,还可以有更多设备可以同时作为分享设备进行实时分享,其中任意两个设备的说明和上述电子设备100和电子设备200进行双向分享的说明类似,不再赘述。
不限于上述示例,在另一些示例中,电子设备200作为分享设备向电子设备100和/或其他设备实时分享电子设备100实时分享的内容时,可以根据是否具备即时传播权限执行不同的操作,例如,具备即时传播权限的电子设备200可以响应于针对图21C所示的用户界面2130中的选项2121D的触摸操作(例如点击操作),实时分享选项2121D指示的内容。或者,不具备即时传播权限的电子设备200可以响应于针对图21C所示的用户界面2130中的选项2121D的触摸操作(例如点击操作),向电子设备100请求获取选项2121D指示的内容的即时传播权限,具体可参见图16A-图16C的示例,不再赘述。不限于此,不具备即时传播权限的电子设备200也可以直接显示指示不具备即时传播权限的提示信息,或者直接不响应上述触摸操作。
在一种实施方式中,电子设备100可以默认开启以上实施方式中的实时分享功能。在另一种实施方式中,电子设备100可以响应于用户操作,开启以上实施方式中的实时分享功能,下面示例性示出一些用于设置实时分享功能的用户界面。
在一些示例中,如图22A所示,电子设备100可以显示用户界面2210,用户界面2210包括设置名称 2211(包括字符“一起看/听”),可以表征用户界面2210为实时分享功能的设置界面。例如,电子设备100可以响应于针对设置菜单中的“更多连接”选项下的“一起看/听”选项的触摸操作(如点击操作),显示用户界面2210。用户界面2210可以包括功能名称2212(包括字符“一起看,一起听”),功能名称2212右侧还显示有对应的开关控件2212A,开关控件2212A用于开启或关闭功能名称2212指示的实时分享功能,开关控件2212A可以理解为是实时分享功能的总开关。用户界面2210还包括分享对象的设置菜单:标题2213(包括字符“分享菜单”)下显示的多个设置选项,例如设置选项2214(包括字符“允许和电话对方一起看/听”)和设置选项2215(包括字符“允许和附近设备一起看/听”)。设置选项2214右侧还显示有对应的开关控件2214A,开关控件2214A用于开启或关闭设置选项2214指示的通过运营商通话/OTT通话等NewTalk实现实时分享的功能。设置选项2215右侧还显示有对应的开关控件2215A,开关控件2215A用于开启或关闭设置选项2215指示的通过近场通信技术实现实时分享的功能。不限于此,还可以包括指示通过卫星方式实现实时分享的功能的选项,指示和车载设备实现实时分享的功能的选项等,或者,仅包括设置选项2214或设置选项2215,本申请对此不作限定。
在一些示例中,如图22B所示,电子设备100可以显示用户界面2220,用户界面2220包括设置名称2221(包括字符“新通话”),可以表征用户界面2220为NewTalk功能的设置界面。例如,电子设备100可以响应于针对“电话”选项下的设置菜单中的“新通话(NewTalk)”选项的触摸操作(如点击操作),显示用户界面2220。用户界面2220可以包括功能名称2222(包括字符“新通话”),功能名称2222右侧还显示有对应的开关控件2222A,开关控件2222A用于开启或关闭功能名称2222指示的NewTalk功能,开关控件2222A可以理解为是NewTalk功能的总开关。用户界面2220还包括NewTalk功能中的多个子功能的信息,例如子功能2223、子功能2224和子功能2225。其中,子功能2223包括功能名称:“智能增加通话质量”,子功能2223下还显示有对应的功能说明2223A(包括字符“允许使用蜂窝移动数据,确保流畅的通话体验”)。子功能2224包括功能名称:“通话中信息分享”,子功能2224下还显示有对应的功能说明2224A(包括字符“允许通话中接收信息,如图片、位置、文件、链接等”),子功能2224例如通过图3的(A)所示的通话界面310中的悬浮窗312包括的选项312B和选项312C实现。子功能2225包括功能名称:“一起看/听”,子功能2225下还显示有对应的功能说明2225A(包括字符“允许发起或者接受通话的双方一起看、一起听等”),用于指示通过运营商通话/OTT通话等NewTalk实现实时分享的功能,例如通过图3的(A)所示的通话界面310中的悬浮窗312包括的选项312D实现。
不限于图22B所示的示例,在另一些示例中,NewTalk功能中的任意一个子功能可以独立开启或关闭,例如图22C所示,电子设备100可以显示用户界面2230,用户界面2230包括设置名称2231(包括字符“新通话”),例如,电子设备100可以响应于针对“电话”选项下的设置菜单中的“新通话(NewTalk)”选项的触摸操作(如点击操作),显示用户界面2230。用户界面2230可以包括NewTalk功能中的多个子功能的选项,例如子功能2232、子功能2233和子功能2234,其中,子功能2232包括功能名称:“通话质量增强”,下侧还显示有对应的说明2232A(包括字符“开启后,通话允许使用蜂窝移动数据,确保流畅的通话体验,实际消耗的流量以运营统计为准”),右侧还显示有对应的开关控件2232B,开关控件2232B用于开启或关闭子功能2232。子功能2233包括功能名称:“允许通话中接收信息”,下侧还显示有对应的说明2233A(包括字符“开启后将允许通话中接收信息,如图片、位置、文件、链接等”),右侧还显示有对应的开关控件2233B,开关控件2233B用于开启或关闭子功能2233。子功能2234包括功能名称:“允许和电话对方一起看/听”,下侧还显示有对应的说明2234A(包括字符“开启后将允许发起或者接受通话的双方一起看、一起听等”),右侧还显示有对应的开关控件2234B,开关控件2234B用于开启或关闭子功能2234,即通过运营商通话/OTT通话等NewTalk实现实时分享的功能。
在一些示例中,如图22D所示,电子设备100可以显示用户界面2240,用户界面2240包括设置名称2241(包括字符“华为分享”),设置名称2241下还显示有对应的说明2241A(包括字符“无需流量,与附近设备极速分享图片、视频、应用、文件等”),可以表征用户界面2240为华为分享(本申请也可称为即时分享)功能的设置界面。例如,电子设备100可以响应于针对设置菜单中的“更多连接”选项下的“华为分享”选项的触摸操作(如点击操作),显示用户界面2240。用户界面2240可以包括功能名称2242(包括字符“华为分享”)和功能名称2243(包括字符“允许获取华为账号权限”),其中,功能名称2242下侧还显示有对应的功能说明2242A(包括字符“本服务使用蓝牙、WLAN进行数据和多媒体流传输,使用NFC进行设备触碰,调用存储权限读取或保存分享的文件和一起看一起听,即使关闭蓝牙、WLAN、NFC,华为分享也可继续使用蓝牙、WLAN、NFC功能的能力。打开开关,即表示您同意上述内容”),可以表征通过蓝牙、WLAN、NFC等近场通信技术分享文件以及实时分享音频流/视频流的华为分享功能。功能名 称2242右侧还显示有对应的开关控件2242B,开关控件2242B用于开启或关闭功能名称2242指示的华为分享功能,开关控件2242B可以理解为是华为分享功能的总开关。功能名称2242指示的华为分享功能例如通过图8A所示的用户界面810中的控件814B/控件814D实现。功能名称2243下侧还显示有对应的功能说明2243A(包括字符“允许获取本机华为账号昵称和头像,并在发送方设备中缓存,以便发送方更容易识别您”),右侧还显示有对应的开关控件2243B,开关控件2243B用于开启或关闭功能名称2243指示的功能,例如,图6B所示的用户界面620中的可选择的分享对象的列表622示出的用户名称可以是通过功能名称2243指示的功能获取到的。
不限于图22D所示的示例,在另一些示例中,通过华为分享实现实时分享的功能可以独立于华为分享功能开启或关闭,例如图22E所示,电子设备100可以显示的用户界面2250,用户界面2250和图22D所示的用户界面2240类似,区别在于,用户界面2250中的功能名称2242下显示的功能说明2251不同,功能说明2251包括字符“本服务使用蓝牙、WLAN进行数据和多媒体流传输,使用NFC进行设备触碰,调用存储权限读取或保存分享的文件,即使关闭蓝牙、WLAN、NFC,华为分享也可继续使用蓝牙、WLAN、NFC功能的能力。打开开关,即表示您同意上述内容”,可以表征通过蓝牙、WLAN、NFC等近场通信技术分享文件的华为分享功能。并且,用户界面2250还包括功能名称2252(包括字符“允许和附近设备一起看、一起听”),功能名称2252下侧还显示有对应的功能说明2252A(包括字符“允许附近设备通过华为分享一起看、一起听”),可以表征通过华为分享实现实时分享的功能,例如通过蓝牙、WLAN、NFC等近场通信技术实现实时分享的功能。功能名称2252右侧还显示有对应的开关控件2252B,开关控件2252B用于开启或关闭功能名称2252指示的功能。功能名称2252指示的功能例如通过图8A所示的用户界面810中的控件814B/控件814D实现。
不限于上述实施方式所示的实时共享场景(“一起看”和/或“一起听”),在另一种实施方式中,实时共享场景还可以包括“一起玩”。接下来以进行运营商通话/OTT通话等NewTalk的电子设备100和电子设备200为例说明“一起玩”的实时共享场景。
在一些示例中,电子设备100可以响应于用于触发实时分享的用户操作,例如,针对图3的(A)所示的通话界面310中的分享选项312D的触摸操作,向通话对方:电子设备200发送“一起玩”的请求,电子设备200接受该请求后,电子设备100和电子设备200可以同时显示游戏界面,例如图23A所示,电子设备100可以显示图23A的(1)所示的用户界面2310,电子设备200可以显示图23A的(2)所示的用户界面2320。用户界面2310可以包括位于顶部的通话图标2311和游戏窗口2312,通话图标2311可以表征当前电子设备100处于通话态,且通话时长为33秒,游戏窗口2312用于显示“一起玩”的游戏内容。其中,游戏窗口2312可以包括游戏名称2312A(包括字符“一起算题中”)和题目信息2312B(包括字符“15+23=”),题目信息2312B右侧还显示有用于接收用户输入内容的输入框2312C,题目信息2312B下侧还显示有键盘2312D,键盘2312D用于用户在输入框2312C中输入相应的字符。键盘2312D可以包括确定控件2312E,确定控件2312E用于将输入框2312C中的内容作为题目信息2312B对应的答案提交至审核设备,以使审核设备审核答案是否正确。游戏窗口2312还可以包括控制选项2313和切换选项2314,控制选项2312用于触发显示控制菜单,控制菜单例如但不限于包括暂停/退出“一起玩”的选项,切换选项2314用于切换题目信息2312B包括的内容。用户界面2320和用户界面2310类似,区别在于,用户界面2320所示的游戏窗口2321中的题目信息2321A和用户界面2310中的题目信息2312B不同。不限于上述示例的情况,在另一些示例中,电子设备100和电子设备200显示的题目信息也可以相同,例如,用户界面2320中的题目信息为用户界面2310中的题目信息2312B。
在一些示例中,在图23A之后,电子设备100可以接收用户在图23A的(1)所示的用户界面2310中的输入框2312C输入的字符“38”,并接收针对用户界面2310中的确定控件2312E的触摸操作(例如点击操作)。响应于该触摸操作,电子设备100可以将输入框2312C中的内容(即字符“38”)发送给服务器。服务器确定该内容即为用户界面2310中的题目信息2312B对应的答案(也可称为是确定答案正确)时,可以指示电子设备100显示提示信息,该提示信息表征当前游戏胜利,以及指示电子设备200显示提示信息,该提示信息表征当前游戏失败,例如图23B所示,电子设备100可以显示图23B的(1)所示的用户界面2330,电子设备200可以显示图23B的(2)所示的用户界面2340。用户界面2330和图23A的(1)所示的用户界面2310类似,区别在于,用户界面2330所示的游戏窗口2312中的输入框2312C显示有用户输入的字符“38”,并且,游戏窗口2312中还显示有提示信息2331,提示信息2331包括字符“本人赢”,用于指示当前游戏胜利。用户界面2340和图23A的(2)所示的用户界面2320类似,区别在于,用户界面2340所示的游戏窗口2321中还显示有提示信息2341,提示信息2341包括字符“对方赢”,用于指示当 前游戏失败。
在一些示例中,在图23B之后,电子设备100可以响应于针对图23B的(1)所示的用户界面2330中的切换选项2314的触摸操作(例如点击操作),向服务器请求获取新一轮游戏的游戏内容。服务器接收到该请求后,可以向电子设备100和电子设备200发送游戏内容,例如游戏窗口中的题目信息。在另一些示例中,电子设备100也可以在显示图23B的(1)所示的用户界面2330中的提示信息2331后的预设时长(例如10秒),向服务器请求获取新一轮游戏的游戏内容。不限于上述示例的情况,在另一些示例中,也可以是电子设备200向服务器请求获取新一轮游戏的游戏内容,本申请对此不作限定。
不限于上述示例的情况(审核设备为服务器),在另一些示例中,审核设备也可以是电子设备100、电子设备200或其他网络设备。在一种情况下,审核设备是电子设备100。电子设备100可以自行判断图23B的(1)所示的用户界面2330中的输入框2312C中的内容(即字符“38”)是否为用户界面2330中的题目信息2312B对应的答案,当判断结果为是时,可以显示表征当前游戏胜利的提示信息,并且指示电子设备200显示表征当前游戏失败的提示信息。在另一种情况下,审核设备是电子设备200。电子设备100可以向电子设备200发送图23B的(1)所示的用户界面2330中的输入框2312C中的内容(即字符“38”),由电子设备200判断该内容是否为用户界面2330中的题目信息2312B对应的答案,当判断结果为是时,电子设备200可以显示表征当前游戏失败的提示信息,并指示电子设备100显示表征当前游戏胜利的提示信息。
以上示例中,提供“一起玩”的游戏内容的设备为服务器,可以理解为是,服务器为主设备/分享设备,电子设备100和电子设备200为从设备/被分享设备。不限于此,在另一些示例中,提供“一起玩”的游戏内容的设备也可以是电子设备100、电子设备200或其他网络设备。接下来以提供“一起玩”的游戏内容的设备是电子设备100为例说明,电子设备100向电子设备200实时分享游戏内容可以但不限于包括以下三种分享方式:
方式一:不传输音频流/视频流等被分享设备可直接输出的多媒体数据流,而是仅传输游戏数据,游戏数据例如为图23A和图23B所示的题目信息2321A,不限于此,还可以包括得分等游戏状态的数据,本申请对此不作限定。
方式二:传输频流/视频流等被分享设备可直接输出的多媒体数据流,但不携带遮挡的画布。例如,电子设备100开始实时分享之后,可以在图23A和图23B所示的题目信息2312B上显示新的图层(可称为画布),画布用于遮挡题目信息2312B,让用户无法看到题目信息2312B。电子设备100在预设的传输时长后取消显示画布,预设的传输时长可以为电子设备100向电子设备200发送多媒体数据流的时延和电子设备200播放多媒体数据流之前的处理时延(例如解码和渲染的时延),预设的传输时长例如是电子设备100在预设的测量时长内(例如最近一次的传输过程内)的测量值、平均值或者估算值。
方式三:传输频流/视频流等被分享设备可直接输出的多媒体数据流,也携带遮挡的画布。例如,电子设备100和电子设备200可以在预设的游戏开始时间之前,在图23A和图23B所示的题目信息2312B和题目信息2321A上显示画布,在预设的游戏开始时间取消显示画布。
可以理解地,采用方式一进行实时共享时,设备的数据传输量较小,对流量、带宽等网络环境要求较低,可以很好地适用于流量较小或网络质量较差的场景,减小设备的数据传输量和减少设备功耗。
不限于上述示例的情况,在另一些示例中,提供“一起玩”的游戏内容的设备可以有多个,例如电子设备100可以为电子设备200提供游戏数据,电子设备200可以为电子设备100提供游戏数据,本申请对此不作限定。
在一些示例中,从设备/被分享设备可以浏览、操作游戏,但可以不运行游戏,而是运行“一起玩”的播放窗口。不限于此,在另一些示例中,从设备/被分享设备也可以根据接收到的游戏内容运行游戏,例如接收到一个完整的游戏应用,本申请对此不作限定。
不限于上述示例的游戏内容,在另一些示例中,也可以是“一起玩”其他游戏,例如图23C所示,电子设备100可以显示图23C的(1)所示的用户界面2350,电子设备200可以显示图23C的(2)所示的用户界面2360。用户界面2350中的游戏窗口2351可以包括用于显示电子设备100的游戏内容的窗口2351A、用于显示游戏对方(即电子设备200)的游戏内容的窗口2351B、游戏得分2351C、电子设备100的道具信息2351D。类似地,用户界面2360中的游戏窗口2361可以包括用于显示电子设备200的游戏内容的窗口2361A、用于显示游戏对方(即电子设备100)的游戏内容的窗口2361B、游戏得分2361C、电子设备200的道具信息2361D。可以理解地,电子设备100或电子设备200可以将本设备上更新后的游戏内容发送至通话对方,以用于通话对方更新显示的用户界面。例如,电子设备200可以将用户界面2360中的窗 口2361A的内容发送给电子设备100,以用于电子设备100在用户界面2350中更新窗口2351B的显示内容,其中,电子设备200可以直接向电子设备100发送游戏内容,也可以先向服务器发送游戏内容再由服务器转发给电子设备200(可称为是间接发送)。类似地,电子设备100可以直接或间接地将用户界面2350中的窗口2351A的内容发送给电子设备200,以用于电子设备200在用户界面2360中更新窗口2361B的显示内容。不限于此,例如,当电子设备100的游戏分数发生变化时,可以直接或间接地向电子设备200发送最新的游戏分数,以用于电子设备200在用户界面2360中更新游戏得分2361C的显示内容。类似地,当电子设备200的游戏分数发生变化时,可以直接或间接地向电子设备100发送最新的游戏分数,以用于电子设备100在用户界面2350中更新游戏得分2351C的显示内容。
不限于上述实施方式所示的实时共享场景(“一起看”和/或“一起听”),在另一种实施方式中,实时共享场景还可以包括“一起编辑”。接下来以进行运营商通话/OTT通话等NewTalk的电子设备100和电子设备200为例说明“一起编辑”的实时共享场景。
在一些示例中,电子设备100(用户A)可以作为分享设备向通话对方:电子设备200(用户B)实时分享文档(例如word格式)的内容,电子设备100和电子设备200可以同时显示该文档的具体内容。例如图24A所示,电子设备100可以显示图24A的(1)所示的用户界面2410,电子设备200可以显示图24A的(2)所示的用户界面2420。用户界面2410可以包括文档1的编辑窗口2411,编辑窗口2411可以包括文档1的具体内容和编辑功能列表2411A,编辑功能列表2411A例如包括保存文档的控件、撤销最近一次输入的控件、恢复最近一次撤销的输入的控件、退出编辑的控件等。用户界面2420和用户界面2410类似,也包括文档1的编辑窗口2421。如图24A的(1)所示,电子设备100可以响应于针对编辑窗口2411中的文本2411B(“文本1”)的触摸操作,在文本2411B右侧显示光标2411C以及在文本2411B所在的区域显示编辑标识2411D,光标2411C和编辑标识2411D用于指示用户A当前使用电子设备100编辑文本2411B。同时,如图24A的(2)所示,电子设备200可以在编辑窗口2421中的文本2411B所在的区域显示编辑标识2421A和提示信息2421B(包括字符“用户A同步编辑中”),用于指示通话对方(用户A)当前正在编辑文本2411B。类似地,如图24A的(2)所示,用户B使用电子设备200编辑窗口2421中的文本2421C(“文本3”)时,编辑窗口2421中的文本2421C右侧可以显示光标2421D,文本2421C所在的区域可以显示编辑标识2421E。同时,如图24A的(1)所示,编辑窗口2411中的文本2421C所在的区域可以显示编辑标识2411E和提示信息2411F(包括字符“用户B同步编辑中”)。
可以理解地,电子设备100或电子设备200可以将本设备上更新后的文档内容发送至通话对方,以用于通话对方更新显示的文档内容,例如,用户A在图24A的(1)所示的用户界面2410中将文本2411B从“文本1”更改为“文本1包括”,则图24A的(2)所示的用户界面2420中的文本2411B也会更新为“文本1包括”。
不限于上述示例的情况,在另一些示例中,文档也可以是表格(excel)格式的。例如图24B所示,电子设备100可以显示用户界面2440,用户界面2440可以包括表格1的编辑窗口2441,编辑窗口2441可以包括表格1的具体内容和编辑功能列表。编辑窗口2441中的内容2441A右侧显示有光标2441B,并且内容2441A所在的区域显示有编辑标识2441C,用于指示用户A当前使用电子设备100编辑内容2441A。编辑窗口2441中的内容2441D所在的区域显示有编辑标识2441E和提示信息2441F(包括字符“用户B同步编辑中”),用于指示通话对方(用户B)当前正在编辑内容2441D。电子设备200显示的界面和用户界面2440类似,具体说明和图24B的说明类似,不再赘述。
不限于上述示例的情况,在另一些示例中,文档也可以是PPT格式的。例如图24C所示,电子设备100可以显示用户界面2430,用户界面2430可以包括PPT1的编辑窗口2431,编辑窗口2431可以包括幻灯片内容的显示窗口2432和PPT1包括的幻灯片内容的列表2433,列表2433中的选项2433A为选中状态,可以表征显示窗口2432用于显示选项2433A指示的幻灯片内容。显示窗口2432中的内容2432A右侧显示有光标2432B,并且内容2432A所在的区域显示有编辑标识2432C,用于指示用户A当前使用电子设备100编辑内容2432A。显示窗口2432中的内容2432D所在的区域显示有编辑标识2432E和提示信息2432F(包括字符“用户B同步编辑中”),用于指示通话对方(用户B)当前正在编辑内容2432D。电子设备200显示的界面和用户界面2430类似,具体说明和图24C的说明类似,不再赘述。
以上示例中,电子设备100用于提供“一起编辑”的文档,可以理解为是,电子设备100为主设备/分享设备,电子设备200为从设备/被分享设备。在一些示例中,从设备/被分享设备可以浏览、编辑文档,但可以不运行文档,而是运行“一起编辑”的播放窗口。不限于此,在另一些示例中,从设备/被分享设备也可以根据接收到的文档内容运行文档,例如接收到一个完整的文档,本申请对此不作限定。
不限于上述示例的情况,在另一些示例中,电子设备100也可以向电子设备200实时分享画图、白板、注释等,例如,用户A可以在电子设备100显示的画图的窗口/白板上输入内容1,电子设备200显示的画画的窗口/白板上可以显示用户A输入的内容1,不限于此,也可以删除或修改内容,本申请对编辑的具体方式不作限定。例如,用户A可以在电子设备100显示的视频流上添加注释,电子设备100可以将该视频流和注释内容一起作为分享数据发送给电子设备200显示,方便分享用户和被分享用户之间的沟通。本申请对共享的内容不作限定。
不限于上述示例的情况(将更新后的游戏、文档、图画等内容发送给实时共享的其他设备,以用于其他设备更新输出的内容),在另一些示例中,还可以将用户操作事件(例如触摸操作时间)和相关的信息(例如触摸操作的发生时间)发送给实时共享的其他设备,以使其他设备将该用户操作事件作为本设备的输入事件,其他设备可以响应该用户操作事件,可以理解为是“远程控制”的实时共享场景。
以上示例中,不仅可以实现一起看、一起听的实时共享,还可以实现一起玩(游戏)、一起编辑(文档)、远程控制的实时共享,大大拓宽了使用场景,满足用户的多样化需求,用户体验感更好。
不限于上述示例的电子设备,在另一些示例中,分享设备和被分享设备也可以为配置有可折叠的显示屏(可以称为折叠屏)的电子设备(可简称为可折叠电子设备),例如,图20D所示的用户界面2040中的图层2041B和图层2041C可以分别在可折叠电子设备的两个显示屏上显示,例如,图21E所示的用户界面2150中的播放窗口2151和播放窗口2152可以分别在可折叠电子设备的两个显示屏上显示。
基于以上实施例介绍本申请涉及的共享方法。该方法可以应用于图1A所示的共享系统10。该方法可以应用于图1B所示的共享系统10。该方法可以应用于图1C所示的共享系统10。该方法可以应用于图2E所示的共享系统10。
请参见图25,图25是本申请实施例提供的一种共享方法的流程示意图。
分享设备可以但不限于执行如下步骤:
S11:分享设备显示分享入口。
在一种实施方式中,分享设备可以响应于针对分享入口的用户操作,进行实时分享过程,具体可参见S12-S17的说明,该用户操作可以理解为是用于触发实时分享功能/实时分享过程的用户操作。
接下来示出一些分享入口以及用于触发实时分享功能的用户操作的示例。
在一些示例中,分享入口为图3的(A)所示的通话界面310或者图4A所示的用户界面410中的悬浮窗312包括的分享选项312D,用于触发实时分享功能的用户操作例如为针对分享选项312D的触摸操作(例如点击操作)。
在一些示例中,分享入口为图6A所示的短视频应用的用户界面610,用于触发实时分享功能的用户操作例如为针对用户界面610的触摸操作,该触摸操作例如为单指滑动、多指滑动或指关节滑动(例如图6A所示的指关节按照“W”的特定轨迹滑动)等滑动操作。
在一些示例中,分享入口为图7A所示的多任务列表/多任务窗口的用户界面710中的分享控件712B,用于触发实时分享功能的用户操作例如为针对分享控件712B的触摸操作(例如点击操作)。
在一些示例中,分享入口为图8A所示的用户界面810中的即时分享的控件814B或者控件814D,用于触发实时分享功能的用户操作例如为针对控件814B或者控件814D的触摸操作(例如点击操作)。
S12:分享设备选择目标分享内容。
在一种实施方式中,分享设备可以按照预设规则确定目标分享内容,可选地,分享设备可以根据分享入口确定目标分享内容为:和分享入口相关的应用程序的多媒体数据流。
在一些示例中,分享设备接收到针对图4A所示的用户界面410中的悬浮窗312包括的分享选项312D的触摸操作时,由于用户界面410是短视频应用的用户界面,因此,分享设备可以确定目标分享内容为短视频应用的多媒体数据流。
在一些示例中,分享设备接收到针对图6A所示的用户界面610的触摸操作时,由于用户界面610是短视频应用的用户界面,因此,分享设备可以确定目标分享内容为短视频应用的多媒体数据流。
在一些示例中,分享设备接收到针对图7A所示的用户界面710中的分享控件712B时,由于分享控件712B是用户界面710中和短视频应用的窗口712相关的控件,因此,分享设备可以确定目标分享内容为短视频应用的多媒体数据流。
在另一种实施方式中,分享设备可以响应于用户操作确定目标分享内容,可选地,分享设备接收用于触发实时分享功能的用户操作后,可以显示分享内容的选择界面,分享设备可以响应于针对该选择界面中 的任意一个分享内容的用户操作,确定该分享内容为目标分享内容。
在一些示例中,图6B所示的用户界面620为分享内容的选择界面,用户界面620中的列表621示出了多个可选择的分享内容的选项,这多个分享内容可以分别为前台应用(如短视频应用)的多媒体数据流、电子设备100(分享设备)的屏幕的显示内容、后台应用(如视频应用)的多媒体数据流。
在一些示例中,图12C所示的用户界面1230为分享内容的选择界面,用户界面1230中的列表1231示出了多个可选择的分享内容的选项,这多个分享内容可以分别为前台应用(如短视频应用)的多媒体数据流、后台应用(如视频应用)的多媒体数据流、电子设备100(分享设备)上未运行的应用(如音乐应用)的多媒体数据流。
S13:分享设备选择目标分享对象(即被分享设备)。
在一种实施方式中,分享设备选择目标分享对象之前,可以先发现可选择/可实时分享的设备/对象,再从发现的设备/对象中选择出目标分享对象。其中,分享设备例如但不限于通过蜂窝通信技术、近场通信技术、卫星通信技术、D2D等通信技术发现可选择/可实时分享的设备/对象。
在一种实施方式中,分享设备可以按照预设规则确定目标分享对象,可选地,分享设备可以根据分享入口确定目标分享对象为:和分享入口相关的设备。
在一些示例中,电子设备100(分享设备)接收到针对图4A所示的用户界面410中的悬浮窗312包括的分享选项312D的触摸操作时,由于悬浮窗312是和NewTalk相关的控件(具体可参见图3的(A)的说明),同时电子设备100当前和电子设备200进行NewTalk,因此,电子设备100可以确定目标分享对象为通话对方:电子设备200。
在另一种实施方式中,分享设备可以响应于用户操作确定目标分享对象,可选地,分享设备接收用于触发实时分享功能的用户操作后,可以显示分享对象的选择界面,该选择界面可以包括发现的可选择/可实时分享的设备/对象。分享设备可以响应于针对该选择界面中的任意一个分享对象的用户操作,确定该分享对象为目标分享对象。
在一些示例中,图6B所示的用户界面620为分享对象的选择界面,用户界面620中的列表622示出了多个可选择的分享对象的选项,这多个分享对象可以包括一个通话对方和至少一个附近设备。
在一些示例中,图11A所示的用户界面1110为分享对象的选择界面,用户界面1110中的列表1111示出了多个可选择的分享对象的选项,这多个分享对象可以包括多个通话对方和至少一个附近设备。
在一些示例中,图11B所示的用户界面1120为分享内容的选择界面,用户界面1120中的列表1121示出了多个可选择的分享对象的选项,这多个分享对象可以包括至少一个最近联系人和至少一个附近设备。
在一些示例中,图11C所示的用户界面1130为分享内容的选择界面,用户界面1130中的列表1131示出了多个可选择的分享对象的选项,这多个分享对象可以包括联系人和至少一个附近设备,联系人的具体示例可参见图11D。
其中,S12和S13的顺序不作限定,例如可以是同时执行的。
在一种实施方式中,分享设备接收用于触发实时分享功能的用户操作后,可以先显示实时分享方式的选择界面。分享设备可以响应于针对该选择界面中的任意一个实时分享方式的用户操作,显示分享内容和/或分享对象的选择界面(其中显示的分享内容和/或分享对象与该实时分享方式相关)。在一些示例中,图12A所示的用户界面1210为实时分享方式的选择界面。分享设备可以响应于针对用户界面1210中的一起看的选项1211A的用户操作,显示图12B所示的用户界面1220,用户界面1220中的列表1221示出了多个可被观看的分享内容的选项,用户界面1220中的列表1222示出了多个可显示图像的设备的选项。分享设备可以响应于针对用户界面1210中的一起听的选项1211B的用户操作,显示图12C所示的用户界面1230,用户界面1230中的列表1231示出了多个可被收听的分享内容的选项,用户界面1230中的列表1232示出了多个可播放音频的设备的选项。
在另一种实施方式中,分享设备可以根据接收到的用于触发实时分享功能的用户操作确定实时分享方式,然后显示分享内容和/或分享对象的选择界面(其中显示的分享内容和/或分享对象与该实时分享方式相关)。在一些示例中,用于触发实时分享功能的用户操作为针对图6A所示的用户界面610的第一滑动操作(例如图6A所示的指关节按照“W”的特定轨迹滑动)时,分享内容的选择界面为图12B所示的用户界面1220。用于触发实时分享功能的用户操作为针对用户界面610的第二滑动操作(例如图12D所示的指关节按照“L”的特定轨迹滑动)时,分享内容的选择界面为图12C所示的用户界面1230。
S14:分享设备选择目标通信链路。
在一些示例中,目标通信链路可以但不限于包括图2E所示的链路1-链路6、V2X链路中的一个或多 个链路。在一种实施方式中,分享设备可以按照预设规则确定目标通信链路。
在一些示例中,分享设备可以根据目标分享对象确定目标通信链路。例如,电子设备100(分享设备)确定的目标分享对象为通话对方:电子设备200时,目标通信链路可以是和电子设备100和电子设备200之间已建立的通话链路相关的链路,例如NewTalk链路或辅助链路。
在另一些示例中,分享设备可以根据分享入口确定目标通信链路为:和分享入口相关的设备。例如,电子设备100(分享设备)接收到针对图8A所示的用户界面810中的即时分享的控件814D的触摸操作时,目标通信链路可以为和即时分享功能相关的链路,例如Wi-Fi链路或BT链路。
在另一种实施方式中,分享设备可以响应于用户操作确定目标通信链路,在一些示例中,分享设备接收用于触发实时分享功能的用户操作后,可以显示图13所示的用户界面1310,用户界面1310可以包括分享给联系人的选项1311A、分享给Wi-Fi设备的选项1311B和分享给蓝牙设备的选项1311C。选项1311A对应的目标通信链路例如为NewTalk链路或辅助链路,选项1311B对应的目标通信链路例如为Wi-Fi链路,选项1311C对应的目标通信链路例如为蓝牙链路。
其中,S14和S11-S13中任一项的顺序不作限定,例如,S13和S14可以是同时执行的。
S15:分享设备和被分享设备建立目标通信链路。
其中,S15和S11-S13中任一项的顺序不作限定,例如,S15在S11之前已经执行。
在一些示例中,目标通信链路为远场形式的Wi-Fi链路,例如,处于不同局域网的分享设备和被分享设备可以建立远场形式的Wi-Fi链路。在另一些示例中,目标通信链路为近场形式的Wi-Fi链路,例如,连接了同一个Wi-Fi信号源的分享设备和被分享设备(此时处于同一个局域网)可以建立近场形式的Wi-Fi链路。
S16:分享设备抓取分享数据。
在一种实施方式中,分享设备可以抓取和目标分享内容相关的分享数据。在一些示例中,目标分享内容为应用1的多媒体数据流时,分享设备可以抓取应用1的图层等内容,以生成应用1的图像和/或音频等多媒体数据流(分享数据)。在一些示例中,目标分享内容为分享设备的屏幕的显示内容和/或相关的音频数据时,分享设备可以抓取分享设备显示的图层等内容,以生成系统的图像和/或音频等多媒体数据流(分享数据)。
不限于上述示例的情况,在另一些示例中,目标分享内容也可以不是分享设备在前台或后台输出的数据,而是分享设备未输出的数据,例如,分享数据可以通过3G/4G/5G/6G广播信道接收基站发送的频道的广播数据,不输出该广播数据,但将该广播数据作为分享数据以用于进行实时分享。
在另一些示例中,分享设备也可以不是抓取本设备的应用级和/或系统级的多媒体数据作为分享数据,而是生成和目标分享内容相关的分享数据发送给被分享设备。例如,假设目标分享内容的类型为游戏,则分享设备可以生成游戏类型的分享数据发送给被分享设备。
在另一些示例中,分享设备也可以抓取接收到的用户操作事件和相关信息(例如发生时间)。例如,分享设备可以通过系统提供的接口(例如,该接口用于提供给应用程序集成和调用)抓取用户操作事件和相关信息,该接口例如但不限于包括以下至少一项:包括发现(Discovery)接口(例如用于发现成员(Member))、链路管理(Link Manager,LinkMgr)接口、传输(Transmit)接口(例如用于发送(send)和/或接收(receive,recv))。
也就是说,本申请对分享数据的具体内容不作限定。
在一种实施方式中,分享设备抓取分享数据后,可以对分享数据进行编码、封包和分流等处理,处理后的分享数据可以用于发送给被分享设备,即用于执行S17。
S17:分享设备向被分享设备发送分享数据。
在一种实施方式中,分享设备可以通过目标通信链路向被分享设备发送分享数据。可以理解地,由于分享设备和被分享设备之间是进行实时分享,分享数据实际为数据流,因此,分享设备可以在实时分享期间持续向被分享设备发送分享数据流(例如音频流/视频流)。
不限于上述示例的情况,在另一些示例中,分享设备还可以抓取分享设备的屏幕上的任意区域相关的多媒体数据,并发送给被分享设备。可选地,分享设备可以响应于用户操作,确定待分享的区域,具体示例可参见图17A-图17I、图18A-图18D、图19A-图19G。在另一些示例中,分享设备还可以抓取分享设备的屏幕上的任意图层相关的多媒体数据,并发送给被分享设备。可选地,分享设备可以响应于用户操作,确定待分享的图层,具体示例可参见图20A-图20D。
被分享设备可以但不限于执行如下步骤:
S21:被分享设备接收分享请求。
在一种实施方式中,被分享设备可以持续监听是否接收到分享请求。
在一种实施方式中,被分享设备接收到分享请求后,可以按照预设规则接受该分享请求,例如,分享设备为正在通信、通信过或者已发现的设备时,被分享设备可以默认接受该分享请求。在另一种实施方式中,被分享设备也可以响应于用户操作,接受该分享请求,例如,电子设备200(被分享设备)接收到电子设备100(分享设备)发送的分享请求后,可以显示图5A所示的用户界面510中的提示信息511,电子设备200可以响应于针对提示信息511中的接受控件511B的触摸操作(例如点击操作),接受该分享请求。被分享设备接受分享设备发送的分享请求后,可以和分享设备建立目标通信链路。
其中,S21和上述S11-S16中任一项的顺序不作限定,例如,电子设备100(分享设备)接收到针对图6B所示的用户界面620中的选项622A的触摸操作(例如点击操作)之后,可以确定目标分享对象为选项622A指示的电子设备200(被分享设备)(即执行S13),并向电子设备200发送分享请求,电子设备200可以接收该分享请求(即执行S21)。
S22:被分享设备和分享设备建立目标通信链路。
在一种实施方式中,S22和上述S15同时执行。
其中,S22和S21的顺序不作限定。
S23:被分享设备接收分享设备发送的分享数据。
在一种实施方式中,上述S17之后,被分享设备可以执行S23。
在一种实施方式中,被分享设备接收到分享数据后,可以对分享数据进行聚合、解包和解码等处理,处理后的分享数据可以用于输出给用户,即用于执行S24。
S24:被分享设备输出分享数据。
在一些示例中,被分享设备可以通过显示屏显示分享数据中的图像和/或通过扬声器播放分享数据中的音频,具体示例可参见图5B、图14A、图14B和图14C,本申请对被分享设备输出分享数据的方式不作限定。
不限于上述示例的情况,在另一些示例中,分享设备也可以向和被分享设备连接的其他设备发送针对该被分享设备的分享请求,上述其他设备接收到该分享请求后,可以输出提示信息,用户可以通过上述其他设备接受或拒绝针对被分享设备的分享请求,具体示例可参见图14D。
不限于上述示例的情况,在另一些示例中,分享设备和被分享设备可以不直接建立通信链路,而是通过第三方设备“中转”建立通信链路,并通过第三方设备“中转”传输分享数据,具体示例可参见图14D。
在一种实施方式中,S24之后,进行实时共享的任意一个设备显示分享数据时可以接收用户操作,响应于该用户操作对分享数据进行处理,例如将某一内容设置为编辑状态、更新内容等。该设备可以将处理信息(例如编辑的位置、更新后的内容、和更新后的内容相关的信息)发送给进行实时共享的其他设备,以用于其他设备更新本设备显示的分享数据。
在一些示例中,在图23B所示的“一起玩”的实时共享场景中,分享数据为游戏内容。用户A使用电子设备100在用户界面2330中的输入框2312C输入字符“38”后,电子设备100确定输入框2312C中的内容(即字符“38”)为用户界面2330中的题目信息2312B对应的答案后,可以显示提示信息2331,并且可以向电子设备200发送指示当前游戏失败的信息(可以理解为是和更新后的内容相关的信息),电子设备200接收到该信息后可以显示用户界面2340中的提示信息2341。
在一些示例中,在图23C所示的“一起玩”的实时共享场景中,分享数据为游戏内容。电子设备100可以响应于用户操作更新用户界面2350中的窗口2351A和游戏得分2351C,并将更新后的内容发送给电子设备200,电子设备200可以根据更新后的窗口2351A显示用户界面2360中的窗口2361B,根据更新后的游戏得分2351C显示用户界面2360中的游戏得分2361C。
在一些示例中,在图24A所示的“一起编辑”的实时共享场景中,分享数据为word格式的文档1。用户A使用电子设备100编辑用户界面2410中的文本2411B(“文本1”)时,电子设备100可以向电子设备200发送当前正在编辑的位置:文本2411B(由于当前未修改文本2411B包括的字符,因此可以不发送更新后的内容),因此,电子设备200可以在用户界面2420中的文本2411B所在的区域显示编辑标识2421A和提示信息2421B。不限于此,分享数据也可以为其他格式的文档,具体示例可参见图24B和图24C。
不限于上述示例的情况,在另一些示例中,也可以不是由分享设备提供分享数据,而是由服务器等网络设备提供分享数据,此时分享设备可以理解为是发起实时共享的设备,但不是提供分享数据的设备。例如,分享设备可以向网络设备发送共享请求,网络设备基于共享请求向被分享设备发送分享数据,其中, 该网络设备例如为分享数据对应的应用的应用服务器。在一些示例中,网络设备还可以向分享设备发送分享数据,网络设备向分享设备发送的分享数据和向被分享设备发送的分享数据可以相同,也可以不同,例如,图23A-图23B所示的“一起玩”的实时共享场景下,服务器可以分别向电子设备100和电子设备200发送不同的题目信息,电子设备100显示的用户界面2310所示的游戏窗口2312和电子设备200显示的用户界面2320所示的游戏窗口2321不同(其中的题目信息不同)。服务器还可以作为审核设备验证电子设备100或电子设备200发送的答案是否正确。本身请对提供分享数据的设备不作限定。
在一种实施方式中,分享设备可以管理被分享设备,例如取消向某一设备实时分享(也可称为是删除该设备),具体示例可参见图10A-图10B。
在一种实施方式中,分享设备可以改变分享内容,具体示例可参见图10A-图10B。
在一种实施方式中,分享设备可以设置被分享设备基于分享内容的相关权限,例如但不限于保存的权限和转发的权限,具体示例可参见图15A-图15D、图16A-图16E。
在一种实施方式中,分享设备向被分享设备实时分享第一内容时,被分享设备也可以向分享设备实时分享第二内容,也就是说可以实现双向分享,被分享设备向分享设备实时分享的说明和上述分享设备向被分享设备实时分享的说明类似,不再赘述,具体示例可参见图21A-图21E。
在一种实施方式中,电子设备可以默认开启以上实施方式中的实时分享功能。在另一种实施方式中,电子设备可以响应于用户操作,开启以上实施方式中的实时分享功能,具体示例可参见图22A-图22E。
图25以分享设备和一个被分享设备进行实时共享为例进行示意,在另一些示例中,分享设备可以和多个被分享设备进行实时共享,分享设备和这多个被分享设备中的任意一个被分享设备进行实时共享的说明可参见图25的说明。
本申请提供的共享方法的应用示例可参见图3、图4A-图4C、图5A-图5D、图6A-图6D、图7A-图7C、图8A-图8C、图9A-图9C、图10A-图10B、图11A-图11D、图12A-图12D、图13、图14A-图14D、图15A-图15D、图16A-图16E、图17A-图17I、图18A-图18D、图19A-图19G、图20A-图20D、图21A-图21E、图22A-图22E、图23A-图23C、图24A-图24C所示的实施方式。
本申请可以通过一次针对分享入口的用户操作,让分享设备和一个或多个通话对方、附近设备等被分享设备,实现一起看、一起听、一起玩和一起编辑等实时共享、协同的功能,提供了更加简洁和方便的用户体验操作序列,解决了运营商通话和近场通信的场景下无法实时共享的问题,无需安装聊天应用或会议应用、待共享的应用,也无需适配待共享的应用,大大拓宽了应用场景,让用户可以做到快捷地分享任意应用、任意区域、任意图层的多媒体数据流,有效满足用户的需求,提升用户体验。并且,实时分享可以减少二次传播的可能性,提升对用户的隐私安全的保护。
在一种实施方式中,分享设备可以将摄像头采集的第一图像/视频和实时分享的第二图像/视频(可以是应用级和/或系统级的图像/视频)发送给被分享设备一起显示/播放,让被分享用户可以同时观看实时分享的内容和对方所处的实际场景,满足用户的个性化需求。
在一种实施方式中,分享设备可以将麦克风采集的第一音频和实时分享的第二音频(可以是应用级/系统级/背景的音频)发送给被分享设备一起播放,即实现混音播放,让被分享用户可以同时收听实时分享的音频和对方的声音,满足用户的个性化需求。第一音频和第二音频的传输方式可以但不限于包括以下三种:
方式一:如图26A所示,在分享设备侧,分享设备通过麦克风采集第一音频后,可以对采集的第一音频进行3A处理并得到处理后的第一音频,其中,3A处理可以包括回波抵消(acoustic echo cancellation,AEC)、背景噪声抑制(adaptive noise suppression,ANS)和自动增益控制(automatic gain control,AGC)。分享设备还可以获取分享的第二音频(例如抓取生成第二音频)。本申请对分享设备获取处理后的第一音频和获取第二音频的顺序不作限定。分享设备可以将处理后的第一音频和获取的第二音频混合,并对混合后的音频进行统一编码(可简称为混合编码),以得到第三音频。分享设备可以将第三音频发送至被分享设备。在被分享设备侧,被分享设备可以不分离第三音频,直接对第三音频进行解码和播放。
方式二:如图26B所示,在分享设备侧的处理方式和方式一一致,区别在于被分享设备侧,被分享设备可以对第三音频进行分离和解码以得到第一音频和第二音频,被分享设备可以对第一音频进行3A处理。被分享设备可以同时播放3A处理后的第一音频和第二音频。
方式三:如图26C所示,在分享设备侧的处理方式和方式一类似,区别在于,分享设备不会将处理后的第一音频和获取的第二音频进行混合编码,而是分别编码,并且,分别编码后的第一音频和第二音频可 以通过不同的链路传输至被分享设备。在被分享设备侧,被分享设备可以分别对接收到的第一音频和第二音频进行解码,被分享设备可以对解码后的第一音频进行3A处理,被分享设备可以同时播放3A处理后的第一音频和解码后的第二音频。
不限于上述方式三示例的情况,在另一些示例中,分别编码后的第一音频和第二音频也可以通过同一个链路传输至被分享设备。
在一种实施方式中,被分享设备可以对接收到的第一音频和第二音频(例如是对第一音频和第二音频进行混合编码得到的第三音频)进行统一降噪,在另一种实施方式中,被分享设备也可以仅对接收到的第一音频进行降噪,不对第二音频进行降噪,本申请对降噪的具体方式不作限定。
接下来示例性说明通过NewTalk(例如运营商通话或OTT通话等通话)进行实时共享的实现方式。
请参见图27,图27示例性示出又一种共享系统10的架构示意图。在一些示例中,图27所示的共享系统10可以应用于通过NewTalk进行实时共享的场景。以下示例以实时共享音频为例进行说明。
图27所示的部分模块的功能和可能实现可参见前述实施例中的电子设备的软件架构的描述,例如图2E所示的共享系统10的说明。
如图27所示,共享系统10中的电子设备100和电子设备200之间可以通过NewTalk进行一起看、一起听、一起玩和一起编辑等实时共享。不限于上述示例的单播场景,在另一些示例中,还可以有更多设备进行实时共享,即可以应用于组播或广播场景,本申请对此不作限定。
以电子设备100为例说明电子设备的软件系统架构,电子设备200的说明类似。
在一种实施方式中,如图27所示,电子设备100的应用系统可以分为三层,从上至下分别为应用程序框架层、硬件抽象层和内核层。应用程序框架层包括分享模块、NewTalk功能模块、通信管理模块、音频框架模块和多路径传输管理模块。硬件抽象层包括无线接口层(radio interface layer,RIL)、音频抽象模块、通信地图和辅助链路模块。内核层包括移动接口模块和音频核心模块。其中:
通信管理模块用于管理NewTalk的接听、挂断等功能。不限于此,在一些示例中,通信管理模块还可以用于管理短信息、网络通话的相关功能,本申请对此不作限定。在一些示例中,NewTalk功能模块可以通过通信管理模块和RIL进行交互,以实现电子设备100和电子设备200之间的NewTalk。RIL是用于和无线通信系统进行连接/交互的接口层。通信管理模块可以和RIL进行交互,例如,通信管理模块可以通过内核层中的NewTalk服务模块和RIL进行交互。在一些示例中,RIL可以通过移动接口模块和电子设备100的无线通信系统中的蜂窝通信系统进行交互。移动接口模块例如包括移动调制解调器(mobile station modem,MSM)接口(interface),以及用于管理注意命令(attention command,AT)的模块,其中,注意命令(AT)指令集可以是终端设备(terminal equipment,TE)或者数据终端设备(data terminal equipment,DTE)向终端适配器(terminal adapter,TA)或数据电路终端设备(data circuit terminal equipment,DCE)发送的,TE或DTE可以通过发送注意命令(AT)来控制移动台(mobile station,MS)的功能,从而和网络业务进行交互。
音频框架模块、音频抽象模块和音频核心模块分别在应用程序框架层、硬件抽象层和内核层负责管理音频功能。在一些示例中,音频框架模块可以通过音频抽象模块和音频核心模块进行交互,音频核心模块可以和无线通信系统中的数字信号处理模块进行交互,以实现音频的处理过程。其中,音频框架模块也可称为是音频框架(Audio Framework),音频抽象模块也可称为是音频硬件层(Audio Hardware Layer,Audio HAL)。音频核心模块可以是高级声音架构(advanced Linux sound architecture,ALSA)和/或片上ALSA系统(ALSA system on chip,ASoC)的核心(CORE)层,其中,ALSA可以提供音频和音乐设备数字化接口(musical instrument digital interface,MIDI)的支持。ASoC可以是建立在ALSA之上,为了更好支持嵌入式系统和应用于移动设备的音频编解码(codec)的软件系统,ASoC可以依赖于标准ALSA驱动的框架。ALSA CORE可以向上提供逻辑设备系统调用,向下驱动硬件设备,逻辑设备例如但不限于包括PCM设备、控制(control,CTL)设备、MIDI设备和定时器(Timer)设备等,硬件设备例如但不限于包括机械(Machine)设备、I2S设备、直接存储器访问(direct memory access,DMA)设备和编解码(codec)设备等。无线通信系统中的数字信号处理模块例如为音频数字信号处理系统(audio digital signal processing,ADSP)(例如用于进行音频解码),数字信号处理模块例如包括PCM模块。
多路径传输管理模块可以负责通过多条不同的路径建立连接和传输数据(例如称为是四网+),以及用于负责基于多条路径高效地传输数据(例如称为是华为公有云网络平面(Huawei Open Network,HON),其中,HON可以融入云服务未来极简网络,整合端、管、云协同优势,构建最优网络通信体验)。
通信地图可以包括通用的通信地图,可选地以及个性化的通信地图。通信地图可以用于进行预测建链,例如但不限于包括预测是否建立通信链路、建立通信链路的时间、建立的通信链路的类型、建立通信链路的位置等。
在一些示例中,电子设备100作为分享设备向通话对方(即电子设备200)实时分享系统级/应用级/背景的音频流时,NewTalk功能模块可以通过音频框架模块、音频抽象模块、音频核心模块和数字信号处理模块处理实时分享的音频流。在一些示例中,处理后的实时分享的音频流可以通过数字信号处理模块发送至蜂窝通信模块,蜂窝通信模块可以同时传输NewTalk的通话数据流和实时分享的音频流至电子设备200。不限于此,在另一些示例中,实时分享的音频流也可以通过无线通信模块中的蓝牙通信模块、卫星通信模块或Wi-Fi通信模块等其他通信模块传输至电子设备200。
在一些示例中,电子设备100作为分享设备向通话对方(即电子设备200)实时分享系统级/应用级/背景的音频流时,NewTalk功能模块可以和辅助链路模块进行交互,以和电子设备200建立辅助链路,辅助链路可以用于传输实时分享的音频流。
网络设备300可以包括认证模块。认证模块用于提供身份信息,该身份信息可以为用户级别的身份信息(例如访问令牌(AT))或者设备级别的身份信息(例如华为证书)。在一些示例中,电子设备100的NewTalk功能模块可以通过网络设备300的认证模块获取电子设备100的身份信息,在一些示例中,网络设备300的认证模块可以为登录有华为账号的电子设备100提供对应的身份信息。不限于此,在一些示例中,认证模块还用于唤醒处于空闲态或休眠态的电子设备。
在一些示例中,电子设备100的NewTalk功能模块可以通过网络设备300的寻址模块实现身份信息(例如上述访问令牌(AT)或华为证书)的认证,认证通过后网络设备300可以生成电子设备100的P2P-TOKEN,P2P-TOKEN可以用于NAT穿越或NAT中继。不限于此,在一些示例中,网络设备300的寻址模块还可以用于通话双方交换各自的SessionID。在一些示例中,网络设备300的寻址模块还可以用于和推送(PUSH)服务器对接,通过PUSH服务器唤醒处于空闲态或休眠态的电子设备。
在一些示例中,唤醒后的电子设备可以和网络设备300连接,并通过网络设备300的认证模块和寻址模块实现身份信息的认证和寻址。
在一种实施方式中,电子设备100和电子设备200进行NewTalk时,可以通过图27所示的NewTalk链路传输NewTalk的数据流(也可称为通话数据流),NewTalk链路可以称为是主链路。NewTalk链路的说明可参见图2E中的链路1的说明。
在一种实施方式中,电子设备100和电子设备200通过NewTalk进行实时共享时,用于传输实时分享的多媒体数据流的通信链路可以是NewTalk链路(主链路),不限于此,在另一种实施方式中,也可以是NewTalk的数据通路(Data channel),在另一种实施方式中,也可以是辅助链路,在一些示例中,辅助链路可以是NAT穿越链路,或者服务器中转链路(例如NAT中继链路),辅助链路的说明可参见图2E中的链路6的说明。
接下来示例性说明通过NewTalk进行实时共享时的发现、建链和传输等过程。
发现:分享设备/分享发起方发现一个或多个候选的被分享设备/分享接收方的行为。
在一种实施方式中,发现可以便于分享设备/分享发起方向上述一个或多个候选的被分享设备/分享接收方中的指定设备发起实时分享过程。在一些示例中,在通过NewTalk进行实时共享的场景下,由于NewTalk建立时已经明确了通话双方(对应两方通话场景)或者多方(对应多方通话场景),因此发现过程在NewTalk建立时已完成。在一些示例中,在两方通话场景下,通话中的任意一方作为分享设备/分享发起方发起实时分享时,通话的另一方即为被分享设备/分享接收方。在另一些示例中,在多方通话场景下,通话中的任意一方作为分享设备/分享发起方发起实时分享时,通话中的其他多方可以为候选的被分享设备/分享接收方。
建链:建立用于传输实时分享的多媒体数据流的通信链路。
在一种实施方式中,基于功耗、资费等多重因素的考虑,建链可以包括但不限于以下三种情况:始终建链、预测建链和按需建链,其中,始终建链是NewTalk开始时就建立有通信链路。预测建链是按照预测内容建立通信链路,例如,按照预测的在时刻A且到达区域A时建立通信链路,预测内容例如是根据通信地图得到的。按需建链是在有数据传输需求时建立通信链路。
在一种实施方式中,建立的用于传输实时分享的多媒体数据流的通信链路可以包括一条或多条通信链路,例如,可以一直维系一条低功耗的通信链路,并按需拉起一条高速稳定的通信链路。
在一种实施方式中,建链的时间可以但不限于为以下任意一种情况:
情况1,在NewTalk开始之后,实时分享之前的任意一个时间点发起建链。例如,图6A-图6C所示的实施方式中,电子设备100和电子设备200进行运营商通话/OTT通话之后,电子设备100响应针对图6B所示的用户界面620中的选项622A的触摸操作之前,电子设备100(分享设备)可以向电子设备200(被分享设备)发起建链,其中,电子设备100可以响应于上述触摸操作,向选项622A指示的电子设备200发起实时分享。
情况2,分享设备选择目标分享对象后发起建链。例如,图6A-图6C所示的实施方式中,电子设备100(分享设备)响应针对图6B所示的用户界面620中的选项622A的触摸操作,确定选项622A指示的电子设备200为目标分享对象之后,电子设备100可以向电子设备200发起建链,并基于建立的链路进行实时分享。
情况3,分享设备选择目标分享对象和选择目标分享内容后发起建链。例如,图6A-图6C所示的实施方式中,电子设备100(分享设备)响应针对图6B所示的用户界面620中的选项622A的触摸操作,确定选项622A指示的电子设备200为目标分享对象,以及确定用户界面620中的选中状态的选项621A指示的短视频应用的音频流/视频流为目标分享内容后,电子设备100可以向电子设备200发起建链,并基于建立的链路进行实时分享。
情况4,建立NewTalk链路时一起建立用于传输实时分享的多媒体数据流的通信链路。例如,用于传输实时分享的多媒体数据流的通信链路包括NewTalk链路。
情况5,建立NewTalk链路之前建立用于传输实时分享的多媒体数据流的通信链路。
情况6,由于通话补包、文件共享、链接共享等通信场景已建立有通信链路,因此可以直接使用已建立的通信链路作为用于传输实时分享的多媒体数据流的通信链路,建链的时间为上述已建立的通信链路的建立时间。
情况7,根据通信地图等信息进行预测建链,建链的时间根据预测结果确定。
在一种实施方式中,建链的方式可以但不限于为以下任意一种:
方式一,复用NewTalk链路(主链路)。在一些示例中,通话数据流和实时分享的多媒体数据流可以共用NewTalk链路(主链路)进行传输。在一些示例中,可以先通过NewTalk链路(主链路)传输通话数据流,然后再传输实时分享的多媒体数据流。在一些示例中,通话数据流和实时分享的多媒体数据流的头部字段可以不同。在一些示例中,NewTalk为基于IMS协议的通话(可称为IMS通话),因此可以在原有的实时传输协议(real-time transport protocol,RTP)报文中扩展增加,例如通话数据流和实时分享的多媒体数据流的RTP头部不同。在一些示例中,方式一下核心网处于透传模式,不会对实时分享的多媒体数据流的报文进行过滤和转码。
方式二,使用NewTalk的Data channel。其中,Data channel是基于IMS专有承载上,相对于通话的信令QCI5、多媒体通路QCI1/QCI2以外的数据传输通道。
方式三,建立辅助链路。在一些示例中,通话数据流可以使用NewTalk链路(主链路)进行传输,实时分享的多媒体可以使用辅助链路进行传输。
在一种实施方式中,通过NewTalk链路(主链路)中传输的报文进行建链协商,以建立辅助链路。在一些示例中,分享设备可以在主链路传输的实时传输控制协议(real-time control protocol,RTCP)报文中携带用于建立辅助链路的信息,以此在通话过程中向通话对方请求建立辅助链路。在一些示例中,分享设备可以在RTCP报文包括的源描述项(source description items,SDES)字段中携带用于建立辅助链路的信息,SDES字段例如用于对发送RTCP报文的源进行描述。在一些示例中,分享设备可以将用于NAT穿透的通信ID(如SessionID)、地址信息(如IP地址)等信息按照文本化编码的方式存储在SDES字段中,该SDES字段例如为终端标志(canonical name,CNAME)。在一些示例中,进行协商后,分享设备可以调用NAT的接口进行穿越或中继,以建立辅助链路。
不限于上述示例的情况,在另一些示例中,分享设备还可以通过会话初始协议(session initialization protocol,SIP)的消息进行建链协商以建立辅助链路,例如,在建立NewTalk时,分享设备可以在邀请(INVITE)消息中携带通信ID(如SessionID)等信息,以和被分享设备交换各自的通信ID(用于后续建立辅助链路)。或者,在通话过程中,分享设备可以在重邀请(reINVITE)消息或更新(UPDATE)消息中携带通信ID(如SessionID)等信息,以和被分享设备交换各自的通信ID(用于后续建立辅助链路)。本申请对此不作限定。
在另一种实施方式中,也可以不借助NewTalk链路(主链路)建立辅助链路,而是通过网络设备300进行寻址以建立辅助链路。在一些示例中,通话中的任意一方可以在网络设备300上进行参数绑定,可选 地,具体为将电话号码、OTT ID等标识信息和通信ID(如SessionID)绑定/设置为关联。在一些示例中,任意一个设备在网络设备300上进行参数绑定后,其他设备可以通过网络设备300,根据该设备的电话号码、OTT ID等标识信息寻址到该设备的通信ID。
请参见图28,图28示例性示出一种辅助链路的建立过程的流程示意图。图28以进行NewTalk的电子设备100和电子设备200通过网络设备300进行寻址以建立辅助链路为例进行说明。该建立过程可以但不限于包括以下步骤:
1.电子设备100将电子设备100的第一标识信息和电子设备100的第一通信ID绑定,注册和/或登录到网络设备300上(可称为是绑定操作)。
在一些示例中,第一标识信息为电话号码或OTT ID等通讯号码。第一通信ID为SessionID。
在一些示例中,绑定操作之前,网络设备300可以对电子设备100进行身份认证,例如验证电子设备100的访问令牌(AT)或华为证书是否符合要求。当电子设备100的身份认证通过时,网络设备300可以生成电子设备100的P2P-TOKEN,P2P-TOKEN例如携带了密钥标识(key id),并且使用了私钥签名。
在一些示例中,电子设备100的身份认证通过后,才可以执行绑定操作。
在一些示例中,绑定操作可以包括:电子设备100向网络设备300发送电子设备100的第一标识信息,网络设备300可以返回电子设备100的第一通信ID。其中,第一标识信息可以包括电子设备100的一个或多个标识信息,例如,第一标识信息包括电话号码1和电话号码2,则电子设备100可以向网络设备300发送经过哈希处理的电话号码1和电话号码2,可以表征为:HASH(电话号码1)+HASH(电话号码2)。
在一些示例中,当电子设备100的标识信息和/或通信ID发生变化,则电子设备100可以执行刷新(Refresh)操作,刷新操作和绑定操作类似,区别在于,绑定的标识信息和通信ID为变化后的标识信息和通信ID。
在一些示例中,电子设备100执行绑定操作后,网络设备300可以存储有电子设备100的第一标识信息,以及和第一标识信息关联的第一通信ID,也可称为是建立有绑定关系。
2.电子设备200将电子设备200的第二标识信息和电子设备200的第二通信ID绑定,注册和/或登录到网络设备300上。
图28的2和图28的1类似,具体可参见图28的1的说明,不再赘述。
3.电子设备100根据电子设备200的第二标识信息向网络设备300获取电子设备200的第二通信ID(可称为是寻址操作)。
在一些示例中,电子设备100已知电子设备200的第二标识信息中的至少一个标识信息的情况下,电子设备100可以向网络设备300发送查询请求,该查询请求用于查询电子设备200的通信ID,该查询请求可以携带电子设备100已知的电子设备200的至少一个标识信息。网络设备300接收到该查询请求后,可以获取和这至少一个标识信息关联的第一通信ID返回给电子设备100。
在一些示例中,图28的3之后,电子设备100可以通过提供的会话结束接口向网络设备300主动释放,解除图28的1实现的绑定关系。在另一些示例中,图28的3之后,网络设备300可以在预设时长(例如10分钟)后自动释放,解除图28的1实现的绑定关系,可称为是超时自动清理绑定关系。
4.电子设备200根据电子设备100的第一标识信息向网络设备300获取电子设备100的第一通信ID。
图28的4和图28的3类似,具体可参见图28的3的说明,不再赘述。
5.电子设备100和电子设备200根据第一通信ID和第二通信ID建立辅助链路。
在一些示例中,电子设备100可以通过电子设备200的第二通信ID和电子设备200完成建链协商,例如但不限于为IP直连、通过NAT穿越或者通过服务器中继(例如NAT中继),从而建立辅助链路。
其中,图28的1和2的顺序不作限定,图28的3和4的顺序不作限定。
不限于图28示例的情况,在另一些示例中,在上述3之前,电子设备200也可以不执行2(即不执行绑定操作)。在这种情况下,电子设备100执行寻址操作时,网络设备300无法匹配到和电子设备200的第二标识信息关联的通信ID。在一些示例中,网络设备300可以通过对接的PUSH服务器唤醒电子设备200,例如,唤醒后的电子设备200可以和网络设备300连接,并通过网络设备300(例如其中的认证模块和寻址模块)进行身份认证和寻址(例如为寻址到电子设备100),在另一些示例中,网络设备300可以不唤醒电子设备200,因此可以向电子设备100返回寻址失败的指示信息(例如包括原因“未唤醒”),在另一些示例中,网络设备300无法成功唤醒电子设备200,因此可以向电子设备100返回寻址失败的指示信息(例如包括原因“唤醒失败”)。不限于此,在另一些示例中,在上述4之前,电子设备100也可以不执行绑定操作,具体说明和上述类似,不再赘述。
在另一种实施方式中,也可以不借助NewTalk链路(主链路)建立辅助链路,而是通过周边设备建立辅助链路,其中,该周边设备可以但不限于是通过近场通信方式和分享设备通信的设备,通过蜂窝通信方式、卫星等远场通信方式和分享设备通信的设备,分享设备已知的设备(如存储有该设备的信息)或者分享设备未知的设备(如未存储有该设备的任何信息)。在一些示例中,电子设备100是不具备寻址能力的设备,电子设备100可以通过周边设备建立和电子设备200之间的辅助链路,例如,电子设备100为智能手表(如modem下电)、平板电脑(如没有SIM卡接口)、智能音箱或耳机等设备,电子设备100可以通过和电子设备100连接的智能手机建立和电子设备200之间的辅助链路。
在一种实施方式中,上述方式三下建立的辅助链路是分享设备和被分享设备之间的辅助链路,在另一种实施方式中,上述方式三下建立的辅助链路包括分享设备和中继设备之间的辅助链路1,以及中继设备和被分享设备之间的辅助链路2。在一些示例中,电子设备100是不具备直接建立辅助链路的能力的设备,电子设备100可以通过中继设备和电子设备200进行通信,例如,电子设备100为智能手表(如modem下电)、平板电脑(如没有SIM卡接口)、智能音箱或耳机等设备,电子设备100可以通过和电子设备100连接的智能手机建立和电子设备200之间的辅助链路。
需要说明的是,处于通话态的分享设备可以使用上述方式一、方式二或方式三建链,处于空闲态的分享设备可以使用上述方式二或方式三建链。
接下来示例性介绍通信地图和如何根据通信地图进行预测建链。
在一种实施方式中,通信地图可以包括通用的通信地图,通用的通信地图可以包括多个电子设备的进行众包数据,例如但不限于包括以下至少一种数据:服务集标识(service set identifier,SSID)(也可称为是Wi-Fi ID)、蜂窝小区ID(CELLID)、信号强度参数(例如参考信号接收功率(reference signal receiving power,RSRP))、信号质量参数(例如参考信号接收质量(reference signal receiving quality,RSRQ))、通话QoE参数(例如丢包率、时延、断续次数等)、链路传输质量参数(例如丢包率、时延、抖动等)、时间周期、GNSS定位的经纬度信息、GNSS定位的绝对位置信息、GNSS定位的室内的相对位置信息、通话对象的信息(例如电话号码等)。
在一些示例中,根据通用的通信地图进行预测建链可以是:根据多个电子设备的进行众包数据在云端(例如服务器)进行数据分析,以得到通信链路在空间和时间的特征,得到的特征可以用于确定以下至少一项:建链的时间、建链的地点和建立的链路的类型等,建立的链路可以包括物理链路和/或逻辑链路,其中,通过不同通信方式建立的物理链路不同,通过同一种通信方式建立的多个逻辑链路可以不同,例如,通过电子设备的不同端口建立的同一种通信方式的逻辑链路可以不同,例如,通过蜂窝通信方式或Wi-Fi通信方式建立的中继链路和穿越链路可以为不同的逻辑链路。可以理解为是,电子设备可以通过云端判断在某个时间段、某个地点建立某种链路是否稳定,其中,稳定时通信质量较好,不稳定时通信质量较差,通信质量例如但不限于通过丢包率、时延、抖动、带宽等来确定。也可以理解为是,电子设备可以使用其他电子设备的通信情况来指导本设备的建链行为,例如,在时间段1内,处于地点1的其他电子设备建立的蜂窝通信链路的通信质量较差,因此本设备可以不在时间段1内、处于地点1时建立蜂窝通信链路,以此保证通话质量,无需本设备逐个时间逐个地点逐个链路学习,有效节省功耗。
在一种实施方式中,通信地图可以包括个性化的通信地图,个性化的通信地图可以是学习使用习惯、操作行为等个人信息得到的后续可能执行的用户操作。在一些示例中,个性化的通信地图可以包括上述多个电子设备的进行众包数据。在一些示例中,个性化的通信地图可以包括私人数据,例如但不限于包括以下至少一种数据:每个通话对象的亲昵程度(例如通过通话时长、通话时段、通讯录标注的备注/关系、通话时的心情愉悦程度等表征)、和每个通话对象进行实时分享的信息(例如时间、地点、频度等)、通话过程中同时观看/收听视频/音频的情况、通话过程中传输链接和文件等数据的情况、通话过程中用户的操作习惯和行为序列(例如常用按键、触摸方式、触摸位置等)、历史通话的预测建链的准确度等。
在一些示例中,电子设备或云端(例如服务器)可以根据个性化的通信地图标记高频的实时分享对象(可简称为高频对象),例如,将预设周期内(例如一周内)进行实时分享的次数最多的N个对象(N为正整数)标记为高频对象,和/或,将按照实时分享的时间从晚到早排列的前M个对象标记为高频对象。上述标记的高频对象可以用于进行预测建链。
在一些示例中,电子设备或云端(例如服务器)可以根据个性化的通信地图标记亲昵对象,例如,将通讯录中备注/关系为家人、领导、朋友、同事等类型的联系人标记为亲昵对象,和/或,将通话记录中通话较为频繁(例如通话次数较多和/或通话时间较近)的联系人标记为亲昵对象。上述标记的亲昵对象以及和亲昵对象进行实时分享的信息(例如时间、地点、频度等)可以用于进行预测建链。
在一些示例中,电子设备或云端(例如服务器)可以根据个性化的通信地图预测操作行为,例如,根据通话过程中同时观看/收听视频/音频的情况、通话过程中传输链接和文件等数据的情况、通话过程中用户的操作习惯(例如常用按键、触摸方式、触摸位置等)预测用户的操作行为。预测的操作行为可以用于进行预测建链。
在一种实施方式中,预测建链可以用于实现以下至少一个功能:
第一,最优链路的选择。例如,当可以建立多个链路时,可以选择出这多个链路中最优/较优/相对最优的至少一个链路,建立这至少一个链路,不建立这多个链路中的其他链路。
第二,确定建立链路的最优时间。例如,当前时段内可建立的链路都较差时,可以选择之后的某个最优/较优/相对最优的时间点建立链路。
第三,按照预测的用户意图来建立链路。例如,通过个性化的通信地图学习用户的操作行为,从而预测用户后续的操作行为,当预测到用户下一步操作的意图是进行实时共享时,建立链路。
在一种实施方式中,上述示例的通信地图可以是以网格的形式来划片区分,例如图29所示的规格为20米×20米的网格。在一些示例中,通用的通信地图经过数据清洗和分析后,可以得到每个网格中的各种链路的拥塞状态和建链的最佳时期等。在一些示例中,个性化的通信地图在电子设备的本地或者云端过滤掉用户的隐私信息后,可以根据模型预测用户是否有建链的需求,可选地以及需要建链的时间。不限于上述示例的情况,在另一些示例中,也可以是不规则的形状,在另一些示例中,也可以是携带有海拔等三维信息的形式,本申请对通信地图的具体形式不作限定。
图30示例性示出一种预测建链的流程示意图。如图30所示,分享设备可以在发起实时分享之前,记录操作行为和序列,例如,记录通话过程中的操作行为和序列。分享设备可以根据记录的操作行为和序列,以及通信地图进行预测建链。分享设备可以根据预测建链的结果和实际是否发起实时分享的结果判断预测建链是否正确。在一些示例中,当预测是否建链的结果为是,以及实际是否发起实时分享的结果为是时,预测正确。在另一些示例中,当预测是否建链的结果为是,以及实际是否发起实时分享的结果为否时,预测不正确。在另一些示例中,当预测是否建链的结果为否,以及实际是否发起实时分享的结果为是时,预测不正确。在另一些示例中,当预测是否建链的结果为否,以及实际是否发起实时分享的结果为否时,预测正确。在一种实施方式中,分享设备可以记录实际是否发起实时分享的操作行为和序列,以用于后续进行预测建链。在一种实施方式中,分享设备可以记录预测建链是否正确的结果,并将该结果反馈至预测建链的系统,以用于后续进行预测建链。
不限于上述示例的情况,在另一些示例中,预测建链可以由云端服务器或其他网络设备执行,减少电子设备的处理压力,节省功耗。
以上示例以分享设备发起建链为例进行说明,在另一些示例中,也可以是被分享设备发起建链,本申请对此不作限定。
传输是在分享设备和至少一个被分享设备之间传输实时分享的数据流。
在一种实施方式中,分享设备和被分享设备之间可以直接点对点的传输数据流,例如图31所示,分享设备可以直接向被分享设备1发送数据流1,分享设备可以直接向被分享设备2发送数据流2。在另一种实施方式中,分享设备和被分享设备之间可以通过中继设备(例如服务器等网络设备)传输数据流,例如图31所示,分享设备可以通过中继设备向被分享设备1发送数据流3,也就是说,数据流3可以经过中继设备中转。数据流3可以经过分享设备和中继设备之间的链路,以及中继设备和被分享设备之间的链路。分享设备可以通过中继设备向被分享设备2发送数据流4,具体说明和上述说明类似,不再赘述。
在一种实施方式中,数据流是分层传输的,例如图32A所示的音频流/视频流的传输架构,从上往下可以依次为:数据层(例如音频数据/视频数据),编码层(例如使用H.265、H.264等音频/视频编码标准,例如使用Opus等声音编码格式),传输层(例如使用RTP、实时流传输协议(real time streaming protocol,RTSP)),网络层(例如使用TCP/IP协议,或者使用用户数据报协议(user datagram protocol,)UDP),物理层(例如使用蜂窝通信/Wi-Fi/BT/D2D/卫星等物理链路的协议)。
在一种实施方式中,基于图32A所示的频流/视频流的传输架构,传输的数据包的格式例如为图32B所示,该数据包可以包括网络协议头部(例如IP Head)、传输协议头部(例如RTP Head)、编码信息头部(例如H.265 Head/Opus Head)和原始数据(RAW Data)等字段。
在一种实施方式中,当存在多条用于传输实时分享的数据流的通信链路时,分享设备可以按照预设的传输规则对实时分享的数据流进行分流(例如通过图27所示的多路径传输管理模块中的四网+实现),该传输规则可以但不限于包括以下至少一种:
规则1:音频流和视频流分离传输。音频流和视频流分别编码/独立编码,音频流通过链路A传输,视频流通过链路B传输,例如,链路A为低时延和/或低抖动的较为稳定的通信链路,链路B为大带宽和/或低资费或无资费的通信链路。
规则2:音频流和视频流分离传输。应用级/系统级/背景的音频流和通话数据流进行混音编码(具体可参见图26A的分享设备侧的说明),混音编码后的音频流通过链路A传输,视频流通过链路B传输。
规则3:音频流和视频流分离传输。基础的音频流1和基础的视频流1通过链路A传输,丰富的音频流2和丰富的视频流2通过链路B传输。
规则4:音频流和基础的视频流1通过链路A传输,丰富的视频流2通过链路B传输。
其中,数据流为基础的数据流或丰富的数据流和编码(如分层编码)相关,编码程度较高的数据流可以为丰富的数据流,编码程度较低的数据流可以为基础的数据流,例如,对于同一张图像而言,该图像的缩略图为基础的数据,该图像的原图为丰富的数据。
规则5:音频流和视频流一起传输。相同时间戳的音频流和视频流一起编码,在一些示例中,可以在同一条连路上传输,在另一些示例中,也可以根据链路质量的变化动态迁移到其他链路上,以确保传输的最优效果。
规则6:音频流和/或视频流冗余补包。在一些示例中,补包可以在相同的链路上传输,例如每次携带相邻两帧的编码数据,在另一些示例中。补包可以通过其他至少一条链路上传输。在一些示例中,音频流和/或视频流可以部分冗余补包,在另一些示例中,也可以全部冗余补包。
在一种实施方式中,为了确保接收端(被分享设备)接收一组数据包的时间最短,分享设备可以在发送一组数据包时,根据多条通信链路的传输时延和抖动情况进行分流传输(例如通过图27所示的多路径传输管理模块中的HON实现),接收端(被分享设备)可以在接收到分享设备发送的这组数据包的全部后进行组包,具体示例可参见下图33。其中,不同链路之间存在差别,因此不同链路的传输时延和抖动情况可以不同,可选地,物理链路之间存在差别,例如蓝牙链路和Wi-Fi链路的传输时延和抖动情况不同,可选地,逻辑链路之间存在差别,例如,使用不同端口建立的多条Wi-Fi链路的传输时延和抖动情况不同,例如,直接建立的Wi-Fi链路和通过中继设备建立的Wi-Fi链路的传输时延和抖动情况不同。
图33示例性示出一种分流传输的示意图。
如图33所示,分享设备可以包括端口1、端口2和端口3,被分享设备可以包括端口4和端Wi-Fi口5,其中,端口1和端口4之间建立有链路1,端口2和端口4之间建立有链路2,端口2和端口5之间建立有链路3,端口3和端口5之间建立有链路4。假设这四条链路按照时延从高到低依次为:链路1、链路2、链路3和链路4,则分享设备在发送数据包1、数据包2和数据包3这一组数据包时,可以依次执行:通过链路4发送数据包1,通过链路3传输数据包2,通过链路2传输数据包2,通过链路1传输数据包1,其中,链路2传输的数据包2可以为补包。因此,被分享设备可以在相近的时刻接收到上述数据包1、数据包2和数据包3,并进行组包,避免部分数据包到达被分享设备的时刻远远晚于其他数据包到达被分享设备的时刻,减少被分享设备接收一组数据包的时间,传输效率更高。
接下来示例性说明通过Wi-Fi进行实时共享的实现方式。
请参见图34,图34示例性示出又一种共享系统10的架构示意图。在一些示例中,图34所示的共享系统10可以应用于通过Wi-Fi进行实时共享的场景。
图34所示的部分模块的功能和可能实现可参见前述实施例中的电子设备的软件架构的描述,例如图2E所示的共享系统10的说明。
如图34所示,共享系统10中的电子设备100和电子设备200之间可以通过Wi-Fi进行一起看、一起听、一起玩和一起编辑等实时共享。
以下示例中,以电子设备100为发送实时分享的数据流的分享设备,电子设备200为接收实时分享的数据流的被分享设备为例进行说明,并且,以下示例以组播场景为例进行说明。在组播场景下,电子设备100也可称为是组播发送方(source),电子设备200也可称为是组播接收方(sink),其中,电子设备200为多个组播接收方中的任意一个,不限于此,也可以应用于单播或广播场景,本申请对此不作限定。
在一种实施方式中,如图34所示,电子设备100的软件系统可以分为四层,从上至下分别为应用程序框架层、内核层、固件层和硬件层。应用程序框架层包括分享模块、发现模块和抓取模块。内核层包括传输协议栈、编码模块、组播管理协议、组播控制算法和组播密钥管理。固件层包括组播帧发送和组播帧加密。硬件层包括Wi-Fi的基带和射频。其中:
抓取模块可以用于抓取分享数据,例如抓取应用级/系统级/背景的音频和/或图像,并对抓取的音频和/或图像进行编码,以生成音频/视频的源数据。
编码模块可以用于在发送音频/视频的数据报文之前对该数据报文进行编码(例如喷泉编码),从而提高传输的可靠性,降低空口信道丢包的概率。
组播管理协议可以用于管理Wi-Fi组播组的成员,例如成员的加入和离开。
组播控制算法可以用于动态控制组播报文的聚合调度、调制编码策略(modulation and coding scheme)的信号调制等级等。
组播密钥管理可以用于管理组播密钥,例如组播密钥的动态生成和分发。
组播帧发送可以用于将音频/视频数据封装为Wi-Fi的组播数据帧(可简称为Wi-Fi组播帧),并通过空口(向电子设备200等Wi-Fi组播组的成员)发送Wi-Fi组播帧。
组播帧加密可以用于基于组播密钥对Wi-Fi组播帧进行加密,可选地,组播帧发送通过空口发送的Wi-Fi组播帧具体为加密后的Wi-Fi组播帧。
Wi-Fi的基带和射频用于发送/接收W-Fi组播帧。
在一种实施方式中,如图34所示,电子设备200的软件系统可以分为四层,从上至下分别为应用程序框架层、内核层、固件层和硬件层。应用程序框架层包括分享模块、发现模块和播放模块。内核层包括传输协议栈、解码模块、组播管理协议和组播密钥管理。固件层包括组播帧过滤和组播帧解密。硬件层包括Wi-Fi的基带和视频。具体说明和上述电子设备100的软件系统的说明类似,接下来主要说明电子设备200中和电子设备100的模块不同的模块。
播放模块可以用于对音频/视频数据进行解码,并输出解码后的音频/视频数据。
解码模块可以用于对接收到的音频/视频的数据报文进行解码(例如喷泉解码),以恢复丢失的数据报文。
组播帧过滤可以用于在空口接收到Wi-Fi组播帧后,基于电子设备200已加入的组播组的地址信息进行过滤,将不属于该组播组的Wi-Fi组播帧丢弃,保留属于该组播组的Wi-Fi组播帧。
组播帧解密可以用于在空口接收到Wi-Fi组播帧后,基于组播密钥对接收到的Wi-Fi组播帧进行解密。
接下来示例性说明通过Wi-Fi进行实时共享时的发现、连接、传输和离开等过程。
在一种实施方式中,分享设备(组播发送方)可以作为Wi-Fi组播组的源设备,通过广播消息搜索附近设备以完成设备的发现。在一种实施方式中,分享设备(组播发送方)完成设备的发现之后,可以向被分享设备(组播接收方)发送实时分享的请求,被分享设备接受该请求后可以和分享设备完成组播地址、组播密钥等信息的协商,从而完成连接。
请参见图35,图35示例性示出一种设备发现和连接的流程示意图。
如图35所示,设备的发现可以但不限于包括以下步骤:
1.电子设备100(分享设备/组播发送方)向电子设备200(被分享设备/组播接收方)发送广播消息,以搜索附近设备。其中,广播消息例如但不限于为Wi-Fi广播消息或者蓝牙广播消息。
2.电子设备200响应于接收到的广播消息,向电子设备100发送电子设备200的通讯信息。其中,通讯信息例如但不限于包括电子设备200的ID、MAC地址等信息。
3.电子设备100和电子设备200之间完成设备的认证。
在一些示例中,电子设备100的发现模块和电子设备200的发现模块可以完成设备的发现,例如图35的1-3。
如图35所示,设备连接可以但不限于包括以下步骤:
4.电子设备100向电子设备200发送实时分享的请求。例如,电子设备100响应于以上实施例所述的用于触发实时分享功能的用户操作,向电子设备200发送实时分享的请求。
5.电子设备200接受电子设备100发送的实时分享的请求。例如,电子设备200响应于用户操作,接受该实时分享的请求。
6.电子设备100和电子设备200传输组播地址和协商组播密钥。
在一些示例中,电子设备100的组播管理协议和电子设备200的组播管理协议可以完成连接,例如图35的4-6。
在一种实施方式中,设备的发现和连接之后,分享设备(组播发送方)和多个被分享设备(组播接收方)之间可以传输实时分享的数据流。
请参见图36,图36示例性示出一种通过Wi-Fi传输实时分享的数据流的示意图。该传输过程可以但 不限于包括如下步骤:
1.电子设备100抓取并生成实时分享的数据流(简称分享数据流)。
在一些示例中,电子设备100(例如包括的抓取模块)抓取应用图层/系统图层/背景图层的图像和/或音频,并对抓取的图像和/或音频进行编码,以生成音频/视频的源数据(即上述分享数据流)。
2.电子设备100对分享数据流进行切片,并封装为组播数据帧。
在一些示例中,电子设备100(例如包括的传输协议栈)对音频/视频的源数据进行切片,并封装为组播数据帧。
3.电子设备100对组播数据帧进行编码。
在一些示例中,电子设备100(例如包括的编码模块)对组播数据帧进行喷泉编码,增加冗余信息。
4.电子设备100对组播数据帧进行加密。
在一些示例中,电子设备100(例如包括的组播帧加密)基于协商的组播密钥对组播数据帧进行加密。
5.电子设备100向电子设备200发送组播数据帧。
在一些示例中,电子设备100(如包括的组播帧发送)基于Wi-Fi数据组播协议,在空口向电子设备200等组播组成员发送组播帧。
6.电子设备200过滤接收到的组播数据帧。
在一些示例中,电子设备200在空口接收到组播数据帧后,电子设备200(例如包括的组播帧过滤)可以将不属于电子设备200所在的组播组的组播数据帧丢弃,保留属于该组播组的组播数据帧。
7.电子设备200对组播数据帧进行解密。
在一些示例中,电子设备200(例如包括的组播帧解密)基于协商的组播密钥对组播数据帧进行解密。
8.电子设备200对组播数据帧进行解码。
在一些示例中,电子设备200(例如包括的解码模块)对组播数据帧进行喷泉解码,基于冗余信息恢复丢失的数据帧。
9.电子设备200对组播数据帧进行解封装和重组,以得到分享数据流。
在一些示例中,电子设备200(例如包括的传输协议栈)将组播数据帧进行解封装和重组,以恢复为音频流/视频流(即分享数据流)。
10.电子设备200播放分享数据流。
在一些示例中,电子设备200(例如包括的播放模块)对分享数据流进行解码,并在前台显示和/或播放解码后的视频流/音频流。
在一些示例中,用于实时分享的音频流/视频流的传输流向可以为:电子设备100的源应用/源系统(用于产生实时分享的音频流/视频流)->电子设备100的抓取模块->电子设备100的传输协议栈->电子设备100的编码模块->电子设备100的组播帧加密->电子设备100的组播帧发送->电子设备200的组播帧过滤->电子设备200的组播帧解密->电子设备200的解码模块->电子设备200的传输协议栈->电子设备200的播放模块->电子设备200的目标应用/目标系统(用于输出实时分享的音频流/视频流)。
在一种实施方式中,分享设备(组播发送方)和多个被分享设备(组播接收方)中的任意一个设备可以退出当前的实时共享,可以理解为是该设备可以离开当前的组播组。
在一些示例中,任意一个组播接收方(例如电子设备200)接收到退出当前的实时共享的指令时,可以通知组播发送方(电子设备100),组播发送方可以在组播组中删除该成员(即上述组播接收方),具体示例可参见图37。
图37示例性示出一种组播接收方离开的流程示意图。该离开过程可以但不限于包括以下步骤:
1.电子设备200(组播接收方)接收退出实时共享的指令。例如,电子设备200接收针对图5C所示的用户界面530中的“退出观看”的选项531A的触摸操作(例如点击操作)。
2.电子设备200向电子设备100(组播发送方)发送离开的通知消息,例如,该通知消息为组播信令帧。
3.电子设备100删除组播组成员:电子设备200。
4.电子设备100向电子设备200发送确认离开的响应消息,例如,该响应消息为组播信令帧。
在一些示例中,组播发送方接收到退出当前的实时共享的指令时,可以通知其他组播组成员(多个组播接收方)离开当前组播组,并删除当前组播组,具体示例可参见图38。
图38示例性示出一种组播发送方离开的流程示意图。该离开过程可以但不限于包括以下步骤:
1.组播发送方接收退出实时共享的指令。例如,电子设备100(组播发送方)接收针对图4C所示的用 户界面430中的“暂停分享”的选项431E的触摸操作(例如点击操作)。
2.组播发送方通知所有组播组成员退出。具体地,组播发送方分别向多个组播接收方(组播接收方1、…、组播接收方N,N为大于1的正整数)发送退出组播组的通知消息,例如,该通知消息为组播信令帧。
3.多个组播接收方(组播接收方1、…、组播接收方N)向组播发送方发送确认退出的响应消息,例如,该响应消息为组播信令帧。
在一些示例中,电子设备100的组播管理协议和电子设备200的组播管理协议可以完成组播组成员的维护,例如实现图37所示的组播接收方离开和/或图38所示的组播发送方离开。
在一种实施方式中,组播信令帧的格式例如为图39所示,组播信令帧可以包括:目的地址(destination address,DestAddr)、源地址(source address,SrcAddr)、类型/长度(Type/Length)、实际目标地址(Actual DestAddr)、实际源地址(Actual SrcAddr)、控制编号(Control ID)、发送窗口下沿(transport lower,TX LE)、有效载荷(Payload)等字段(fields),其中:
目标地址(6字节(octets))属于组播地址,是组播信令帧在以太网头和MAC头对应的接收地址。
源地址(6字节)属于组播地址,是组播信令帧在以太网头和MAC头对应的发送地址。
可以理解地,通过属于组播地址的目标地址和源地址可以避免攻击者按照实际源地址和/或实际目标地址获取组播密钥的情况,提升数据传输的安全性。
类型/长度(2字节)可以包括组播类型(Multicast Type)和子类型(Subtype),其中,组播帧类型(10比特(bit))用于表征组播帧的类型,例如,组播信令帧中的类型字段为0x1FF。子类型(6bit)用于表征组播帧的子类型。
实际目标地址(6字节)是实际接收组播信令帧的组播MAC地址,在一些示例中,组播MAC地址可以是组播地址段,实际接收组播信令帧的多个设备的MAC地址可以在该组播地址段内。
实际源地址(6字节)是实际发送组播信令帧的设备的MAC地址。
控制编号(1字节)是控制信令帧的编码,可以用于重传。
发送窗口下沿(1字节)用于指示接收端对接收窗口进行移位。
有效载荷是具体的控制信令信息,有效载荷的大小在不同情况下可以不同,即为变量(Variable)。
在一些示例中,组播信令帧可以是WLAN的原始组播帧。不限于此,在另一些示例中,组播信令帧也可以是通过华为万磁链接(Huawei Magneto Link,HML)确定的,这样可以满足更低功耗、WLAN并发场景。
接下来示例性说明通过蓝牙进行实时共享的实现方式。以下示例以实时共享音频为例进行说明。
请参见图40,图40示例性示出又一种共享系统10的架构示意图。在一些示例中,图40所示的共享系统10可以应用于通过蓝牙进行实时共享的场景。
图40所示的部分模块的功能和可能实现可参见前述实施例中的电子设备的软件架构的描述,例如图2E所示的共享系统10的说明。
如图40所示,共享系统10中的电子设备100和电子设备200之间可以通过蓝牙进行一起看、一起听、一起玩和一起编辑等实时共享。以下示例中,以电子设备100为发送实时分享的数据流的分享设备(也可称为源设备),电子设备200为接收实时分享的数据流的被分享设备(也可称为接收设备)为例进行说明。可以理解地,通过蓝牙进行实时共享可以应用于单播、组播或广播场景,在组播或广播场景下,电子设备200可以为多个接收设备中的任意一个。
在一种实施方式中,在组播或广播场景下,源设备可以同时向多个接收设备发送音频数据,多个接收设备接收到音频数据后同时播放音频数据。
在一种实施方式中,如图40所示,电子设备100的软件系统可以分为上层,从上至下分别为应用程序框架层、原生层(Native)和内核层。应用程序框架层包括源应用/源系统、音频框架模块、分享模块和分享服务,其中,分享服务包括设备管理和密钥管理。原生层包括音频抽象模块和蓝牙堆栈,其中,蓝牙堆栈包括编码模块、蓝牙协议栈、传输标准和时间戳同步。内核层包括蓝牙芯片/驱动(例如为Hi110x),蓝牙芯片/驱动包括蓝牙低功耗控制器(BLE Controller)。其中:
源应用/源系统用于产生实时分享的音频流,例如为音乐应用、视频应用或游戏应用。
音频框架模块(Audio Framework)和音频抽象模块(Audio HAL)分别在应用程序框架层和原生层负责管理音频功能。在一些示例中,源应用/源系统产生的音频数据可以发送至音频框架模块,经音频框架模块处理后发送至音频抽象模块,音频抽象模块可以将处理后的音频数据发送至蓝牙堆栈处理。
设备管理可以用于管理通过蓝牙进行实时共享的设备,例如设备的加入和离开。
密钥管理可以用于管理蓝牙密钥,例如蓝牙密钥的生成,该蓝牙密钥例如为广播同步流(broadcast isochronous streams,BIS)协议的密钥。
编码模块可以用于对音频抽象模块发送的音频数据进行编码,例如L3编码。
蓝牙协议栈例如为BIS协议栈。
传输标准可以为用于传输音频单播/组播/广播的配置参数的标准,例如但不限于包括广播音频扫描服务(broadcast audio scan service,BASS)、基础音频配置(basic audio profile,BAP)、通用属性配置文件(generic attribute profile,GATT)。
时间戳同步可以用于和其他接收设备同步时间,以方便后续多个接收设备接收到音频数据后同时播放音频数据。
蓝牙芯片/驱动可以用于将蓝牙堆栈处理后的音频数据发送至接收设备。
在一种实施方式中,如图40所示,电子设备200的软件系统可以包括蓝牙模块和音频模块,其中,蓝牙模块包括时间戳同步、密钥管理、广播模块(例如用于实现BIS广播)、传输标准和蓝牙低功耗控制器。音频模块包括音频队列、解码模块、音频同步和编解码器,具体说明和上述电子设备100的软件系统的说明类似,接下来主要说明电子设备200中和电子设备100的模块不同的模块。
蓝牙模块可以用于接收并处理源设备发送的音频数据,并将处理后的音频数据发送至音频模块。
音频队列可以用于缓存蓝牙模块处理后的音频数据。当音频队列缓存的音频数据的大小大于或等于预设阈值时,音频模块可以处理音频队列中的音频数据。
解码模块可以用于对音频队列中的数据进行解码,例如L3解码。
音频同步可以用于和其他接收设备约定播放音频数据的时刻,以方便后续多个接收设备接收到音频数据后在上述约定的时刻同时播放音频数据。
编解码器可以用于对音频/视频数据进行解码,并得到原始的音频数据。接收设备可以在和其他设备约定的时刻播放原始的音频数据,即多个接收设备可以同时播放原始的音频数据。
在一种实施方式中,用于实时分享的音频流在图40所示的软件系统中的传输流向例如图41所示。首先,源设备(电子设备100)选择音源(例如源应用)和接收设备(假设包括电子设备200)后,电子设备100可以和电子设备200建立蓝牙连接并传输蓝牙密钥。上述音源的音频数据可以从源应用依次传输至音频框架模块、音频抽象模块、编码模块。编码模块可以对上述音源的PCM原始音频数据进行编码(例如L3编码),编码后的音频数据可以从编码模块传输至蓝牙协议栈,蓝牙协议栈可以根据上述传输的蓝牙密钥对编码后的音频数据进行加密,加密后的音频数据可以传输至蓝牙芯片,电子设备100可以通过蓝牙芯片将音频数据发送至电子设备200。电子设备200可以通过蓝牙芯片接收音频数据,并将该音频数据传输至蓝牙协议栈,蓝牙协议栈可以基于上述传输的蓝牙密钥对该音频数据进行解密。解密后的音频数据可以依次传输至解码模块(例如使用L3解码)、编解码器进行解码,解码后得到的原始音频数据可以用于进行播放(例如通过扬声器播放)。
在一种实施方式中,一起听等实时共享音频流的场景下,源设备可以通过蓝牙广播实现音频数据的分发(可简称为数据分发)。在一些示例中,数据分发可以采用BLE Audio BIS技术。在一些示例中,数据分发的原理是在固定的传输间隔(例如BIS的传输间隔)上,以顺序或者交错的方式重复发送音频数据,以提高广播接收的成功率,接收设备可以无需和源设备建立连接就能单向接收和播放源设备发送的音频数据,具体示例可参见下图42。
图42示例性示出一种音频数据的发送过程的示意图。图42以传输间隔(interval)为20毫秒(ms),每个数据包重复发送3次为例进行说明。
如图42所示,在传输周期1(时长为传输间隔:20ms)的起始时刻,源设备开始发送数据包,例如每隔150微秒发送一次,依次发送了:数据包1的左(left)声道部分(可简称为数据包1-L)、数据包1的右(right)声道部分(可简称为数据包1-R)、数据包1-L、数据包1-R、数据包2-L、数据包2-R,也就是说,源设备在6.9ms内传输了2次数据包1和1次数据包2。在传输周期2(时长为传输间隔:20ms)的起始时刻(即传输周期1的终止时刻),源设备再次发送数据包,例如每隔150微秒发送一次,依次发送了:数据包2-L、数据包2-R、数据包2-L、数据包2-R、数据包3-L、数据包3-R,也就是说,源设备在6.9ms内传输了2次数据包2和1次数据包3。
在一种实施方式中,源设备可以选择多个接收设备进行一起听等实时共享音频流,例如上图6D所示的实施方式,源设备可以向上述选择的多个接收设备传输用于加密音频流的广播密码。在一些示例中,源 设备可以通过BLE和上述选择的多个接收设备进行连接,并发起安全管理协议配对(security manager protocol,SMP),以进行链路加密,然后,源设备可以通过加密后的链路向上述选择的多个接收设备传输广播密码(Broadcast Code)。源设备可以依次向上述选择的多个接收设备传输广播密码,当前接收设备完成广播密码的接收后源设备可以和该接收设备断开连接,以向下一个接收设备传输广播密码,具体示例可参见下图43。
图43示例性示出一种密码传输过程的流程示意图。图43以源设备选择了N个设备(设备1、…、设备N,N为大于1的正整数)中的两个设备进行实时共享为例进行说明,该过程可以但不限于包括如下步骤:
1.源设备接收到实时共享的指令后高占空比扫描(设备)。
2.N个设备(设备1、…、设备N)持续向源设备发送蓝牙广播消息(例如BLE广播)。
其中,图43的1和2的顺序不作限定,例如可以是同时执行的。
3.源设备选择设备1和设备N。例如,源设备响应于用户操作,选择设备1和设备N。
4.源设备向设备1请求建立蓝牙连接。在一些示例中,设备1接受源设备的请求后,源设备和设备1建立蓝牙连接(例如BLE连接)。
5.源设备基于SMP和设备1进行配对和加密源设备和设备1之间的蓝牙链路。
6.源设备通过加密的蓝牙链路向设备1发送广播密码。
7.源设备和设备1之间断开蓝牙连接。
上述4-7之后,源设备可以向下一个接收设备(设备N)传输广播密码,即执行图43中的8-11,8-11和说明和上述4-7类似,不再赘述。
在一种实施方式中,一起听等实时共享音频流的场景下,可以通过蓝牙广播(例如BIS广播)实现多个接收设备的同步机制,即多个接收设备同时播放源设备发送的音频数据。在一些示例中,多个接收设备可以通过源设备的广播公告服务(Broadcast Audio Announcement Service,BAP)获取第一参数(例如延迟参数(Presentation_Delay)),并在接收到源设备发送的音频数据后延时第一时长播放该音频数据,其中,第一时长可以是根据第一参数确定的,例如第一时长即为第一参数,具体示例可参见下图44。
图44示例性示出一种多设备同步过程的流程示意图。图44以多个接收设备为设备1和设备2为例进行说明,该过程可以但不限于包括如下步骤:
1.源设备基于BAP向设备1和设备2发送Presentation_Delay。
2.源设备向设备1和设备2发送音频报文1(也可称为是广播音频报文1)。
3.设备1接收到音频报文1后延时Presentation_Delay播放音频报文1。
4.设备2接收到音频报文1后延时Presentation_Delay播放音频报文1。
其中,图44的3和4可以是同时执行的。
可以理解地,基于蓝牙广播机制,多个接收设备可以同时接收到源设备发送的某个音频数据报文,多个接收设备可以在接收到该音频数据报文后延时第一时长再播放,以达到多个接收设备同时播放该音频数据报文的目的,用户体验感更好。
以上实施例以一级分享的场景(即一个分享设备向至少一个被分享设备分享第一多媒体数据流)为例进行说明,在另一些实施例中,还可以应用于多级分享的场景,例如应用于二级分享的场景:上述至少一个被分享设备中的任意一个设备可以再作为分享设备向至少一个设备分享第二多媒体数据流,其中,第二多媒体数据流和第一多媒体数据流可以相同或不同。多级分享的场景下的任意一级分享可参见以上实施例所示的一级分享的说明。
接下来示例性示出一些多级分享的场景。
图45A示例性示出一种二级分享场景的示意图。图45A以第一级分享通过Wi-Fi实现,第二级分享通过蓝牙实现为例进行示意。
如图45A所示,电子设备401(第一级设备)可以通过Wi-Fi广播向电子设备402、电子设备403等第二级设备实时分享,电子设备402可以通过蓝牙单播向电子设备404(第三级设备)实时分享音频流/视频流,电子设备403可以通过蓝牙广播向电子设备405、电子设备406等第三级设备实时分享。
图45B示例性示出又一种二级分享场景的示意图。图45B以第一级分享和第二级分享均通过Wi-Fi实现为例进行示意。
如图45B所示,电子设备411(第一级设备)可以通过Wi-Fi广播向电子设备412、电子设备413等 第二级设备实时分享,电子设备413可以通过Wi-Fi广播向电子设备414、电子设备415等第三级设备实时分享,可以理解为是Wi-Fi级联中继的场景。
图45C示例性示出又一种二级分享场景的示意图。图45C以第一级分享和第二级分享均通过蓝牙实现为例进行示意。
如图45C所示,电子设备421(第一级设备)可以通过蓝牙广播向电子设备422、电子设备423等第二级设备实时分享,电子设备422可以通过蓝牙单播向电子设备424(第三级设备)实时分享,电子设备423可以通过蓝牙广播向电子设备425、电子设备426等第三级设备实时分享。
图45D示例性示出一种三级分享场景的示意图。图45D以第一级分享通过NewTalk、卫星等远场通信方式实现,第二级分享和第三级分享通过WiFi、D2D、BT等近场通信方式实现为例进行示意。
如图45D所示,电子设备431(第一级设备)可以通过NewTalk链路或辅助链路单播向电子设备432(第二级设备)实时分享。电子设备432可以通过D2D单播向电子设备433(第三级设备)实时分享,同时也可以通过蓝牙广播向电子设备434等第三级设备实时分享。电子设备434(第三级设备)可以通过Wi-Fi广播向电子设备435、电子设备436等第四级设备实时分享。
不限于图45D示例的远场分享(第一级分享)+近场分享(第二级分享)+近场分享(第三级分享)的三级分享场景,在另一些示例中,也可以是远场分享+远场分享+近场分享,在另一些示例中,也可以是远场分享+近场分享+远场分享,在另一些示例中,也可以是近场分享+远场分享+近场分享,在另一些示例中,也可以是近场分享+近场分享+远场分享,在另一些示例中,也可以是近场分享+远场分享+远场分享,本申请对此不作限定。
不限于上述示例的多级分享的场景,在另一些示例中,图45A中,电子设备401(第一级设备)可以通过Wi-Fi单播向电子设备402(第二级设备)实时分享。本申请中的多级分享的场景中,任意一级分享可以是单播、组播或广播。
在一种实施方式中,进行实时共享的电子设备可以根据网络环境调整编码/解码实时分享的多媒体数据的码率,例如,当网络带宽较大时,码率可以较大,当网络带宽较小时,码率可以较小,即支持音频/视频的动态码率,可以自适应网络。不限于此,在另一种实施方式中,还可以根据电子设备的电量/功耗、对输出效果的要求等来调整编码/解码实时分享的多媒体数据的码率,本申请对此不作限定。这样可以在各种场景下平衡用户体验和设备功耗,提升设备可用性。
在一种实施方式中,分享设备可以实时共享3G/4G/5G/6G广播数据,例如,通过图2C-图2E所示的3G/4G/5G/6G广播模块实现。接下来以5G/NR通信系统中的组播广播业务(multicast and broadcast service,MBS)为例进行说明,其他情况类似。
在NR MBS中,NR广播/组播(组播也可称为多播)技术可以通过共享无线和传输资源,以点对多点的方式传输用户业务,让一份业务流覆盖尽可能多的用户,从而在有效提升网络资源利用率的同时,也提升用户的业务体验,减少资源拥堵造成的业务体验差的问题。
请参见图46A,图46A是本申请提供的一种NR通信系统的架构示意图。图46A示例性示出了NR广播/组播的通信场景和NR单播的通信场景。
图46A所示的NR通信系统可以包括广播平台4611、核心网设备4612、核心网设备4613、基站4614、基站4615、多个用户设备(UE)4616以及多个UE4617。其中,广播平台4611、核心网设备4612、基站4614和多个UE4616可以实现单播通信。广播平台4611、核心网设备4613、基站4615和多个UE4617可以实现广播/组播通信。广播平台4611可以是网络设备,例如为提供5G广播数据和相关服务的业务服务器,广播平台4611也可称为是广播业务服务器。任意一个UE的说明可参见以上实施例所示的电子设备的说明。
如图46A所示,广播平台4611可以通过核心网设备4612、基站4614和多个UE4616中的任意一个UE(以UE4616A为例进行说明)进行单播通信。在下行方向,广播平台4611可以通过核心网设备4612向基站4614发送数据,基站4614接收到该数据后再向UE4616A发送该数据,即以点对点的方式进行下行传输。广播平台4611可以和多个UE4616分别进行单播通信,此时,多个UE4616可以分别使用不同的承载,例如,3个UE4616使用3个承载。在一些示例中,UE4616A也可以以点对点的方式和基站4614、核心网设备4612或广播平台4611进行上行传输,具体说明类似,不再赘述。在一些示例中,进行单播通信时,广播平台4611、核心网设备4612、基站4614中至少一个设备可以感知UE4616A。
如图46A所示,广播平台4611可以通过核心网设备4613、基站4615和多个UE4617进行广播通信或者组播通信。在下行方向,广播平台4611可以通过核心网设备4613向基站4615发送数据,基站4615接收到该数据后可以向多个UE4617发送数据,即以点对多点的方式进行下行传输。此时,多个UE4617可以使用相同的承载,例如,3个UE4617共用1个承载。在一些示例中,进行广播通信时,可以不进行上行传输,在一些示例中,进行广播通信时,广播平台4611、核心网设备4613和基站4615可以不感知多个UE4617。在一些示例中,进行组播通信时,可以进行上行传输,例如,UE4616A可以以点对点的方式和基站4615、核心网设备4613或广播平台4611进行上行传输。在一些示例中,进行组播通信时,广播平台4611、核心网设备4613和基站4615中至少一个设备可以感知多个UE4617。
在一种实施方式中,和广播平台4611进行广播/组播通信的多个UE4617中的任意一个UE可以向其他至少一个UE实时分享接收到的广播/组播数据,例如图46A所示,UE4617A可以将基站4615发送的广播/组播数据分享给UE4618,可以理解地,UE4617A具备接收3G/4G/5G/6G广播数据的能力(例如UE4617A包括modem),而UE4618可以具备接收3G/4G/5G/6G广播数据的能力,也可以不具备接收3G/4G/5G/6G广播数据的能力,UE4618例如但不限于为以下任意一种情况:
情况1:UE4618具备接收3G/4G/5G/6G广播数据的能力,但UE4618所在的基站不具备接收3G/4G/5G/6G广播数据的能力。
情况2:UE4618具备接收3G/4G/5G/6G广播数据的能力,但UE4618所在的基站和UE4617A所在的基站不同(例如属于不同的运营商)。
情况3:UE4618具备接收3G/4G/5G/6G广播数据的能力,但UE4618所在的基站和UE4617A所在的基站(例如属于同一个运营商)距离较远。
情况4:UE4618不具备接收3G/4G/5G/6G广播数据的能力。
以上4种情况下,UE4618无法接收和播放UE4617A能接收到的频道数据,但UE4617A可以将接收到的频道数据实时分享给UE4618,让UE4618的使用者可以观看/收听到UE4617A能接收到的频道,不受环境和设备的限制,使用场景更加广泛,用户体验感更好。
需要说明的是,图46A所示的广播平台、核心网设备、基站和UE的形态和数量仅用于示例,本申请实施例对此不作限定。
接下来示例性介绍广播/组播通信场景。
请参见图46B,图46B是本申请提供的又一种NR通信系统的架构示意图。
图46B所示的NR通信系统可以包括广播业务服务器(例如为图46A所示的广播平台4611)、核心网、无线接入网(RAN)和n个UE(UE1、UE2、…、UEn,n为大于1的正整数)。其中,核心网可以包括至少一个核心网设备,例如包括图46A所示的核心网设备4612和核心网设备4613。RAN可以包括至少一个接入网设备,例如包括图46A所示的基站4614和基站4615。任意一个UE的说明可参见以上实施例所示的电子设备的说明。
如图46B所示,(1)广播业务服务器可以通知核心网广播(broadcast)启动,(2)核心网可以通知RAN广播启动,例如但不限于发送业务ID、小区列表(cell list)等信息,(3)RAN可以向核心网返回广播响应,(4)RAN可以通过多播控制信道(multicast control channel,MCCH)实现n个UE的广播频道配置。其中,配置的频道可以为一个或多个,不同频道对应的广播数据可以不同,UE可以接收已配置频道对应的广播数据。
如图46B所示,假设已配置了频道1和频道2。(5)广播业务服务器可以通过核心网向RAN发送频道1对应的广播数据(可简称为频道1数据),RAN可以通过多播业务信道(multicast traffice channel,MTCH)向n个UE发送频道1数据。(6)广播业务服务器也可以通过核心网向RAN发送频道2数据,RAN可以通过MTCH向n个UE发送频道2数据。其中,频道1数据的广播过程和频道2数据的广播过程的顺序不作限定。在一些示例中,RAN向n个UE发送频道1数据时使用的MTCH和发送频道2数据时使用的MTCH可以不同。
在一种实施方式中,对于上述n个UE中的任意一个UE而言,可以响应于用户操作确定接收的频道数据,例如,用户可以选择让UE接收频道1数据但不接收频道2数据。
在一种实施方式中,上述n个UE中的任意一个UE可以将接收到的频道数据实时分享给其他至少一个UE。
请参见图46C,图46C是本申请提供的又一种NR通信系统的架构示意图。
图46C所示的NR通信系统可以包括图46A中的UE4617A和UE4618,UE4617A可以向UE4618实 时分享3G/4G/5G/6G广播数据,为了方便说明,以3G/4G/5G/6G广播数据为频道1数据为例进行说明。
如图46C所示,UE4617A可以包括应用处理器(AP)、调制解调处理器(modem)和无线通信模块,图46C以无线通信模块包括蜂窝通信模块、Wi-Fi通信模块、蓝牙通信模块、卫星通信模块为例进行说明,在具体实现中,无线通信模块可以包括更多或更少的通信模块。其中:
应用处理器可以包括广播/组播应用程序(APP)、分享传输模块、传输协议栈、广播/组播网卡(MBS network,MBSNET)、A核数据服务(ADS)、显示驱动、播放驱动和抓取模块。广播/组播APP可以是用于实现MBS的APP(例如通话),可以包括用于提供用户体验(UI/UX)显示的模块、用于提供服务逻辑的模块、传输模块和编解码模块,其中,传输模块可以用于从传输协议栈接收3G/4G/5G/6G广播数据并发送至编解码模块,编解码模块可以用于对接收到3G/4G/5G/6G广播数据进行编码或者解码,解码后的数据可以在广播/组播APP上播放。分享传输模块可以用于实现向其他UE实时分享多媒体数据流。传输协议栈例如为TCP/IP协议栈。显示驱动可以用于调用显示屏等显示模块实现显示功能。播放驱动可以用于调用扬声器等音频模块实现音频播放功能。抓取模块可以用于抓取解码后、可直接播放的多媒体数据流,例如抓取正在播放的多媒体数据流。
调制解调处理器可以包括NR协议栈、C核数据服务(C-core data service,CDS)和广播/组播业务(MBS),其中,NR协议栈可以包括媒体接入控制(medium access control,MAC)层、无线链路控制(radio link control,RLC)层、分组数据汇聚协议(packet data convergence protocol,PDCP)层。
如图46C所示,UE4617A可以通过蜂窝通信模块中的3G/4G/5G/6G广播模块接收频道1数据,然后,3G/4G/5G/6G广播模块可以将频道1数据发送给调制解调处理器。在调制解调处理器中,频道1数据可以依次传输至MAC层、RLC层、PDCP层,PDCP层再将频道1数据发送给CDS,由CDS将频道1数据发送至应用处理器。在应用处理器中,频道1数据可以依次传输至ADS、MBSNET、传输协议栈。
在一种实施方式中,UE4617A可以从应用处理器中获取频道1数据,并将频道1数据实时共享给UE4618,其中,UE4617A从应用处理器中获取频道1数据的方式可以但不限于包括以下三种:
获取方式1:UE4617A可以从广播/组播APP中获取解码后的频道1数据。在一些示例中,频道1数据传输至传输协议栈后,传输协议栈可以将频道1数据发送至广播/组播APP中的传输模块,传输模块再将频道1数据发送至编解码模块进行解码,解码后的频道1数据可以发送至分享传输模块,由分享传输模块实时共享给UE4618。
获取方式2:UE4617A可以直接从传输协议栈获取解码前的频道1数据。在一些示例中,频道1数据传输至传输协议栈后,传输协议栈可以将频道1数据发送至分享传输模块,由分享传输模块实时共享给UE4618。
获取方式3:UE4617A可以通过抓取模块抓取解码后的、显示和/或播放的频道1数据。在一些示例中,频道1数据传输至传输协议栈后,传输协议栈可以将频道1数据发送至广播/组播APP中的传输模块,传输模块再将频道1数据发送至编解码模块进行解码,解码后的频道1数据可以发送至显示驱动和/或播放驱动进行输出(显示和/或播放)。抓取模块抓取输出的多媒体数据流并传输至分享传输模块,由分享传输模块实时共享给UE4618。
在一种实施方式中,在上述获取方式2下,调制解调处理器可以将发送至分享传输模块的频道1数据在CDS进行路由,以通过对应的通信方式将频道1数据发送给UE4618。在一些示例中,UE4617A和UE4618通过蜂窝通信方式通信,此时可以使用图46C所示的传输方式1将频道1数据发送给UE4618,例如,在传输方式1下,CDS可以对频道1数据进行IP组包和IP头的替换,处理后的频道1数据再依次通过PDCP层、RLC层和MAC层,最后由蜂窝通信模块将频道1数据发送至UE4618。在另一些示例中,UE4617A和UE4618通过Wi-Fi方式通信,此时可以使用图46C所示的传输方式2将频道1数据发送给UE4618,例如,在传输方式2下,CDS可以将频道1数据发送给Wi-Fi通信模块(例如包括Wi-Fi芯片),Wi-Fi通信模块可以对频道1数据进行IP头的替换和IP组包,并将处理后的频道1数据发送至UE4618。或者,CDS可以对频道1数据进行IP组包和IP头的替换,再将处理后的频道1数据发送至Wi-Fi通信模块,由Wi-Fi通信模块发送给UE4618。不限于上述示例的情况,在另一些示例中,UE4617A和UE4618通过蓝牙通信方式通信时,可以使用图46C所示的传输方式3将频道1数据发送给UE4618,在另一些示例中,UE4617A和UE4618通过蓝牙通信方式通信时,可以使用图46C所示的传输方式4将频道1数据发送给UE4618,传输方式3和传输方式4的说明和传输方式2类似,不再赘述。UE4617A和UE4618还可以通过其他通信方式通信,可以使用对应的传输方式传输频道1数据,本申请对此不作限定。
可以理解地,在上述获取方式2下,分享设备的应用处理器可以无需被唤醒(例如无需对3G/4G/5G/6G 广播数据进行解码、播放3G/4G/5G/6G广播数据等),直接通过调制解调处理器将接收到的3G/4G/5G/6G广播数据发送给被分享设备,即可以提供低功耗传输模式,减少设备功耗,提升设备可用性。并且,无需分享设备在前台运行广播/组播APP(用于播放3G/4G/5G/6G广播数据),也无需分享设备具备解码和播放3G/4G/5G/6G广播数据的能力,拓宽应用场景,用户体验更好。
接下来示例性示出一些实时共享的场景,但不应构成限定。
场景一:朋友间进行运营商通话时,一方看到一个有趣的视频(如电影、电视、短视频)或音频(如音乐),想要和通话对方一起看该视频或一起听该音频,可以发起实时共享,这样通话双方可以边看/边听边讨论,具体示例可参见上述一起看的实时共享场景的说明。解决了目前运营商通话中无法实时分享音频流/视频流的问题,提升用户体验。
场景二:下属和领导间进行运营商通话时,下属需要向领导汇报/分享文档材料(例如word格式、excel格式或PPT格式等),对文档材料的内容进行逐行/逐页的说明,并基于领导的意见进行修改,领导希望同步看到修改的结果,但下属又不希望领导看到下属使用的手机等电子设备上的其他应用的画面/音频,因此,下属可以仅实时共享文档材料的应用,领导不仅可以同步看到修改结果,还可以执行对文档材料进行修改,具体示例可参见上述一起编辑的实时共享场景的说明,使用更加灵活。
场景三:消费者打电话向客服咨询购买物品的使用方法时,客服可以发起实时共享,将指导的视频、图片或文档分享给消费者,减少电话沟通时花费的时间和精力。
场景四:子女上网课时,上班的父母希望能在公司午休时间或者上下班途中等时间看到子女上网课的样子,远程监督一下子女,以及看到网课的画面,可以针对子女有疑问的部分进行讲解和点评,远程指导一下子女,则子女可以通过电话实时共享网课应用的音频流/视频流,同时分享麦克风采集的声音和摄像头采集的人脸图像,具体示例可参见图15A和图15B。
场景五:家里的老人不会使用某些东西,例如使用手机时无法上网,需要外地的子女通过电话远程指导,外地的子女可以主动发起实时共享的请求,家里的老人接受请求后就会自动共享老人手机屏幕的画面,更加方便快捷地指导老人。
场景六:在家庭聚餐时用户可以通过Wi-Fi等近场通信方式向附近的多个用户共享应用的实时画面,可以理解为是近场一对任意(1 to any)的实时共享场景。
场景七:在朋友聚会时用户可以通过近场通信方式向附近的多个用户共享游戏应用的实时画面,和/或一起玩同一个游戏(具体示例可参见图23A-图23C),可以理解为是近场一对任意(1 to any)的实时共享场景。
场景八:在同一个会议室进行商务会议时,用户可以通过近场通信方式向附近的多个用户共享文档材料的内容,甚至可以一起编辑文档材料,可以理解为是近场一对任意(1 to any)的实时共享场景。
场景九:在近场一对任意(1 to any)的实时共享场景下,以场景六为例进行说明,分享用户出于自身的隐私安全考虑,希望将视频的部分或全部内容共享给其他人,但不希望将视频的源文件发送给其他人,因此,分享用户可以在自己的设备上播放视频,当播放到想要共享的内容时通过近场通信方式进行实时共享,当播放到不想要共享的内容时停止实时共享。而且分享用户可以不允许被分享用户保存和转发实时共享的视频,具体示例可参见图15D、图16A-图16E。
场景十:可以一对一实时共享音频,也可以一对多将一个音频共享到多个耳机上播放,从而实现一起听,而无需发送音频文件。并且分享用户可以不允许被分享用户保存和转发实时共享的音乐,保障音乐的版权。
场景十一:户外派对(Party)或户外广场舞,可以通过近场广播方式将一个音频共享到多个音箱设备上播放,避免一个大音箱大音量播放导致的扰民,避免冷场。
场景十二:领导向下属电话,想将文档材料实时共享给该下属和该下属附近的同事,当又不方便直接发送文档材料,因此,领导可以实时共享文档材料的应用给该下属,该下属再实时共享给附近的同事,让附近的同事可以通过自己的设备观看,无需多人聚集在该下属周边共用该下属的设备(例如手机等较小的移动终端)观看,具体示例可参见图45C,这样可以大大提升用户的观看体验。
本申请中的“显示”均可以替换为其它输出方式,例如通过扬声器播放。类似地,本申请中的“播放”也可以替换为其他输出方式,例如通过显示屏显示。
本申请中的输出不仅包括通过设备本身的显示屏等输出模块执行,而且包括通过和设备连接的其它设备的显示屏等输出模块执行。
本申请中的麦克风可以替换为其他可以采集音频/语音/声音的模块。
本申请中的摄像头可以替换为其他可以拍摄/采集图像的模块。
本申请各实施例提供的方法中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、网络设备、用户设备或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机可以存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,数字视频光盘(digital video disc,DWD)、或者半导体介质(例如,固态硬盘(solid state disk,SSD)等。以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (30)

  1. 一种共享方法,其特征在于,应用于第一设备,所述方法包括:
    显示第一界面,所述第一界面用于指示当前和第二设备进行运营商通话;
    和所述第二设备进行所述运营商通话时,显示第一应用的第二界面;
    接收作用于所述第二界面的第一用户操作;
    向所述第二设备发送第一数据,所述第一数据用于所述第二设备输出所述第二界面相关的多媒体数据。
  2. 如权利要求1所述的方法,其特征在于,所述第一界面和所述第二界面包括第一悬浮窗,所述第一用户操作为作用于所述第一悬浮窗中的分享控件的用户操作;或者,
    所述第一用户操作为以第一轨迹滑动的用户操作。
  3. 如权利要求1或2所述的方法,其特征在于,所述向所述第二设备发送第一数据之前,所述方法还包括:
    显示所述第二界面时,抓取所述第二界面相关的多媒体数据,所述第一数据包括所述第二界面相关的多媒体数据。
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    通过所述运营商通话的主链路向所述第二设备发送所述运营商通话的通话数据和所述第一数据。
  5. 如权利要求1-3任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    通过所述运营商通话的主链路向所述第二设备发送所述运营商通话的通话数据,通过所述运营商通话的数据通路向所述第二设备发送所述第一数据。
  6. 如权利要求1-3任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    通过所述运营商通话的主链路向所述第二设备发送所述运营商通话的通话数据,通过辅助链路向所述第二设备发送所述第一数据。
  7. 如权利要求6所述的方法,其特征在于,所述向所述第二设备发送第一数据之前,所述方法还包括:
    向网络设备发送第一请求消息,所述第一请求消息包括所述第二设备的标识信息;
    接收所述网络设备基于所述第一请求消息发送的所述第二设备的会话标识;
    根据所述第二设备的会话标识和所述第二设备建立所述辅助链路。
  8. 如权利要求1-7任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    显示第三界面,所述第三界面包括多个设备的信息;
    接收作用于所述多个设备中的所述第二设备的第二用户操作;
    向所述第二设备发送所述第一数据。
  9. 如权利要求8所述的方法,其特征在于,所述多个设备包括以下至少一项:发现的设备、连接的设备、最近一次进行过运营商通话的设备、存储有标识信息的设备、根据拍摄的图像识别的设备。
  10. 如权利要求1-9任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    显示第四界面,所述第四界面包括多个窗口的信息;
    接收作用于所述多个窗口中的第一窗口的第三用户操作,所述第一窗口包括所述第二界面的内容;
    向所述第二设备发送所述第一数据。
  11. 如权利要求10所述的方法,其特征在于,所述多个窗口包括以下至少一项:前台应用的窗口、后台应用的窗口、所述第一设备已安装但未运行的应用的窗口。
  12. 如权利要求1-7任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    显示第五界面,所述第五界面包括多个共享方式;
    接收作用于所述多个共享方式中的第一方式的第四用户操作;
    显示第六界面,所述第六界面包括多个窗口和多个设备的信息,所述多个窗口和所述多个设备是根据所述第一方式确定的;
    接收作用于所述多个窗口中的第二窗口的第五用户操作,接收作用于所述多个设备中的所述第二设备的第六用户操作,所述第二窗口包括所述第二界面的内容;
    根据所述第五用户操作和所述第六用户操作,向所述第二设备发送所述第一数据。
  13. 如权利要求1-12任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据之前,所述方法还包括:
    接收第七用户操作;
    响应于所述第七用户操作,确定共享数据的类型为第一类型;其中,
    当所述第一类型为音频时,所述第一数据包括所述第二界面相关的音频数据;
    当所述第一类型为图像时,所述第一数据包括所述第二界面相关的视频数据;
    当所述第一类型为音频和图像时,所述第一数据包括所述第二界面相关的音频数据和视频数据。
  14. 如权利要求1-12任一项所述的方法,其特征在于,所述第一数据包括所述第二界面相关的视频数据;所述方法还包括:
    接收作用于所述第二界面且以第二轨迹滑动的第八用户操作;
    向所述第二设备发送第二数据,所述第二数据包括所述第二界面相关的音频数据。
  15. 如权利要求1-14任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据之前,所述方法还包括:
    接收用于选择所述第二界面中的第一区域的第九用户操作,所述第一数据包括所述第一区域相关的多媒体数据。
  16. 如权利要求1-14任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据之前,所述方法还包括:
    接收用于选择所述第二界面中的第一图层的第十用户操作,所述第一数据包括所述第一图层相关的多媒体数据。
  17. 如权利要求1-14任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    当所述第一应用不为预设应用时,向所述第二设备发送所述第一数据,所述预设应用的安全等级高于第一等级。
  18. 如权利要求1-14任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    识别到所述第二界面中的第二区域相关的数据的安全等级高于第二等级;
    向所述第二设备发送所述第一数据,所述第一数据不包括所述第二区域相关的数据。
  19. 如权利要求1-18任一项所述的方法,其特征在于,所述显示第一应用的第二界面,包括:
    接收网络设备发送的第一频道的广播数据;
    根据所述第一频道的广播数据显示所述第二界面。
  20. 如权利要求1-18任一项所述的方法,其特征在于,所述方法还包括:
    接收所述网络设备发送的第二频道的广播数据,所述第一设备显示的用户界面和所述第二频道的广播数据无关;
    接收第十一用户操作;
    向第三设备发送所述第二频道的广播数据,所述第二频道的广播数据用于所述第三设备输出所述第二频道的音频和/或视频。
  21. 如权利要求1-20任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    向所述第二设备发送所述第一数据和第三数据,所述第三数据包括所述第一设备通过麦克风采集到的音频数据和/或所述第一设备通过摄像头采集到的图像数据。
  22. 如权利要求1-21任一项所述的方法,其特征在于,所述方法还包括:
    接收第十二用户操作;
    响应于所述第十二用户操作,确定不授予所述第二设备保存所述第一数据的权限和转发所述第一数据的权限;
    接收所述第二设备发送的第二请求消息,所述第二请求消息用于请求保存和/或转发所述第一数据;
    根据所述第二请求消息显示第一提示信息。
  23. 如权利要求1-22任一项所述的方法,其特征在于,所述方法还包括:
    接收所述第二设备发送的第三请求消息,所述第三请求消息用于请求向所述第一设备实时分享多媒体数据;
    根据所述第三请求消息显示第二提示信息;
    接收第十三用户操作,所述第十三用户操作用于接受所述第二请求消息指示的请求;
    接收所述第二设备发送的第四数据;
    输出所述第四数据。
  24. 如权利要求23所述的方法,其特征在于,所述输出所述第四数据,包括:根据所述第四数据显示第七界面,所述第一设备显示所述第七界面时,所述第二设备显示所述第二界面的内容;或者,
    所述输出所述第四数据,包括:分屏显示所述第二界面和第八界面,所述第八界面是根据所述第四数据确定的。
  25. 如权利要求23或24所述的方法,其特征在于,所述接收所述第二设备发送的第四数据之后,所述方法还包括:
    接收第十四用户操作;
    向第四设备发送所述第四数据,以使所述第四设备输出所述第四数据。
  26. 如权利要求1-25任一项所述的方法,其特征在于,所述向所述第二设备发送第一数据,包括:
    通过第一链路和第二链路向所述第二设备发送所述第一数据,所述第一链路为蜂窝通信链路或辅助链路,所述第二链路包括以下至少一项:蓝牙链路、无线保真Wi-Fi链路、V2X链路、卫星链路、点对点D2D链路、蜂窝通信链路和辅助链路,所述第一链路和所述第二链路不同。
  27. 如权利要求1-26任一项所述的方法,其特征在于,所述方法还包括:
    显示第九界面,所述第九界面包括所述第一设备运行的多个用户界面的信息;
    接收作用于所述第九界面中的第一控件的第十五用户操作,所述第一控件和所述多个用户界面中的第十界面相关;
    向第五设备发送第五数据,所述第五数据用于所述第五设备输出所述第十界面相关的多媒体数据。
  28. 如权利要求1-27任一项所述的方法,其特征在于,所述方法还包括:
    显示第十一界面,所述第十一界面包括控制中心的多个功能的信息;
    接收作用于所述第十一界面中的第二控件的第十六用户操作,所述第二控件和所述多个功能中的分享功能相关;
    向第六设备发送第六数据,所述第六数据用于所述第六设备输出所述第一设备的前台应用的多媒体数据。
  29. 一种电子设备,其特征在于,包括收发器、处理器和存储器,所述存储器用于存储计算机程序,所 述处理器调用所述计算机程序,用于执行如权利要求1-28任一项所述的方法。
  30. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有计算机程序,所述计算机程序被处理器执行时,实现权利要求1-28任一项所述的方法。
PCT/CN2023/108156 2022-07-22 2023-07-19 共享方法、电子设备及系统 WO2024017296A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210867898.7A CN117478686A (zh) 2022-07-22 2022-07-22 共享方法、电子设备及系统
CN202210867898.7 2022-07-22

Publications (1)

Publication Number Publication Date
WO2024017296A1 true WO2024017296A1 (zh) 2024-01-25

Family

ID=89617128

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/108156 WO2024017296A1 (zh) 2022-07-22 2023-07-19 共享方法、电子设备及系统

Country Status (2)

Country Link
CN (1) CN117478686A (zh)
WO (1) WO2024017296A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012134150A2 (ko) * 2011-03-29 2012-10-04 (주)티아이스퀘어 커뮤니케이션 서비스 수행 도중 멀티미디어 콘텐츠 공유 서비스 제공 방법 및 시스템
CN106716954A (zh) * 2014-09-10 2017-05-24 微软技术许可有限责任公司 电话呼叫期间的实时共享
CN108781271A (zh) * 2016-02-02 2018-11-09 三星电子株式会社 用于提供图像服务的方法和装置
CN113452945A (zh) * 2020-03-27 2021-09-28 华为技术有限公司 分享应用界面的方法、装置、电子设备及可读存储介质
CN113489937A (zh) * 2021-07-02 2021-10-08 北京字跳网络技术有限公司 视频共享方法、装置、设备及介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012134150A2 (ko) * 2011-03-29 2012-10-04 (주)티아이스퀘어 커뮤니케이션 서비스 수행 도중 멀티미디어 콘텐츠 공유 서비스 제공 방법 및 시스템
CN106716954A (zh) * 2014-09-10 2017-05-24 微软技术许可有限责任公司 电话呼叫期间的实时共享
CN108781271A (zh) * 2016-02-02 2018-11-09 三星电子株式会社 用于提供图像服务的方法和装置
CN113452945A (zh) * 2020-03-27 2021-09-28 华为技术有限公司 分享应用界面的方法、装置、电子设备及可读存储介质
CN113489937A (zh) * 2021-07-02 2021-10-08 北京字跳网络技术有限公司 视频共享方法、装置、设备及介质

Also Published As

Publication number Publication date
CN117478686A (zh) 2024-01-30

Similar Documents

Publication Publication Date Title
JP6324625B2 (ja) ライブインタラクティブシステム、情報の送信方法、情報の受信方法及び装置
CN111316598B (zh) 一种多屏互动方法及设备
US11025686B2 (en) Network call method and apparatus, terminal, and server
JP6339226B2 (ja) リアルタイムトランスポートプロトコルのためのストリーミング制御
US20150019694A1 (en) Method for Screen Sharing, Related Device, and Communications System
KR20190086517A (ko) 생방송방 비디오 스트림 전송 제어방법, 해당 서버 및 이동단말
US20120173622A1 (en) Social screen casting
US11936921B2 (en) Method for managing network live streaming data and related apparatus, and device and storage medium
WO2016150270A1 (zh) 群组会话消息处理方法和装置
AU2017254981A1 (en) Reduced latency server-mediated audio-video communication
WO2018205786A1 (zh) 分享录屏视频的方法、装置及设备
WO2022121775A1 (zh) 一种投屏方法及设备
WO2015117513A1 (zh) 视频会议控制方法和系统
WO2022100304A1 (zh) 应用内容跨设备流转方法与装置、电子设备
US10778742B2 (en) System and method for sharing multimedia content with synched playback controls
WO2021155702A1 (zh) 通信处理方法、装置、终端、服务器及存储介质
WO2019100259A1 (zh) 数据传输方法、装置及无人机
US11374992B2 (en) Seamless social multimedia
JP7181990B2 (ja) データ伝送方法及び電子デバイス
WO2023030099A1 (zh) 跨设备交互的方法、装置、投屏系统及终端
CN111245854B (zh) 一种媒体传输方法、媒体控制方法及装置
CN114449090B (zh) 数据分享方法、装置、系统及电子设备、存储介质、芯片
WO2024017296A1 (zh) 共享方法、电子设备及系统
WO2022267640A1 (zh) 视频共享方法、电子设备及存储介质
WO2022222691A1 (zh) 一种通话处理方法及相关设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23842358

Country of ref document: EP

Kind code of ref document: A1