WO2024104122A1 - Procédé de partage, dispositif électronique et support de stockage informatique - Google Patents

Procédé de partage, dispositif électronique et support de stockage informatique Download PDF

Info

Publication number
WO2024104122A1
WO2024104122A1 PCT/CN2023/127957 CN2023127957W WO2024104122A1 WO 2024104122 A1 WO2024104122 A1 WO 2024104122A1 CN 2023127957 W CN2023127957 W CN 2023127957W WO 2024104122 A1 WO2024104122 A1 WO 2024104122A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
data
link
sharing
message
Prior art date
Application number
PCT/CN2023/127957
Other languages
English (en)
Chinese (zh)
Inventor
贾银元
张利
许浩维
李坚
张金明
孙方林
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024104122A1 publication Critical patent/WO2024104122A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • the present application relates to the field of computer technology, and in particular to a sharing method, an electronic device and a computer storage medium.
  • the function of real-time sharing of multimedia data streams such as audio and video streams is very limited.
  • the range of communication methods used to achieve real-time sharing is relatively narrow.
  • users often share content by speaking during a call, and the other party cannot see/hear the relevant content.
  • the environment requirements for achieving real-time sharing are relatively large. For example, when sharing the screen through a conference application, both parties of the communication must install the conference application.
  • the present application discloses a sharing method, an electronic device and a computer storage medium, which can provide users with a simple and convenient real-time sharing function in an operator call scenario and reduce limitations.
  • the present application provides a sharing method, applied to a first device, the method comprising: displaying a first interface, the first interface being used to indicate that a current operator call is being conducted with a second device; receiving a first user operation; sending a first message to a third device, the first message being used to instruct the third device to share first data with the second device, the first data comprising multimedia data related to a second interface displayed by the third device, the first data being used for the second device to output multimedia data related to the second interface.
  • the third device is a device discovered by the first device, a device connected to the first device, a device for which identification information is stored by the first device, or a device recognized by the first device based on a captured image.
  • the first device when the first device and the second device conduct an operator call, the first device can, based on the received first user operation, instruct a known third device to share multimedia data with the second device so that the second device outputs the multimedia data.
  • the second device and the third device are not required to conduct an operator call (for example, the operator call between the first device and the second device is not required to be migrated to the third device), and the second device and the third device do not need to install specific applications.
  • the first interface includes a first floating window
  • the first user operation includes a user operation acting on a sharing control in the first floating window
  • the first user operation is a user operation of sliding with a first track.
  • sending the first message to a third device includes: displaying a third interface, wherein the third interface includes information of multiple devices; receiving a second user operation acting on the second device among the multiple devices; and sending the first message to the third device.
  • the shared device ie, the second device
  • the shared data may be determined in response to a user operation, which makes the user's use more flexible and improves the user experience.
  • the multiple devices include at least one of the following: a device discovered by the first device, a device connected to the first device, a device with which the first device had a most recent operator call, a device whose identification information is stored by the first device, and a device recognized by the first device based on a captured image.
  • the devices discovered by the first device include: devices discovered by the first device through near field communication or far field communication.
  • the device to which the first device is connected includes: a device to which the first device is connected via a near field communication method or a far field communication method.
  • the device with which the first device last made a call with the operator may also be replaced by: Devices connected at one time.
  • the identification information includes at least one of the following: an Internet Protocol IP address, a Media Access Control MAC address, a serial number SN, an International Mobile Equipment Identity IMEI, a telephone number, and a personal account of a chat application.
  • the captured images include images captured by the first device and/or images captured by a device to which the first device is connected.
  • sending a first message to a third device includes: displaying a fourth interface, wherein the fourth interface includes information of multiple windows; receiving a third user operation acting on a first window among the multiple windows, wherein the first window includes content of the second interface; and sending the first message to the third device.
  • the content for sharing (ie, the first data) may be determined in response to a user operation, which makes the user's use more flexible and improves the user experience.
  • the multiple windows include at least one of the following: a window of a foreground application of the third device, a window of a background application of the third device, and a window of an application that is installed but not running on the third device
  • the third device is any one of the following: a device discovered by the first device, a device connected to the first device, a device whose identification information is stored by the first device, and a device recognized by the first device based on a captured image.
  • the sharing content available for users to choose can be multimedia data of foreground applications, multimedia data of background applications, or multimedia data of installed but not running applications, thereby meeting users' needs for real-time sharing of different multimedia data and improving user experience.
  • sending a first message to a third device includes: sending a second message to the second device, the second message being used to request sending the first data to the second device; receiving a third message sent by the second device, the third message being used to accept the request indicated by the second message; and sending the first message to the third device.
  • the first device can first request the second device to share data. After the second device accepts the request, the first device instructs the third device to share data with the second device, so as to avoid the third device directly sharing data with the second device when the second device refuses to accept the request.
  • the shared data is often large, which effectively reduces the transmission resources. There is no need for the third device to request the second device to share data, which reduces the transmission pressure of the third device. Even if the third device is a device with poor performance, it can also share in real time, further expanding the application scenarios.
  • the method also includes: receiving the first data sent by the third device; sending the call data of the operator call and the first data to the second device through the main link of the operator call, or sending the call data of the operator call to the second device through the main link of the operator call and sending the first data to the second device through the auxiliary link.
  • the auxiliary link is a link that multiplexes the main link of the operator call.
  • the auxiliary link is an IP Multimedia System IMS Data channel of the operator call.
  • the auxiliary link is a link other than the main link of the operator call and the IMS Data channel of the operator call.
  • the auxiliary link includes at least one of the following physical links: a cellular communication link, a Wireless Fidelity Wi-Fi link, a Bluetooth BT link, a point-to-point D2D link, and a satellite link.
  • the first data shared by the third device to the second device can be transmitted through the operator call link between the first device and the second device. Therefore, even if the third device does not have the ability to establish a link, real-time sharing can be performed, further broadening the application scenarios.
  • the method also includes: sending a fourth message to the network device, the fourth message including identification information of the second device; receiving a session identifier of the second device sent by the network device based on the fourth message; sending the session identifier of the second device to the third device, the session identifier of the second device is used to establish a first link between the third device and the second device, and the first link is used for the third device to send the first data to the second device.
  • the identification information includes a phone number, an over-the-top OTT identification ID, and a network account.
  • the third device can obtain the session identifier of the second device with the help of the first device, and thus establish a first link with the second device based on the session identifier of the second device for transmitting shared data. Even if the third device cannot communicate with the network device, real-time sharing can be performed, further broadening the application scenarios.
  • the method further includes: receiving a fourth user operation; sending a fifth message to the third device, wherein the fifth message is used to instruct the third device to share second data with the second device, wherein the second data includes files and/or locations of the third device.
  • the file and/or location of the third device may also be replaced by at least one of the following: Links, content on the clipboard, data collected by the camera, data collected by the microphone.
  • the third device can share a variety of data with the second device, which can meet the sharing needs of different users and different application scenarios, further broaden the application scenarios, and enhance the user experience.
  • the method also includes: disconnecting the link of the operator call between the second device; receiving a fifth user operation; sending a sixth message to the third device, wherein the sixth message is used to instruct the third device to send third data to the second device, and the third data includes at least one of the following: multimedia data related to the interface displayed by the third device, files of the third device, and the location of the third device.
  • the third device can also share data with the second device, further broadening the application scenarios and improving the user experience.
  • the method also includes: receiving a sixth user operation before the first device and the second device conduct the operator call; sending a seventh message to the third device, wherein the seventh message is used to instruct the third device to send fourth data to the second device, and the fourth data includes at least one of the following: multimedia data related to the interface displayed by the third device, files of the third device, and the location of the third device.
  • the third device may also share data with the second device, further broadening the application scenarios and improving the user experience.
  • the first data is sent by the third device to the second device through a first link between the third device and the second device, and the first link includes at least one of the following: a cellular communication link, an auxiliary link, a Bluetooth link, a wireless fidelity Wi-Fi link, a vehicle-to-vehicle wireless communication V2X link, a satellite link, and a device-to-device D2D link.
  • the third device and the second device can establish a new first link to transmit the first data, instead of relying on the link between the first device and the second device. In this way, even if the link between the first device and the second device is not established, disconnected or the transmission quality is poor, the real-time sharing process between the third device and the second device can also proceed normally, thereby enhancing the stability and fault tolerance of the sharing function and improving the user experience.
  • the present application provides another sharing method, which is applied to a third device, and the method includes: displaying a first interface; receiving a first user operation, or receiving a first message sent by the first device, the first message being sent by the first device after receiving a second user operation, the first message being used to instruct the third device to share first data with the second device, the third device being any one of the following: a device discovered by the first device, a device connected to the first device, a device in which identification information is stored by the first device, and a device recognized by the first device based on a captured image; sending the first data to the second device, and/or sending the first data to the first device so that the first device sends the first data to the second device, the second device being a device conducting an operator call with the first device, the first data including multimedia data related to the first interface, and the first data being used for the second device to output multimedia data related to the first interface.
  • the third device when the first device and the second device are conducting an operator call, the third device can share multimedia data with the second device based on the received operation of the first user, or the first device can instruct a known third device to share multimedia data with the second device based on the received operation of the second user, so that the second device outputs the multimedia data.
  • the second device and the third device are not required to conduct an operator call (for example, the operator call between the first device and the second device is not required to be migrated to the third device), and the second device and the third device do not need to install specific applications.
  • the first interface includes a first floating window
  • the first user operation includes a user operation on a sharing control in the first floating window
  • the second user operation includes a user operation on a sharing control in a second floating window displayed by the first device.
  • the first user operation is a user operation of sliding along a first track.
  • the second user operation is a user operation of sliding along a second track.
  • the sending of the first data to the second device, and/or the sending of the first data to the first device so that the first device sends the first data to the second device includes: displaying a second interface, the second interface including information of multiple devices; receiving a third user operation acting on the second device among the multiple devices; sending the first data to the second device, and/or the sending of the first data to the first device so that the first device sends the first data to the second device.
  • the shared device ie, the second device
  • the shared data may be determined in response to a user operation, which makes the user's use more flexible and improves the user experience.
  • the multiple devices include at least one of the following: a device discovered by the first device, a device connected to the first device, a device with which the first device had a most recent operator call, a device whose identification information is stored by the first device, and a device recognized by the first device based on a captured image.
  • the devices discovered by the first device include: devices discovered by the first device through near field communication or far field communication.
  • the device to which the first device is connected includes: a device to which the first device is connected via a near field communication method or a far field communication method.
  • the device with which the first device last conducted a carrier call may also be replaced by: the device with which the first device was last connected.
  • the identification information includes at least one of the following: an Internet Protocol IP address, a Media Access Control MAC address, a serial number SN, an International Mobile Equipment Identity IMEI, a telephone number, and a personal account of a chat application.
  • the captured images include images captured by the first device and/or images captured by a device to which the first device is connected.
  • the sending of the first data to the second device, and/or the sending of the first data to the first device so that the first device sends the first data to the second device includes: displaying a third interface, the third interface including information of multiple windows; receiving a fourth user operation acting on a first window among the multiple windows, the fourth window including content of the first interface; sending the first data to the second device, and/or the sending of the first data to the first device so that the first device sends the first data to the second device.
  • the content for sharing (ie, the first data) may be determined in response to a user operation, which makes the user's use more flexible and improves the user experience.
  • the multiple windows include at least one of the following: a window of a foreground application of the third device, a window of a background application of the third device, and a window of an application that is installed but not running on the third device.
  • the sharing content available for users to choose can be multimedia data of foreground applications, multimedia data of background applications, or multimedia data of installed but not running applications, thereby meeting users' needs for real-time sharing of different multimedia data and improving user experience.
  • sending the first data to the second device includes: sending the first data to the second device through a first link, wherein the first link includes at least one of the following: a cellular communication link, an auxiliary link, a Bluetooth link, a wireless fidelity Wi-Fi link, a vehicle-to-vehicle wireless communication V2X link, a satellite link, and a device-to-device D2D link.
  • the first link includes at least one of the following: a cellular communication link, an auxiliary link, a Bluetooth link, a wireless fidelity Wi-Fi link, a vehicle-to-vehicle wireless communication V2X link, a satellite link, and a device-to-device D2D link.
  • the third device and the second device can establish a new first link to transmit the first data, instead of relying on the link between the first device and the second device. In this way, even if the link between the first device and the second device is not established, disconnected or the transmission quality is poor, the real-time sharing process between the third device and the second device can also proceed normally, thereby enhancing the stability and fault tolerance of the sharing function and improving the user experience.
  • the method also includes: sending a second message to the network device, the second message including identification information of the second device; receiving a session identifier of the second device sent by the network device based on the second message; and establishing the first link with the second device based on the session identifier of the second device.
  • the identification information includes a phone number, an over-the-top OTT identification ID, and a network account.
  • the third device even if the third device does not originally store the session identifier of the second device, it can obtain the session identifier of the second device through the existing identification information of the second device, thereby establishing a first link with the second device.
  • the types of identification information are diverse, which increases the probability of successfully establishing an auxiliary link and has a wider range of application scenarios.
  • the method also includes: sending a third message to the first device, wherein the third message is used to request a session identifier of the second device; receiving a session identifier of the second device sent by the first device; and establishing the first link with the second device based on the session identifier of the second device.
  • the third device can obtain the session identifier of the second device with the help of the first device, thereby establishing a first link with the second device based on the session identifier of the second device for transmitting shared data. Even if the third device cannot directly obtain the session identifier of the second device, real-time sharing can be performed, further broadening the application scenarios.
  • the method further includes: receiving a fifth user operation, or receiving a fourth message sent by the first device, the fourth message being sent by the first device after receiving a sixth user operation, the fourth message being used to instruct the third device to share the second data with the second device; sending the second data to the second device, and/or sending the second data to the first device.
  • the first device is caused to send the second data to the second device, where the second data includes a file and/or a location of the third device.
  • the file and/or location of the third device may also be replaced by at least one of the following: a hyperlink displayed by the third device, content on a clipboard, data collected by a camera, or data collected by a microphone.
  • the third device can share a variety of data with the second device, which can meet the sharing needs of different users and different application scenarios, further broaden the application scenarios, and enhance the user experience.
  • the method further includes: after the operator call ends, receiving a seventh user operation; sending third data to the second device, the third data including at least one of the following: multimedia data related to the interface displayed by the third device, files of the third device, and the location of the third device.
  • the third device can also share data with the second device, further broadening the application scenarios and improving the user experience.
  • the method also includes: receiving an eighth user operation before the operator call starts; sending fourth data to the second device, the fourth data including at least one of the following: multimedia data related to the interface displayed by the third device, files of the third device, and the location of the third device.
  • the third device may also share data with the second device, further broadening the application scenarios and improving the user experience.
  • the present application provides an electronic device, including a transceiver, a processor and a memory, wherein the memory is used to store a computer program, and the processor calls the computer program to execute the sharing method in any possible implementation of any of the above aspects.
  • the present application provides a computer storage medium storing a computer program, which, when executed by a processor, implements the sharing method in any possible implementation of any of the above aspects.
  • the present application provides a computer program product, which, when executed on an electronic device, enables the electronic device to execute the sharing method in any possible implementation of any of the above aspects.
  • the present application provides an electronic device, the electronic device comprising a method or device for executing any implementation of the present application.
  • the above electronic device is, for example, a chip.
  • FIG1 is a schematic diagram of the architecture of a sharing system provided by the present application.
  • FIG2A is a schematic diagram of the hardware structure of an electronic device provided by the present application.
  • FIG2B is a schematic diagram of a software architecture of an electronic device provided by the present application.
  • FIG3A is a schematic diagram of the architecture of another sharing system provided by the present application.
  • FIG3B is a schematic diagram of the architecture of another sharing system provided by the present application.
  • FIG. 4-FIG 11 , FIG. 12A , and FIG. 12B are schematic diagrams of some user interface embodiments provided by the present application.
  • FIG13 is a flow chart of a sharing method provided by the present application.
  • FIG14 is a flow chart of another sharing method provided by the present application.
  • FIG15 is a schematic diagram of a coding negotiation process provided by the present application.
  • FIG16 is a schematic diagram of an encoding and decoding process provided by the present application.
  • FIG17 is a schematic diagram of a flow chart of an addressing process provided by the present application.
  • FIG18 is a schematic diagram of the architecture of another sharing system provided by the present application.
  • FIG19 is a schematic diagram of the hardware structure of another electronic device provided in the present application.
  • first and second are used for descriptive purposes only and are not to be understood as suggesting or implying relative importance or implicitly indicating the number of the indicated technical features.
  • a feature defined as “first” or “second” may explicitly or implicitly include one or more of the features, and in the description of the embodiments of the present application, unless otherwise specified, "plurality” means two or more.
  • an audio/video call referred to as an audio/video call
  • chat application two users can make an audio/video call (referred to as an audio/video call) through a chat application on their respective devices, wherein user 1 uses device 1 and user 2 uses device 2.
  • user 1 can migrate the audio/video call to a nearby device 3 (with the chat application installed), so that user 1 can make the audio/video call with user 2 through the chat application on device 3, and user 1 can operate device 3 to share the screen of device 3 with device 2.
  • This screen sharing has great limitations, including at least the following prerequisites: device 3 has a chat application installed, device 3 needs to be connected to the Internet/application server of the chat application (to communicate through the chat application), the audio/video call needs to be migrated to device 3, device 3 and device 2 need to maintain the audio/video call (which can be called in a call state), the user needs to operate device 3 to share the screen, and device 1 and device 3 remain connected.
  • the present application provides a sharing method, which is applied to a sharing system 10.
  • the sharing system 10 may include an electronic device 100, an electronic device 200, and an electronic device 300.
  • the electronic device 200 and the electronic device 300 can communicate with the electronic device 100.
  • the user can share the software resources and/or hardware resources (referred to as software and hardware resources) on the electronic device 300 with the electronic device 200 by operating the electronic device 100 or the electronic device 300.
  • the electronic device 100 and the electronic device 200 make a new call (NewTalk) and discover and/or connect the electronic device 300 through a near field communication method or a far field communication method.
  • the new call may be migrated to the electronic device 300, or it may not be migrated to the electronic device 300.
  • the above sharing is performed after the new call between the electronic device 100 and the electronic device 200 ends, and in other examples, the above sharing is performed when the electronic device 100 and the electronic device 200 do not have a new call, wherein the new call may include but is not limited to any of the following: operator call (also called phone call), over the top (OTT) call, device to device (D2D) call, satellite call, vehicle to X (V2X) call, etc.
  • OTT call can be a business that bypasses the operator to develop various video and other data services based on the open Internet, such as audio and video calls realized by chat applications, audio and video conferences realized by conference applications, etc.
  • sharing the software and hardware resources of the electronic device 300 with the electronic device 200 does not require the electronic device 100 and the electronic device 200 to have a new call, nor does it require the new call to be migrated to the electronic device 300, which greatly broadens the application scenarios of the above sharing.
  • the electronic device 300 can belong to the nearby device or far-field device of the electronic device 100.
  • the electronic device 300 is a device discovered/connected by the electronic device 100 through a distributed communication method (for example, realized through a soft bus channel).
  • the electronic device 300 does not need to install a chat application, nor does it need to be connected to the application server of the Internet/chat application, further broadening the application scenario of the above-mentioned sharing.
  • the user operation (which can be referred to as a trigger operation) for triggering the sharing of the software and hardware resources of the electronic device 300 can be input to the electronic device 100, or it can be input to the electronic device 300.
  • the software and hardware resources of the electronic device 300 can be shared with the electronic device 200 by operating the electronic device 100, further broadening the application scenario of the above-mentioned sharing.
  • the communication method of sharing the software and hardware resources of the electronic device 300 with the electronic device 200 can be, but not limited to: realized by a link between the electronic device 100 and the electronic device 200, and/or, realized by a link between the electronic device 300 and the electronic device 200, and the implementation method is various, rather than limited to being realized by the application server of the chat software, further broadening the application scenario of the above-mentioned sharing.
  • software resources may include but are not limited to multimedia data streams, files, locations, hyperlinks, content on the clipboard, etc.
  • hardware resources may include but are not limited to cameras, microphones, etc.
  • the above files may include but are not limited to documents, picture files, video files, software (such as applications and applets, etc.) installation packages or other executable files
  • the above multimedia data streams may include but are not limited to desktops, screens, software (such as applications and applets, etc.) multimedia data streams
  • the above multimedia data streams may include image data (multi-frame images can be called video streams) and audio data (multi-frame audio can be called audio streams).
  • Sharing a camera can include sharing images captured by the camera, such as a video stream.
  • Sharing a microphone can include sharing voices captured by the microphone, such as an audio stream.
  • the data available for sharing in this application is diverse and covers a large range, which can meet different data sharing needs.
  • the present application can provide users with simpler, more convenient, richer and more intelligent sharing functions, greatly reducing the limitations of the sharing functions and effectively meeting the sharing needs of users in different scenarios.
  • the electronic device 300 can be called the sharing device (the corresponding user can be called the sharing user), the electronic device 200 can be called the shared device (the corresponding user can be called the shared user), and the software and hardware resources can be called shared data.
  • the sharing device is used to provide shared data, and the shared device is used to receive shared data. It can be understood that the sharing device/sharing user and the shared device/shared user are relative role concepts rather than physical concepts.
  • a device/user has different roles in different sharing devices. The roles can be different in different scenarios. For example, a device/user can have different roles at different times, or a device/user can have different roles relative to different devices.
  • the device for receiving the trigger operation can be called a sharing initiator.
  • the sharing initiator can be the electronic device 100 or the electronic device 300.
  • the shared device can also be called a sharing receiver.
  • the sharing initiator and the sharing receiver are also relative role concepts. The specific examples are similar to the above-mentioned examples of sharing devices and shared devices.
  • an electronic device can run at least one application, and the application that is visible and interactive to the user in the at least one application can be called a foreground application.
  • the electronic device can display the user interface of the foreground application, which can also be referred to as the electronic device running the application in the foreground.
  • the application that is not visible and interactive to the user in the at least one application can be referred to as a background application.
  • the electronic device will not display the user interface of the background application, but will still run the background application, which can also be referred to as the electronic device running the application in the background. It can be understood that foreground applications and background applications are role concepts, not physical concepts, and an application can have different roles in different scenarios.
  • nearby devices of an electronic device may include, but are not limited to, devices discovered/connected by the electronic device through near-field communication technologies such as Bluetooth, wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi)), D2D, near field communication (NFC), ultra wide band (UWB), infrared, etc.
  • Nearby devices may include devices discovered but not connected by the electronic device, and/or devices that the electronic device is already connected to. The present application does not limit the specific content of near-field communication technology.
  • the far-field device of the electronic device may include, but is not limited to, devices discovered/connected by the electronic device through far-field communication technologies such as WLAN, satellite, and cellular communication.
  • Far-field devices may include devices discovered but not connected by the electronic device, and/or devices that the electronic device has connected to. This application does not limit the specific content of the far-field communication technology.
  • D2D can be implemented, but not limited to, through cellular communication or Wi-Fi.
  • D2D can allow electronic devices (such as terminals) to communicate directly by reusing cell resources under the control of the communication system.
  • D2D communication can realize resource sharing between cell users under the control of the cell network, thereby increasing the spectrum efficiency of the cellular communication system, reducing the transmission power of electronic devices, alleviating the burden of the cellular communication network and reducing the battery power consumption of electronic devices.
  • the sharing system 10 involved in the embodiment of the present application is introduced below.
  • FIG. 1 exemplarily shows a schematic diagram of the architecture of a sharing system 10 .
  • the sharing system 10 may include an electronic device 100 , an electronic device 200 and an electronic device 300 .
  • the electronic device 200 and the electronic device 300 may be any two devices discovered, connected, or having device information stored therein or obtained by the electronic device 100 in other ways.
  • the electronic device 100 and the electronic device 200 can communicate through a first link
  • the electronic device 100 and the electronic device 300 can communicate through a second link
  • the electronic device 200 and the electronic device 300 can communicate through a third link.
  • any one of the above-mentioned first link, second link and third link can be implemented through a near field communication mode and/or a far field communication mode
  • the near field communication mode includes, but is not limited to, soft bus, Bluetooth, WLAN (such as Wi-Fi), D2D, NFC, UWB, infrared, etc.
  • the far field communication mode includes, but is not limited to, WLAN (such as Wi-Fi), satellite, cellular communication, etc.
  • the far field communication mode includes, but is not limited to, peer-to-peer (P2P) transmission through network address translation (NAT), relay transmission through a transfer device (such as NAT relay, the transfer device is a network device such as a server), and software transmission through real time communication (RTC) (for example, low-latency
  • Any one of the first link, the second link and the third link includes, for example but not limited to, at least one of the following links: a cellular communication link, a satellite link, a Wi-Fi link, a D2D link, a Bluetooth link, a soft bus channel, etc.
  • the electronic device 100 or the electronic device 300 can respond to user operations and send the software and hardware resources (shared data) of the electronic device 300 to the electronic device 200 through the first link and/or the third link, and the electronic device 200 can output corresponding content (for example, play audio/video), wherein the above-mentioned sending through the first link can also be referred to as sending through the electronic device 100 (relay), for example, the electronic device 300 can first send the shared data to the electronic device 100 through the second link, and then the electronic device 100 sends the shared data to the electronic device 200 through the first link.
  • the electronic device 300 can first send the shared data to the electronic device 100 through the second link, and then the electronic device 100 sends the shared data to the electronic device 200 through the first link.
  • Any electronic device in the sharing system 10 can be a mobile phone, a tablet computer, a handheld computer, a desktop computer, a laptop computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a personal digital assistant (PDA), as well as smart home devices such as smart large screens and smart speakers, wearable devices such as smart bracelets, smart watches, and smart glasses, extended reality (extended reality, XR) devices such as augmented reality (AR), virtual reality (VR), and mixed reality (MR), vehicle-mounted devices or smart city devices.
  • XR extended reality
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • vehicle-mounted devices or smart city devices smart city devices.
  • FIG1 illustrates the first link as a NewTalk link.
  • the NewTalk link may include any of the following: an operator call link, an OTT call link, a D2D call link, a satellite call link, a V2X call link, etc.
  • the operator call link It can be established through cellular communication.
  • the OTT link can be established through near-field communication and/or far-field communication.
  • the D2D call link can be established through cellular communication or Wi-Fi communication.
  • the satellite call link can be established through satellite communication.
  • the V2X call link can be established through near-field communication and/or far-field communication.
  • a NewTalk link may include a main link and an auxiliary link.
  • the main link of NewTalk includes a channel of a quality of service (QoS) class identifier (QCI) 5 (used to transmit call signaling) and/or a channel of QCI1/QCI2 (used to transmit audio/video).
  • QoS quality of service
  • QCI1/QCI2 used to transmit audio/video
  • the auxiliary link of NewTalk includes an Internet protocol (IP) multimedia system (IMS) data channel (IMS Data channel), which is a data transmission channel other than the main link based on an IMS proprietary bearer.
  • the auxiliary link of NewTalk may include at least one physical link, any of which may be implemented by any near-field communication method or far-field communication method.
  • electronic device 300 is electronic device 301 in FIG. 1 (e.g., a laptop computer), the second link is a Bluetooth link, electronic device 100 and electronic device 301 can log in to the same account, and the NewTalk conducted between electronic device 100 and electronic device 200 can be migrated to electronic device 301.
  • both electronic device 100 and electronic device 301 output prompt information of the request (e.g., play a ringtone to remind incoming calls), and the user can use electronic device 301 to accept the request and answer the NewTalk.
  • prompt information of the request e.g., play a ringtone to remind incoming calls
  • the user can use electronic device 301 to accept the request and answer the NewTalk.
  • the user can share the screen, files, and other software and hardware resources of electronic device 300 with electronic device 200, and explain the relevant content while demonstrating.
  • the electronic device 300 is the electronic device 302 (e.g., a smart screen) in FIG1
  • the second link is a Wi-Fi link.
  • user 1 can share the screen of the electronic device 300 with the electronic device 200, so that user 2 can use the electronic device 200 to remotely operate the screen of the electronic device 300.
  • the electronic device 300 is the electronic device 303 in FIG. 1 (assuming it is a vehicle), and the electronic device 100 and the electronic device 300 can communicate via V2X, and V2X is implemented, for example but not limited to, via a cellular communication method and/or a near field communication method.
  • the electronic device 303 may include multiple components, such as but not limited to: a display screen of the central control (referred to as the central control screen) 303A (for example, located on the right side of the steering wheel), a display screen of the rear seat (referred to as the rear seat screen) (for example, located on the back of the driver's seat), an instrument display screen (for example, located directly in front of the driver's seat), a camera 303B, and a head-up display system (head up display, HUD) (for example, AR-HUD) (for example, projected on the front window glass) 303C.
  • the components of the electronic device 300 can be shared with the electronic device 200.
  • the display content of the central control screen 303A, the display content of the rear seat screen, the display content of the instrument display screen, the image captured by the camera 303B, and the display content of the AR-HUD can be shared with the electronic device 200.
  • the user can view the vehicle's navigation interface, the vehicle's operating status, actual road conditions and other content through the electronic device 200.
  • the electronic device 300 is the electronic device 304 in FIG. 1 (e.g., a smart watch), and the electronic device 100 and the electronic device 300 can communicate via a cellular communication network.
  • the electronic device 304 is placed at home, and when the user uses the electronic device 100 outside, the user can share the files, songs, and other contents of the electronic device 304 with the electronic device 200.
  • the electronic device 300 may also be a drone, and the audio/video collected by the drone in real time may be shared with the electronic device 200 in real time to achieve the effect of “what I see is what you see”, which is not limited in the present application.
  • any two devices communicate via cellular communication, which may include communication via a cellular communication network.
  • the two devices and the cellular communication network may constitute a cellular communication system, which may be, for example but not limited to, a global system for mobile communications (GSM), code division multiple access (CDMA), wideband code division multiple access (WCDMA), time division synchronous code division multiple ac (TD-SCDMA), long term evolution (LTE), new radio (NR) or other future network systems.
  • the cellular communication network may include, for example but not limited to, a base station, a core network and a communication line.
  • a base station is a device deployed in a radio access network (RAN) to provide wireless communication functions.
  • RAN radio access network
  • the names of base stations may be different, such as but not limited to base transceiver station (BTS) in GSM or CDMA, node B (NB) in WCDMA, evolved node B (eNodeB) in LTE, next generation base station (g node B, gNB) in NR, or base stations in other future network systems.
  • the core network is a key control node in the cellular communication system, mainly responsible for signaling processing functions.
  • the core network equipment includes, but is not limited to, access and mobility management function (AMF) entity, session management function (SMF) entity, user plane function (UPF) entity, etc.
  • Communication lines include, but are not limited to, twisted pair, coaxial cable, and optical fiber.
  • the electronic device may also access the core network through other access systems.
  • the other access system may be based on Wi-Fi, such as based on Wi-Fi voice call (voice over Wi-Fi, VoWiFi) technology, or based on Wi-Fi video call (video over Wi-Fi, ViWiFi) technology.
  • the base station in the present application may also be other access network devices, such as user equipment (UE), access point (AP), transmission and receiver point (TRP), relay device, or other network devices with the functions of a base station, etc., and the present application does not limit this.
  • UE user equipment
  • AP access point
  • TRP transmission and receiver point
  • relay device or other network devices with the functions of a base station, etc., and the present application does not limit this.
  • the forms and quantities of the electronic device 100, the electronic device 200 and the electronic device 300 shown in FIG. 1 are for illustrative purposes only. In other examples, there may be multiple electronic devices 200 and/or there may be multiple electronic devices 300, and this application does not limit this.
  • FIG. 2A exemplarily shows a schematic diagram of the hardware structure of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and a subscriber identification module (SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine some components, or split some components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated into one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signal to complete the control of instruction fetching and execution.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or cyclically used. If the processor 110 needs to use the instruction or data again, it may be directly called from the memory. This avoids repeated access, reduces the waiting time of the processor 110, and thus improves the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 is charging the battery 142, it can also power the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and powers other modules such as the processor 110.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor, for example, to transmit and share data.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
  • antenna 1 can be reused as a diversity antenna for a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G/6G, etc., applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, and filter, amplify, and process the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation through the antenna 1.
  • at least some of the functional modules of the mobile communication module 150 can be set in the processor 110.
  • at least some of the functional modules of the mobile communication module 150 can be set in the same processor 110 as at least some of the modules of the processor 110. in a device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless communication solutions including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (IR), D2D, V2X, etc., which are applied to the electronic device 100.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • D2D D2D, V2X, etc.
  • the wireless communication module 160 can be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signal and filters it, and sends the processed signal to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, modulate the frequency, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technology.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS) and/or a satellite based augmentation system (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation system
  • the electronic device 100 implements display functions, such as displaying a real-time shared video stream, through a GPU, a display screen 194 , and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 (also referred to as a screen) is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode or an active matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement shooting functions through ISP, camera 193, video codec, GPU, display screen 194 and application processor, for example, it can take portraits for real-time sharing with other devices together with the application's video stream.
  • ISP is used to process the data fed back by camera 193. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to ISP for processing and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on the noise, brightness, etc. of the image. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In one embodiment, ISP can be set in camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) phototransistor.
  • CMOS complementary metal oxide semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • the DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function.
  • the files are saved in the external memory card.
  • the internal memory 121 can be used to store computer executable program codes, which include instructions.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
  • the data storage area may store data created during the use of the electronic device 100 (such as audio data, a phone book, etc.), etc.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions, such as playing a real-time shared audio stream, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor.
  • audio functions such as playing a real-time shared audio stream, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 can be arranged in the processor 110, or some functional modules of the audio module 170 can be arranged in the processor 110.
  • the speaker 170A also called a "speaker" is used to convert an audio electrical signal into a sound signal.
  • the electronic device 100 can listen to music or other real-time shared audio streams, or listen to hands-free calls through the speaker 170A.
  • the receiver 170B also called a "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be received by placing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak by putting their mouth close to the microphone 170C to input the sound signal into the microphone 170C.
  • the electronic device 100 can be provided with at least one microphone 170C. In another embodiment, the electronic device 100 can be provided with two microphones 170C, which can not only collect sound signals but also realize noise reduction function. In another embodiment, the electronic device 100 can also be provided with three, four or more microphones 170C to realize the collection of sound signals, noise reduction, identification of sound sources, and directional recording functions. For example, the audio collected by the microphone 170C in real time can be shared with other devices in real time together with the audio stream of the application.
  • the earphone jack 170D is used to connect a wired earphone.
  • the pressure sensor 180A is used to sense pressure signals and can convert pressure signals into electrical signals.
  • the pressure sensor 180A can be set on the display screen 194.
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure based on the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation based on the pressure sensor 180A.
  • the electronic device 100 can also calculate the position of the touch based on the detection signal of the pressure sensor 180A.
  • the touch sensor 180K is also called a "touch control device”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, also called a "touch control screen”.
  • the touch sensor 180K can also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the pressure sensor 180A and/or the touch sensor 180K are used to detect a touch operation applied thereto or thereabout.
  • the pressure sensor 180A and/or the touch sensor 180K may transmit the detected touch operation to the application processor to determine the type of touch event.
  • a visual output related to the touch operation may be provided through the display screen 194.
  • the gyroscope sensor 180B can be used to determine the motion posture of the electronic device 100.
  • the air pressure sensor 180C is used to measure the air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect the temperature.
  • the bone conduction sensor 180M can obtain a vibration signal.
  • the button 190 includes a power button, a volume button, etc.
  • the button 190 may be a mechanical button. It may also be a touch button.
  • the electronic device 100 may receive a button input and generate a key signal input related to the user settings and function control of the electronic device 100.
  • the motor 191 may generate a vibration prompt.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, the change of the power, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
  • the layered architecture software system may be an Android system or a Harmony operating system. OS), or other software systems.
  • Fig. 2B exemplarily shows a schematic diagram of a software architecture of an electronic device 100.
  • Fig. 2B takes the Android system of a layered architecture as an example to exemplarily illustrate the software architecture of the electronic device 100.
  • the layered architecture divides the software into several layers, each with a clear role and division of labor.
  • the layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, from top to bottom: the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as address book, gallery, Bluetooth, chat, call, short message, browser, music, sharing, short video and video.
  • the sharing application can provide: the function of sharing the software and hardware resources of the electronic device 300 that can communicate with the electronic device 100 with the electronic device 200 that can communicate with the electronic device 100. Sharing can be an independent application, or it can be a functional component encapsulated by other applications such as call, Bluetooth, chat, etc., and this application is not limited to this.
  • the application package can also be replaced by other forms of software such as applets.
  • the application framework layer provides application programming interface (API) and programming framework for the applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, a sharing module, and the like.
  • the window manager is used to manage window programs.
  • the window manager can obtain the display screen size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying images, etc.
  • the view system can be used to build applications.
  • a display interface can be composed of one or more views.
  • a display interface including a text notification icon can include a view for displaying text and a view for displaying images.
  • the phone manager is used to provide communication functions of the electronic device 100, such as management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for applications, such as localized strings, icons, images, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar. It can be used to convey notification-type messages and can disappear automatically after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be a notification that appears in the system top status bar in the form of a chart or scroll bar text, such as notifications of applications running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is emitted, an electronic device vibrates, an indicator light flashes, etc.
  • Android runtime includes core libraries and virtual machines. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function that needs to be called by the Java language, and the other part is the Android core library.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules, such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
  • functional modules such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
  • the surface manager is used to manage the display subsystem and provide the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, H.265 and other video encoding formats, MP3, AAC, AMR, SBC, LC3, aptX, LDAC, L2HC, WAV, FLAC and other audio encoding formats, JPG, PNG, BMP, GIF and other image encoding formats.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • a 2D graphics engine is a drawing engine for 2D drawings.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the following is an example of the workflow of the software and hardware of the electronic device 100 in conjunction with the scenario of answering a call.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes the touch operation into raw input events (including touch coordinates, timestamp of the touch operation and other information).
  • the raw input events are stored in the kernel layer.
  • the application framework layer obtains the raw input events from the kernel layer and identifies the control corresponding to the input event. For example, if the touch operation is a touch single-click operation and the control corresponding to the single-click operation is the answering control of the call application, the call application calls the interface of the application framework layer, and then starts the audio by calling the kernel layer. Driven to play the voice information of the other party through the receiver 170B and/or obtain the voice information of the current user through the microphone 170C.
  • the hardware structure/software architecture of the electronic device 200 and the electronic device 300 is similar to the hardware structure/software architecture of the electronic device 100.
  • the software system of the electronic device may include an application processor system (AP) (which may be referred to as an application system) and a wireless communication system.
  • the wireless communication system may include, but is not limited to, at least one of the following: a cellular communication system (e.g., 2G/3G/4G/5G/6G, etc.), a satellite system (e.g., Beidou, Tiantong, Starlink, etc.), a Wi-Fi system, a BT system, an NFC system, a D2D system, etc.
  • the wireless communication system may include a coprocessor (CP) and/or a digital signal processor (DSP), wherein the CP in the terminal may be a baseband chip plus a coprocessor or a multimedia accelerator, the CP may include digital components required for communication with the network, and the CP may include a processor based on a reduced instruction set computer (RISC) microprocessor (advanced RISC machines, ARM) and a DSP.
  • RISC reduced instruction set computer
  • the CP may have an operating system and may communicate with an application processor running an operating system such as Android, IOS, or Windows through a high-speed (HS) serial connection.
  • CP can implement processing logic such as VR, AR, image processing, high fidelity (HiFi), high data rate (HDR), sensor management, etc.
  • CP can also be a cellular modem (cellular processor, CP).
  • Application system can be used to implement control logic such as rendering and presentation of user interface, input and response of user operation, business functions, and playback of multimedia data such as audio/video.
  • FIG. 3A exemplarily shows a schematic diagram of the architecture of another sharing system 10 .
  • the sharing system 10 may include an electronic device 100, an electronic device 200, and an electronic device 300, and the details may refer to the description of FIG1 .
  • the application system (AP) of the electronic device 100 may include, but is not limited to, a sharing module, a NewTalk function module, a video enhancement module, an audio enhancement module, a discovery module, a capture module, a playback module, a file system, a positioning module, a camera, and a link management module. Among them:
  • the sharing module can be used to share the software and hardware resources of the electronic device 300 with the electronic device 200.
  • the sharing module can be used to implement the user experience (UX) display of the sharing function, provide user interaction functions (such as receiving and responding to user input operations), business functions and service logic, etc.
  • the UX display includes, for example but not limited to: a display interface for executing a trigger operation (such as including a control for triggering a sharing function), a display interface for playing shared data, a display interface for selecting shared data (such as including an interface for selecting the software and hardware resources of the electronic device 300), and a display interface for selecting a shared device (i.e., the electronic device 200).
  • the NewTalk function module can be used to implement NewTalk, for example, to implement NewTalk between the electronic device 200.
  • the NewTalk function module includes at least one of the following: a module for implementing operator calls, a module for implementing OTT calls, a module for implementing D2D calls (not shown), a module for implementing satellite calls (not shown), and a module for implementing V2X calls (not shown).
  • the video enhancement module is an optional module that can be used to enhance the video playback effect.
  • the audio enhancement module is an optional module that can be used to enhance the audio playback effect.
  • the discovery module (eg, called Nearby) may be used to discover/connect to nearby devices via near field communication, and/or to discover/connect to far field devices via far field communication.
  • the CaptureStream module is an optional module that can be used to capture and share data.
  • the capture module can obtain the decoded multimedia data stream (which can be played directly) or the multimedia data stream before decoding (such as the generated original data) based on the interface of the application and/or system for sharing.
  • the capture module can directly capture the multimedia data stream before decoding at the system layer. For example, after the electronic device 100 receives the broadcast data sent by the base station through the 3G/4G/5G/6G broadcast module in the wireless communication system, the broadcast data may not be played, but the capture module may obtain the broadcast data for sharing.
  • the PlayStream module is an optional module that can be used to output shared data.
  • the PlayStream module decodes and plays shared data in audio/video format.
  • the File System is an optional module that can be used to store and read files of at least one format.
  • the positioning module is an optional module, which can be used to obtain positioning information (such as GPS coordinates), such as obtaining real-time positioning information.
  • Camera is an optional module that can be used to capture images.
  • the link management (Link Manager) module can be used to uniformly manage communication links, for example, but not limited to, to perform operations such as establishing communication links, transmitting data based on communication links, and releasing communication links.
  • the link management module may include a NewTalk link module (used to manage at least one of the following: operator call links, OTT call links, D2D call links, satellite call links, V2X call links, etc.), a Wi-Fi link module (used to manage at least one of the following: Wi-Fi broadcast links, Wi-Fi multicast links, and Wi-Fi unicast links), a Bluetooth link module (used to manage Bluetooth broadcast links, Bluetooth multicast links, and Bluetooth unicast links), a D2D link module (used to manage D2D link), satellite link module (for managing satellite links).
  • NewTalk link module used to manage at least one of the following: operator call links, OTT call links, D2D call links, satellite call links, V2X call links, etc.
  • a Wi-Fi link module used to manage at least one of the
  • the wireless communication system of the electronic device 100 in the sharing system 10 may include, but is not limited to, a cellular communication module, a Wi-Fi communication module, a BT communication module, and a satellite communication module, wherein:
  • the cellular communication module may include an Internet protocol (IP) multimedia system (IMS) communication module, a circuit switched (CS) communication module and a 3G/4G/5G/6G broadcast module, wherein the IMS communication module may be, but is not limited to, implementing IMS protocol-based calls such as LTE voice call (VoLTE), LTE video call (ViLTE), NR voice call (VoNR), NR video call (ViNR), VoWiFi, ViWiFi, and evolved packet system fallback (EPS-Fallback).
  • the CS communication module may provide a CS call function and/or a CS fallback function.
  • the 3G/4G/5G/6G broadcast module may be used to monitor the broadcast channel of 3G/4G/5G/6G.
  • the electronic device 100 may be in the coverage area of at least one base station, and any one of the at least one base station may send broadcast data (e.g., multimedia data stream) to the electronic device (including the electronic device 100) in the coverage area through a broadcast channel.
  • broadcast data e.g., multimedia data stream
  • Any base station may maintain at least one channel, and the broadcast data corresponding to different channels may be different.
  • the user may select the channel corresponding to the received broadcast data through the electronic device 100.
  • the electronic device 100 may receive the broadcast data sent by the base station through the 3G/4G/5G/6G broadcast module, the received broadcast data may be played through a system application (e.g., a call application) or a third-party application (e.g., a chat application), and the electronic device 100 may share the played content with other devices.
  • a system application e.g., a call application
  • a third-party application e.g., a chat application
  • the electronic device 100 may not play the received broadcast data, but directly share the received broadcast data with other devices, or share the processed broadcast data with other devices.
  • the Wi-Fi communication module may include a hardware module for Wi-Fi communication, such as firmware and a chip.
  • the BT communication module may include a hardware module for BT communication, such as firmware and a chip.
  • the satellite communication module may include a hardware module for satellite communication, such as firmware and chips.
  • FIG3A illustrates an example in which the first link between the electronic device 100 and the electronic device 200 is a NewTalk link
  • FIG3A illustrates an example in which the second link between the electronic device 100 and the electronic device 300 is a link established by near field communication.
  • the software system of the electronic device 200 is similar to the software system of the electronic device 100, except that the sharing module in the electronic device 200 is an optional module, and the sharing module of the electronic device 200 is used to share the software and hardware resources of the electronic device 200 with the electronic device 100 or the electronic device 300.
  • a first link can be established between the NewTalk function module of the electronic device 100 and the NewTalk function module of the electronic device 200, and signaling transmission can be performed based on the first link.
  • the software system of the electronic device 300 is similar to the software system of the electronic device 100, except that the electronic device 300 may not include the NewTalk function module and the cellular communication module.
  • a second link may be established between the discovery module of the electronic device 100 and the discovery module of the electronic device 300, and signaling transmission may be performed based on the second link.
  • the NewTalk functional module may be integrated with the sharing module.
  • the NewTalk functional module includes the sharing module, and the sharing module is used to implement the sharing function based on NewTalk.
  • FIG. 3B exemplarily shows a schematic diagram of the architecture of yet another sharing system 10 .
  • the sharing system 10 may include an electronic device 100 (not shown), an electronic device 200, and an electronic device 300.
  • the functions and possible implementations of some modules of the electronic device 100, the electronic device 200, and the electronic device 300 may refer to the description of the software system of the electronic device in the aforementioned embodiments, such as the description of Figures 1 and 3A.
  • Figure 3B takes the second link between electronic device 300 and electronic device 100 as a Wi-Fi link, and the first link between electronic device 200 and electronic device 100 as a NewTalk link as an example for illustration.
  • Figure 3B takes electronic device 300 as a sharing device, and electronic device 200 as a shared device as an example for illustration.
  • the application system (AP) of the electronic device 300 can be divided into three layers, namely, from top to bottom, the application framework layer (framework, FW), the hardware abstract layer (hardware abstract layer, HAL) and the kernel layer (kernel).
  • the application framework layer can include a sharing module, an audio enhancement module, a video enhancement module, a capture module, a playback module, a discovery module, a camera, a positioning module, a file system, a NewTalk function module (optional), a Wi-Fi function module, a D2D function module (optional), a BT function module (optional), a satellite function module (optional) and a link management module.
  • the sharing module of the application framework layer may include a software sharing module, a hardware sharing module, a codec module, a member management module, a data processing module, a transmission module, a quality module and a security module.
  • the software sharing module can be used to share the software resources of the electronic device 300 with the electronic device 200
  • the hardware sharing module can be used to share the hardware resources of the electronic device 300 with the electronic device 200.
  • the codec module (Codec) can be used to implement the encoding and decoding of shared data (such as audio stream/video stream).
  • the member management (Member Manager) module is used to manage the members who share, such as but not limited to viewing, adding, and deleting shared devices/shared users.
  • the member management module can manage the members who share through identification information such as the device address (such as IP address) and the user's name, but is not limited to.
  • the data processing module can be used to implement at least one data processing strategy, such as but not limited to slice, aggregation and redundancy.
  • the transmission module can be used to manage the function implementation of at least one transmission method for implementing the sharing function.
  • the quality module can be used to manage and/or control the quality of experience (QoE) of the user who is sharing.
  • the security module can be used to implement security functions related to the sharing function, such as but not limited to certificate authentication, encryption/decryption, and permission management of shared data.
  • the Wi-Fi function module can be used to implement Wi-Fi functions, such as but not limited to Wi-Fi functions including unicast, broadcast and multicast (also referred to as multicast), and the electronic device 300 can perform Wi-Fi communication with the electronic device 100 through the Wi-Fi function module.
  • the D2D function module can be used to implement the D2D function.
  • the BT function module can be used to implement the BT function, such as but not limited to the BT function including unicast, broadcast and multicast.
  • the satellite function module can be used to implement satellite-based communication functions.
  • the hardware abstraction layer may include a NewTalk service module (optional), a Wi-Fi protocol stack, a D2D protocol stack (optional), a BT protocol stack (optional), a satellite service module (optional), and an auxiliary link module.
  • the NewTalk service module, the Wi-Fi protocol stack, the D2D protocol stack, the BT protocol stack, and the satellite service module may be modules in the HAL that are adapted to the NewTalk function module, the Wi-Fi function module, the D2D function module, the BT function module, and the satellite function module, respectively.
  • the auxiliary link module is an optional module that may be used to manage the auxiliary link, such as but not limited to establishing, maintaining, and canceling the auxiliary link.
  • the auxiliary link may be established in a manner that includes but is not limited to at least one of the following: Bluetooth, WLAN (such as Wi-Fi), D2D, NFC, UWB, infrared, satellite, cellular communication, network address translation (NAT) traversal, NAT relay, RTC-based software, etc.
  • session traversal utilities for NAT, STUN may be understood as a P2P technology for direct communication between two points.
  • Relay traversal using relays around NAT, TURN
  • network devices such as servers to forward data between the communicating parties, thereby achieving communication between two points.
  • the kernel layer may include a transmission protocol stack, a Wi-Fi network card (network interface controller, NIC), a Wi-Fi driver, a cellular communication network card, an A-core data service (ADS), a D2D driver, a Bluetooth driver, and a satellite driver.
  • the transmission protocol stack includes, for example but not limited to, a transmission control protocol (TCP) and/or an IP protocol.
  • TCP transmission control protocol
  • the Wi-Fi network card and the Wi-Fi driver may be modules in the kernel layer that are adapted to the Wi-Fi functional module.
  • the full English name of the cellular communication network card may be remote (wireless wide area) network, which may be abbreviated as RMNET.
  • RMNET may be a modem or other external device as a remote network card provided by the operating system, which may form a virtual network card device in the operating system kernel.
  • RMNET and ADS may be modules in the kernel layer that are adapted to the NewTalk functional module.
  • the D2D driver may be a module in the kernel layer that is adapted to the D2D functional module.
  • the Bluetooth driver may be a module in the kernel layer that is adapted to the Bluetooth function module.
  • the Bluetooth driver may be a Bluetooth low energy (BLE) control module that is used to control the signaling of BLE.
  • the satellite driver may be a module in the kernel layer that is adapted to the satellite function module.
  • the software system of electronic device 200 is similar to the software system of electronic device 300. The difference is that electronic device 200 includes a NewTalk functional module and a module adapted to the NewTalk functional module to achieve communication with electronic device 100, and electronic device 200 may not include a Wi-Fi functional module and a module adapted to the Wi-Fi functional module.
  • the third link between the electronic device 200 and the electronic device 300 may be established and maintained by respective auxiliary link modules.
  • the third link may include but is not limited to at least one of the following physical links:
  • Link 1 NewTalk link.
  • the NewTalk link when the NewTalk link is an operator call link, it can include an IMS communication link and a CS communication link.
  • the IMS communication link can be, but is not limited to, a multimedia path/channel of QCI1/QCI2, QCI5, or an IMS Data channel.
  • the NewTalk link when the NewTalk link is an operator call link, it can be established through a cellular communication module of electronic device 200 and a cellular communication module of electronic device 300.
  • Wi-Fi link 2 Wi-Fi link, wherein the Wi-Fi link may include a unicast link, a multicast link, and/or a broadcast link.
  • the Wi-Fi link may be established through a Wi-Fi communication module of electronic device 200 and a Wi-Fi communication module of electronic device 300. Not limited thereto, in other examples, the Wi-Fi link may also be established through a cellular communication module of the electronic device.
  • the Wi-Fi link may be a P2P connection implemented through NAT traversal, and in other examples, the Wi-Fi link may be a relay connection implemented through a transit device (e.g., a server).
  • Link 3 BT link, wherein the BT link may include a unicast link, a multicast link and/or a broadcast link.
  • the BT link may be established through the BT communication module of the electronic device 200 and the BT communication module of the electronic device 300 .
  • the D2D link may be established through a cellular communication module of electronic device 200 and a cellular communication module of electronic device 300.
  • the D2D link may be established through a Wi-Fi communication module of electronic device 200 and a Wi-Fi communication module of electronic device 300.
  • the D2D link may be established through a D2D communication module (not shown) in a wireless communication system of electronic device 200 and a D2D communication module (not shown) in a wireless communication system of electronic device 300.
  • the D2D communication module may include a hardware module for D2D communication, such as firmware and a chip.
  • a satellite link can be established through a satellite communication module of electronic device 200 and a satellite communication module of electronic device 300.
  • the satellite communication module of electronic device 200 and the satellite communication module of electronic device 300 are both connected to a satellite system.
  • the satellite communication module of electronic device 200 is connected to a satellite system, which is then connected to a ground receiving station, which is connected to a cellular communication system, and electronic device 300 can be connected to the cellular communication system.
  • the above-mentioned cellular communication system can also be replaced by a Wi-Fi communication system or other communication system.
  • the present application does not limit the specific method of establishing a satellite link. Satellite systems include, for example but not limited to, Beidou, Tiantong, Starlink, etc.
  • Link 6 Cellular data service link.
  • the cellular data service link may be established through the cellular communication module of the electronic device 200 and the cellular communication module of the electronic device 300, for example, by implementing a relay connection through a server or implementing a P2P connection through NAT penetration.
  • link 2 or link 6 may be established through the Wi-Fi communication module of the electronic device 200 and the cellular communication module of the electronic device 300.
  • the electronic device 200 and the electronic device 300 when they establish the third link, they can choose to establish at least one link among the above-mentioned links 1-link 6 (any link or a combination of multiple links) according to the requirements of the transmission scenario. For example, when the electronic device 200 and the electronic device 300 are close to each other, link 3 and link 4 can be established. Establishing multiple links can avoid the situation where communication cannot be achieved or the communication quality is poor when one link is abnormal, thereby improving the stability of communication.
  • the sharing system 10 further includes a network device 400.
  • the network device 400 may include at least one server, for example, the network device 400 is a server cluster composed of multiple servers, any one of which may be a hardware server or a cloud server, for example, a web server, a background server, an application server, a download server, etc.
  • the network device 400 may be a core network device.
  • addressing and/or data transmission may be performed through a real-time control protocol (RTCP) and/or an IMS Data Channel.
  • RTCP real-time control protocol
  • IMS Data Channel IMS Data Channel
  • the network device 400 may include an addressing (wiseFunction) module, a (NAT) traversal (STUN) module, and a (NAT) relay (TURN) module.
  • addressing wiseFunction
  • STUN traversal
  • TURN NAT relay
  • the addressing module is used to perform identity authentication and addressing for establishing a link.
  • the electronic device 300 can implement access token (AT) authentication and exchange of session identity document (ID) for NAT traversal through the addressing module of the network device 400, and the electronic device 300 can obtain the SessionID of the electronic device 200.
  • the electronic device 200 can also implement AT authentication and exchange of SessionID for NAT traversal through the addressing module of the network device 400, and the electronic device 200 can obtain the SessionID of the electronic device 300.
  • SessionID can be used to establish a link, such as a NAT traversal link or a NAT relay link.
  • the (NAT) traversal module is used to implement the establishment of a NAT traversal link and signaling transmission.
  • the auxiliary link module of the electronic device 200 and the auxiliary link module of the electronic device 300 can establish a P2P traversal link (belonging to the third link) through the NAT traversal module of the network device 400 and perform signaling transmission based on the link.
  • the (NAT) relay module is used to implement the establishment of a NAT relay link and signaling transmission.
  • the auxiliary link module of the electronic device 200 and the auxiliary link module of the electronic device 300 can establish a relay link (belonging to the third link) through the NAT relay module of the network device 400 and perform signaling transmission based on the link.
  • Uplink data flow in the electronic device 300 capture module->sharing module (codec module (for example, for encoding)->data processing module (for example, for packetization)->transmission module (for example, for diversion))->NewTalk function module/Wi-Fi function module/D2D function module/Bluetooth function module/satellite function module (optional)->NewTalk service module/Wi-Fi protocol stack/D2D protocol stack/Bluetooth protocol stack/satellite service module (optional)->auxiliary link module->NewTalk/Wi-Fi/BT/D2D/satellite transmission module->air interface.
  • Downlink data flow in the electronic device 200 air interface->NewTalk/Wi-Fi/BT/D2D/satellite transmission module->auxiliary link module->NewTalk service module/Wi-Fi protocol stack/D2D protocol stack/Bluetooth protocol stack/satellite service module (optional)->NewTalk function module/Wi-Fi function module/D2D function module/Bluetooth function module/satellite function module (optional)->sharing module (transmission module (for example, for aggregation)->data processing module (for example, for unpacking)->codec module (for example, for decoding))->playback module.
  • transmission module for example, for aggregation
  • data processing module for example, for unpacking
  • codec module for example, for decoding
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned uplink data flow can be: transmission protocol stack->cellular communication network card->ADS->cellular communication module
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned downlink data flow can be: cellular communication module->ADS->cellular communication network card->transmission protocol stack.
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned uplink data flow can be: transmission protocol stack->Wi-Fi network card->Wi-Fi driver->Wi-Fi communication module
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned downlink data flow can be: Wi-Fi communication module->Wi-Fi driver->Wi-Fi network card->transmission protocol stack.
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above uplink data flow may
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the downlink data flow may be: BT communication module -> BT driver.
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned uplink data stream can be: D2D driver->cellular communication module/Wi-Fi communication module/D2D communication module (not shown), and the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned downlink data stream can be: cellular communication module/Wi-Fi communication module/D2D communication module (not shown)->D2D driver.
  • the D2D driver in the uplink data stream can also be replaced by: transmission protocol stack->cellular communication network card->ADS, and the D2D driver in the downlink data stream can also be replaced by: ADS->cellular communication network card->transmission protocol stack, at which time the cellular communication module/Wi-Fi communication module/D2D communication module in the uplink data stream and the downlink data stream is specifically a cellular communication module.
  • the D2D driver in the uplink data stream can also be replaced by: transmission protocol stack -> Wi-Fi network card -> Wi-Fi driver, and the D2D driver in the downlink data stream can also be replaced by: Wi-Fi driver -> Wi-Fi network card -> transmission protocol stack.
  • the cellular communication module/Wi-Fi communication module/D2D communication module in the uplink data stream and the downlink data stream is specifically a Wi-Fi communication module.
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned uplink data flow can be: satellite drive->satellite communication module
  • the NewTalk/Wi-Fi/BT/D2D/satellite transmission module in the above-mentioned downlink data flow can be: satellite communication module->satellite drive.
  • Figure 3B takes the software systems of electronic devices 200 and 300 and the third link between the two devices as examples for illustration.
  • the descriptions of the software system of electronic device 100, the first link between electronic devices 100 and 200, and the second link between electronic devices 100 and 300 are similar.
  • the difference is that when the first link is a NewTalk link, it can include a main link and an auxiliary link of NewTalk.
  • the main link of NewTalk can be established and maintained through a cellular communication module, a NewTalk functional module, and a module adapted to the NewTalk functional module in the electronic device.
  • the description of the auxiliary link is similar to that of the third link.
  • the NewTalk functional module/Wi-Fi functional module/D2D functional module/Bluetooth functional module/satellite functional module (optional) in the uplink/downlink data flow of the auxiliary link can be the NewTalk functional module
  • the NewTalk service module/Wi-Fi protocol stack/D2D protocol stack/Bluetooth protocol stack/satellite service module can be the NewTalk service module.
  • the NewTalk function module/Wi-Fi function module/D2D function module/Bluetooth function module/satellite function module can be integrated with the sharing module.
  • the Wi-Fi function module includes the sharing module, and the sharing module is used to implement the Wi-Fi-based sharing function.
  • the above-mentioned module adapted to the NewTalk functional module can also be adapted to other modules implemented by cellular communication.
  • RMNET and ADS can be adapted to the NewTalk functional module and other functional modules implemented by cellular communication in the kernel layer.
  • the above-mentioned module adapted to the Wi-Fi functional module can also be adapted to other modules implemented by Wi-Fi communication.
  • the above-mentioned module adapted to the Bluetooth functional module can also be adapted to other modules implemented by Bluetooth communication.
  • the above-mentioned module adapted to the satellite functional module can also be adapted to other modules implemented by satellite communication.
  • the following is an exemplary introduction to the application scenario involved in the embodiment of the present application and an example of a user interface (UI) in the scenario.
  • the following example is described by taking the first link between the electronic device 100 and the electronic device 200 as an operator call link as an example.
  • the electronic device 100 when the electronic device 100 and the electronic device 200 are conducting a carrier call, the electronic device 100 may display a floating window of a sharing function, which may be used to trigger the sharing of software and hardware resources of the electronic device 100 and/or the device (including the electronic device 300) discovered/connected by the electronic device 100.
  • the electronic device 300 may display a floating window of a sharing function or may not display a floating window of a sharing function, and specific examples may be seen in the following Figures 4 to 9.
  • Fig. 4 is a schematic diagram showing an exemplary embodiment of a user interface. Fig. 4 is illustrated by taking the electronic device 100 and the electronic device 300 displaying floating windows of a sharing function at the same time as an example.
  • an operator call can be made between the electronic device 100 (user A, phone number 1) and the electronic device 200 (user B, phone number 2), and Bluetooth communication can be made between the electronic device 100 and the electronic device 300, that is, Figure 4 takes the second link between the electronic device 100 and the electronic device 300 as a Bluetooth link as an example for explanation.
  • the electronic device 100 may display a user interface 410 of a call application, and the user interface 410 may include call information 411, a floating window 412, and a status bar 413 at the top.
  • the call information 411 may include the information of the other party (i.e., the electronic device 200) (the contact name "user B" and the communication number "phone number 2"), and the call duration "1 second".
  • the floating window 412 may be used to share the software and hardware resources of the electronic device 100 and/or the device discovered/connected by the electronic device 100, for example, to the electronic device 200.
  • the floating window 412 may include multiple options, for example: an option 412A for switching call mode, an option 412B for sharing location, an option 412C for sharing files, and an option 412D for sharing multimedia data streams.
  • the status bar 413 may include a signal strength indicator for a mobile communication signal, a signal strength indicator for a Wi-Fi signal, a Bluetooth indicator (indicating that Bluetooth is turned on), a battery status indicator, and a time indicator, etc.
  • electronic device 200 can display a user interface 420 of a call application.
  • User interface 420 may include call information 421 and a floating window 422.
  • Call information 421 may include information of the other party of the call (i.e., electronic device 100) (contact name "User A" and communication number "Telephone number 1") and call duration "1 second".
  • Floating window 422 is similar to floating window 412 in user interface 410, with the difference that floating window 422 is used to share the software and hardware resources of electronic device 200, for example, sharing with electronic device 100 and/or electronic device 300.
  • the electronic device 300 can display a user interface 430.
  • the user interface 430 is a desktop.
  • the user interface 430 may include a status bar 431 at the bottom, and the status bar 431 may include a Bluetooth indicator 431A (indicating that Bluetooth is turned on), a network indicator 431B (indicating that the Internet is not connected), and more controls 431C.
  • the more controls 431C can be used to trigger the display of other controls such as an indicator of a mobile communication signal.
  • the electronic device 100 may notify the electronic device 300 to display a floating window of a sharing function, such as but not limited to an interface notification via Bluetooth broadcast, or, through an interface notification of distributed communication. While the electronic device 100 displays the floating window 412 in the user interface 410, the electronic device 300 may display a floating window 432 in the user interface 430.
  • the floating window 432 may be used to share the software and hardware resources of the electronic device 300, such as sharing with the electronic device 200.
  • the floating window 432 may include multiple options, such as: an option 432A for sharing a location, an option 432B for sharing a file, an option 432C for sharing a multimedia data stream, and more controls 432D.
  • More controls 432D may be used to trigger options for displaying more functions, such as but not limited to options including a whiteboard function (for users to input content on a whiteboard and share the whiteboard), and options for sharing functions of hardware resources (such as options for sharing a camera/microphone).
  • options for displaying more functions such as but not limited to options including a whiteboard function (for users to input content on a whiteboard and share the whiteboard), and options for sharing functions of hardware resources (such as options for sharing a camera/microphone).
  • the electronic device 300 may further display a pointer 433, which is used to indicate the current position of the mouse to be operated on the interface displayed by the electronic device 300.
  • the mouse may be located on the electronic device 300, or may be an external device of the electronic device 300.
  • the user may control the movement, left-click, left-double-click, left-click and drag, right-click, and other operations of the pointer 433 on the interface displayed by the electronic device 300 by operating the mouse.
  • the pointer 433 is located in the area where the floating window 432 is located.
  • Fig. 5 is a schematic diagram showing another embodiment of a user interface.
  • Fig. 5 is a diagram showing an example of an electronic device 300 displaying a floating window of a sharing function after the electronic device 100 receives a trigger operation of the sharing function.
  • the user interface 410 shown in FIG. 4 may be displayed.
  • the electronic device 300 may display the user interface 510, which is similar to the user interface 430 shown in FIG. 4 , except that the user interface 510 does not include the floating window 432.
  • the electronic device 100 may display a list of shared data to be selected in response to a user operation (e.g., a touch operation) on the option 412D in the floating window 412 shown in the user interface 410, and notify the electronic device 300 to display the floating window of the sharing function, as shown in (B) of FIG. 5 .
  • a user operation e.g., a touch operation
  • the electronic device 300 may display the user interface 430 shown in FIG. 4 , and the user interface 430 includes a floating window 432.
  • the electronic device 100 may display a user interface 520, which is similar to the user interface 410, except that the user interface 520 also includes a list 521, and the list 521 may include multiple options for sharing data to be selected, for example: option 521A (including the characters "current application", used to indicate the multimedia data stream of the window of the foreground application of the electronic device 100), option 521B (including the characters "local window"), option 521C (including the characters "my computer window"), option 521D (including the characters "my car window”), and option 521E (including the characters "more devices”).
  • option 521A including the characters "current application", used to indicate the multimedia data stream of the window of the foreground application of the electronic device 100
  • option 521B including the characters "local window”
  • option 521C including the characters "my computer window”
  • option 521D including the characters "my car window
  • the foreground application of the electronic device 100 shown in FIG. 5 is a call application
  • option 521A indicates the multimedia data stream of the window of the call application
  • option 521B can be used to trigger the display of multiple windows of the electronic device 100, such as the desktop, screen, window of the foreground application, window of the background application, and window of the application installed but not running on the electronic device 100.
  • the multimedia data stream of at least one of these multiple windows can be used as shared data.
  • Option 521C can be used to trigger the display of multiple windows of the computer (assuming it is the electronic device 300) discovered/connected by the electronic device 100.
  • the multimedia data stream of at least one of these multiple windows can be used as shared data.
  • Option 521D can be used to trigger the display of multiple components of the car discovered/connected by the electronic device 100, such as the display content of the display screen, the audio output by the speaker, the display content of the AR-HUD, the image collected by the camera, etc.
  • the multimedia data stream related to at least one of these multiple components can be used as shared data.
  • Option 521E can be used to trigger the display of the software and hardware resources of other devices discovered/connected by the electronic device 100, such as the software and hardware resources of smart watches, smart screens and other devices.
  • the electronic device 100 can determine the shared data according to the user operation input by the user based on the list 521.
  • the electronic device 100 displays multiple windows of the electronic device 300 in response to the touch operation on option 521C, and then the electronic device 100 determines the multimedia data stream of the at least one window as the shared data in response to the touch operation on at least one window of the multiple windows.
  • the electronic device 100 can preset the shared device, for example, directly determine the other party of the call (i.e., the electronic device 200) as the shared device.
  • the shared data determined above can be sent to the shared device for output.
  • the option 521C and/or option 521D included in the list 521 in the user interface 520 shown in FIG. 5 (B) can also be used to directly trigger sharing.
  • the electronic device 100 determines that the multimedia data stream of the screen of "my computer" (i.e., the electronic device 300) indicated by option 521C is the shared data, and the shared data can be sent to the shared device for output.
  • the shared data may not be the multimedia data stream of the screen of the electronic device 300.
  • the present invention is not a multimedia data stream, but a multimedia data stream of the desktop of the electronic device 300 or a multimedia data stream of the foreground application of the electronic device 300, and this application does not limit this.
  • an electronic device when an electronic device displays the options included in list 521 in user interface 520 shown in (B) of Figure 5, information about the device indicated by the option can be displayed. For example, the name or IP address of "My Computer" indicated by option 521C can be displayed next to option 521C in list 521, so as to facilitate users to distinguish between devices, especially when there are multiple devices of the same type, thereby further improving the user experience.
  • the electronic device 100 may also preset the shared data, for example, determine the multimedia data stream of the screen of the device that was most recently connected via the near field communication method as the shared data.
  • the electronic device 100 may also determine the shared device based on user operation.
  • the user interface 520 shown in (B) of Figure 5 may also include options for multiple devices discovered/connected by the electronic device 100.
  • the electronic device 100 may determine that the device indicated by at least one option is the shared device based on a touch operation on at least one of the multiple options.
  • the example shown in Figure 5 can meet the user needs in specific scenarios.
  • the user can share the multimedia data stream of the electronic device 300 with the electronic device 200 through the list 521 in the user interface 520 of the electronic device 100 shown in Figure 5 (B).
  • the user can also share the files and other software and hardware resources of the electronic device 300 with the electronic device 200 through the floating window 432 in the user interface 430 of the electronic device 300 shown in Figure 5 (B).
  • the sharing method is more flexible, and it also avoids the inconvenience of operation when sharing the files of the electronic device 300 on the electronic device 100, thereby improving the user experience.
  • Fig. 6 is a schematic diagram of another embodiment of a user interface.
  • Fig. 6 is an example of an electronic device 300 displaying a floating window of a sharing function after the electronic device 100 receives a user operation for transferring a carrier call to the electronic device 300.
  • a user interface 610 may be displayed.
  • the user interface 610 is similar to the user interface 410 shown in FIG. 4 , except that the control 611 (for switching the answering method) in the user interface 610 is selected, and the user interface 610 also includes a list 612 related to the control 611.
  • the list 612 may include multiple options for answering the carrier call, such as the option of answering through a component of the electronic device 100 and the option of answering through a device discovered/connected by the electronic device 100.
  • the list 612 includes option 612A (indicating that the carrier call is answered through the speaker of the electronic device 100), option 612B (indicating that the carrier call is answered through the computer discovered/connected by the electronic device 100 (assuming that it is the electronic device 300)) and option 612C (indicating that the carrier call is answered through the car discovered/connected by the electronic device 100).
  • the electronic device 300 may display the user interface 510 (excluding the floating window 432) shown in (A) of FIG. 5 .
  • the electronic device 100 may switch the mode of answering the operator call to answering the call through the electronic device 300 in response to a user operation (e.g., a touch operation) on the option 612B in the list 612 shown in the user interface 610, and notify the electronic device 300 to display the floating window of the sharing function, as shown in (B) of FIG. 6 for details.
  • a user operation e.g., a touch operation
  • the electronic device 100 may display a user interface 620, which is similar to the user interface 410 shown in FIG. 4, except that the control 611 (used to switch the answering mode) in the user interface 620 includes the characters "My Computer", which is used to indicate that the operator call with the electronic device 200 is currently answered through "My Computer” (i.e., the electronic device 300).
  • the electronic device 300 may display a user interface 630, which is similar to the user interface 430 shown in FIG.
  • both include a floating window 432 of a sharing function, except that the user interface 630 also includes a call window 631, and the call window 631 may include prompt information 631A (including the characters "Current call - Collaboration with my phone"), which may indicate that the actual caller of the operator call currently answered by the electronic device 300 is "My phone" (i.e., the electronic device 100), that is, the operator call currently answered by the electronic device 300 is migrated from the electronic device 100.
  • prompt information 631A including the characters "Current call - Collaboration with my phone”
  • the call window 631 also includes information about the other party (i.e., electronic device 200) (contact name "User B" and communication number "Phone number 2"), a hang-up control (used to hang up the above-mentioned operator call) and a mute control (used to cancel the sending of the sound collected by the microphone in the above-mentioned operator call).
  • the other party i.e., electronic device 200
  • a hang-up control used to hang up the above-mentioned operator call
  • a mute control used to cancel the sending of the sound collected by the microphone in the above-mentioned operator call.
  • the electronic device 300 may display a floating window of a sharing function together with a call window.
  • the electronic device 300 shown in FIG7 may display a user interface 710.
  • the user interface 710 is similar to the user interface 510 shown in (A) of FIG6 , except that the user interface 710 further includes a call window 711.
  • the call window 711 is similar to the call window 631 in the user interface 630 shown in (B) of FIG6 , except that the call window 711 further includes multiple sharing options 711A.
  • the multiple sharing options 711A may be used to share software and hardware resources of the electronic device 300, including, for example, options for sharing multimedia data streams, options for sharing files, options for sharing locations, and controls for triggering the display of more functional options.
  • the implementation shown in FIG6 takes the example of electronic device 100 migrating the operator call to electronic device 300 according to user operation as an example.
  • electronic device 100 can also automatically migrate the operator call to electronic device 300.
  • electronic device 100 and electronic device 200 establish an operator link
  • electronic device 100 is not connected to electronic device 300.
  • the operator call with the electronic device 200 is transferred to the electronic device 300.
  • the electronic device 100 and the electronic device 200 make an operator call, if the distance between the electronic device 300 and the electronic device 100 is less than a preset distance (for example, 1 meter), the electronic device 100 transfers the operator call with the electronic device 200 to the electronic device 300.
  • a preset distance for example, 1 meter
  • the electronic device 100 transfers the operator call with the electronic device 200 to the electronic device 300.
  • the operator call is transferred to the electronic device 300. This application is not limited to this.
  • Fig. 8 is a schematic diagram of another embodiment of a user interface.
  • Fig. 8 is an example of displaying a floating window of a sharing function after the electronic device 300 receives a user operation for transferring a carrier call of the electronic device 100 to the electronic device 300.
  • the electronic device 100 and the electronic device 300 can establish a connection through a cellular communication mode.
  • the electronic device 100 can display the user interface 410 (not shown in FIG8 ) shown in FIG4
  • the electronic device 300 can display the user interface 810.
  • the user interface 810 is a desktop.
  • the user interface 810 may include a status bar 811 and a pointer 812 at the bottom, wherein the status bar 811 is similar to the status bar 431 in the user interface 430 shown in FIG4 , except that the status bar 811 also includes a signal strength indicator 811A of a mobile communication signal, and the description of the pointer 812 can refer to the description of the pointer 433 in the user interface 430 shown in FIG4 .
  • the electronic device 300 may receive a user operation on the signal strength indicator 811A of the mobile communication signal in the user interface 810.
  • the user operation includes the user operating the mouse to move the pointer 812 to the area where the signal strength indicator 811A is located (assuming it is position 2), and a left-click operation.
  • the electronic device 300 may display a window 813 (including the title "Mobile Communication Sharing") in the user interface 810 in response to the user operation.
  • Window 813 may include control 8131 (including characters “Communicating and sharing with my phone”), which may indicate that electronic device 300 is currently connected to “my phone” (i.e., electronic device 100), and may implement related functions through electronic device 100, such as functions indicated by multiple options included in window 813, which may include, for example, option 8132 (including characters “Connect to network”, indicating that electronic device 300 may connect to the Internet through electronic device 100), option 8133 (including characters “Answer call”, indicating that new calls on electronic device 100 may be transferred to electronic device 300), and option 8134 (including characters “New call”, indicating that the function of sharing the software and hardware resources of electronic device 300 may be triggered on electronic device 300), wherein any one of the options may correspond to a switch control, which may be used to turn on or off the function indicated by the option, for example, option 8134 shown in user interface 810 corresponds to switch control
  • the electronic device 300 may receive a user operation on the switch control 8134A, for example, the user operation includes the user moving the pointer 812 to the area where the switch control 8134A is located by operating the mouse, and a left-click operation, and the electronic device 300 may respond to the user operation and turn on the function indicated by option 8134, and a floating window of the sharing function may be displayed at this time, as shown in FIG8 (B).
  • the user operation includes the user moving the pointer 812 to the area where the switch control 8134A is located by operating the mouse, and a left-click operation
  • the electronic device 300 may respond to the user operation and turn on the function indicated by option 8134, and a floating window of the sharing function may be displayed at this time, as shown in FIG8 (B).
  • the electronic device 300 may display a user interface 820, and the user interface 820 is similar to the user interface 810, except that the switch control 8134A in the user interface 820 is in an on state, indicating that the function indicated by option 8134 is turned on, and the user interface 820 also includes the floating window 432 in the user interface 430 shown in FIG4.
  • Fig. 9 is a schematic diagram showing another embodiment of a user interface.
  • Fig. 9 is an example of an electronic device 300 displaying a floating window of a sharing function after receiving a user operation for discovering/connecting to the electronic device 100.
  • the electronic device 100 when the electronic device 100 (not shown) and the electronic device 200 (not shown) are making a carrier call, the electronic device 100 is not connected to the electronic device 300. At this time, the electronic device 100 can display the user interface 410 (not shown in FIG9 ) shown in FIG4 , and the electronic device 300 can display the user interface 910.
  • the user interface 910 is a desktop.
  • the user interface 910 may include the status bar 431 and the pointer 433 in the user interface 430 shown in FIG4 .
  • the electronic device 300 may receive a user operation on the network indicator 431B in the status bar 431 , for example, the user operation includes the user moving the pointer 433 to the area where the network indicator 431B is located (assuming it is position 3) by operating the mouse, and a left-click operation.
  • the electronic device 300 may respond to the user operation and display a window 911 in the user interface 910 .
  • Window 911 may include options for multiple connected/connectable devices, for example, option 911A for "my phone" (i.e., electronic device 100), option 911A for "device A", option 911B for "device B", and option 911B for "device C", etc. Any option may correspond to a control, which may be used to connect or cancel the connection with the device indicated by the option, for example, option 911A shown in user interface 910 corresponds to control 911B (indicating that the current electronic device 300 is not connected to electronic device 100).
  • the electronic device 300 may receive a user operation on the control 911B, for example, the user operation includes the user moving the pointer 433 to the area where the control 911B is located by operating the mouse, and a left-click operation, and the electronic device 300 may respond to the user operation and request the electronic device 100 to establish a connection.
  • the electronic device 300 may display a floating window of a sharing function, as shown in FIG. 9 (B).
  • the electronic device 300 may display a user interface 920, and the user interface 920 is similar to the user interface 430 shown in FIG. 4, and both include a floating window 432 of a sharing function, except that the status bar 431 shown in the user interface 920 also includes an indicator 921 for indicating that the electronic device 300 and the electronic device 100 have established a connection.
  • the electronic device 300 when the electronic device 100 and the electronic device 200 are conducting a carrier call, if the distance between the electronic device 300 and the electronic device 100 is greater than or equal to a preset distance (e.g., 1 meter), the electronic device 300 does not display the division A floating window of the sharing function. If the distance between the electronic device 300 and the electronic device 100 is less than a preset distance, the electronic device 300 can display a floating window of the sharing function. This application does not limit the conditions for triggering the display of the floating window of the sharing function.
  • a preset distance e.g. 1 meter
  • the electronic device 300 may also display a floating window of a sharing function.
  • the electronic device 300 when the electronic device 300 displays the floating window of the sharing function, it may first display the floating window in the retracted state, for example, only displaying a portion of the floating window 432 in the user interface 430 shown in FIG. 4, and when the pointer 433 is located in the area where the floating window in the retracted state is located, for example, at position 1 in the user interface 430, the electronic device 300 may display the floating window in the expanded state, for example, the floating window 432 in the user interface 430.
  • the electronic device 300 may also first display the floating window in the expanded state, and after a preset time (for example, 10 seconds), the electronic device 300 may display the floating window in the retracted state, and this application does not limit this.
  • electronic device 100, electronic device 200 or electronic device 300 can adjust the display position of the sharing option in the floating window in response to a user operation on any sharing option included in the floating window of the sharing function.
  • electronic device 100 can receive a touch operation on option 412B in floating window 412 shown in user interface 410, and the touch operation includes long pressing and dragging to the location of option 412C in floating window 412.
  • Electronic device 100 can exchange the display positions of option 412B and option 412C in response to the user operation.
  • the electronic device 300 after the electronic device 300 receives a user operation on a floating window for a sharing function, it can share the software and hardware resources of the electronic device 300 with other devices discovered/connected to the electronic device 100 (for example, the electronic device 200 that is conducting operator calls with the electronic device 100). For specific examples, see Figure 10 below.
  • FIG. 10 exemplarily shows a schematic diagram of yet another user interface embodiment.
  • the electronic device 100 can establish a connection with the electronic device 300.
  • the electronic device 100 can display the user interface 410 of the call application shown in FIG. 4
  • the electronic device 200 can display the user interface 420 of the call application shown in FIG. 4
  • the electronic device 300 can display the user interface 1010.
  • the user interface 1010 is similar to the user interface 430 shown in FIG. 4 , and both include a status bar 431, a floating window 432, and a pointer 433.
  • the electronic device 300 can receive a user operation for option 432C in the floating window 432.
  • the user operation includes the user moving the pointer 433 to the area where option 432C is located (assuming it is position 4) by operating the mouse, and a left-click operation.
  • the electronic device 300 can respond to the user operation and display a window 1011 in the user interface 1010.
  • Window 1011 can include a list 10111, a list 10112, and a sharing control 10113, wherein:
  • List 10111 may include multiple options for shared data to be selected, such as: option 10111A (including the character "screen” for indicating the multimedia data stream of the screen of electronic device 300), option 10111B (including the character "video” for indicating the multimedia data stream of the window of the video application (belonging to the background application) of electronic device 300), but not limited thereto, for example, also including: options for indicating other windows of the video application of electronic device 300, options for indicating the window of the foreground application of electronic device 300, options for indicating the window of other background applications of electronic device 300, and options for indicating the window of the application installed but not running on electronic device 300, etc.
  • option 10111A including the character "screen” for indicating the multimedia data stream of the screen of electronic device 300
  • option 10111B including the character "video” for indicating the multimedia data stream of the window of the video application (belonging to the background application) of electronic device 300
  • options for indicating other windows of the video application of electronic device 300 options for indicating the window of the foreground
  • electronic device 300 may determine that the multimedia data stream related to at least one option in list 10111 is shared data in response to a user operation (such as a touch operation) on at least one option in list 10111.
  • option 10111A in user interface 1010 is in a selected state, which may indicate that the currently selected shared data includes the multimedia data stream indicated by option 10111A.
  • List 10112 may include multiple options for shared devices to be selected, for example, option 10112A, option 10112B, option 10112C, option 10112D, and option 10112E.
  • Option 10112A includes the characters “phone number 2 (in call)", which is used to indicate that the communication number is phone number 2 and the electronic device 200 is currently conducting an operator call with the electronic device 100.
  • Option 10112B includes the characters “phone number 3 (recent contact)", which is used to indicate that the communication number is "phone number 3" and the electronic device 100 has recently conducted a new call.
  • Option 10112C includes the characters "contact", which can be used to trigger the display of information of multiple contacts. The device corresponding to at least one of the multiple contacts can be used as a shared device.
  • the multiple contacts can be contacts in a preset application.
  • the preset application is an application that implements at least one of the following: operator call, OTT call, D2D call, satellite call, V2X call, network chat, and the specific type of contact is not limited in this application.
  • Option 10112D includes the characters "User C's mobile phone” to indicate that the electronic device 100 discovers/connects to the mobile phone via a far-field communication method such as satellite and Wi-Fi.
  • Option 10112E includes the characters "My headphones” to indicate that the electronic device 100 discovers/connects to the headphones via a near-field communication method such as Bluetooth and Wi-Fi.
  • the electronic device 300 can respond to a user operation on at least one option in the list 10112 and determine that the device indicated by the at least one option is a shared device.
  • option 10112A in the user interface 1010 is in a selected state, which can indicate that the currently selected shared device includes the electronic device 200 indicated by option 10112A.
  • the sharing control 10113 may be used to trigger sharing of the selected sharing data to the selected shared device.
  • the user interface 1010 further includes controls 10114, 10115, and 10116.
  • Control 10114 includes the characters "Share Computer Sound” for turning on or off the function of sharing the audio stream in the multimedia data stream. When this function is turned on, the shared data includes the audio stream of the electronic device 300 system/application.
  • Control 10115 includes the characters “Overlay MIC” for turning on or off the sharing of the microphone. The function of sharing the audio collected by the microphone, when this function is turned on, the shared data includes the audio stream collected by the microphone of the electronic device 300.
  • the control 10116 includes the characters "overlay Camera” for turning on or off the function of sharing the image collected by the camera, when this function is turned on, the shared data includes the video stream collected by the camera of the electronic device 300.
  • the electronic device 300 may send a message 1 for requesting to share data to the selected shared device (i.e., the electronic device 200) through the electronic device 100 in response to a user operation (e.g., a touch operation) on the sharing control 10113 in the user interface 1010, i.e., the message 1 is relayed and sent through the electronic device 100. And/or, the electronic device 300 may send the message 1 to the electronic device 200 by itself in response to a user operation (e.g., a touch operation) on the sharing control 10113 in the user interface 1010. After the electronic device 200 accepts the request indicated by the message 1, the electronic device 300 may relay and send the selected shared data through the electronic device 100, and/or send the selected shared data to the electronic device 200 by itself. The electronic device 200 may output the received shared data, and a specific interface example may be referred to (B) of FIG. 10 .
  • the electronic device 300 may display a user interface 1020, which may include a status bar 431 and a pointer 433 in the user interface 1010.
  • the floating window 1021 in the user interface 1020 and the check box 1022 for the entire screen may represent the multimedia data stream of the screen of the electronic device 300 that is currently being shared.
  • the floating window 1021 may include information about the operator call made by the electronic device 100 and the electronic device 200, such as a hang-up control 1021A, a call duration 1021B, and a mute control 1021C.
  • the floating window 1021 may also include an option 1021D for sharing a file, an option 1021E for exiting the current sharing, and an option 1021F for more functions.
  • option 1021F may be used to trigger the display of at least one function option, such as: a hardware resource sharing function (such as sharing a camera/microphone), a location sharing function, a whiteboard function, an annotation function, an audio and video setting function, a mixed audio and video setting function, a permission setting function, a sharing area selection function, a sharing data/shared device change function, a remote control function, etc.
  • a hardware resource sharing function such as sharing a camera/microphone
  • location sharing function such as sharing a camera/microphone
  • a whiteboard function such as sharing a camera/microphone
  • an annotation function such as sharing a camera/microphone
  • an audio and video setting function such as a mixed audio and video setting function
  • a permission setting function such as sharing area selection function
  • a sharing data/shared device change function such as a Wi-Fi Protected Access (WPA)
  • a remote control function such as Wi-Fi Protected Access (WPA) option, etc.
  • the location sharing function is used to share the location of the electronic device 300.
  • the electronic device 300 can send the real-time location to the electronic device 200. In one case, the electronic device 300 can periodically send the location to the electronic device 200. In another case, the electronic device 300 can first send the location to the electronic device 200, and when the location changes, send the changed location to the electronic device 200. In some examples, when the electronic device 300 sends the real-time location to the electronic device 200, the electronic device 200 can also send the real-time location to the electronic device 300, which can be understood as the electronic device 200 and the electronic device 300 sharing the real-time location.
  • the electronic device 200 and the electronic device 300 can display a map interface, which includes an indicator of the location of the electronic device 200 and an indicator of the location of the electronic device 300.
  • the electronic device 300 may also send the current location to the electronic device 200 (only send the location once).
  • the electronic device 200 may display detailed information of the location (such as the interface of the location on the map, the city/district/building/building/floor where the location is located, etc.), and/or navigate to the location (such as setting the location as the destination for navigation).
  • the whiteboard function is used for the user to input content on the whiteboard and share the whiteboard.
  • the user can input character 1 on the whiteboard displayed by the electronic device 300, and the electronic device 300 can share the whiteboard with the electronic device 200, and the electronic device 200 can display the whiteboard including character 1.
  • the electronic device 200 displays the whiteboard shared by the electronic device 300
  • the user can also input character 2 on the whiteboard and share the updated whiteboard with the electronic device 300, and the electronic device 300 can display the whiteboard including character 1 and character 2.
  • the annotation function is used for users to input annotations on the sharing interface and share annotations.
  • the window for sharing data is the screen of electronic device 300
  • the user can add annotation 1 on the screen of electronic device 300
  • electronic device 300 can share the multimedia data stream of the screen of electronic device 300 and annotation 1 to electronic device 200
  • electronic device 200 can display the image including annotation 1.
  • annotation 2 can also be added and shared to electronic device 300
  • electronic device 300 can display the image including annotation 1 and annotation 2, which can be understood as realizing "two-way annotation”.
  • the audio and video settings function is used to set the type of shared data, for example: only audio, only image, or both audio and image.
  • the mixed audio and video setting function is used to set whether the shared data includes the audio collected by the microphone and/or the image collected by the camera. For example, it can be set to any of the following: excluding the audio collected by the microphone and the image collected by the camera, including only the audio collected by the microphone, including only the image collected by the camera, or including the audio collected by the microphone and the image collected by the camera.
  • the permission setting function is used to set relevant permissions of the shared device based on shared data, such as whether to grant the shared device saving permission and/or secondary dissemination permission.
  • Saving permission includes, for example, the permission to record/take screenshots and/or the permission to save files of shared data.
  • Secondary dissemination permission includes, for example, instant dissemination permission and/or delayed dissemination permission.
  • Instant dissemination permission is the permission for the shared device to forward the data shared by the sharing device to other devices when the shared device plays the data in real time.
  • Delayed dissemination permission is the permission for the shared device to forward the shared data to other devices after saving the shared data.
  • the function of selecting a sharing area can be implemented, for example but not limited to, by a grid method, a check box method, a hand-drawn method, a layer method, a preset rule method, etc.
  • the grid method can be to divide the shared window into multiple areas (also understood as multiple grids), at least one of the multiple grids can be selected, and the electronic device 300 can share the multimedia data stream of the selected grid, and not share the multimedia data stream of the unselected grid. Data stream.
  • the check box mode may be to adjust the size and/or position of the preset check box (e.g., proportional enlargement/reduction) according to the user operation (e.g., sliding operation), and the electronic device 300 may share the multimedia data stream of the adjusted check box, and not share the multimedia data stream of other areas outside the check box.
  • the hand-drawn mode may be to determine the area of corresponding shape and size according to the user operation (e.g., sliding operation), and the electronic device 300 may share the multimedia data stream of the area, and not share the multimedia data stream of other areas outside the area.
  • the layer mode may be to divide the shared window into layers to obtain multiple layers, for example, the contents of different applications belong to different layers, and for example, different contents of the same application belong to different layers, and at least one layer of the multiple layers may be selected, and the electronic device 300 may share the multimedia data stream of the selected layer, and not share the multimedia data stream of other layers.
  • the preset rule mode may be that the electronic device 300 determines the data to be shared according to the preset rule.
  • the preset rule includes: when the application corresponding to data 1 in the shared data is a preset application, the data 1 is not shared.
  • the preset application may be determined in response to the user operation, or it may be automatically identified, for example, applications of the types of bank and payment are identified as preset applications.
  • the preset rules include: when data 2 in the shared data includes preset content, the data 2 is not shared.
  • the preset content can be determined in response to user operations or automatically identified, such as identifying a user name, password, account name, login name, ID number, bank card number, account balance, etc. as preset content.
  • the remote control function is used to allow the shared device to operate the multimedia data stream while sharing the multimedia data stream.
  • the electronic device 200 plays the multimedia data stream of the screen shared by the electronic device 300
  • the user operation on the playback window of the multimedia data stream can be received, and the electronic device 200 can send the event and related information (such as the time of occurrence, for the control, for the application, etc.) of the user operation to the electronic device 300, so that the electronic device 300 uses the user operation event as an input event of the device, and the electronic device 300 can respond to the user operation event, which can be understood as the user can "remotely control" the electronic device 300 through the electronic device 200.
  • the event and related information such as the time of occurrence, for the control, for the application, etc.
  • the electronic device 200 may display a user interface 1030, and the user interface 1030 may include a window 1031, a function control 1032, and a prompt box 1033 (including the characters "Watching the content shared by user A").
  • the window 1031 may be used to play the shared data of the electronic device 300, that is, the multimedia data stream of the screen of the electronic device 300, so the user interface 1020 displayed by the electronic device 300 is similar to the display content of the window 1031.
  • the function control 1032 may be used to trigger the display of at least one function option, for example: an option for pausing the playback of the current shared data, an option for exiting the playback of the current shared data, an option for sharing the current shared data to other devices discovered/connected by the electronic device 200, an option for saving the current shared data, etc.
  • the saved shared data may be used to share with other devices discovered/connected by the electronic device 200.
  • FIG. 10 illustrates the migration of the operator call between the electronic device 100 and the electronic device 200 to the electronic device 300 as an example.
  • the migration may not be required.
  • the user interface 1020 displayed by the electronic device 300 may not include information about the operator call such as the hang-up control 1021A, call duration 1021B, and mute control 1021C.
  • the electronic device 300 may cancel the sharing, for example, the shared data of the electronic device 300 is sent to the electronic device 200 through the electronic device 100.
  • the electronic device 300 may continue to share the software and hardware resources with the electronic device 200, for example, the shared data of the electronic device 300 is sent to the electronic device 200 by itself.
  • the electronic device 300 may still display a floating window of the sharing function for sharing data with the electronic device 200.
  • the electronic device 300 may send the multimedia data stream of the screen and the audio collected by the microphone to the electronic device 200 by itself.
  • the electronic device 300 may receive a user operation for ending sharing, such as a user operation for option 1021E in the user interface 1020 shown in (B) of FIG. 10 , and the electronic device 300 may respond to the user operation and send a message 2 to the electronic device 200 for indicating the end of the current sharing.
  • the electronic device 300 may respond to the user operation and cancel the floating window displaying the sharing function.
  • the electronic device 200 may cancel the playing of the shared data.
  • the electronic device 200 may receive a user operation for ending sharing and notify the electronic device 300 to end the current sharing, and this application does not limit this.
  • the electronic device 200 may also share the software and hardware resources of the electronic device 200 with the electronic device 300.
  • the specific description is similar to the description of the electronic device 300 as a sharing device sharing the software and hardware resources with the electronic device 200.
  • the electronic device 200 may share data with the electronic device 300 through the electronic device 100, and/or share data with the electronic device 300 by itself. It can be understood that two-way sharing can be performed between the electronic device 300 and the electronic device 200.
  • FIG10 illustrates an example of sharing data as a multimedia data stream.
  • the shared data may also include files of the electronic device 300, which may be referred to as realizing a file sharing function.
  • the electronic device 300 may display a user interface 1110.
  • the user interface 1110 is similar to the user interface 1020 shown in (B) of FIG10 . Both include a floating window 1021.
  • the difference is that the user interface 1110 also includes a file icon 1110A (including the file name "111") of the file.
  • the electronic device 300 may receive a user operation on the icon 1110A.
  • the user operation includes: moving the icon 1110A from position 5 to the floating window 1021 by operating the mouse.
  • the electronic device 300 may respond to the user operation on the area where option 1021D is located (e.g., position 6), and send the file indicated by icon 1110A to the shared device. Not limited to this, in another embodiment, the electronic device 300 may also respond to the user operation (e.g., touch operation) on option 1021D in the floating window 1021 to display multiple files stored in the electronic device 300. The electronic device 300 may respond to the user operation (e.g., touch operation) on at least one file among the multiple files to determine that the at least one file is shared data.
  • the present application does not limit the specific operation method for triggering the sharing of files.
  • the file sharing function can also realize other related functions, and specific examples are as follows:
  • electronic device 300 when electronic device 300 sends a file to electronic device 200, it may first send indication information of the file instead of the file, and the indication information may include, for example but not limited to, a thumbnail and a file name of the file. Electronic device 300 sends the file to electronic device 200 only after receiving a message sent by electronic device 200 indicating that it agrees to receive the file.
  • electronic device 200 when electronic device 300 sends a file to electronic device 200, electronic device 200 can also send a file to electronic device 300, which can be understood as electronic device 300 and electronic device 200 can share files.
  • the electronic device 300 and/or the electronic device 200 may display a file sharing list, which may include information about the shared files, such as but not limited to the file name, the sending status of the file (e.g., sending, sent, or failed to send, etc.), the sender, the receiver, the sending time, etc.
  • the file sharing list may be for this sharing, for example, when the electronic device 200 and the electronic device 100 are in a carrier call, the electronic device 300 and the electronic device 200 share files, and the displayed file sharing list may only include information about the files shared in this carrier call.
  • the file sharing list is for any sharing process, that is, it may include information about files shared by the electronic device 200 and the electronic device 300 at any time/in any scenario.
  • files sent by electronic device 300 to electronic device 200 can be withdrawn, for example, files that have been sent within a preset time period and/or files that are being sent can be withdrawn.
  • the withdrawal function is for this sharing, for example, when electronic device 200 and electronic device 100 are in a carrier call, electronic device 300 can withdraw files shared with electronic device 200 during the carrier call.
  • the withdrawal function is for any sharing process, for example, when electronic device 200 and electronic device 100 are in a carrier call, electronic device 300 and electronic device 200 share files, and after the carrier call ends (for example, the user hangs up or drops the call), electronic device 300 can withdraw files shared with electronic device 200 during the carrier call.
  • file sharing supports breakpoint resumption after disconnection, that is, when the link used to transmit files is disconnected and re-entered, the electronic device 300 can continue to send files to the electronic device 200 based on the link from the file location where the previous transmission was disconnected. For example, when the electronic device 200 and the electronic device 100 are in an operator call, the electronic device 300 sends file 1 to the electronic device 200 through the third link or the auxiliary link of the first link (which is relayed by the electronic device 100). At a certain moment, the third link or the auxiliary link of the first link is disconnected, and the first quarter of file 1 has been transmitted. After the disconnected link is re-established, the electronic device 300 can continue to send the last three quarters of file 1 to the electronic device 200 through the re-established link.
  • file sharing supports breakpoint resuming for re-sharing, that is, the untransmitted part of the file that was not fully transmitted during the current sharing can be continued to be transmitted during the next sharing.
  • electronic device 300 sends the first quarter of file 2 to electronic device 200.
  • electronic device 300 can continue to send the last three quarters of file 2 to electronic device 200 (for example, electronic device 200 and/or electronic device 300 displays a prompt message to prompt for breakpoint resuming).
  • differential transmission is supported during file sharing, that is, for multiple files with the same file name, the differential contents can be transmitted after being differentiated, and the shared device can restore the original file (that is, the above-mentioned multiple files) based on the received differential contents.
  • concurrent transmission is supported during file sharing, that is, multiple files can be transmitted simultaneously and/or transmitted using the same link.
  • the file when sharing a file, the file may be compressed first and then shared.
  • the shared device receives the compressed file, the file may be decompressed to obtain the original file.
  • the shared device when sharing files, after the shared device receives the file, it can save the file to a preset directory. Not limited to this, in other examples, the shared device can also determine the saving directory based on the type of file. This application does not limit the way the file is saved.
  • the selectable shared devices of the electronic device 300 may include devices discovered, connected, or having device information stored therein or acquired by the electronic device 100 through other means, such as devices added by the electronic device 100 through a scan function. This application does not limit the specific acquisition method.
  • FIG10 illustrates an example in which the electronic device 300 determines shared data according to user operations.
  • the electronic device 300 may also preset shared data, for example, determining multimedia data of a foreground application window as shared data.
  • FIG10 illustrates an example in which the electronic device 300 determines a shared device based on user operations.
  • the electronic device 300 may also preset a shared device, for example, determining the electronic device 200 that is currently conducting an operator call with the electronic device 100 as a shared device.
  • the above implementation manner is illustrated by taking the example of the electronic device 200 automatically accepting the request indicated by the message 1 sent by the electronic device 300.
  • the electronic device 200 can first display a prompt message (for example, including the characters "User A invites you to watch together"), and then determine whether to accept the request indicated by the message 1 based on the received user operation.
  • a prompt message for example, including the characters "User A invites you to watch together
  • the above implementation manner is illustrated by taking the electronic device 200 displaying the shared data of the electronic device 300 in full screen as an example.
  • the electronic device 200 may also display the window of the foreground application and the shared data sent by the electronic device 300 in split screen.
  • the electronic device 200 may also run a window in the background for playing the shared data sent by the electronic device 300.
  • the electronic device 200 may also display a floating window in the user interface of the foreground application, and the floating window is used to play the shared data sent by the electronic device 300.
  • the present application does not limit the manner in which the shared device outputs the shared data.
  • the user can select multiple shared data, and the electronic device 300 can share the multiple shared data selected by the user with the electronic device 200.
  • the electronic device 200 can display the multiple shared data in split screen, in other examples, the electronic device 200 can play one of the multiple shared data in the foreground and play other shared data in the background, and in other examples, the electronic device 200 can play the multiple shared data in combination with the discovered/connected devices, for example, the electronic device 200 plays part of the shared data, and the device discovered/connected by the electronic device 200 plays other shared data.
  • This application does not limit the way in which the shared device outputs multiple shared data.
  • the above implementation manner is illustrated by taking the selected shared device as the electronic device 200 as an example.
  • the user can select multiple shared devices, and the electronic device 300 can share the shared data selected by the user to the multiple shared devices.
  • the shared data received by at least two of the multiple shared devices are the same.
  • the shared data received by at least two of the multiple shared devices are different. This application does not limit this.
  • Figure 10 illustrates an example in which the trigger operation of the sharing function (used to trigger the sharing of software and hardware resources of the electronic device 300) is received by the electronic device 300.
  • the trigger operation can also be received by the electronic device 100.
  • the electronic device 100 can display a user interface 1210, which is similar to the user interface 410 shown in FIG. 4 , except that option 412C in the user interface 1210 is selected, and the user interface 1210 also includes a list 1211 related to option 412C, and the list 1211 may include multiple options for sharing data to be selected, for example: option 1211A (including the characters "local file"), option 1211B (including the characters "my computer file”), option 1211C (including the characters "my car file”), and option 1211D (including the characters "more devices”).
  • option 1211A can be used to trigger the display of multiple files stored in the electronic device 100, and at least one of the multiple files can be used as shared data.
  • Option 1211B can be used to trigger the display of multiple files stored in the computer (i.e., the electronic device 300) discovered/connected by the electronic device 100, and at least one of the multiple files can be used as shared data.
  • Option 1211C can be used to trigger the display of multiple files stored in the car discovered/connected by the electronic device 100, and at least one of the multiple files can be used as shared data.
  • Option 1211D can be used to trigger the option of displaying files of other devices discovered/connected by the electronic device 100, such as the option of files stored in devices such as smart watches and smart screens.
  • the electronic device 100 can determine the shared data based on the user operation input by the user based on the list 1211.
  • the electronic device 100 displays multiple files stored in the electronic device 300 in response to the touch operation on option 1211B, and then the electronic device 100 determines that the multimedia data stream of at least one file among the multiple files is shared data in response to the touch operation on the at least one file, and the above-determined shared data can be sent to the shared device for output.
  • the electronic device 100 may display a user interface 1220.
  • the user interface 1210 may be displayed by the electronic device 100 in response to a user operation for option 412C in the user interface 410 shown in FIG. 4 .
  • the user interface 1210 may be displayed by the electronic device 100 in response to a user operation for option 1211A in the user interface 1210 shown in FIG. 12A .
  • the user interface 1210 may be used to display files (such as but not limited to including pictures, documents, videos, etc.) recently operated (such as received, modified, collected, etc.) by the electronic device 100, wherein at least one file may be used as shared data.
  • the user interface 1210 also includes a control 1221 for more devices, and the control 1221 may be used to trigger the display of files stored in at least one device discovered/connected by the electronic device 100, wherein at least one file may be used as shared data.
  • the above-mentioned determined shared data may be sent to the shared device for output.
  • the electronic device 100 may display a list of shared devices, which may include multiple options of shared devices to be selected, for example, the electronic device 100 displays the list of shared data. Then, the electronic device 100 may determine the shared device in response to a user operation on the list, and in other examples, the electronic device 100 may also preset the shared device.
  • the electronic device 100 and/or the electronic device 300 may enable the sharing function in the above embodiment by default. In another embodiment, the electronic device 100 and/or the electronic device 300 may enable the sharing function in the above embodiment in response to a user operation.
  • the above implementation is illustrated by taking the electronic device 100 in a call state as an example.
  • the electronic device 100 may not make a call with the operator, that is, when the electronic device 100 is in a non-call state, the electronic device 300 may discover/connect the device 100 to the operator. Share software and hardware resources.
  • the trigger operation used to trigger the sharing of the hardware and software resources of the electronic device 300
  • the trigger operation may also be implemented based on other interfaces.
  • the electronic device 100 may display a notification interface in response to a touch operation (e.g., sliding downward from the upper edge of the screen).
  • the electronic device 100 may display a selection interface for sharing data/shared devices in response to a user operation (e.g., a touch operation) on an instant sharing control in the notification interface.
  • the trigger operation may also be a specific gesture.
  • the electronic device 300 when the electronic device 300 displays the user interface of a video application, it may share the multimedia data stream of the window of the video application targeted by the sliding operation to the shared device in response to a specific sliding operation on the user interface.
  • the present application does not limit the specific form of the trigger operation.
  • the sharing method involved in the present application is introduced.
  • the method can be applied to the sharing system 10 shown in Figure 1.
  • the method can be applied to the sharing system 10 shown in Figure 3A.
  • the method can be applied to the sharing system 10 shown in Figure 3B.
  • FIG. 13 is a flow chart of a sharing method provided in an embodiment of the present application.
  • Fig. 13 is described by taking the sharing device as the electronic device 300 and the shared device as the electronic device 200 as an example.
  • the electronic device 300 may, but is not limited to, perform the following steps:
  • the electronic device 100 may obtain information of the electronic device 300 and the electronic device 200 , such as discovering/connecting the electronic device 300 and the electronic device 200 .
  • the electronic device 300 may receive a user operation, which may be used to trigger the sharing of the software and hardware resources of the electronic device 300 with the electronic device 200 (i.e., the trigger operation in the above embodiment), and the electronic device 300 may send a message of a sharing request to the electronic device 200, which is used to request the sending of shared data to the electronic device 200.
  • the electronic device 300 may obtain the information of the electronic device 200 through the electronic device 100, and send a message of a sharing request to the electronic device 200 according to the information of the electronic device 200.
  • the electronic device 300 may display a sharing portal, which may be used to trigger: the function of sharing the software and hardware resources of the electronic device 300 with the device discovered, connected or obtained by the electronic device 100 in other ways, and the above trigger operation may include a user operation on the sharing portal, for example, as shown in FIG. 10, the sharing portal is a floating window 432 in the user interface 1010, and the trigger operation may include a user operation on any one of the options in the floating window 432.
  • the electronic device 300 may display the sharing entrance in any of the following ways, but is not limited to:
  • Method 1 When the first condition is met, the electronic device 100 may send a first notification message to the electronic device 300 through the second link, and the electronic device 300 may display a sharing entrance according to the first notification message.
  • the first condition is that the electronic device 100 displays a sharing entrance, which can be understood as that the electronic device 100 and the electronic device 300 can display the sharing entrance at the same time.
  • the first condition is that the electronic device 100 receives a user operation for the sharing entrance, and for specific examples, see Figure 5.
  • the first condition is that the electronic device 100 receives a user operation for migrating the operator call with the electronic device 200 to the electronic device 300, and for specific examples, see Figures 6 and 7.
  • the first condition is that the distance between the electronic device 300 and the electronic device 100 is less than or equal to the preset distance, which is not limited in this application.
  • Mode 2 When the second condition is met, the electronic device 300 may display a sharing entrance.
  • the information of the sharing entrance may be sent by the electronic device 100 to the electronic device 300 through the second link.
  • the second condition is that the electronic device 300 receives a user operation for migrating the operator call between the electronic device 100 and the electronic device 200 to the electronic device 300.
  • the second condition is that the electronic device 300 receives a user operation for discovering/connecting the electronic device 200.
  • the second condition is that the distance between the electronic device 300 and the electronic device 100 is less than or equal to a preset distance, which is not limited in this application.
  • the trigger operation may also be received by the electronic device 100.
  • the electronic device 100 may send a second notification message to the electronic device 300, and the electronic device 300 may send a message of a sharing request to the electronic device 200 according to the second notification message.
  • the electronic device 100 may display a sharing entrance, and the description of the sharing entrance is similar to the description of the sharing entrance displayed by the above-mentioned electronic device 300.
  • the trigger operation may include a user operation on any one of the options in the floating window 412. In some examples, as shown in FIG.
  • the trigger operation may include a user operation on any one of the options in the list 1211 shown in the user interface 1210.
  • the trigger operation may include a user operation on a control 1221 in the user interface 1220.
  • the electronic device 300 may send a message of a sharing request to the electronic device 200 through the electronic device 100.
  • the electronic device 300 may send a message of a sharing request to the electronic device 100 through the second link, and then the electronic device 100 may send a message of a sharing request to the electronic device 200 through the first link.
  • the electronic device 300 may directly send a message of a sharing request to the electronic device 200. Sending a sharing request message.
  • the electronic device 300 may send a sharing request message to the electronic device 200 via a third link.
  • the electronic device 100 when the electronic device 100 receives a trigger operation, the electronic device 100 may send a sharing request message to the electronic device 200. In some examples, if the electronic device 200 accepts the sharing request, the electronic device 100 may then notify the electronic device 300 to share software and hardware resources with the electronic device 200.
  • the method further includes: the electronic device 300 determines the shared data.
  • the electronic device 300 may determine the shared data according to a first rule.
  • the first rule includes: presetting the multimedia data stream of the screen of the electronic device 300 as the shared data.
  • the electronic device 300 may determine the shared data in response to a user operation.
  • the electronic device 300 may display multiple software resource or hardware resource options in the electronic device 300.
  • the electronic device 300 may respond to a user operation on at least one of the multiple options and determine that the resource indicated by the at least one option is shared data.
  • the electronic device 300 may determine the shared data according to a trigger operation.
  • the trigger operation is for shared data.
  • the electronic device 300 may receive a third notification message sent by the electronic device 100, and determine the shared data based on the third notification message. In some examples, the electronic device 100 may determine the shared data in response to a user operation. For specific examples, see (B) of FIG. 5 , FIG. 12A , and FIG. 12B .
  • the method before S11, the method further includes: the electronic device 300 determines a shared device.
  • the shared device may be any device discovered, connected, or acquired by the electronic device 100 in other ways, and FIG. 13 illustrates the shared device as the electronic device 200.
  • the electronic device 100 may first obtain information of at least one device, for example, discovering at least one device through a near field communication method and/or a far field communication method, and the electronic device 100/electronic device 300 may determine the shared device from the at least one device.
  • the electronic device 300 may determine the shared device according to a second rule.
  • the second rule includes: presetting a device that conducts a carrier call with the electronic device 100 as a shared device.
  • the electronic device 300 may determine the shared device in response to a user operation.
  • the electronic device 300 may display multiple device options obtained by the electronic device 100.
  • the electronic device 300 may determine that the device indicated by the at least one option is a shared device in response to a user operation on at least one of the multiple options. For a specific example, see FIG10.
  • the electronic device 300 may determine the shared device based on a trigger operation.
  • the electronic device 300 may receive a trigger operation: a user operation on a sharing option among multiple sharing options 711A included in a call window 711 shown in a user interface 710, and determine that the shared device is the call counterpart (i.e., the electronic device 200) associated with the call window 711 based on the trigger operation.
  • a trigger operation a user operation on a sharing option among multiple sharing options 711A included in a call window 711 shown in a user interface 710, and determine that the shared device is the call counterpart (i.e., the electronic device 200) associated with the call window 711 based on the trigger operation.
  • the electronic device 300 may receive the fourth notification message sent by the electronic device 100, and determine the shared device based on the fourth notification message. In some examples, the electronic device 100 may determine the shared device in response to a user operation. The specific example is similar to FIG. 10 .
  • S12 The electronic device 300 determines a target communication link.
  • the target communication link can be used to send shared data to the electronic device 200, and the target communication link may include a first link and/or a third link.
  • the case where the target communication link includes the first link can be understood as the electronic device 300 sending the shared data through the electronic device 100 relay.
  • the case where the target communication link includes the third link can be understood as the electronic device 300 sending the shared data by itself.
  • the target communication link may include at least one of the following: a third link, a main link of the first link, and an auxiliary link of the first link.
  • the way of transmitting shared data includes, for example but not limited to: transmission only through the main link, transmission only through the auxiliary link, transmission only through the third link, and transmission through the auxiliary link and the third link.
  • the target communication link may include multiple physical links and/or logical links, wherein the physical links established through different communication modes are different, for example, the links established through communication modes such as Bluetooth, WLAN (such as Wi-Fi), D2D, NFC, UWB, infrared, satellite, and cellular communication are different.
  • Multiple logical links established through the same communication mode may be different, for example, the logical links of the same communication mode established through different ports of an electronic device are different, for example, a relay link and a traversal link established through a cellular communication mode or a Wi-Fi communication mode are different logical links.
  • the electronic device 300 may determine the target communication link according to a preset rule. In some examples, the electronic device 300 presets the target communication link as the third link.
  • the electronic device 300 may determine the target communication link in response to a user operation.
  • the user operation for setting the target communication link may also be received by the electronic device 100, which determines the target communication link based on the user operation and then notifies the electronic device 300.
  • the electronic device 300 may determine the target communication link according to an algorithm.
  • the electronic device 300 The communication map can be analyzed to obtain the characteristics of the communication link in space and time, and the link with better communication quality can be determined as the target communication link based on the characteristics, wherein the communication map can include crowdsourcing data of multiple electronic devices, such as signal strength parameters (such as reference signal receiving power (RSRP)) and signal quality parameters (such as reference signal receiving quality (RSRQ)), call QoE parameters (such as packet loss rate, delay, number of interruptions, etc.), link transmission quality parameters (such as packet loss rate, delay, jitter, etc.).
  • the communication quality is determined by, for example, but not limited to, packet loss rate, delay, jitter, bandwidth, etc.
  • the electronic device 100 can also determine the target communication link according to the algorithm and then notify the electronic device 300.
  • the electronic device 300 may determine the target communication link according to the current scenario.
  • the target communication link used by the electronic device 300 to share software and hardware resources with the electronic device 200 may be a third link.
  • the target communication link used by the electronic device 300 to share software and hardware resources with the electronic device 200 may be the third link.
  • the target communication link used by the electronic device 300 to share software and hardware resources with the electronic device 200 may include the first link.
  • the target communication link used by the electronic device 300 to share software and hardware resources with the electronic device 200 may be the first link.
  • the order of S12 and S11 is not limited.
  • S13 The electronic device 300 and the electronic device 200 perform coding negotiation.
  • coding negotiation may be performed, and the coding/decoding method obtained by the coding negotiation may be used to transmit the multimedia data stream.
  • the electronic device 300 may initiate coding negotiation with the electronic device 200, for example, by sending a message for requesting coding negotiation.
  • the electronic device 200 may initiate coding negotiation with the electronic device 300.
  • the electronic device 300 and the electronic device 200 may initiate coding negotiation together.
  • the coding method obtained by the above coding negotiation may be used to send the shared data of the electronic device 300 to the electronic device 200.
  • the electronic device 300 can directly perform encoding negotiation with the electronic device 200. In another embodiment, the electronic device 300 can perform encoding negotiation with the electronic device 200 through the electronic device 100. In another embodiment, the electronic device 100 can replace the electronic device 300 and the electronic device 200 to perform encoding negotiation.
  • the implementation method of the encoding negotiation can be found in the description of Figure 15 below and will not be described in detail for the time being.
  • the order of S13 and S11 is not limited, for example, they can be executed simultaneously, and the message of the sharing request can include information for requesting coding negotiation.
  • the order of S13 and S12 is not limited.
  • S14 The electronic device 300 captures the shared data.
  • the electronic device 300 can capture the software resources of the electronic device 300 as shared data.
  • the electronic device 300 can capture the displayed image and/or the played audio (multimedia data stream output by the foreground) as shared data, and the image includes, for example, the image of the screen of the electronic device 300 or the image of the software of the electronic device 300.
  • the electronic device 300 can generate a multimedia data stream (such as a game-related multimedia data stream) as shared data.
  • the electronic device 300 can read a stored file as shared data.
  • the electronic device 300 can obtain location information as shared data through GNSS.
  • the electronic device 300 can receive broadcast data of a channel sent by a base station through a 3G/4G/5G/6G broadcast channel as shared data, and optionally, the electronic device 300 may not output the broadcast data.
  • the electronic device 300 can capture the received user operation event and related information (such as the occurrence time) as shared data.
  • the electronic device 300 can capture the content on the clipboard as shared data.
  • the electronic device 300 can obtain a hyperlink (such as a hyperlink of a currently displayed web page) as shared data.
  • the electronic device 300 may obtain an installation package or other executable file of the software as the shared data.
  • the electronic device 300 may capture hardware resources of the electronic device 300 as shared data. In some examples, the electronic device 300 may capture images through a camera as shared data. In some examples, the electronic device 300 may capture audio through a microphone as shared data.
  • S14 and any of S11, S12, and S13 is not limited.
  • the shared data includes not only the data of the software and hardware resources of the electronic device 300 (resource data for short), but also the control data of the sharing function, such as but not limited to: information indicating the start of sharing, information indicating the pause of sharing, and information indicating the end of sharing.
  • the resource data and the control data can be transmitted on the same link, and in other examples, the resource data and the control data can be transmitted separately using different types of links.
  • the electronic device 300 can use the encoding method obtained by the encoding negotiation to encode the shared data. (such as multimedia data stream) is encoded.
  • the implementation method of encoding can be found in the description of Figure 16 and will not be described in detail for the time being.
  • the electronic device 300 can packetize the encoded data to convert the format of the data into a format of data that can be transmitted in the network. For example, a network protocol header (such as an IP protocol header) and a transmission protocol header (such as an RTP protocol header) can be added to the encoded data.
  • the packetized data can be called a data packet.
  • the electronic device 300 can shunt the packetized data (data packet), that is, distribute the data packet to at least one physical link/logical link in the target communication link for transmission.
  • the shunt can be performed by redundant supplementary packets, that is, when transmitting any data packet, part or all of the content of the data packet can be transmitted at least once (which can be called a supplementary packet of the data packet), and the content of each transmission can be the same or different (for example, including three situations: completely the same, partially the same, and completely different), and the time of each transmission can be the same or different, so as to ensure the transmission quality of real-time sharing (for example, real-time and/or stability).
  • the electronic device 300 can also send the data packet to the electronic device 200 only once.
  • the electronic device 300 can refer to at least one of the following parameters when performing traffic diversion: transmission quality, bandwidth, tariff, power consumption, business type, etc.
  • the electronic device 300 can select a link with high transmission quality (e.g., low latency, low jitter, and low packet loss rate) in the target communication link to transmit shared data.
  • the electronic device 300 can allocate different data packets to communication links with different bandwidths, such as data packets of different business types can be allocated to communication links with different bandwidths, and the bandwidth of any communication link can meet the transmission bandwidth required by the business type of the corresponding allocated data packet.
  • the electronic device 300 can select a link with lower or no tariff in the target communication link to transmit shared data.
  • the third link when the power of the electronic device 300 is low, the third link can be selected to transmit shared data, and when the power of the electronic device 300 is high, the third link and the first link can be selected to transmit shared data.
  • the electronic device 100 can allocate data packets of different business types to different communication links, such as audio, video, files, and locations are transmitted through different communication links.
  • Audio stream and video stream are transmitted separately.
  • the audio stream and video stream are encoded separately/independently, the audio stream is transmitted through link A, and the video stream is transmitted through link B.
  • link A is a communication link with high transmission quality
  • link B is a communication link with large bandwidth and/or low or no tariff.
  • Audio and video streams are transmitted separately.
  • the application-level/system-level/background audio stream and the new call data stream are mixed and encoded.
  • the encoded audio stream is transmitted via link A, and the video stream is transmitted via link B.
  • the data stream being basic or enriched can be related to encoding (such as layered encoding), a data stream with a higher degree of encoding can be an enriched data stream, and a data stream with a lower degree of encoding can be a basic data stream.
  • encoding such as layered encoding
  • a data stream with a higher degree of encoding can be an enriched data stream
  • a data stream with a lower degree of encoding can be a basic data stream.
  • the thumbnail of the image is the basic data
  • the original image of the image is the enriched data.
  • Audio and video streams are transmitted together. Audio and video streams with the same timestamp are encoded together, for example, transmitted on the same link, or dynamically migrated to other links according to changes in link quality to ensure optimal transmission.
  • the audio stream and/or video stream is transmitted in the form of redundant supplementary packets.
  • the supplementary packets can be transmitted on the same link, for example, each time carrying the coded data of two adjacent frames, and in other examples, the supplementary packets can be transmitted through at least one other link.
  • the electronic device 300 can send the data packet after encoding, packaging and shunting to the electronic device 200 through the target communication link.
  • the target communication link may include a first link.
  • the electronic device 300 sends the shared data to the electronic device 200 through the first link, which may include: the electronic device 300 first sends the shared data to the electronic device 200 through the second link, and then the electronic device 200 sends the shared data to the electronic device 100 through the first link.
  • the electronic device 300 can send shared data based on the communication protocol corresponding to the target communication link. For example, when the target communication link is a link established through a cellular communication method and/or a Wi-Fi method, the electronic device 300 can send shared data based on TCP or user datagram protocol (UDP).
  • TCP Transmission Control Protocol
  • UDP user datagram protocol
  • the electronic device 100 may also encode, package, and split the shared data.
  • the electronic device 100 may also send the shared data of the electronic device 300 to the electronic device 200 .
  • the electronic device 100 can encode, package, and split the shared data, and then send the processed shared data to the electronic device 200 through the first link, or send the processed shared data to the electronic device 300 through the second link, and the electronic device 300 sends it to the electronic device 200.
  • the electronic device 200 may, but is not limited to, perform the following steps:
  • S21 The electronic device 200 receives a sharing request.
  • the electronic device 200 may receive a sharing request message sent by the electronic device 100 or the electronic device 300 .
  • the electronic device 200 can determine whether to accept the sharing request according to a preset rule, for example, accepting the sharing request by default. In another embodiment, the electronic device 200 can also determine whether to accept the sharing request according to the received user operation. In some examples, the electronic device 200 can display a prompt message according to the message of the sharing request, and then accept the sharing request or reject the sharing request in response to the user operation.
  • the electronic device 200 may send a response message to the electronic device 100 or the electronic device 300 , where the response message indicates whether to accept the above-mentioned sharing request.
  • the electronic device 200 may receive the shared data from the electronic device 300, for example, by executing S22-S25.
  • S22 The electronic device 200 determines a target communication link.
  • S22 is an optional step.
  • the electronic device 200 can determine the target communication link by itself, and the specific description is similar to that of S12. In another embodiment, the electronic device 200 can receive a message sent by the electronic device 100 and/or the electronic device 300, and determine the target communication link according to the message.
  • the order of S22 and S21 is not limited.
  • the sharing request message received by the electronic device 200 includes information indicating the target communication link.
  • S22 and S21 may be executed simultaneously.
  • S23 The electronic device 200 and the electronic device 100 or the electronic device 300 perform coding negotiation.
  • S23 is similar to S13, and for details, please refer to the description of S13.
  • the decoding method obtained through the above coding negotiation can be used by the electronic device 200 to receive the shared data of the electronic device 300 .
  • the order of S23 and S21 is not limited.
  • the message of the sharing request received by the electronic device 200 may include information for requesting coding negotiation.
  • the order of S23 and S22 is not limited.
  • S24 The electronic device 200 receives the shared data.
  • the electronic device 200 may receive the shared data sent by the electronic device 100 and/or the electronic device 300 .
  • the electronic device 200 may perform aggregation processing on the received shared data.
  • the aggregation processing may include sorting and/or de-redundancy.
  • a cache queue may be created on the electronic device 200 side, and the shared data received by the electronic device 200 may be input into the cache queue, which may output the sorted and de-redundant shared data at a preset time interval.
  • the electronic device 200 may unpack the received shared data (eg, aggregated shared data), such as removing the header of the data packet.
  • the electronic device 200 can decode the received shared data (e.g., unpacked shared data), i.e., decode it using the decoding method obtained by the above-mentioned encoding negotiation process.
  • the implementation method of decoding can be found in the description of Figure 16 below and will not be described in detail for the time being.
  • the order of S24 and S22 is not limited. For example, when the electronic device 200 receives the shared data through the target communication link, the target communication link is determined.
  • S25 The electronic device 200 outputs the shared data.
  • the electronic device 200 may output the decoded sharing data, such as a multimedia data stream.
  • the electronic device 200 may, but is not limited to, display the shared data through a display screen and/or play the shared data through a speaker. For specific examples, see FIG. 10 .
  • the present application does not limit the manner of outputting the shared data.
  • the electronic device 100 and the electronic device 300 may establish a second link.
  • the second link is established before the first link is established, or the second link is established when the first link is established, or the second link is established after the first link is established.
  • the electronic device 100 and the electronic device 200 may establish a first link.
  • the first link is a NewTalk link, it may include a main link and an auxiliary link, wherein:
  • the establishment time of the auxiliary link can be the same as or different from the establishment time of the main link, for example, the establishment time of the auxiliary link is earlier than the establishment time of the main link.
  • the establishment time of the auxiliary link of the first link can be earlier or later than the time when the electronic device 100/electronic device 300 starts to display the sharing entrance.
  • the establishment time of the auxiliary link of the first link can be earlier or later than the time when the electronic device 100/electronic device 300 starts to display the sharing entrance.
  • the sub-device 100 /electronic device 300 receives the time of the trigger operation.
  • the auxiliary link may be, but is not limited to, at least one of the following:
  • Case 1 multiplexing the main link as an auxiliary link.
  • the shared data can be transmitted through the multimedia channel of QCI2 in the main link, and the NewTalk data stream is transmitted through other channels in the main link (such as the multimedia channel of QCI1).
  • the shared data is video data
  • the NewTalk data stream is voice data, such as applied to the scenario of screen sharing when making a voice call.
  • the shared data and NewTalk data streams can share the main link, for example, the NewTalk data stream is transmitted first and then the shared data.
  • the shared data and NewTalk data streams are both video data or both voice data.
  • the header fields of the shared data and NewTalk data streams can be different, such as the header of the real-time transport protocol (RTP) message (such as the synchronization source (SSRC) identifier) is different.
  • RTP real-time transport protocol
  • SSRC synchronization source
  • Case 3 Establishing other links as auxiliary links.
  • the data stream of NewTalk can be transmitted using the main link, and the shared data can be transmitted using the auxiliary link.
  • the method for establishing the auxiliary link can be, but is not limited to, at least one of the following:
  • the electronic device 100 can establish an auxiliary link with the help of the main link, that is, when establishing the auxiliary link, the electronic device 100 can negotiate the link establishment through the message transmitted in the main link.
  • the electronic device 100 can carry information for establishing the auxiliary link in the RTCP or session initialization protocol (SIP) message transmitted in the main link, so as to request the electronic device 200 to establish the auxiliary link (i.e., request the link establishment negotiation) during the new call.
  • the information for establishing the auxiliary link includes, for example, the communication ID (such as SessionID) and address information (such as IP address) used for NAT penetration.
  • the electronic device 100 can call the NAT interface to traverse or relay to establish the auxiliary link.
  • Method 2 The electronic device 100 can be addressed through the network device 400 to establish an auxiliary link.
  • any device can perform parameter binding on the network device 400, and optionally, specifically, the identification information such as the phone number and OTT ID and the communication ID (such as SessionID) are set to be associated.
  • the electronic device 100 can obtain the communication ID of the electronic device 200 from the network device 400 according to the identification information such as the phone number and OTT ID of the electronic device 200, that is, addressing.
  • the communication ID obtained by addressing can be used to establish an auxiliary link.
  • the electronic device 100 can establish an auxiliary link through a peripheral device, wherein the peripheral device can be, but is not limited to, a nearby device, a far-field device, a device in which the electronic device 100 stores device information, or other devices that the electronic device 100 can obtain (the electronic device 100 may not store the information of the other device), for example, the electronic device 100 is a smart watch (such as a modem powered off), a tablet computer (such as no SIM card interface), a smart speaker or a headset, etc., which do not have the addressing capability.
  • the peripheral device can be, but is not limited to, a nearby device, a far-field device, a device in which the electronic device 100 stores device information, or other devices that the electronic device 100 can obtain (the electronic device 100 may not store the information of the other device), for example, the electronic device 100 is a smart watch (such as a modem powered off), a tablet computer (such as no SIM card interface), a smart speaker or a headset, etc., which do not have the addressing capability.
  • the auxiliary link is an auxiliary link between the electronic device 100 and the electronic device 200, and in other examples, the auxiliary link includes an auxiliary link 1 between the electronic device 100 and the peripheral device, and an auxiliary link 2 between the peripheral device and the electronic device 200, for example, the electronic device 100 is a smart watch (such as a modem powered off), a tablet computer (such as no SIM card interface), a smart speaker or a headset, etc., which do not have the ability to directly establish an auxiliary link.
  • the electronic device 100 is a smart watch (such as a modem powered off), a tablet computer (such as no SIM card interface), a smart speaker or a headset, etc., which do not have the ability to directly establish an auxiliary link.
  • the situation in which the electronic device 100 in a call state establishes an auxiliary link may be situation one, situation two, or situation three described above. In other examples, the situation in which the electronic device 100 in a non-call state establishes an auxiliary link may be situation two or situation three described above.
  • the electronic device 100 and the electronic device 300 may establish a third link.
  • the establishment time of the third link may be the same as or different from the establishment time of the first link, for example, the establishment of the third link is triggered after the auxiliary link of the first link is established. In some examples, the establishment time of the third link may be earlier or later than the time when the electronic device 100/electronic device 300 starts to display the sharing entrance. In some examples, the establishment time of the third link may be earlier or later than the time when the electronic device 100/electronic device 300 receives the triggering operation.
  • the method for establishing the third link may be, but is not limited to, at least one of the following:
  • the electronic device 300 can establish the third link with the help of the first link.
  • the electronic device 100 can send information for establishing the third link to the electronic device 200 through the main link and/or the auxiliary link of the first link, and the information for establishing the third link includes, for example, a communication ID (such as SessionID) for NAT penetration, address information (such as IP address), etc.
  • a communication ID such as SessionID
  • address information such as IP address
  • the electronic device 300 can address the network device 400 to establish the third link.
  • the electronic device 300 can obtain the communication ID of the electronic device 200 from the network device 400 according to the identification information such as the telephone number and OTT ID of the electronic device 200, that is, addressing.
  • the communication ID obtained by addressing can be used to establish the third link.
  • the electronic device 300 can establish a third link by addressing with the help of the electronic device 100.
  • the electronic device 100 can bind the parameters of the electronic device 300 on the network device 400.
  • the electronic device 200 can bind the parameters of the electronic device 300 on the network device 400.
  • the electronic device 100 can obtain the communication ID of the electronic device 200 from the network device 400 according to the identification information such as the telephone number and OTT ID of the electronic device 200, that is, perform addressing. Then, the electronic device 100 can send the obtained communication ID of the electronic device 200 to the electronic device 300, so that the electronic device 300 can establish a third link with the electronic device 200 based on the communication ID.
  • the specific process can be seen in Figure 17 below.
  • the third link can be established through the above-mentioned method 1, method 2 or method 3, and when the first link is not established, the third link can be established through the above-mentioned method 1 or method 3.
  • the third link can be established through the above-mentioned method 1, method 2 or method 3.
  • the third link can be established through method 1 or method 3.
  • S11 and S21 may also be replaced by: the electronic device 200 sends a request message to the electronic device 300, where the request message is used to request the electronic device 300 to share software and hardware resources, which may be understood as the electronic device 200 "inviting" the electronic device 300 to share.
  • FIG13 illustrates the sharing process between a sharing device (electronic device 300) and a shared device (electronic device 200) as an example.
  • the sharing process between the sharing device and multiple shared devices may be performed.
  • the sharing process between the sharing device and any one of the multiple shared devices please refer to the description of FIG13 .
  • the user can perform a trigger operation on the electronic device 100 or the electronic device 300 to share the software and hardware resources of the electronic device 300 with the electronic device 200, wherein the electronic device 100 and the electronic device 200 can perform NewTalk, and the above-mentioned sharing process can be performed during the NewTalk process, or after the NewTalk ends, or before the NewTalk starts, and the above-mentioned NewTalk can be migrated to the electronic device 300, or it can not be migrated to the electronic device 300.
  • the sharing function can be independent of the communication connection between the electronic device 100 and the electronic device 200, which greatly reduces the requirements and limitations of the sharing function, provides users with a simple and convenient sharing function that can be applied in a variety of scenarios, improves user experience, and also reduces the power consumption of the electronic device 100, which is more power-saving.
  • the target communication link for transmitting the software and hardware resources (i.e., sharing data) of the electronic device 300 may include multiple links, such as the main link of the first link and the third link, which greatly improves the data transmission quality, reduces the probability of sharing failure, and reduces jamming.
  • Figure 14 is a flowchart of another sharing method provided by an embodiment of the present application.
  • Figure 14 is illustrated by taking the first link as a new call link, the second link as a distributed communication link, and the target communication link as a third link as an example.
  • Figure 14 is illustrated by taking the electronic device 100 receiving a trigger operation as an example.
  • Figure 14 is illustrated by taking the new call between the electronic device 100 and the electronic device 200 being implemented through a call application as an example.
  • the method may include, but is not limited to, the following steps:
  • the cellular communication module of electronic device 100 and the cellular communication module of electronic device 200 establish a main link for a new call.
  • the electronic device 100 may send a message for requesting a NewTalk to the electronic device 200 in response to a user operation, and the electronic device 200 may accept the request indicated by the message in response to the user operation, and the electronic device 100 and the electronic device 200 may establish a main link for a new call through their respective cellular communication modules, which may be understood as the electronic device 100 requesting a new call with the electronic device 200.
  • the electronic device 200 may also request a new call with the electronic device 100.
  • the establishment of the main link of the new call can be used to trigger the establishment of the auxiliary link of the new call, that is, after the electronic device 100 and the electronic device 200 establish the main link, they can execute the auxiliary link establishment process, for example, to implement the following 2-4.
  • the cellular communication module of the electronic device 100 notifies the call application of the electronic device 100 to establish an auxiliary link for the new call.
  • the call application of electronic device 100 requests the call application of electronic device 200 to establish an auxiliary link for the new call.
  • the call application of electronic device 200 feeds back to the call application of electronic device 100 that the auxiliary link is successfully established.
  • the distributed communication module of the electronic device 100 and the distributed communication module of the electronic device 300 establish a distributed communication link.
  • the above 5 is executed before the above 1-4.
  • the electronic device 100 receives a message sent by the electronic device 200 to request a NewTalk, it can notify the electronic device 300 to output a corresponding incoming call reminder through the distributed communication link.
  • the call application of the electronic device 100 displays a floating window of the sharing function.
  • the electronic device 100 may display a user interface 410 , and the user interface 410 includes a floating window 412 of a sharing function.
  • the call application of the electronic device 100 sends a call to the electronic device 300 based on the distributed communication link through the distributed communication module of the electronic device 100.
  • the distributed communication module of the electronic device 300 notifies the user to start the sharing function, and the distributed communication module of the electronic device 300 then sends the notification to the sharing application of the electronic device 300.
  • the sharing application of the electronic device 300 displays a floating window of the sharing function according to the notification sent by the distributed communication module of the electronic device 300.
  • the electronic device 300 may display a user interface 430 , and the user interface 430 includes a floating window 432 of a sharing function.
  • the above 6 and the above 7-8 can be executed together, or only one of them can be executed.
  • the order is not limited.
  • the order of the above 7-8 and any one of the above 1-5 is not limited.
  • the electronic device 100 determines to notify the electronic device 300 to start the sharing function, if the distributed communication link is not established, the above 5 can be executed, and then the above 7-8 can be executed.
  • the call application of the electronic device 100 receives a trigger operation.
  • the electronic device 100 may determine to share the software and hardware resources of the electronic device 300 with the electronic device 200, for example, by executing the following 10-11.
  • the call application of electronic device 100 notifies the distributed communication module of electronic device 300 to share the software and hardware resources of electronic device 300 with electronic device 200 through the distributed communication module of electronic device 100 based on the distributed communication link, and the distributed communication module of electronic device 300 then sends the notification to the sharing application of electronic device 300.
  • the call application of electronic device 100 sends a sharing request to the call application of electronic device 200 through the new call link, and the call application of electronic device 200 then sends the sharing request to the playback application of electronic device 200.
  • the sharing request received by the electronic device 200 is used to request sharing of data with the electronic device 200 .
  • the playback application of the electronic device 200 may perform preparation operations, such as starting a process for receiving multimedia data streams and a process for playing multimedia data streams, for subsequent reception and playback of shared data.
  • the electronic device 200 and the electronic device 300 may establish a third link.
  • the order of the above 12 and any one of the above 1-9 is not limited.
  • the electronic device 200 and the electronic device 300 can establish a third link.
  • the sharing application of the electronic device 300 sends the sharing data to the playing application of the electronic device 200 via the third link.
  • the playback application of the electronic device 200 plays the received shared data.
  • the shared data is a multimedia data stream of the screen of the electronic device 300 .
  • the electronic device 200 may display the user interface 1030 . Both the user interface 1020 and the user interface 1030 are used to display the screen of the electronic device 300 .
  • the above 15 is an optional step.
  • the electronic device 100 may disconnect the first link in response to a user operation for hanging up the new call, that is, the electronic device 100 may actively end the new call.
  • the electronic device 200 may actively end the new call.
  • the sharing application of the electronic device 300 sends the sharing data to the playing application of the electronic device 200 through the third link.
  • the playback application of the electronic device 200 plays the received shared data.
  • the electronic device 300 may continue to send the shared data to the electronic device 200.
  • the sharing application of the electronic device 300 and the playing application of the electronic device 200 determine to end the sharing.
  • the above 18 is an optional step.
  • the electronic device 300 may notify the electronic device 200 to end the sharing in response to a user operation, for example, the user operation is a user operation on option 1021E in the user interface 1020 shown in (B) of FIG. 10.
  • the electronic device 200 may also notify the electronic device 300 to end the sharing in response to a user operation, for example, the user operation may trigger the display of an option for exiting the playback of the currently shared data in response to a function control 1032 in the user interface 1030 shown in (B) of FIG. 10.
  • the sharing application of the electronic device 300 cancels the floating window that displays the sharing function.
  • the above 19 is an optional step.
  • the electronic device 300 may cancel the floating window displaying the sharing function.
  • the sharing application of the electronic device 300 notifies the distributed communication module of the electronic device 100 of the end of sharing through the distributed communication module of the electronic device 300 based on the distributed communication link, and the distributed communication module of the electronic device 100 then sends a message to the call application of the electronic device 100. The notice.
  • the call application of the electronic device 100 cancels the floating window displaying the sharing function.
  • the electronic device 300 may notify the electronic device 100, and the electronic device 100 may cancel the floating window displaying the sharing function.
  • the above 12 may not be performed, and the above 13 may be replaced by: the sharing application of the electronic device 300 sends the sharing data to the distributed communication module of the electronic device 100 based on the distributed communication link through the distributed communication module of the electronic device 300, and the distributed communication module of the electronic device 100 then sends the sharing data to the call application of the electronic device 100; the call application of the electronic device 100 sends the sharing data to the call application of the electronic device 200 through the new call link, and the call application of the electronic device 200 then sends the sharing data to the playback application of the electronic device 200.
  • the sharing data can be transmitted by relaying through the electronic device 100, that is, transmitted through the distributed communication link and the new call link.
  • the above 13 can also be that the sharing data is transmitted through the third link, and is also relayed through the electronic device 100, which can be understood as redundant transmission.
  • redundant transmission can improve the transmission quality.
  • the electronic device 200 may also be a sharing device, and the electronic device 300 may be a shared device, and the specific description is similar to the above description.
  • there may be multiple sharing devices and in another implementation, there may be multiple shared devices, and the description of any sharing device sharing data with the shared device is similar to the above description.
  • the electronic device 200 is a sharing device
  • the electronic device 300 is a shared device, and there are other shared devices.
  • the electronic device 200 can send shared data to the electronic device 300 and other shared devices through the electronic device 100.
  • the electronic device 100 sends shared data to the electronic device 300 and other shared devices through multiple unicast links, or the electronic device 100 sends shared data to the electronic device 300 and other shared devices through a broadcast or multicast link.
  • Figure 15 is a schematic diagram of a coding negotiation process provided in an embodiment of the present application.
  • the coding and decoding methods obtained in the coding negotiation process are used to share the software and hardware resources of the electronic device 300 with the electronic device 200.
  • the implementation method of the coding negotiation process can be any of the following:
  • Mode 1 The electronic device 200 and the electronic device 300 perform coding negotiation and obtain coding mode 1 and decoding mode 1, and the electronic device 100 does not participate in this coding negotiation.
  • the electronic device 200 and the electronic device 300 relay the message used for this coding negotiation (which may be referred to as the coding negotiation message for short) through the electronic device 100, wherein the first link between the electronic device 100 and the electronic device 200, and the second link between the electronic device 100 and the electronic device 300 can be used to transmit the coding negotiation message.
  • the above-mentioned first link is a new call link
  • the main link and/or auxiliary link of the new call can be used to transmit the coding negotiation message.
  • Mode 2 The electronic device 200 and the electronic device 300 perform coding negotiation and obtain coding mode 2 and decoding mode 2, and the electronic device 100 does not participate in this coding negotiation. In addition, the electronic device 200 and the electronic device 300 transmit the coding negotiation message through the third link, and the electronic device 100 does not participate in the transmission process of the coding negotiation message.
  • Mode 3 Electronic device 100 and electronic device 300 perform a coding negotiation and obtain coding mode 3 and decoding mode 3. This coding negotiation message can be transmitted through the second link. Electronic device 100 and electronic device 200 also perform a coding negotiation and obtain coding mode 4 and decoding mode 4. This coding negotiation message can be transmitted through the first link.
  • Method 4 The electronic device 100 and the electronic device 200 perform coding negotiation and obtain coding method 5 and decoding method 5.
  • the coding negotiation message can be transmitted through the first link.
  • the electronic device 300 can use the coding negotiation capability of the electronic device 100 to perform coding negotiation with the electronic device 200, which can be understood as the electronic device 300 using the coding negotiation capability of the electronic device 100 as its own capability for coding negotiation.
  • the coding negotiation of method 4 can be implemented.
  • the coding negotiation of method 4 can be implemented.
  • the following is an exemplary illustration of the encoding and decoding process of the shared data.
  • Figure 16 is a schematic diagram of an encoding and decoding process provided in an embodiment of the present application.
  • the encoding and decoding process can be implemented in any of the following ways:
  • Method 1 corresponds to the coding negotiation process of method 1 shown in FIG. 15 .
  • the electronic device 300 may use coding method 1 to encode the original data (shared data), and then send the encoded data to the electronic device 200 through the electronic device 100.
  • the electronic device 200 may use decoding method 1 to decode the received data to obtain the original data.
  • the electronic device 100 does not participate in the encoding and decoding, but the electronic device 100 participates in the transmission process of shared data, wherein the first link between the electronic device 100 and the electronic device 200, and the second link between the electronic device 100 and the electronic device 300 can be used to transmit the encoded shared data, and in some examples, when the first link is a new call link, the main link and/or the auxiliary link of the new call can be used to transmit the encoded shared data. It can be understood that in this way, the electronic device 100 does not need to support encoding mode 1 and decoding mode 1.
  • Mode 2 The coding negotiation process corresponding to Mode 2 shown in FIG. 15.
  • the electronic device 100 can encode the original data using encoding mode 2, and then send the encoded data to the electronic device 200 through the third link.
  • the electronic device 200 can decode the received data using decoding mode 2 to obtain the original data.
  • the electronic device 100 does not participate in the encoding and decoding, nor does it participate in the transmission process of the shared data. It can be understood that in this mode, the electronic device 100 does not need to support encoding mode 2 and decoding mode 2.
  • Mode 3 The coding negotiation process corresponding to Mode 3 shown in FIG. 15.
  • the electronic device 100 may encode the original data using encoding mode 3, and then send the encoded data to the electronic device 100 through the second link.
  • the electronic device 100 may decode the received data using decoding mode 3 to obtain the original data.
  • the electronic device 100 may encode the original data using encoding mode 4, and send the encoded data to the electronic device 200 through the first link.
  • the electronic device 200 may decode the received data using decoding mode 4 to obtain the original data.
  • the electronic device 100 may send the data encoded using encoding mode 4 to the electronic device 300 through the second link, and then the electronic device 300 may send the data to the electronic device 200 through the third link.
  • Mode 3 can be understood as a transcoding performed once on the electronic device 100, so that even if the electronic device 300 does not support encoding mode 4, the encoding and decoding process can be implemented.
  • Mode 4 The coding negotiation process corresponding to Mode 4 shown in FIG. 15.
  • the electronic device 300 can send the original data to the electronic device 100 through the second link, the electronic device 100 can encode the original data using the coding mode 5, and send the encoded data to the electronic device 200 through the first link, and the electronic device 200 can decode the received data using the decoding mode 5 to obtain the original data.
  • the electronic device 100 can send the data encoded using the coding mode 5 to the electronic device 300 through the second link, and the electronic device 300 can then send the data to the electronic device 200 through the third link. It can be understood that in this mode, even if the electronic device 300 does not support the coding mode 5, the encoding and decoding process can be implemented.
  • Figure 17 is a flowchart of an addressing process provided by an embodiment of the present application.
  • Figure 17 takes the electronic device 300 addressing with the help of the electronic device 100 to establish a third link (i.e., the third method for establishing the third link shown in Figure 13) as an example for explanation.
  • the addressing process may include, but is not limited to, the following steps:
  • the electronic device 100 obtains the first identification information of the electronic device 100 and the first communication ID of the electronic device 300.
  • the first identification information may include one or more identification information of the electronic device 100, and the identification information may include, for example but not limited to, communication numbers such as a telephone number and/or an OTT ID.
  • the first communication ID may be a Session ID or other communication identifier used to establish a communication link.
  • the electronic device 100 may receive the first communication ID sent by the electronic device 300 .
  • the electronic device 100 requests the network device 400 to bind the first identification information and the first communication ID.
  • the network device 400 notifies the electronic device 100 that the binding is successful.
  • the electronic device 100 may send the first identification information and the first communication ID to the network device 400 to request binding of the two parameters.
  • the first identification information and the first communication ID may be associated.
  • the network device 400 may store the first identification information and the first communication ID. The above 2-3 can be understood as: the electronic device 100 binds the first identification information and the first communication ID and registers/logs in to the network device 400.
  • the network device 400 may send a notification to the electronic device 100 that the parameter binding requested by the electronic device 100 is successful.
  • the network device 400 can authenticate the electronic device 100.
  • the network device 400 can verify the access token (AT) of the electronic device 100.
  • the network device 400 can verify whether the Huawei certificate of the electronic device 100 meets the requirements (for example, whether it is credible/valid).
  • the network device 400 can verify whether the account information of the electronic device 100 meets the requirements (for example, whether it is credible/valid).
  • the network device 400 can generate a P2P-TOKEN of the electronic device 100.
  • the P2P-TOKEN carries a key identifier (key id) and is signed with a private key.
  • the electronic device 100 can request the network device 400 to bind the parameters of the electronic device.
  • the electronic device 100 may perform a refresh operation, the specific process is similar to the above 2-3, the difference is that the first identification information needs to be replaced with the changed The identification information of the electronic device 100, the first communication ID needs to be replaced with the communication ID of the changed electronic device 300.
  • the electronic device 100 notifies the electronic device 300 that the binding is successful.
  • the electronic device 100 may send a notification to the electronic device 300: the first identification information and the first communication ID are successfully bound.
  • the electronic device 200 obtains the second identification information and the second communication ID of the electronic device 200.
  • the second identification information may include one or more identification information of the electronic device 200, and the identification information may include, for example but not limited to, a communication number such as a telephone number and/or an OTT ID.
  • the second communication ID may be a SessionID or other communication identifier used to establish a communication link.
  • the electronic device 200 requests the network device 400 to bind the second identification information and the second communication ID.
  • the network device 400 notifies the electronic device 200 that the binding is successful.
  • the above 6-7 are similar to the above 2-3, and can be understood as the electronic device 200 binding the second identification information and the second communication ID and registering/logging in to the network device 400.
  • the electronic device 300 requests the electronic device 100 to obtain the communication ID of the electronic device 200 .
  • the above 8 is an optional step.
  • the electronic device 300 after the electronic device 300 determines to share software and hardware resources with the electronic device 200 through the third link (for example, receiving a trigger operation), it can first perform an addressing operation to obtain the communication ID of the electronic device 200 and establish a third link with the electronic device 200 through the communication ID. For example, the electronic device 300 can execute the above 8 to perform addressing with the help of the electronic device 100.
  • the electronic device 100 requests the network device 400 to obtain the communication ID (carrying the second identification information) of the electronic device 200 .
  • the network device 400 sends the second communication ID to the electronic device 100 .
  • the network device 400 may acquire the second communication ID associated with the second identification information, and return the second communication ID to the electronic device 100 .
  • the electronic device 100 sends the second communication ID to the electronic device 300 .
  • the electronic device 200 requests the network device 400 to obtain the communication ID bound to the first identification information.
  • the electronic device 200 may send the first identification information to the network device 400 to request to obtain the communication ID bound to the first identification information.
  • the network device 400 sends the first communication ID to the electronic device 200 .
  • the network device 400 may acquire the first communication ID associated with the first identification information, and return the first communication ID to the electronic device 200 .
  • the electronic device 100 can actively release to the network device 400 through the provided session termination interface to release the binding relationship between the first identification information and the first communication ID implemented in 2-3 above.
  • the network device 400 can also automatically release after a preset time (e.g., 10 minutes) to release the binding relationship between the first identification information and the first communication ID implemented in 2-3 above.
  • the electronic device 200 can release the binding relationship between the second identification information and the second communication ID implemented in 6-7 above.
  • the specific instructions are similar to the above instructions for releasing the binding relationship between the first identification information and the first communication ID implemented in 2-3 above, and will not be repeated.
  • the electronic device 300 and the electronic device 200 establish a third link according to the first communication ID and the second communication ID.
  • electronic device 300 may conduct a link establishment negotiation, such as but not limited to direct IP connection, NAT traversal, or NAT relay.
  • a link establishment negotiation such as but not limited to direct IP connection, NAT traversal, or NAT relay.
  • the order of the above 2-3 and the above 6-7 is not limited, and the order of the above 9-10 and the above 12-13 is not limited.
  • the above 6-7 (for implementing the binding of the second identification information and the second communication ID) may not be executed.
  • the network device 400 can wake up the electronic device 200 through the docked PUSH server.
  • the awakened electronic device 200 can be connected to the network device 400, and identity authentication and addressing are performed through the network device 400 (for example, the authentication module and the addressing module therein).
  • the network device 400 may not wake up the electronic device 200, so it can return to the electronic device 100 an indication of addressing failure (for example, including the reason "not woken up”). In other examples, the network device 400 cannot successfully wake up the electronic device 200, so it can return to the electronic device 100 an indication of addressing failure (for example, including the reason "wake-up failure”).
  • the above 2-3 used to implement the binding of the first identification information and the first communication ID
  • the specific description is similar to the above description and will not be repeated.
  • the electronic device 100 can also bind the first identification information and the third communication ID of the electronic device 100 and register/login to the network device 400. Therefore, after the network device 400 receives the first identification information sent by the electronic device 200, it can obtain the first communication ID and the third communication ID associated with the first identification information.
  • the above 13 can be replaced by: the network device 400 sends the first communication ID and the third communication ID to the electronic device 200.
  • the electronic device 200 can determine that the communication ID of the electronic device 300 is the first communication ID according to the content sent by the network device 400.
  • the first communication ID and the third communication ID sent by the network device 400 to the electronic device 200 can have a sequence, for example, the first communication ID is arranged in the first place, not limited to this, in other examples, the network device 400 can send the device type of the device of the first communication ID and the device type of the device of the second communication ID to the electronic device 200, and the specific content sent by the network device 400 is not limited. This situation can be understood as the electronic device 200 addressing and obtaining the communication ID group bound to the first identification information.
  • the electronic device 100 may also bind the first identification information and the third communication ID of the electronic device 100 and register/login to the network device 400.
  • the electronic device 200 may receive the third identification information of the electronic device 300 sent by the electronic device 100, and the third identification information is different from the first identification information, such as but not limited to the serial number (SN), international mobile equipment identity (IMEI), medium access control (MAC) address, device type information, etc. of the electronic device 300.
  • the electronic device 200 executes the above 12, it may also carry the third identification information.
  • the network device 400 After the network device 400 receives the first identification information and the third identification information sent by the electronic device 200, it may first obtain the first communication ID and the third communication ID associated with the first identification information, and then select the first communication ID corresponding to the third identification information from the first communication ID and the third communication ID. This situation can be understood as the electronic device 200 accurately addressing based on the third identification information notified by the electronic device 100.
  • the electronic device 200 and the electronic device 300 may be addressed through the electronic device 100 instead of through the network device 400.
  • the electronic device 100 may obtain the first communication ID of the electronic device 300 and the second communication ID of the electronic device 200.
  • the electronic device 100 may send the second communication ID of the electronic device 200 to the electronic device 300, and/or the electronic device 100 may send the first communication ID of the electronic device 300 to the electronic device 200.
  • the electronic device 300 may also address the network device 400 directly without the help of the electronic device 100 (i.e., the second method for establishing the third link shown in FIG. 13), and the specific implementation process is similar to FIG. 17, except that the electronic device 100 needs to be replaced by the electronic device 300.
  • the above 4, 8, and 11 are not executed.
  • the electronic device 100 can also bind the first identification information and the third communication ID of the electronic device 100 and register/login to the network device 400, and the subsequent addressing process can be consistent with the above electronic device 200 addressing and obtaining the communication ID group bound to the first identification information.
  • the subsequent addressing process can be consistent with the above electronic device 200 based on the third identification information notified by the electronic device 100.
  • the third identification information is not ...
  • the implementation process of the electronic device 100 addressing through the network device 400 to establish an auxiliary link with the electronic device 200 is similar to Figure 17 above, with the difference that the electronic device 300 needs to be replaced by the electronic device 100, and the first communication ID of the electronic device 300 needs to be replaced by the third communication ID of the electronic device 100. At this time, the above 4, 8 and 11 are not executed.
  • the electronic device 100 and the electronic device 200 may include a DC protocol stack, and the electronic device 300 may include a DC protocol stack or may not include a DC protocol stack.
  • the DC protocol stack may be used to process shared data, such as multimedia data streams of applets and web pages. For specific examples, see FIG. 18 below.
  • FIG. 18 exemplarily shows a schematic diagram of the architecture of yet another sharing system 10 .
  • the application system of the electronic device 100 may include a new call function module, a DC protocol stack and a DC proxy module, and the cellular communication module of the electronic device 100 may include an IMS communication module, a communication protocol module, a data route module and a quality identification configuration module, wherein:
  • FIG18 is an example in which the new call function module includes a sharing module.
  • the new call function module and the sharing module reference may be made to the description of the new call function module and the sharing module of the electronic device 100 in FIG3A-FIG3B.
  • the DC protocol stack may include a DC management module, a route management module, and a transport layer protocol.
  • the DC management module may be used to manage the DC protocol stack.
  • the route management module may be used to manage the routing function based on the DC implementation.
  • the transport layer protocol may include, for example but not limited to, the stream control transmission protocol (SCTP) and/or the datagram transport layer security protocol (DSTP).
  • SCTP stream control transmission protocol
  • DSTP datagram transport layer security protocol
  • the DC protocol stack may be used to process data so that the processed data can be transmitted via the DC, and/or to parse and process data received via the DC so that the processed data can be output by the electronic device 100.
  • the DC proxy module can be used to send data received by the electronic device 100 via DC to other devices such as the electronic device 300, and/or send data received by the electronic device 100 and processed by the DC protocol stack of other devices such as the electronic device 300 to the electronic device 200, so that the data will not be transmitted to the DC protocol stack of the electronic device 100 for processing.
  • IMS communication module For the description of the IMS communication module, please refer to the description of the IMS communication module in FIG. 3A-FIG 3B .
  • the communication protocol module may include a communication protocol related to the IMS (eg TCP/IP), and use the communication protocol to process data.
  • a communication protocol related to the IMS eg TCP/IP
  • the data routing module can be used to implement DC-based routing functions.
  • the quality identification configuration module can be used to configure signaling such as QCI and/or 5G service quality identifier (5G QoS Identifier, 5QI).
  • signaling such as QCI and/or 5G service quality identifier (5G QoS Identifier, 5QI).
  • the software architecture of electronic device 200 is similar to the software architecture of electronic device 100, with the difference that the sharing module of electronic device 200 is an optional module, and the new call function module of electronic device 200 may include a playback module.
  • the description of the playback module can be found in the description of the playback module of electronic device 200 in Figures 3A-3B.
  • the application system of electronic device 200 does not include a DC agent module.
  • the application system of electronic device 300 is similar to the application system of electronic device 100, except that the DC protocol stack in the application system of electronic device 300 is optional.
  • the application system of electronic device 300 may also include a playback module (optional).
  • a playback module for the description of the playback module, refer to the description of the playback module of electronic device 300 in Figures 3A-3B.
  • the application system of electronic device 300 does not include a DC agent module.
  • the first link between the electronic device 100 and the electronic device 200 may include a main link for a new call, such as a multimedia path of QCI1/QCI2 (which may be referred to as an audio and video link), and an auxiliary link (such as DC).
  • a main link for a new call such as a multimedia path of QCI1/QCI2 (which may be referred to as an audio and video link)
  • an auxiliary link such as DC.
  • the transmission process of shared data is as follows.
  • the electronic device 300 can share the software and hardware resources (i.e., shared data) of the electronic device 300 with the electronic device 200.
  • the electronic device 300 can send the shared data that has not been processed by the DC protocol stack to the electronic device 100 (i.e., transmission mode 1 shown in FIG. 18), and the electronic device 100 can process the received shared data through the DC protocol stack, and then send the processed data to the air interface after processing through the IMS communication module and the communication protocol module, and then transmit it to the electronic device 200 through the main link (e.g., multimedia path of QCI1/QCI2) and/or the auxiliary link (e.g., DC) of the new call.
  • the main link e.g., multimedia path of QCI1/QCI2
  • the auxiliary link e.g., DC
  • the shared data of the electronic device 300 can be sent to the electronic device 100 after being processed by the DC protocol stack (i.e., transmission mode 2 shown in FIG. 18), and the DC proxy module of the electronic device 100 can send the received shared data to the air interface after processing through the IMS communication module and the communication protocol module, and finally transmit it to the electronic device 200 through the main link (e.g., multimedia path of QCI1/QCI2) and/or the auxiliary link (e.g., DC) of the new call.
  • the electronic device 200 can receive the shared data sent by the electronic device 100 through the air interface.
  • the received shared data can be sent to the new call function module after being processed by the communication protocol module and the IMS communication module, and played by the playback module in the new call function module.
  • the electronic device 300 may include a call application (corresponding to a new call function module). When the electronic device 300 displays the user interface of the call application, it may also display the window of the mini-program/webpage. For example, in response to a user operation on the user interface of the call application, the user interface of the mini-program of the call assistant is displayed (for example, message information is displayed). The electronic device 300 may determine that the multimedia data stream of the window of the mini-program/webpage is shared data in response to the user operation. In some examples, the electronic device 200 may play the multimedia data stream of the mini-program/webpage shared by the electronic device 300 on the user interface of the call application.
  • a call application corresponding to a new call function module
  • the electronic device 200 can share the software and hardware resources (i.e., shared data) of the electronic device 200 with the electronic device 300.
  • the shared data of the electronic device 200 can be processed by the DC protocol stack and then sent to the air interface after being processed by the IMS communication module and the communication protocol module, and then transmitted to the electronic device 100 through the main link (e.g., the multimedia channel of QCI1/QCI2) and/or the auxiliary link (e.g., DC) of the new call.
  • the electronic device 100 can receive the shared data sent by the electronic device 200 through the air interface, and the received shared data can be sent to the DC proxy module after being processed by the communication protocol module and the IMS communication module.
  • the shared data of the electronic device 200 can be transmitted to the DC protocol stack of the electronic device 100 for processing, and the processed shared data can be sent to the electronic device 300 (i.e., the transmission mode 1 shown in Figure 18), and the electronic device 300 can play the received shared data through the playback module in the new call function module.
  • the DC proxy module can send the shared data that has not been processed by the DC protocol stack to the electronic device 300 (i.e., transmission mode 2 shown in FIG. 18 ), and the electronic device 300 can process the received shared data through the DC protocol stack, and the processed data can be transmitted to the playback module of the electronic device 300 for playback.
  • the specific example is similar to the above-mentioned example of the electronic device 300 sharing the software and hardware resources of the electronic device 300 with the electronic device 200, except that the roles of the electronic device 200 and the electronic device 300 need to be exchanged.
  • the media plane data and the control plane data in the shared data can be transmitted together or separately.
  • the media plane data can be transmitted through the main link
  • the control plane data can be transmitted through the auxiliary link.
  • the data of the media plane and the data of the control plane can both be transmitted through the main link, or both be transmitted through the auxiliary link.
  • the transport layer protocol in the DC protocol stack may not be deployed in the application system, but may be deployed in the cellular communication module.
  • the DC protocol stack may not include the transport layer protocol, and the transport layer protocol may be deployed on the cloud server side.
  • the DC protocol stack may include a DC cloud module, which may be used to send data to the cloud server so that the cloud server returns data processed by the transport layer protocol.
  • Fig. 19 exemplarily shows a schematic diagram of the hardware structure of another electronic device 300.
  • Fig. 19 takes the electronic device 300 as a car or other intelligent driving device as an example for illustration.
  • the electronic device 300 may include, but is not limited to, at least one of the following components: a smart cockpit domain controller (CDC), a telematics box (Tbox), a central control screen, a rear seat screen, a HUD, an instrument display screen, and a camera. Any two components of the electronic device 300 may be connected via a vehicle bus, such as a controller area network (CAN) bus, but not limited thereto. In another embodiment, they may also be connected via other communication methods such as USB and/or a flat panel display link (FDP-Link). In some examples, as shown in FIG.
  • the central control screen, the rear seat screen, the HUD, the instrument display screen, and the camera are connected to the CDC and Tbox via the CAN bus, the CDC is connected to the Tbox via the CAN bus and/or USB, and the central control screen is connected to the CDC via the FPD-Link.
  • the CDC may include a sharing module.
  • the description of the sharing module can be found in the description of the sharing module of the electronic device 300 in Figures 3A-3B. It can be understood that the sharing function is implemented through the CDC, such as the case where the electronic device 300 receives a trigger operation through the central control screen and/or the rear seat screen.
  • the Tbox can be used only to transmit shared data.
  • the Tbox may include modules for implementing some functions of the sharing function, such as modules for implementing the logic of the sharing function and modules for encoding, etc., such as the case where the Tbox includes a WiFi chip.
  • the Tbox may include a sharing module
  • the CDC may not include a sharing module, that is, the sharing function is implemented through the Tbox.
  • the specific description is similar to the description of implementing the sharing function through the CDC, and will not be repeated.
  • both the Tbox and the CDC may include a sharing module, which is not limited in this application.
  • the video stream collected by the camera of the electronic device 300 can be transmitted to the CDC via the CAN bus.
  • the CDC can process the video stream through the sharing module and then send it to the Tbox, which can send the video stream to the electronic device 200 via the third link, and/or, the Tbox can send the video stream to the electronic device 100 via the second link (so that the electronic device 100 forwards it to the electronic device 200 via the first link), that is, the transmission mode 1 shown in FIG. 19, which is applied, for example, to the case where the sharing function is realized through the CDC and the Tbox is only used to transmit the shared data.
  • the CDC may include a radio interface layer (RIL) and a Tbox daemon module (Tbox-Deamon), and the Tbox-Deamon may include a scalable service-oriented middleware over IP (SomeIP), which may be located above the transport layer of the seven-layer model of open system interconnection (OSI) (e.g., the session layer, the presentation layer, or the application layer), and SomeIP may send data when the receiving end has a demand, and may not send data when the receiving end has no demand.
  • Tbox may include an application system and a cellular communication module, and the application system of Tbox may include an Internet of Things (IoT) Booster and SomeIP.
  • IoT Internet of Things
  • the RIL of CDC and the IoT Booster of Tbox may be used to transmit the data stream of the control plane of the sharing function (which may be referred to as the control stream for short), such as for transmitting instructions for sharing.
  • the Tbox-Deamon of CDC and the SomeIP of Tbox may be used to transmit the data stream of the media plane of the sharing function (which may be referred to as the media stream for short), such as for transmitting sharing data.
  • the cellular communication module of Tbox may include a control plane protocol stack/user plane protocol stack of a cellular communication network (such as 5G/6G), such as a content distribution service (CDS) layer, a packet data convergence protocol (PDCP) layer, a radio link control (RLC) layer, a MAC layer and a physical (PHY) layer, which may be used to transmit sharing data via cellular communication.
  • the Tbox application system and the cellular communication module may perform control plane signaling transmission via an attention command (AT).
  • the Tbox application system may send the data stream to the cellular communication module of the Tbox, which then sends the data stream to the electronic device 100 and/or the electronic device 200 via a cellular communication method.
  • the CDC can also send the video stream processed by the sharing module by itself, that is, the transmission mode 2 shown in FIG19, for example, the case where the CDC includes a communication chip such as a Wi-Fi chip.
  • the video stream collected by the camera can be transmitted to the Tbox via the CAN bus, and the Tbox can process the video stream through the sharing module, and then send the processed video stream by itself, that is, the transmission mode 3 shown in FIG19, for example, the case where the sharing function is realized through the Tbox.
  • the electronic device 300 and the electronic device 100 when the electronic device 300 and the electronic device 100 transmit shared data via BT, it may include a control plane data transmission process and a data plane transmission process with less traffic. Not limited to this, the electronic device 300 and the electronic device 100 may also transmit shared data via Wi-Fi or cellular communication, which may include a control plane and a data plane transmission process.
  • the sharing system 10 may include more or fewer devices, and the communication method between devices in the sharing system 10 may be other situations. Any electronic device in the sharing system 10 may include more or fewer modules, may combine two or more modules, or may have different module configurations. In some examples, the wireless communication system of any electronic device in the sharing system 10 may include a D2D communication module and a V2X module.
  • the various components/modules shown in Figures 2A, 2B, 3A, 3B, 18, and 19 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application-specific integrated circuits.
  • Scenario 1 When a subordinate and a leader are talking via a mobile phone, the subordinate needs to report/share document materials (such as word format, excel format, or PPT format, etc.) with the leader and make modifications based on the leader's opinions.
  • the subordinate can open the document material and modify it on a computer near the mobile phone, and share the computer screen and the document material on the computer.
  • the leader can not only see the modification results synchronously, but also receive the modified document material so that he can modify and share the document material later.
  • Scenario 2 A deliveryman usually has multiple mobile phones. When the food is delivered, the deliveryman can use one of the mobile phones to call the customer to inform the customer that the food has been delivered. Some customers may want to take a photo of the food. The deliveryman can use another mobile phone to take a photo of the food and share the photo with the customer.
  • Scenario 3 When a user makes a call with another user through a smartwatch, they need to share their location, but the smartwatch may not have a positioning function. In this case, the user can initiate location sharing with the other party through the mobile phone, and both parties can see each other's location.
  • Scenario 4 When the elderly at home do not know how to use the smart screen, they can call their children for help through their mobile phones. They can share the smart screen with their children, who can give verbal instructions or directly control the smart screen remotely.
  • Scenario 5 A user is on a business trip and is talking to a colleague via a mobile phone.
  • the colleague wants to view some of the user's files, but the files are stored on the home computer.
  • the user can perform a trigger operation on the mobile phone to share the files on the home computer with the colleague.
  • the output in this application includes not only being executed through output modules such as the display screen of the device itself, but also being executed through output modules such as the display screen of other devices connected to the device.
  • the microphone in this application can be replaced by other modules that can collect audio/voice/sound.
  • the camera in this application can be replaced by other modules that can shoot/collect images.
  • the above embodiments are illustrated by taking the form of a floating window to display the relevant information of the sharing function as an example. In specific implementations, it can also be displayed in other forms, and this application does not limit this.
  • the electronic device 100 when the shared data is relayed and transmitted through the electronic device 100, the electronic device 100 will send all the shared data to the shared device, and the shared device outputs the shared data.
  • the electronic device 100 after the electronic device 100 receives the shared data of service 1 of the sharing device, it can output the shared data 1 of service 1 by itself, and send the shared data 2 of service 1 to the shared device, and the shared device outputs the shared data 2 of service 1.
  • service 1 is a screen sharing service
  • shared data 1 is audio data
  • shared data 2 is image data (the image of the screen of the sharing device), which can be understood as the data of a single shared service can be separated.
  • the electronic device 100 after the electronic device 100 receives the shared data of service 2 and service 3 of the sharing device, it can output the shared data of service 2 by itself, and send the shared data of service 3 to the shared device, and the shared device outputs the shared data of service 3.
  • service 2 is a screen sharing service
  • service 3 is a file sharing service, which can be understood as the data of multiple shared services can be separated.
  • the methods provided in the embodiments of the present application may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using software, they may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, a network device, a user device, or other programmable device.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from one website, computer, server or data center to another website, computer, server or data center via wired (e.g., coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means.
  • wired e.g., coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless e.g., infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center that includes one or more available media integrated therein.
  • the available medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (DWD), or a semiconductor medium (e.g., a solid state disk (SSD)).
  • a magnetic medium e.g., a floppy disk, a hard disk, a magnetic tape
  • an optical medium e.g., a digital video disc (DWD)
  • a semiconductor medium e.g., a solid state disk (SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephonic Communication Services (AREA)

Abstract

La présente demande concerne un procédé de partage, un dispositif électronique et un support de stockage informatique. Le procédé comprend les étapes au cours desquelles : un premier dispositif affiche une première interface, la première interface étant utilisée pour indiquer qu'un appel d'un opérateur avec un deuxième dispositif est en cours ; lorsqu'il affiche une seconde interface, un troisième dispositif reçoit une première opération d'utilisateur ou reçoit un premier message envoyé par le premier dispositif, le premier message étant envoyé par le premier dispositif lors de la réception d'une seconde opération d'utilisateur, le premier message étant utilisé pour ordonner au troisième dispositif de partager des données multimédias avec le deuxième dispositif, et le troisième dispositif étant un dispositif découvert par le premier dispositif, un dispositif connecté au premier dispositif, un dispositif stockant un identifiant ou un dispositif identifié en fonction d'une image capturée ; le troisième dispositif envoie des premières données au deuxième dispositif et/ou le troisième dispositif envoie les premières données au premier dispositif de telle sorte que le premier dispositif envoie les premières données au deuxième dispositif, les premières données contenant des données multimédias relatives à la seconde interface ; et le deuxième dispositif délivre en sortie les premières données. La présente demande peut réduire la limitation d'une fonction de partage.
PCT/CN2023/127957 2022-11-16 2023-10-30 Procédé de partage, dispositif électronique et support de stockage informatique WO2024104122A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211435717.X 2022-11-16
CN202211435717.XA CN118055184A (zh) 2022-11-16 2022-11-16 分享方法、电子设备及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2024104122A1 true WO2024104122A1 (fr) 2024-05-23

Family

ID=91052428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/127957 WO2024104122A1 (fr) 2022-11-16 2023-10-30 Procédé de partage, dispositif électronique et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN118055184A (fr)
WO (1) WO2024104122A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108933965A (zh) * 2017-05-26 2018-12-04 腾讯科技(深圳)有限公司 屏幕内容共享方法、装置及存储介质
CN109542378A (zh) * 2018-11-19 2019-03-29 上海闻泰信息技术有限公司 屏幕共享方法、装置、电子设备及可读存储介质
AU2020100721A4 (en) * 2019-05-06 2020-06-18 Apple Inc. User interfaces for sharing content with other electronic devices
CN113923528A (zh) * 2020-07-08 2022-01-11 华为技术有限公司 屏幕共享方法、终端和存储介质
CN114911400A (zh) * 2021-02-08 2022-08-16 花瓣云科技有限公司 分享图片的方法和电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108933965A (zh) * 2017-05-26 2018-12-04 腾讯科技(深圳)有限公司 屏幕内容共享方法、装置及存储介质
CN109542378A (zh) * 2018-11-19 2019-03-29 上海闻泰信息技术有限公司 屏幕共享方法、装置、电子设备及可读存储介质
AU2020100721A4 (en) * 2019-05-06 2020-06-18 Apple Inc. User interfaces for sharing content with other electronic devices
CN113923528A (zh) * 2020-07-08 2022-01-11 华为技术有限公司 屏幕共享方法、终端和存储介质
CN114911400A (zh) * 2021-02-08 2022-08-16 花瓣云科技有限公司 分享图片的方法和电子设备

Also Published As

Publication number Publication date
CN118055184A (zh) 2024-05-17

Similar Documents

Publication Publication Date Title
CN111866950B (zh) Mec中数据传输的方法和通信装置
WO2022121775A1 (fr) Procédé de projection sur écran, et dispositif
WO2020143380A1 (fr) Procédé de transmission de données et dispositif électronique
US10560532B2 (en) Quick relay session management protocol
WO2022083386A1 (fr) Procédé et système de projection d'écran et dispositif électronique
JP7181990B2 (ja) データ伝送方法及び電子デバイス
WO2022143508A1 (fr) Procédé de transmission de données dans un champ proche, dispositif et système
WO2023143300A1 (fr) Procédé et système de sélection de tranche, et appareil associé
WO2022135341A1 (fr) Procédé et système de communication, et dispositif électronique
WO2022222691A1 (fr) Procédé de traitement d'appel et dispositif associé
WO2020134868A1 (fr) Procédé d'établissement de connexion, et appareil terminal
WO2024104122A1 (fr) Procédé de partage, dispositif électronique et support de stockage informatique
WO2021218544A1 (fr) Système de fourniture de connexion sans fil, procédé et appareil électronique
WO2022205254A1 (fr) Procédé et dispositif de détermination d'un serveur de configuration de bordure
WO2024017296A1 (fr) Procédé de partage, dispositif électronique et système
EP4354917A1 (fr) Procédé de traitement de données et dispositif électronique
WO2023202533A1 (fr) Procédé de communication, dispositif électronique et système
WO2023273487A1 (fr) Procédé et appareil pour envoyer une signalisation par chemins multiples
WO2024093922A1 (fr) Procédé de commande audio, support de stockage, produit programme et dispositif électronique
WO2024007998A1 (fr) Procédé de transmission de données, et dispositif électronique et système de communication
WO2021143921A1 (fr) Procédé de commande et appareil de commande de transmission par trajets multiples
WO2023280160A1 (fr) Procédé et appareil de commutation de canal
WO2022188813A1 (fr) Procédé et système de communication bluetooth, et dispositif électronique
WO2024046347A1 (fr) Procédé et système de partage de données et appareil associé
WO2023035885A1 (fr) Procédé de communication et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23890558

Country of ref document: EP

Kind code of ref document: A1