WO2024027718A1 - 一种多窗口投屏方法、电子设备及系统 - Google Patents

一种多窗口投屏方法、电子设备及系统 Download PDF

Info

Publication number
WO2024027718A1
WO2024027718A1 PCT/CN2023/110589 CN2023110589W WO2024027718A1 WO 2024027718 A1 WO2024027718 A1 WO 2024027718A1 CN 2023110589 W CN2023110589 W CN 2023110589W WO 2024027718 A1 WO2024027718 A1 WO 2024027718A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen projection
window
application
weight coefficient
screen
Prior art date
Application number
PCT/CN2023/110589
Other languages
English (en)
French (fr)
Inventor
许豪灿
梅森
胡传丰
黄中帅
郑博文
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024027718A1 publication Critical patent/WO2024027718A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping

Definitions

  • the present application relates to the field of terminal technology, and in particular, to a multi-window screen projection method, electronic equipment and system.
  • the multi-window screen projection technology is to project multiple application interfaces launched on one electronic device (such as a first device) to another electronic device (such as a second device), so as to realize the viewing on the first device and the second device.
  • the multi-window screen projection technology is to project multiple application interfaces launched on one electronic device (such as a first device) to another electronic device (such as a second device), so as to realize the viewing on the first device and the second device.
  • Mirror manipulation and input collaboration is to project multiple application interfaces launched on one electronic device (such as a first device) to another electronic device (such as a second device), so as to realize the viewing on the first device and the second device.
  • the first device When the first device casts the screen to the second device, first the first device encodes the screencast data according to the code rate, then transmits the encoded screencast data to the second device, and finally the second device encodes the received screencast data.
  • the data is decoded and displayed in the window.
  • electronic devices need to process a larger amount of screencasting data when performing multi-window screencasting. Therefore, a reasonable bit rate allocation needs to be performed for each projection window (that is, the displayed application) to ensure that the projection data of each application can be successfully projected to the second device.
  • Embodiments of the present application provide a multi-window screen projection method, electronic device and system, which can adaptively adjust the code rate of each application during multi-window screen projection, thereby ensuring the clarity of the display screen in the screen projection window corresponding to each application. and smoothness, improve the problems such as stuck and frame skipping in the display window of the screen during the screen casting process, and improve the user experience.
  • embodiments of the present application provide a multi-window screen projection method, including: a first device obtaining a first projection used to characterize the projection characteristics of casting screen projection data of multiple applications to multiple screen projection windows. screen parameters.
  • the first device determines a first set of code rates according to the first screen projection parameter, and the first set of code rates includes the first code rate of each application in the plurality of applications.
  • the first device projects the screen projection data of the multiple applications to multiple screen projection windows of the second device at the first set of code rates, so that the screen of each application is correspondingly displayed on a projection screen of the second device. in the window.
  • the first device then obtains a second screen projection parameter, and determines a second set of code rates based on the second screen projection parameter.
  • the second set of code rates includes the second code rate of each application in the plurality of applications.
  • the first device projects the screen projection data of the multiple applications to multiple screen projection windows of the second device at the second set of code rates.
  • the first device can adaptively adjust each screen according to the screencasting parameters used to characterize the screencasting characteristics of casting screencasting data of multiple applications to multiple screencasting windows.
  • the code rate of the application that is, the corresponding screen projection window
  • the second device displays the screen of each application through a screen projection window.
  • the first screen projection parameter includes at least one of the following parameters: network status information, user experience information, and window status information.
  • the network status information is used to characterize the status of the transmission channels used by multiple applications during screencasting
  • the user experience information is used to characterize the user's quality experience of the screencast image of each application
  • the window status information is used to characterize the quality of each screencasting window. real-time status.
  • the projection data of multiple applications can be quantified by the projection characteristics of multiple projection windows, and at least one of network status information, user experience information and window status information is used as the first Screen projection parameters, so that the code rate of each application can be determined based on the first screen projection parameter.
  • the method further includes: the first device determines the total code rate that can be used for screen projection of multiple applications based on the network status information; determines the total code rate of each application based on the window status information. Code rate allocation weight coefficient; determine the quality experience score corresponding to each application based on user experience information.
  • the first device determines the first code rate of the application based on the code rate allocation weight coefficient of the application and the quality experience score corresponding to the application, where the constraints include the first code rate corresponding to multiple applications.
  • the sum of a code rate is equal to the total code rate.
  • the first device can determine the total code rate, code rate allocation weight coefficient and quality experience score based on network status information, window status information and user experience information respectively.
  • the code rate of each application is determined based on the total code rate, code rate allocation weight coefficient and quality experience score. In this way, the first device reasonably allocates the code rate of each application according to the network status, window status and user experience. Make the quality of the screen display in the projection window corresponding to each application meet the user's needs and improve the user experience.
  • the network status information includes a network bandwidth parameter and a channel interference parameter
  • the total code rate is the product of the network bandwidth parameter and the channel interference parameter.
  • the network bandwidth parameter is used to characterize the bandwidth of the transmission channel
  • the channel interference parameter is used to characterize the interference situation of the transmission channel.
  • the network bandwidth parameters and channel interference parameters can truly reflect the current network status. In this way, based on the network bandwidth parameters and channel interference parameters, the total code rate that can be used for multiple application screencasting can be more accurately determined.
  • the network bandwidth parameters and channel interference parameters can be predicted through a time series prediction method.
  • the network bandwidth parameters and channel interference parameters predicted based on the time series prediction method are closer to the real network status. The accuracy of determining the total code rate for screen projection for multiple applications based on the measured network bandwidth parameters and channel interference parameters can be improved.
  • the user experience information includes a user experience score corresponding to each application, and the user experience score corresponding to the application is the user's quality experience score of the screen-cast image of the application.
  • the user experience score is used to characterize the user's subjective experience of the application's screen image quality.
  • the user's subjective experience can be quantified through the user experience score, so that the code rate of each application can be determined based on the user experience score. .
  • the user experience score is obtained based on the image quality evaluation score and the corresponding preset image quality evaluation weight coefficient.
  • the user experience score is the product of the image quality evaluation score and the preset image quality evaluation weight coefficient.
  • the image quality evaluation score is the objective evaluation result of the image quality through the image quality evaluation method.
  • the image quality evaluation score includes one or more of the following scores: peak signal-to-noise ratio (PSNR) score, structural similarity (structural similarity, SSIM) score or mean square error (mean square error, MSE) score.
  • PSNR peak signal-to-noise ratio
  • SSIM structural similarity
  • MSE mean square error
  • the preset image quality evaluation weight coefficient represents the impact of different image quality evaluation methods on user experience.
  • the user experience score determined based on the image quality evaluation score and the preset image quality evaluation weight coefficient can truly reflect the user's experience of the application's screen image quality, so that the code of each application can be reasonably determined based on the user experience score. Rate.
  • the window status information includes a code rate allocation weight coefficient for each application.
  • the window status information is used to characterize the real-time status of each screen projection window. Based on the real-time status of each screen projection window, the code rate allocation weight coefficient of each application can be determined for further use according to the code rate. Assign weight coefficients to determine the code rate for each application.
  • the code rate allocation weight coefficient is obtained according to at least one of the following coefficients: user attention weight coefficient, image complexity weight coefficient and preset weight coefficient.
  • the code rate allocation weight coefficient is the product of the user attention weight coefficient, the image complexity weight coefficient and the preset weight coefficient.
  • the user attention weight coefficient is used to represent the user's attention in the screen projection window
  • the image complexity weight coefficient is used to represent the image complexity of the corresponding application in the screen projection window
  • the preset weight coefficient is used to Indicates the preset weight of the projection window.
  • the window status information can be quantified through the user attention weight coefficient, the image complexity weight coefficient and the preset weight coefficient, so as to determine the code rate allocation weight coefficient for each application.
  • the user attention weight coefficient is obtained based on the operating frequency of the window interface in the historical time period and the corresponding preset frequency weight coefficient.
  • the user attention weight coefficient is the user's attention invested in the screen projection window, which can be reflected by the user's operation frequency of the screen projection window. Based on the operating frequency of the window interface in the historical time period and the corresponding preset frequency weight coefficient, the user's attention to the screen projection window can be more accurately reflected.
  • the operating frequency of the window interface in the historical time period includes: the first operating frequency of the window interface in the first historical time period, the operating frequency of the window interface in the second historical time period.
  • the preset frequency weight coefficient includes: a first frequency preset weight coefficient corresponding to the first operating frequency, and a first frequency preset weight coefficient corresponding to the second operating frequency.
  • the user attention weight coefficient is the sum of the product of the first operating frequency and the first frequency preset weight coefficient, and the product of the second operating frequency and the second frequency preset weight coefficient.
  • the user attention weight coefficient determined based on the operating frequency of two historical time periods and the corresponding preset weight coefficient can more truly reflect the user's actual attention to the screen projection window, further improving Accuracy of code rate allocation weight coefficient sex.
  • the image complexity weight coefficient is obtained based on the complexity of historical multi-frame images of the window interface and the preset complexity weight coefficient corresponding to each frame of image.
  • the image complexity weight coefficient is the product of the complexity of historical multi-frame images and the preset complexity weight coefficient corresponding to each frame image.
  • the image complexity weight coefficient is used to characterize the image complexity of the corresponding application of the screen projection window.
  • the image complexity can be determined based on the complexity of the historical multi-frame images of the window interface and the corresponding preset complexity weight coefficient. weight coefficient. In this way, the image complexity weight coefficient can accurately reflect the image complexity of the corresponding application of the screen projection window.
  • the preset weight coefficient is obtained based on the application weight coefficient of window startup and the preset window weight coefficient.
  • the preset weight coefficient is the product of the window-started application weight coefficient and the preset window weight coefficient.
  • the preset weight coefficient is used to represent the user's preset weight for the screen projection window, which mainly includes the started application weight coefficient and the window weight coefficient. Based on the application weight coefficient of window startup and the preset window weight coefficient, the preset weight coefficient can be determined. In this way, the preset weight coefficient can accurately reflect the user's preset weight for the screen projection window.
  • inventions of the present application provide an electronic device.
  • the device includes: an acquisition module, configured to acquire a first screen projection parameter; and the first screen projection parameter is used to represent the projection data of multiple applications to the screen. Screen projection characteristics of multiple screen projection windows; a processing module for projecting screen projection data of multiple applications to multiple screen projection windows at a first set of code rates according to the first screen projection parameters; the first set of code rates Including the first code rate of each application in multiple applications; the acquisition module is also used to obtain the second screen projection parameters; the processing module is also used to convert multiple applications to the second set of code rates according to the second screen projection parameters.
  • the projection data is projected to multiple projection windows, and the second set of code rates includes the second code rate of each application in the multiple applications.
  • the first screen projection parameter includes at least one of the following parameters: network status information, user experience information and window status information, and the acquisition module is also used to: obtain network status information , user experience information and window status information.
  • Network status information is used to characterize the status of the transmission channel used by multiple applications when casting the screen.
  • User experience information is used to characterize the user's quality experience of the screen projection image of each application.
  • Window status The information is used to represent the real-time status of each projection window.
  • inventions of the present application provide an electronic device.
  • the device includes: a memory and one or more processors; the memory is coupled to the processor; wherein the memory stores computer program code, and the computer program code includes computer instructions. , when the computer instructions are executed by the processor, the electronic device is caused to execute the multi-window screen projection method described in any one of the above first aspects.
  • embodiments of the present application provide a multi-window screen projection system.
  • the system includes a first device and a second device; wherein, the first device is used to obtain first screen projection parameters; and the first screen projection parameters are used to obtain first screen projection parameters.
  • Multiple screen projection windows; the first set of code rates includes the first code rate of each application in the multiple applications; the second screen projection parameters are obtained, and according to the second screen projection parameters, the multiple applications are converted to the second set of code rates at the second set of code rates.
  • the screen projection data is projected to multiple screen projection windows.
  • the second set of code rates includes the second code rate of each application in the multiple applications; the second device is used to pass the screen projection data of multiple applications through one screen projection window.
  • the window is displayed.
  • embodiments of the present application provide a computer-readable storage medium that includes computer instructions.
  • the computer instructions When the computer instructions are run on an electronic device, the electronic device causes the electronic device to execute the multi-window screen projection method described in any one of the first aspects. .
  • embodiments of the present application provide a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to execute the multi-window screen projection method described in any one of the above first aspects.
  • the electronic device of the second aspect can be understood that the electronic device of the second aspect, the electronic device of the third aspect, the multi-window screen projection system of the fourth aspect, the computer-readable storage medium of the fifth aspect, and the computer program product of the sixth aspect provided above can For the beneficial effects achieved, reference can be made to the first aspect and the beneficial effects in any possible design method, which will not be described again here.
  • Figure 1 is a schematic diagram of a single-window screen projection scenario according to an embodiment of the present application
  • Figure 2 is a schematic diagram of a multi-window screen projection scenario according to an embodiment of the present application.
  • Figure 3 is a schematic diagram of the screen display effect of a screen projection window according to an embodiment of the present application.
  • Figure 4 is a schematic diagram of the principle of a multi-window screen projection method according to an embodiment of the present application.
  • Figure 5 is a schematic diagram of another multi-window screen projection scenario according to an embodiment of the present application.
  • Figure 6 is a schematic diagram of the hardware structure of an electronic device according to an embodiment of the present application.
  • Figure 7 is a schematic diagram of the software structure of the electronic device shown in the embodiment of the present application.
  • Figure 8 is a schematic flowchart of a multi-window screen projection method according to an embodiment of the present application.
  • Figure 9(a) is a schematic diagram of an image complexity according to an embodiment of the present application.
  • Figure 9(b) is a schematic diagram of another image complexity according to an embodiment of the present application.
  • Figure 10 is a schematic diagram of the display effect of multiple screen projection windows according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • At least one of the following or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • at least one of a, b, or c can mean: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • words such as “first” and “second” are used to distinguish identical or similar items with basically the same functions and effects. Those skilled in the art can understand that words such as “first” and “second” do not limit the number and execution order, and words such as “first” and “second” do not limit the number and execution order.
  • words such as “exemplary” or “for example” are used to represent examples, illustrations or explanations. Any embodiment or design described as “exemplary” or “such as” in the embodiments of the present application is not to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner that is easier to understand.
  • the user can cast video a played on the first device 10 (smartphone) to a large-screen device on the second device 20 (smart screen, etc.) for playback.
  • the playback interface of video a on the second device 20 is larger, so that the user can obtain a higher video viewing experience.
  • the multi-window screen projection technology is to project multiple application interfaces launched on one electronic device (such as a first device) to another electronic device (such as a second device), so as to realize the viewing on the first device and the second device.
  • Figure 2 shows a schematic diagram of a multi-window screen projection scenario.
  • a communication connection for multi-window screen projection is established between the first device 10 and the second device 20 .
  • a multi-window screencasting scenario it is assumed that while the first device 10 is displaying the mobile phone desktop, a user's startup operation for screencasting a short message application, a video application, and a game application on the first device 10 is received.
  • the first device casts the data of the short message application, the video application, and the game application (i.e., the content displayed by the application) according to the corresponding code rate.
  • bitrate refers to the number of data bits transmitted per unit time.
  • the code rate can also be called the bit rate.
  • the unit of code rate can be bits per second (bps).
  • the screen projection window corresponding to each application also has different requirements for bit rate. For example, if the screen projection window displays a game application that the user is currently working on, a higher code rate will be allocated to the game application corresponding to the screen projection window to ensure the smoothness of the user's current operation. For another example, if the image complexity of the application displayed in the screen projection window is high, a higher code rate needs to be allocated to the application corresponding to the screen projection window to ensure the clarity of the image.
  • the code rate is usually evenly distributed to each screen projection application (or application). For example, each application is assigned a code rate of 5 megabits (M). Or, preset the code rate for each application based on a priori knowledge. For example, the default bit rate for game applications is 10M, and the default bit rate for short message applications is 3M. In the code rate allocation methods of the above two applications, the code rate of each application is a fixed value and does not dynamically adjust according to user operations, screen projection content, network status and other factors.
  • the code rate of the application cannot meet the user's usage needs, it will affect the clarity and smoothness of the display screen in the corresponding projection window of the application, making the display screen in the projection window prone to freezes and frame skipping, affecting the user's use. experience.
  • Figure 3 shows a screen projection window display effect shown in an embodiment of the present application.
  • Figure 3 shows a screen projection window display effect shown in an embodiment of the present application. schematic diagram. If the bitrate of a screencast window cannot meet the user's needs, the screen displayed in the screencast window may freeze. As shown in Figure 3, the transmission requirements of SMS applications are relatively low. At this time, screencasting with a code rate of 5M can meet the user needs and achieve normal screencasting. However, video applications and game applications have relatively large amounts of data to transmit, so transmission The demand requirements are relatively high. At this time, if the screen is also cast at a bit rate of 5M, it will not be able to meet the user's needs.
  • Stuttering and frame skipping will occur.
  • the screen of the video application in the screen casting window b in the second device 20 stops.
  • the "Loading" prompt is displayed.
  • the screen of the game application in the screen casting window stops, and a "Loading" prompt is displayed. If stuttering occurs continuously, frame skipping may even occur, for example, the video picture suddenly jumps from the current frame to another unrelated frame, etc. This kind of lagging and frame skipping will affect the screencasting effect and the user's experience on the second device during screencasting.
  • FIG. 4 is a schematic diagram of the principle of the multi-window screen casting method shown in the embodiment of the present application.
  • a first device is used to perform multi-window screen casting to a second device.
  • this method can obtain the network status information of the transmission channel used by the first device to project the screen to the second device through network status monitoring, and obtain the window status of each screen projection window in the second device through screen projection window monitoring.
  • Information obtain user experience information through user experience analysis.
  • the code rate of each application is adaptively adjusted through real-time scheduling and decision-making based on network status information, user experience information, window status information and other parameters, thereby ensuring the clarity and smoothness of the screen projection window display corresponding to each application. Improved the problem of lags and frame skipping in the display window of the screen during the screen casting process, and improved the user experience.
  • the screencasting method provided by the embodiment of this application is applied in the application scenario of multi-window screencasting.
  • the first device performs multi-window screencasting to the second device, and each screencasting window displays a screen of an application.
  • the first device performs multi-window screencasting to multiple electronic devices such as the second device, the third device, the fourth device, and so on.
  • the embodiments of this application do not place special restrictions on the number of electronic devices in the multi-window screen projection application scenario.
  • the multi-window screen projection method provided by the embodiment of the present application can be applied to the above-mentioned first device, second device and other electronic devices.
  • Both the first device and the second device include display screens.
  • the first device and the second device may include, but are not limited to, smartphones, netbooks, tablets, smart watches, smart bracelets, phone watches, smart cameras, handheld computers, personal computers (PC), personal digital assistants (personal digital assistants).
  • the first device and the second device may also be electronic devices of other types or structures, which are not limited by this application.
  • the multi-window screen projection technology is mostly used between a portable device (ie, the first device) and a large-screen device (ie, the second device).
  • a portable device is a smartphone and a large screen device is a laptop.
  • the portable device is a tablet, and the large-screen device is a television.
  • the first device and the second device can be smartphones, netbooks, tablets, smart watches, smart bracelets, phone watches, and smart cameras. , handheld computers, PDAs, PMPs, AR/VR devices or TVs and other electronic devices that support multi-window screen projection.
  • the first device and the second device can be connected through "touch” or “scan” (such as scanning two-dimensional code or barcode), "proximity automatic discovery” (such as using Bluetooth or wireless fidelity (Wi-Fi)) to establish wireless communication connections.
  • the first device and the second device may follow a wireless transmission protocol and transmit information through a wireless connection transceiver.
  • the wireless transmission protocol may include but is not limited to a Bluetooth (BT) transmission protocol or a Wi-Fi transmission protocol.
  • the Wi-Fi transmission protocol may be Wi-FiP2P transmission protocol.
  • the wireless connection transceiver includes but is not limited to Bluetooth, Wi-Fi and other transceivers.
  • a wired communication connection may be established between the first device and the second device.
  • the first device and the second device are connected through a video graphics array (VGA), digital visual interface (DVI), high definition multimedia interface (HDMI) or data transmission line Wait for the wired communication connection to be established.
  • VGA video graphics array
  • DVI digital visual interface
  • HDMI high definition multimedia interface
  • Information transmission is realized between the first device and the second device through the established wired communication connection. This application does not limit the specific connection method between the first device and the second device.
  • FIG. 6 shows a schematic diagram of the hardware structure of the electronic device 100.
  • the electronic device 100 shown in FIG. 6 may be a first device or a second device.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , Antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker, receiver, microphone, headphone interface, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194 , and subscriber identification module (SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , Antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker, receiver, microphone, headphone interface, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194 , and subscriber identification module (SIM)
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can separately couple the touch sensor, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 can be coupled to a touch sensor through an I2C interface, so that the processor 110 and the touch sensor communicate through an I2C bus interface to implement the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • Processor 110 The communication between the processor 110 and the audio module 170 can be realized by coupling with the audio module 170 through the I2S bus.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface to implement the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to implement the function of answering calls through a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationships between the modules illustrated in the embodiment of the present invention are only schematic illustrations and do not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the first device can send the application to the second device through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, etc. Screencast data.
  • the second device can receive the application sent by the first device through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, etc. Screencast data.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode). diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-OLED, quantum dot light emitting diode (quantum dot light emitting diode, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 of the first device can display the content of the screen casting application selected by the user on the first device, and display it when the user clicks or When you touch the screen, some functional controls are further displayed, such as connection controls, screen projection controls, etc.
  • the display screen 194 of the second device can display the content of the screen casting application selected by the user on the first device.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area.
  • the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the processor 110 can adjust each application in real time according to the screen projection parameters by running instructions stored in the internal memory 121 Corresponding bit rate, thereby ensuring the clarity and smoothness of the content displayed in the projection window corresponding to each application, and improving the problems such as stuttering and frame skipping in the display of the projection window during multi-window projection.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker, a receiver, a microphone, a headphone interface, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speakers also called “horns,” are used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speakers, or listen to hands-free calls.
  • a receiver also called a "handset" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a phone call or a voice message, the voice can be heard by bringing the receiver close to the human ear.
  • Microphone also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak through the mouth close to the microphone and input the sound signal into the microphone.
  • the electronic device 100 may be provided with at least one microphone. In other embodiments, the electronic device 100 may be provided with two microphones, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the electronic device 100 can also be equipped with three, four or more microphones to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • the headphone jack is used to connect wired headphones.
  • the headphone interface can be a USB interface 130, or a 3.5mm Open Mobile Terminal Platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
  • OMTP Open Mobile Terminal Platform
  • CTIA Cellular Telecommunications Industry Association of the USA
  • the audio module 170 can synchronously play audio content corresponding to the video content.
  • the audio module 170 of the first device can synchronously play the audio content corresponding to the video content. If the first device projects video content to the second device, the audio module 170 of the first device may not play the audio content corresponding to the video content.
  • the audio module 170 of the second device can synchronously play the audio content corresponding to the video content. If the second device displays the video content projected by the first device in the screen casting scene, the audio module 170 of the second device can synchronously play the audio content corresponding to the screen projected video content.
  • the first device casts the screen to the second device.
  • the processor 110 of the first device can adjust the code rate corresponding to each application in real time according to the screen projection parameters by running instructions stored in the internal memory 121, and pass the screen projection data of the screen projection application through the antenna 1 at the corresponding code rate.
  • the antenna 2, the mobile communication module 150, the wireless communication module 160, etc. transmit to the second device.
  • the second device receives the screen projection data through antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, etc.
  • the second device displays screen projection data (for example, video content) through the display screen 194 and synchronously plays audio content corresponding to the video content through the audio module 170 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • This embodiment of the present invention takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 7 is a software structure block diagram of the electronic device 100 according to the embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android Runtime (Android runtime) and system libraries, as well as the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include short message applications, video applications, office applications, game applications, life applications, shopping applications or functional applications, etc. Specifically, for example, camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 .
  • call status management including connected, hung up, etc.
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Android Runtime includes core libraries and virtual machines. Android Runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), screen projection module, etc.
  • surface manager surface manager
  • media libraries Media Libraries
  • 3D graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • screen projection module etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the screen projection module is used for the electronic device 100 to project screens to other devices, or to receive screen projection data from other devices.
  • the screencasting module can realize the selection of screencasting applications, sending of screencasting data, receiving of screencasting data and display of screencasting data, etc.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer at least includes a display driver, a camera driver, an audio driver, a sensor driver, etc. This embodiment of the present application does not impose any restrictions on this.
  • FIG. 8 is a schematic flowchart of a multi-window screen projection method according to an embodiment of the present application. As shown in FIG. 8 , the method may include the following steps S101-S105.
  • the first device obtains the first screen projection parameters.
  • the first device may be the first device in Figure 4.
  • the first device may have an application framework diagram as shown in Figure 7.
  • the application package in the first device may include multiple applications, such as short message applications. , video applications, office applications, games Applications, life applications, shopping applications or functional applications, etc.
  • the first device can receive the user's screencasting operation for multiple applications through its display screen/user interaction interface, and in response to the screencasting operation, call the multiple applications in the application package through the processor, and transfer the multiple applications to The initial code rate is projected to the second device.
  • the initial code rate group may include multiple initial code rates corresponding to multiple applications, one application corresponds to one initial code rate, and the initial code rate may be a preset code rate.
  • the first screen casting parameter is used to characterize the screen casting characteristics of casting screen casting data of multiple applications to multiple screen casting windows (such as multiple screen casting windows of a second device), or It is understood that the first screen projection parameter is used to characterize the screen projection characteristics of projecting screen projection data of multiple applications to multiple screen projection windows at the initial code rate.
  • the multiple applications described in this application include but are not limited to: short message applications, video applications, office applications, game applications, life applications, shopping applications or functional applications, etc.
  • Screencast data includes but is not limited to: image data, audio data or document data, etc.
  • each screen projection window can display a screen of an application.
  • the display interface of the second device includes three screen projection windows, and the three screen projection windows can respectively display interfaces of short message applications, video applications, and game applications. In this way, the user's multi-application usage needs can be met and the user's usage experience can be improved.
  • the projection characteristics of the projection window may include but are not limited to: network status characteristics of the projection transmission channel, user experience characteristics of the image displayed in the projection window, status characteristics of the projection window, etc.
  • the above-mentioned screencasting characteristics will affect the screencasting effect of each application. Therefore, the code rate of each application can be determined based on the above-mentioned screencasting characteristics.
  • the first screen projection parameter may include but is not limited to at least one of the following parameters: network status information, user experience information, and window status information.
  • the network status information is used to represent the status of the transmission channel used by multiple applications when the first device projects a screen to the second device, such as reference signal receiving power (RSRP), etc.
  • RSRP reference signal receiving power
  • the user experience information is used to characterize the user's quality experience with the screencast image of each application displayed in the screencast window of the second device, such as the score of the screencast image through the image quality evaluation method.
  • the window status information is used to represent the real-time status information of each screen projection window in the second device, such as the complexity of the image displayed in the screen projection window, the user's operation frequency, etc.
  • the first device can reasonably allocate the code rate of each application based on the network status information, user experience information and window status information to ensure the clarity and flow of the image displayed in the projection window corresponding to each application.
  • the first device can obtain the first screen projection parameters in the following manner: the first screen projection parameters include: network status information, user experience information and window status information as an example.
  • the network status information is an actual representation of the network status, and its acquisition method is related to the transmission protocol used.
  • Network status information can be analyzed through historical transmission conditions, or tested by sending test data, or through a partially privatized and customized network transmission protocol (such as a user manually defining a protocol) to provide an application layer-oriented interface.
  • Network status information can be obtained through this interface.
  • User experience information can be obtained based on known prior knowledge.
  • the window status information can be obtained through the screen of the first device or the virtual display running in the background of the first device, and the detailed window status can be obtained by analyzing the content of the screen data. information.
  • the first device determines the first set of code rates based on the first screen projection parameters.
  • the first set of code rates includes the first code rate of each application in the plurality of applications, that is, the first code rate of the screen projection window corresponding to each application.
  • the first set of code rates may be used to characterize the code rate allocation to each application (that is, the screen projection window corresponding to each application) based on the first screen projection parameter.
  • the first device determines the first set of code rates based on the first screen projection parameters. method.
  • the first device determines the total code rate that can be used for screencasting by multiple applications based on the network status information, determines the code rate allocation weight coefficient of each application based on the window status information, and determines the quality experience score corresponding to each application based on the user experience information.
  • the first device determines the first set of code rates based on the total code rate, the code rate allocation weight coefficient of each application, and the quality experience score corresponding to each application. Specifically, under the constraint that the sum of the first code rates corresponding to all applications is equal to the total code rate, the first code rate of each application is determined based on the code rate allocation weight coefficient of the application and the quality experience score corresponding to the application.
  • Kn is the code rate allocation weight coefficient of the nth application
  • Sn is the quality experience score of the nth application.
  • the code rate of each application has a positive correlation with the quality experience score corresponding to the application.
  • the higher the code rate of an application the higher the quality experience score corresponding to the application.
  • the lower the code rate of an application the lower the quality experience score corresponding to the application.
  • the relationship between the first bit rate of the application and the corresponding quality experience score is related to the image quality evaluation method selected for the quality experience score. The following will combine the selected image quality evaluation method to compare the first bit rate of the application and the corresponding quality.
  • the relationship between experience scores is illustrated as an example.
  • the above network status information includes network bandwidth parameters and channel interference parameters.
  • the network bandwidth parameters are used to characterize the bandwidth of the transmission channel, and the channel interference parameters are used to characterize the interference situation of the transmission channel.
  • the method for the first device to determine the total code rate that can be used for screen projection for multiple applications based on the network status information is to multiply the network bandwidth parameter and the channel interference parameter to obtain the total code rate.
  • the network bandwidth parameter is the bandwidth of the transmission channel of the projection data when the first device projects the screen to the second device.
  • Bandwidth refers to the amount of data that a transmission channel can transmit within a unit time (for example, within 1 second). The higher the network bandwidth parameter, the greater the amount of data that the transmission channel can transmit per unit time, and the higher the total bit rate that can be used for screen projection for multiple applications.
  • the channel interference parameters include: adjacent channel interference parameters and co-channel interference parameters.
  • the adjacent channel interference parameters are used to characterize the interference between adjacent or adjacent transmission channels. The closer the adjacent or adjacent transmission channels are, the greater the adjacent channel interference. The bigger.
  • the co-channel interference parameter is used to characterize the interference caused by unwanted signals of the same frequency to the transmission channel, which is called co-channel interference, also known as co-channel interference or co-channel interference.
  • the network bandwidth parameters and channel interference parameters are obtained by predicting the network bandwidth parameters at historical moments and the channel interference parameters at historical moments through a time series prediction method.
  • the first device obtains the set of network bandwidth parameters and channel interference parameters at p historical moments as ⁇ [F 0 ,G 0 ], [F 1 ,G 1 ]...[F p-1 ,G p- 1 ] ⁇ , where F n is the network bandwidth parameter at the nth moment, and G n is the channel interference parameter at the nth moment. Then, the first device uses the time series prediction method to predict the network bandwidth parameters and channel interference parameters in a future period of time based on the network bandwidth parameters and channel interference parameters at p historical moments.
  • the first device may use a recurrent neural network (RNN) to predict network bandwidth parameters and channel interference parameters in a future period of time.
  • RNN is a type of recursive neural network that takes sequence data as input, performs recursion in the evolution direction of the sequence, and connects all nodes (cyclic units) in a chain.
  • RNN has the characteristics of memory, parameter sharing and Turing completeness. It has certain advantages when learning time series and can be applied to the prediction of various time series. Therefore, based on the set of network bandwidth parameters and channel interference parameters at p historical moments, the first device can more accurately predict the network bandwidth parameters and channel interference parameters in a future period of time through RNN.
  • the first device after the first device obtains the set of network bandwidth parameters and channel interference parameters at p historical moments, it can also predict the network bandwidth in the future by using the average method or the weighted average method. parameters and channel interference parameters. Specifically, the first device calculates the average value of the network bandwidth parameters at p historical moments to obtain the predicted network bandwidth parameters. The first device calculates the average value of the channel interference parameters at p historical moments to obtain the predicted channel interference parameters. Or, weight the network bandwidth parameters and channel interference parameters at p times according to the actual network status at p times in history.
  • the first device calculates a weighted average of the network bandwidth parameters and channel interference parameters at p historical moments to obtain the predicted network bandwidth parameters and channel interference parameters. In this way, the predicted network bandwidth parameters and channel interference parameters can more truly reflect the network status information in the future.
  • the above-mentioned user experience information includes a user experience score corresponding to each application, and the user experience score corresponding to the application is the user's quality experience score for the screen-cast image of the application.
  • the user experience score represents the user's user experience on the quality of the screen-cast image, and reflects the relationship between the image quality evaluation method and the user experience.
  • user experience is usually difficult to quantify. Therefore, it is necessary to determine the impact of different image quality evaluation methods on user experience, and then combine the scores of different image quality evaluation methods to determine the user experience score.
  • the user experience score is obtained based on the image quality evaluation score and the corresponding preset image quality evaluation weight coefficient; the user experience score is the product of the image quality evaluation score and the preset image quality evaluation weight coefficient.
  • the image quality evaluation score scores the quality of the screen projection image according to the selected image quality evaluation method.
  • the preset image quality evaluation weight coefficient represents the impact of different image quality evaluation methods on user experience.
  • the preset image quality evaluation weight coefficient The coefficient can be based on the subjective evaluation of a large amount of image data by the experimenter. Based on the statistical results of the subjective evaluation combined with the objective scores of each image quality evaluation method, the weight coefficient of each image quality evaluation method is determined, which is the preset image quality evaluation weight coefficient, so It can accurately reflect the impact of different image quality evaluation methods on user experience.
  • the above-mentioned image quality evaluation scores include one or more of the following scores: PSNR score, SSIM score or MSE score.
  • image quality evaluation methods may include but are not limited to: PSNR method, SSIM method or MSE method, etc.
  • PSNR method PSNR method
  • SSIM method PSNR method
  • MSE method MSE method
  • the PSNR method is the peak signal-to-noise ratio evaluation method, which is a method used to measure the degree of image distortion or noise level.
  • PSNR is the ratio of the maximum possible power of a signal to the power of destructive noise that affects its representation accuracy, expressed in decibel units.
  • the original image undergoes compression and other operations to obtain an output image, and the output image will be different from the original image to some extent.
  • the image quality of the output image can be evaluated by the PSNR method. Specifically, the larger the PSNR score value, the smaller the output image distortion and the higher the image quality.
  • the SSIM method is a structural similarity evaluation method, which is a method used to measure the similarity between two images or to judge the quality of images after compression. From the perspective of image composition, SSIM defines structural information as independent of brightness and contrast, reflecting the properties of the object structure in the scene, and models distortion as a combination of three different factors: brightness, contrast, and structure. Use the mean as an estimate of brightness, the standard deviation as an estimate of contrast, and the covariance as a measure of structural similarity. Specifically, the larger the SSIM score value, the higher the image quality.
  • the MSE method is the mean square error evaluation method, which is a method that can be used to measure the degree of image distortion.
  • the MSE method first calculates the mean square value of the pixel difference between the original image and the output image, and then determines the degree of distortion of the output image through the size of the mean square value. Specifically, the smaller the MSE score value, the smaller the output image distortion and the higher the image quality.
  • multiple image quality evaluation methods can be selected to jointly evaluate the quality of the screen projection image. The more image quality evaluation methods are selected, the more truly the user experience score can reflect the user's user experience with the screen projection image quality.
  • S n is the user experience score of the nth application
  • PSNR n is the PSNR score of the nth application
  • MSE n is the MSE score of the nth application
  • SSIM n is the SSIM score of the nth application
  • w psnr is the preset image quality evaluation weighting coefficient of the PSNR score of the nth application
  • w mse,n is the preset image quality evaluation weighting coefficient of the MSE score of the nth application
  • w ssim,n is the SSIM score of the nth application
  • the preset image quality evaluation weight coefficient is the preset image quality evaluation weight coefficient.
  • Ori and Target are the original image and the encoded image respectively, and i and j are the horizontal and vertical coordinates of the pixels in the image.
  • Target ij f(Ori ij ,A);
  • f( ⁇ ) represents the process of the encoder encoding the image based on the code rate A.
  • the PSNR n -score can be calculated by the following expression:
  • the SSIM n score can be calculated by the following expression:
  • ⁇ target is the average value of target
  • ⁇ ori is the average value of ori
  • c 1 (0.01L) 2
  • L is the dynamic range of the pixel value.
  • the first code rate of each application is determined.
  • PSNR method, SSIM method and MSE method as an example to evaluate the quality of the screen projection image, the expression of the relationship between the first bit rate and the corresponding quality experience score is:
  • MSE ⁇ i ⁇ j ⁇ f(Ori ij ,A)-Ori ij ⁇ 2 ;
  • the above-mentioned window status information includes a code rate allocation weight coefficient for each application.
  • the code rate allocation weight coefficient of an application can be used to characterize the proportion of the code rate allocated to the application.
  • the window status information is used to characterize the real-time status of each screen projection window in the second device.
  • the real-time status of the screencasting window includes: the image complexity of the application corresponding to the screencasting window, the user's operating frequency of the screencasting window (that is, the user's attention on the screencasting window), and the presets of the screencasting window. priority (weight coefficient), etc.
  • the code rate allocation weight coefficient of the corresponding application of each screen projection window can be determined.
  • the first device obtains the user attention weight coefficient, the image complexity weight coefficient and the preset weight coefficient.
  • the code rate allocation weight coefficient is the user attention weight coefficient, the image complexity weight coefficient and the preset weight coefficient. product of .
  • the user attention weight coefficient is the user's attention invested in the screen projection window, which can be reflected by the user's historical operation frequency of the screen projection window. Therefore, the user attention weight coefficient can be obtained based on the operating frequency of the window interface in the historical time period and the corresponding preset frequency weight coefficient.
  • the operating frequency of the window interface in the historical time period includes: the first operating frequency of the window interface in the first historical time period, and the second operating frequency of the window interface in the second historical time period, wherein, The first historical time period is greater than the second historical time period.
  • the user attention weight coefficient is the sum of the product of the first operating frequency and the first frequency preset weight coefficient, and the product of the second operating frequency and the second frequency preset weight coefficient.
  • the first historical time period can be much longer than the second historical time period.
  • the first historical time period is within 1 minute
  • the second historical time period is within 1 minute.
  • the historical time period is within 0.1 seconds.
  • the preset frequency weight coefficient includes: a first frequency preset weight coefficient corresponding to the first operating frequency, and a second frequency preset weight coefficient corresponding to the second operating frequency.
  • the preset frequency weight coefficient can be set by the user according to the importance of the first operating frequency and the second operating frequency. For example, if the first operation frequency in the first historical time period is high, but the first operation frequency is the frequency of operating the game application, the importance of operating the game application is low for the user. Therefore, for the first operating frequency, the user can set a lower weight coefficient. If the second operation frequency in the second historical time period is low, however, the second operation frequency is the frequency of operating the office application. For the user, operating the office application is of higher importance. Therefore, for the second operating frequency, the user can set a lower weight coefficient. In this way, the user's attention weight coefficient determined based on the preset frequency weight coefficient can more accurately reflect the user's attention to the screen projection window.
  • the user attention weight coefficient can also be determined based on the operating frequency of the window interface in more than two historical time periods and the corresponding preset weight coefficient. The more historical time periods selected, the more truly it can reflect the user's attention on the screen casting window.
  • the above image complexity weight coefficient can be obtained based on the complexity of the historical multi-frame images of the window interface and the preset complexity weight coefficient corresponding to each frame of image; the image complexity weight coefficient is the complexity of the historical multi-frame images. The product of the complexity and the preset complexity weight coefficient corresponding to each frame of image.
  • the image complexity weight coefficient is determined by the complexity of the historical multi-frame images in the projection window and the corresponding weight coefficient.
  • the complexity of the image includes but is not limited to: the color (red green blue, RGB) complexity of the image, the graphics complexity of the image, etc.
  • the preset complexity weight coefficient can be determined by the user based on the importance of each historical image frame.
  • Figure 9(a) shows the application
  • the embodiment shows a schematic diagram of image complexity, as shown in Figure 9(a). If the image complexity of the first frame image is high, but the first frame image is a background image without valid text information, for the user Generally speaking, the first frame of the image is less important. Therefore, for the first frame image, the user can set a lower complexity weight coefficient.
  • Figure 9(b) is a schematic diagram of another image complexity shown in an embodiment of the present application.
  • the image complexity of the second frame image is low, but the second frame image includes important text information, for users, the second frame image is more important. Therefore, for the second frame image, the user can set a higher complexity weight coefficient. In this way, the image complexity weight coefficient determined based on the image complexity weight coefficient and the corresponding image complexity can more accurately reflect the image complexity of the screen projection window.
  • the above-mentioned preset weight coefficient is obtained based on the window-started application weight coefficient and the preset window weight coefficient; the preset weight coefficient is the product of the window-started application weight coefficient and the preset window weight coefficient.
  • the application weight coefficient for window startup can be preset by the user.
  • the user can set the weight coefficient of the application launched by the window according to the importance of the application launched by the window.
  • applications that can be launched by windows include: short message applications, video applications, and office applications. If the current user needs to watch a video through a video application, he or she will send and receive messages through a short message application, but will not use an office application for document editing. The order of importance of the applications from high to low is: video application, short message application, and office application. Therefore, the weight coefficients of the applications started in the window set by the user from high to low are: video applications, short message applications and office applications. In this way, the weight coefficient of applications launched through the window can truly reflect which type of application the current user wants to use.
  • the preset window weight coefficient can also be preset by the user.
  • the preset window weight coefficient can be set according to the area size of the screen projection window or the sequential position of the screen projection window display.
  • Figure 10 is a schematic diagram of the display effect of multiple screen projection windows according to an embodiment of the present application; as shown in Figure 10, the second electronic device includes: a first screen projection window, a second screen projection window and a third screen projection window. Three projection windows. The areas of the three screen projection windows in descending order are: the first screen projection window, the second screen projection window, and the third screen projection window.
  • the first projection window has the largest area and can be used to display images of highly important applications.
  • the third screen projection window has the smallest area and can be used to display images of less important applications. Therefore, the preset window weight coefficients set by the user from high to low are: the first screen projection window, the second screen projection window, and the third screen projection window. In this way, the importance of the current screen projection window can be truly reflected by presetting the window weight coefficient.
  • the first device sends screen projection data of multiple applications to the second device at the first set of code rates.
  • the first device sends the screen projection data of each application to the second device at the corresponding first code rate.
  • the screen projection data at least includes: image data.
  • screencast data can also include: audio data, document data, etc.
  • the method before S103, further includes: the first device encoding and compressing the screen projection data of multiple applications according to the first set of code rates.
  • the first device encodes and compresses the original screen projection data of each application to obtain the encoded screen projection data of each application, so that the encoded screen projection data of each application meets the first code rate during data transmission. , that is, less than or equal to the first code rate. In this way, the screen projection data of each application can be successfully transmitted to the second device while meeting the first code rate corresponding to the application.
  • Video encoding refers to the method of converting a video format file into another video format file through a specific compression technology.
  • video encoding can use standards such as H.261, H.263, H.263+, H.263++, H.264, H265, MPEG-1, MPEG-2, or MPEG-4.
  • Video decoding is the reverse process of video encoding.
  • specific processes of video encoding, and specific processes of video decoding please refer to the explanations and descriptions in the conventional technology, and will not be described in detail in this application.
  • the second device receives the screen projection data of multiple applications, and displays the screen projection data of the multiple applications through different screen projection windows.
  • the second device displays the received screen projection data of multiple applications through different screen projection windows respectively, so as to realize multi-window screen projection of multiple applications.
  • the second device if it receives the encoded screencast data, it also needs to decode the encoded screencast data to obtain the original screencast data, that is, each application is unencoded and compressed. of screencasting data. Then display the original screen projection data of multiple applications through different screen projection windows.
  • the first device obtains the second screen projection parameters, and according to the second screen projection parameters, projects the screen projection data of multiple applications to multiple screen projection windows at the second set of code rates.
  • the second set of code rates includes multiple The second bitrate for each application in the application.
  • the first device can also obtain the second screen projection parameters, determine the second set of code rates according to the method in S102 above, and project the screen projection data of multiple applications to the screen at the second set of code rates. to multiple screencasting windows.
  • the first device can realize the The screen casting parameters adaptively adjust the code rate of each application (that is, the corresponding screen projection window), and send each application to the second device at the adjusted code rate. This ensures the clarity and smoothness of the display screen in the projection window corresponding to each application, improves the problem of stuttering and frame skipping in the display window during the screencasting process, and improves the user experience.
  • S105 is an optional step. If after executing S104, the first device completes screencasting to the second device, S105 will not be executed. If the first device has not finished casting the screen to the second device, S105 is executed to enable the first device to adaptively adjust the code rate of each application (ie, the corresponding screen casting window) according to the screen casting parameters.
  • the first device may also periodically obtain the second screen projection parameters at a certain frequency to realize real-time adjustment of the code rate of each application. Specifically, the higher the frequency with which the first device obtains the second projection parameters, the faster the code rate adjustment and update of each application can be made, which can further ensure the clarity and smoothness of the content displayed in the projection window corresponding to each application. .
  • each node such as the first device, the second device, etc.
  • each node includes a corresponding hardware structure and/or software module for performing each function.
  • the algorithm steps of each example described in conjunction with the embodiments disclosed herein the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application can group functional modules of the first device, second device, etc. according to the above method examples.
  • each functional module can be grouped corresponding to each function, or two or more functions can be integrated into one process. in the module.
  • the above integrated modules can be implemented in the form of hardware or software function modules. It should be noted that the grouping of modules in the embodiment of the present application is schematic and is only a logical function grouping. In actual implementation, there may be other grouping methods.
  • Figure 11 shows a structural diagram of an electronic device 300.
  • the electronic device 300 may be a first device, a chip in the first device, or a system on a chip.
  • the electronic device 300 may be used to perform the steps involved in the above embodiments. First device functionality.
  • the electronic device 300 shown in FIG. 11 includes: an acquisition module 310 and a processing module 320.
  • the acquisition module 310 is used to obtain the first screen projection parameter
  • the processing module 320 is used to project the screen projection data of multiple applications to multiple screen projection windows at a first set of code rates according to the first screen projection parameter; so
  • the first set of code rates includes a first code rate for each application in the plurality of applications.
  • the acquisition module 310 is also used to obtain a second screen projection parameter; the processing module 320 is also used to project the screen projection data of multiple applications to multiple screen projection windows at a second set of code rates according to the second screen projection parameter.
  • the second set of code rates includes a second code rate for each of the multiple applications.
  • the first screen casting parameters include: network status information, user experience information and window status information.
  • the acquisition module 310 is also used to: acquire network status information, user experience information and window status information.
  • the network status information is used to characterize the status of the transmission channels used by the multiple applications when casting the screen
  • the user experience information is used to characterize the user's quality experience of the screen projection image of each application
  • the window status information is used to characterize each application. The real-time status of each screen projection window.
  • the method performed by the above acquisition module 310 can be completed by multiple modules respectively.
  • the electronic device 300 includes: a network monitoring module, a user experience analysis report building module and a window monitoring module.
  • the network monitoring module is used to obtain network status information
  • the user experience analysis report building module is used to obtain user experience information
  • the window monitoring module is used to obtain network status information. Get window status information.
  • the names of the above modules are only for illustrative purposes, and each module can execute the corresponding method, which is not limited in the embodiments of the present application.
  • the above-mentioned electronic device 300 may also include components as shown in Figure 6.
  • the transceiver action in the above-mentioned electronic device 300 may be performed by antenna 1, antenna 2, mobile communication module 150, and wireless communication module 160 in Figure 6 and other components are executed, and specific processing actions may be executed by the processor 110 in FIG. 6 .
  • An embodiment of the present application also provides an electronic device, which may include one or more processors, memories, and communication interfaces.
  • the memory, communication interface and processor are coupled.
  • memory, communication interfaces, and processors may be coupled together via a bus.
  • the communication interface is used for data transmission with other devices.
  • Computer program code is stored in the memory.
  • the computer program code includes computer instructions.
  • the electronic device causes the electronic device to execute the multi-window screen projection method in the embodiment of the present application.
  • the above-mentioned electronic device may also include components as shown in FIG. 6 .
  • the processor may be the processor 110 in FIG. 6
  • the memory may be the internal memory 121 in FIG. 6 or an external memory connected through the external memory interface 120 .
  • the communication interface may be the USB interface in Figure 6.
  • An embodiment of the present application further provides a multi-window screen projection system.
  • the system includes a first device and a second device; wherein the first device is used to obtain a first screen projection parameter; and according to the first screen projection parameter, the first screen projection parameter is used to obtain a first screen projection parameter.
  • One set of code rates projects the screencasting data of multiple applications to multiple screencasting windows of the second device; the first set of coderates includes the first coderate of each application in the multiple applications; and obtains the second screencasting parameters.
  • the second projection parameter project the projection data of the multiple applications to multiple projection windows of the second device at a second set of code rates, where the second set of code rates includes each of the multiple applications.
  • the second code rate is used to display the screen projection data of multiple applications through one screen projection window.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • Computer program code is stored in the computer storage medium.
  • the electronic device executes the multi-window screen projection method in the above-mentioned method embodiment. Related steps.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to perform relevant steps of the multi-window screen projection method in the above method embodiment.
  • the electronic equipment, multi-window screen projection system, computer storage media or computer program products provided by this application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects they can achieve can be referred to the above provided The beneficial effects of the corresponding methods will not be described again here.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be The combination can either be integrated into another device, or some features can be omitted, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated.
  • the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the software product is stored in a storage medium and includes several Instructions are used to cause a device (which may be a microcontroller, a chip, etc.) or a processor to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种多窗口投屏方法、电子设备及系统,涉及终端技术领域。该方法包括:获取第一投屏参数;第一投屏参数用于表征将多个应用的投屏数据投屏到多个投屏窗口的投屏特征。第一设备根据第一投屏参数,以第一组码率将多个应用的投屏数据投屏到第二设备的多个投屏窗口。其中,第一组码率包括多个应用中每个应用的第一码率。获取第二投屏参数,第一设备根据第二投屏参数,以第二组码率将多个应用的投屏数据投屏到第二设备的多个投屏窗口。其中,第二组码率包括多个应用中每个应用的第二码率。通过本申请的方案,能够在多窗口投屏时自适应调整每个应用的码率,从而保证每个应用对应的投屏窗口显示画面的清晰度和流畅度,提高用户的使用体验。

Description

一种多窗口投屏方法、电子设备及系统
本申请要求于2022年08月05日提交国家知识产权局、申请号为202210938422.8、申请名称为“一种多窗口投屏方法、电子设备及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种多窗口投屏方法、电子设备及系统。
背景技术
随着电子设备应用显示技术的发展,为满足用户对多屏协同操作的需求,越来越多的电子设备支持多窗口投屏技术。多窗口投屏技术是通过将一个电子设备(如第一设备)上启动的多个应用界面投屏至另一个电子设备(如第二设备),以实现在第一设备和第二设备上的镜像操控和输入协同。
在第一设备向第二设备投屏时,首先第一设备根据码率将投屏数据进行编码,然后将编码后的投屏数据传输至第二设备,最后第二设备对接收到的投屏数据进行解码,并在窗口中显示。相对于单窗口投屏,电子设备进行多窗口投屏时需要处理的投屏数据量较大。因此,需要对每个投屏窗口(也即显示的应用程序)进行合理的码率分配,以确保每个应用的投屏数据都可以成功的投屏至第二设备。
发明内容
本申请实施例提供一种多窗口投屏方法、电子设备及系统,能够在多窗口投屏时自适应调整每个应用的码率,从而保证每个应用对应的投屏窗口显示画面的清晰度和流畅度,改善投屏过程中投屏窗口显示画面容易出现卡顿和跳帧等问题,提高用户的使用体验。
为达到上述目的,本申请的实施例采用如下技术方案:
第一方面,本申请实施例提供一种多窗口投屏方法,包括:第一设备获取用于表征将多个应用的投屏数据投屏到多个投屏窗口的投屏特征的第一投屏参数。第一设备根据第一投屏参数确定第一组码率,第一组码率包括多个应用中每个应用的第一码率。然后,第一设备以第一组码率将多个应用的投屏数据投屏到第二设备的多个投屏窗口中,使每个应用的画面对应的显示在第二设备的一个投屏窗口中。第一设备再获取第二投屏参数,根据第二投屏参数确定第二组码率,第二组码率包括多个应用中每个应用的第二码率。然后,第一设备以第二组码率将多个应用的投屏数据投屏到第二设备的多个投屏窗口中。
本申请实施例提供的多窗口投屏方法,第一设备可以根据用于表征将多个应用的投屏数据投屏到多个投屏窗口的投屏特征的投屏参数,自适应调整每个应用(即对应的投屏窗口)的码率,并以该码率将每个应用的画面发送至第二设备,第二设备对每个应用的画面分别通过一个投屏窗口进行展示。这样,可以保证每个应用对应的投屏窗口显示画面的清晰度和流畅度,防止投屏窗口显示画面出现卡顿和跳帧等情况,提高用户的使用体验。
结合第一方面,在一种可选择的实现方式中,第一投屏参数包括下述至少一项参数:网络状态信息、用户体验信息和窗口状态信息。通过网络状态信息表征多个应用在投屏时所用的传输信道的状态,通过用户体验信息表征用户对每个应用的投屏图像的质量体验,通过窗口状态信息用于表征每个投屏窗口的实时状态。这种实现方式中,可以将多个应用的投屏数据投屏到多个投屏窗口的投屏特征进行量化,由网络状态信息、用户体验信息和窗口状态信息中的至少一项作为第一投屏参数,这样,以便于根据第一投屏参数确定每个应用的码率。
结合第一方面,在一种可选择的实现方式中,所述方法还包括:第一设备根据网络状态信息确定可用于多个应用投屏的总码率;根据窗口状态信息确定每个应用的码率分配权重系数;根据用户体验信息确定每个应用对应的质量体验评分。在满足约束条件的情况下,对于任一应用,第一设备根据应用的码率分配权重系数以及应用对应的质量体验评分,确定应用的第一码率,其中约束条件包括多个应用对应的第一码率的总和等于总码率。这种实现方式中,首先,第一设备根据网络状态信息、窗口状态信息和用户体验信息可以分别确定总码率、码率分配权重系数和质量体验评分。然 后,再根据总码率、码率分配权重系数和质量体验评分确定每个应用的码率。这样,第一设备根据网络状态、窗口状态和用户体验,合理地分配每个应用的码率。使每个应用对应的投屏窗口显示画面的质量达到用户的需求,提高用户体验。
结合第一方面,在一种可选择的实现方式中,网络状态信息包括网络带宽参数和信道干扰参数,则总码率为网络带宽参数与信道干扰参数的乘积。这种实现方式中,网络带宽参数用于表征传输信道的带宽,信道干扰参数用于表征传输信道的干扰情况。网络带宽参数和信道干扰参数可以真实地反映出当前的网络状态,这样,基于网络带宽参数和信道干扰参数,可以更加准确地确定可用于多个应用投屏的总码率。
结合第一方面,在一种可选择的实现方式中,基于历史时刻网络带宽参数和历史时刻信道干扰参数,通过时间序列预测法可以预测得到网络带宽参数和信道干扰参数。这种实现方式中,根据时间序列预测法预测得到的网络带宽参数和信道干扰参数,更接近真实地网络状态。可以提高根据测得到的网络带宽参数和信道干扰参数确定用于多个应用投屏的总码率的准确性。
结合第一方面,在一种可选择的实现方式中,用户体验信息包括每个应用对应的用户体验评分,应用对应的用户体验评分为用户对应用的投屏图像的质量体验评分。这种实现方式中,用户体验评分用于表征用户对应用的投屏图像质量的主观体验,通过用户体验评分可以将用户的主观体验进行量化,以便于基于用户体验评分确定每个应用的码率。
结合第一方面,在一种可选择的实现方式中,用户体验评分根据图像质量评价评分和对应的预设图像质量评价权重系数得到。具体的,用户体验评分为图像质量评价评分与预设图像质量评价权重系数的乘积。这种实现方式中,图像质量评价评分为通过图像质量评价方法对图像质量进行的客观评价结果。可选的,图像质量评价评分包括以下的一个或多个评分:峰值信噪比(peak signal-to-noise ratio,PSNR)评分,结构相似度(structural similarity,SSIM)评分或均方误差(mean square error,MSE)评分。预设图像质量评价权重系数表征了不同的图像质量评价方法对于用户体验的影响。这样,基于图像质量评价评分和预设图像质量评价权重系数确定的用户体验评分,可以真实地反映出用户对应用的投屏图像质量的体验以便于基于用户体验评分合理地确定每个应用的码率。
结合第一方面,在一种可选择的实现方式中,窗口状态信息包括每个应用的码率分配权重系数。这种实现方式中,窗口状态信息用于表征每个投屏窗口的实时状态,基于每个投屏窗口的实时状态,可以确定每个应用的码率分配权重系数,以用于进一步根据码率分配权重系数确定每个应用的码率。
结合第一方面,在一种可选择的实现方式中,码率分配权重系数根据下述至少一项系数得到:用户注意力权重系数、图像复杂度权重系数和预设权重系数。码率分配权重系数为用户注意力权重系数、图像复杂度权重系数和预设权重系数的乘积。这种实现方式中,用户注意力权重系数用于表征用户在该投屏窗口投入的注意力情况,图像复杂度权重系数用于表征投屏窗口对应应用的图像复杂度,预设权重系数用于表征该投屏窗口预设的权重情况。这样,通过用户注意力权重系数、图像复杂度权重系数和预设权重系数可以将窗口状态信息进行量化,以便于确定每个应用的码率分配权重系数。
结合第一方面,在一种可选择的实现方式中,用户注意力权重系数根据历史时间段内对窗口界面的操作频率和对应的预设频率权重系数得到。这种实现方式中,用户注意力权重系数为用户在该投屏窗口投入的注意力情况,可以通过用户对该投屏窗口的操作频率来体现。基于历史时间段内对窗口界面的操作频率和对应的预设频率权重系数,可以更准确地反映出用户对投屏窗口投入的注意力情况。
结合第一方面,在一种可选择的实现方式中,历史时间段内对窗口界面的操作频率包括:第一历史时间段内对窗口界面的第一操作频率,第二历史时间段内对窗口界面的第二操作频率,其中,第一历史时间段大于第二历史时间段;预设频率权重系数包括:与第一操作频率对应的第一频率预设权重系数,与第二操作频率对应的第二频率预设权重系数;用户注意力权重系数为第一操作频率与第一频率预设权重系数的乘积、以及第二操作频率与第二频率预设权重系数的乘积之和。这种实现方式中,基于两个历史时间段的操作频率及对应的预设权重系数确定的用户注意力权重系数,可以更加真实地反映出用户实际对投屏窗口投入的注意力情况,进一步提高码率分配权重系数的准确 性。
结合第一方面,在一种可选择的实现方式中,图像复杂度权重系数根据窗口界面的历史多帧图像的复杂度,以及每帧图像对应的预设复杂度权重系数得到。具体的,图像复杂度权重系数为历史多帧图像的复杂度和每帧图像对应的预设复杂度权重系数的乘积。这种实现方式中,图像复杂度权重系数用于表征投屏窗口对应应用的图像复杂度,基于窗口界面的历史多帧图像的复杂度和对应的预设复杂度权重系数,可以确定图像复杂度权重系数。这样,图像复杂度权重系数可以准确地反映出投屏窗口对应应用的图像复杂度。
结合第一方面,在一种可选择的实现方式中,预设权重系数根据窗口启动的应用权重系数和预设窗口权重系数得到。具体的,预设权重系数为窗口启动的应用权重系数和预设窗口权重系数的乘积。这种实现方式中,预设权重系数用于表征用户对该投屏窗口预设的权重情况,主要包括启动的应用权重系数和窗口权重系数。基于窗口启动的应用权重系数和预设窗口权重系数,可以确定预设权重系数。这样,预设权重系数可以准确地反映出用户对该投屏窗口预设的权重情况。
第二方面,本申请实施例提供一种电子设备,所述设备包括:获取模块,用于获取第一投屏参数;第一投屏参数用于表征将多个应用的投屏数据投屏到多个投屏窗口的投屏特征;处理模块,用于根据第一投屏参数,以第一组码率将多个应用的投屏数据投屏到多个投屏窗口;第一组码率包括多个应用中每个应用的第一码率;获取模块,还用于获取第二投屏参数;处理模块,还用于根据第二投屏参数,以第二组码率将多个应用的投屏数据投屏到多个投屏窗口,第二组码率包括多个应用中每个应用的第二码率。
结合第二方面,在一种可选择的实现方式中,第一投屏参数包括下述至少一项参数:网络状态信息、用户体验信息和窗口状态信息,获取模块还用于:获取网络状态信息、用户体验信息和窗口状态信息,网络状态信息用于表征多个应用在投屏时所用的传输信道的状态,用户体验信息用于表征用户对每个应用的投屏图像的质量体验,窗口状态信息用于表征每个投屏窗口的实时状态。
第三方面,本申请实施例提供一种电子设备,所述设备包括:存储器、一个或多个处理器;存储器与处理器耦合;其中,存储器中存储有计算机程序代码,计算机程序代码包括计算机指令,当计算机指令被处理器执行时,使得电子设备执行上述第一方面任一项所述的多窗口投屏方法。
第四方面,本申请实施例提供一种多窗口投屏系统,该系统包括第一设备和第二设备;其中,第一设备,用于获取第一投屏参数;第一投屏参数用于表征将多个应用的投屏数据投屏到第二设备的多个投屏窗口的投屏特征;根据第一投屏参数,以第一组码率将多个应用的投屏数据投屏到多个投屏窗口;第一组码率包括多个应用中每个应用的第一码率;获取第二投屏参数,根据第二投屏参数,以第二组码率将多个应用的投屏数据投屏到多个投屏窗口,第二组码率包括多个应用中每个应用的第二码率;第二设备,用于将多个应用的投屏数据分别通过一个投屏窗口进行显示。
第五方面,本申请实施例提供一种计算机可读存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述第一方面任一项所述的多窗口投屏方法。
第六方面,本申请实施例提供一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述第一方面任一项所述的多窗口投屏方法。
可以理解地,上述提供的第二方面的电子设备,第三方面的电子设备,第四方面的多窗口投屏系统,第五方面的计算机可读存储介质,第六方面的计算机程序产品所能达到的有益效果,可参考第一方面及其任一种可能的设计方式中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例示出的一种单窗口投屏场景的示意图;
图2为本申请实施例示出的一种多窗口投屏场景的示意图;
图3为本申请实施例示出的一种投屏窗口显示画面效果的示意图;
图4为本申请实施例示出的多窗口投屏方法的原理示意图;
图5为本申请实施例示出的另一种多窗口投屏场景的示意图;
图6为本申请实施例示出的电子设备的硬件结构示意图;
图7为本申请实施例示出的电子设备的软件结构示意图;
图8为本申请实施例示出的多窗口投屏方法的流程示意图;
图9(a)为本申请实施例示出的一种图像复杂度的示意图;
图9(b)为本申请实施例示出的另一种图像复杂度的示意图;
图10为本申请实施例示出的一种多个投屏窗口显示效果的示意图;
图11为本申请实施例示出的一种电子设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请的描述中,除非另有说明,“/”表示前后关联的对象是一种“或”的关系,例如,A/B可以表示A或B;本申请中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,其中A,B可以是单数或者复数。并且,在本申请的描述中,除非另有说明,“多个”是指两个或多于两个。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
另外,为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。同时,在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念,便于理解。
随着电子设备的发展,智能手机、笔记本电脑、平板电脑、智慧屏等电子设备智能化的程度快速提升,电子设备之间的交互也越来越频繁。多屏协同作为电子设备交互的主要方式之一,能够弥补电子设备自身的不足,充分发挥不同电子设备的优势,进而提升用户体验。例如,智能手机具有携带方便的优势,主要用于用户的日常通信和社交。但是,智能手机的屏幕较小,在观看视频或者玩游戏等应用场景下,用户难以获得较高的使用体验。而智慧屏、电视等大屏设备具有屏幕大、显示分辨率高等优势。因此,用户可以通过窗口投屏方式将智能手机上的画面投屏到其他设备(比如智慧屏或电视等),获得较高的使用体验。
请参见图1所示的单窗口投屏场景,用户可以将第一设备10(智能手机)上播放的视频a投屏到第二设备20(智慧屏等)大屏设备上进行播放,在第二设备20上视频a的播放界面更大,这样用户可以获得较高的视频观看体验。
但是,随着使用需求的不断提高,用户已无法满足单一应用的投屏。例如,用户在通过视频应用观看视频的同时,还想要通过短消息应用查看和发送短消息。如果频繁的切换视频应用投屏窗口和短消息应用投屏窗口,则会严重影响用户的使用体验。因此,为满足用户对多应用操作的需求,越来越多的电子设备支持多窗口投屏技术。多窗口投屏技术是通过将一个电子设备(如第一设备)上启动的多个应用界面投屏至另一个电子设备(如第二设备),以实现在第一设备和第二设备上的镜像操控和输入协同。
图2示出了一种多窗口投屏场景的示意图。如图2所示,假设第一设备10与第二设备20之间建立了用于多窗口投屏的通信连接。在多窗口投屏场景中,假设在第一设备10显示手机桌面的同时,接收到用户对第一设备10上短消息应用、视频应用和游戏应用进行投屏的启动操作。响应于用户对短消息应用、视频应用和游戏应用进行投屏的启动操作,第一设备将短消息应用、视频应用和游戏应用投屏的数据(即应用显示的内容)根据对应的码率进行编码,然后将编码后的投屏数据传输至第二设备,最后第二设备对接收到的投屏数据进行解码,并在投屏窗口中显示。如图2所示,在第二设备20中通过投屏窗口a、投屏窗口b、投屏窗口c分别显示短消息应用、视频应用和游戏应用的画面,以实现对多个应用通过多个投屏窗口同时显示。其中,码率(bitrate,br):是指单位时间内传输的数据位数。例如,单位时间内传输的比特(bit)数,因此码率也可以称为比特率。码率的单位可以是比特每秒(bit per second,bps)。
相对于单窗口投屏,电子设备进行多窗口投屏时需要处理的投屏数据量较大。并且,根据用户 对不同应用的使用需求及应用显示内容的复杂程度,每个应用对应的投屏窗口对于码率的需求也不同。例如,如果投屏窗口显示用户正在进行的游戏应用时,则对于该投屏窗口对应的游戏应用分配较高的码率,以保证用户当前操作的流畅性。再例如,如果投屏窗口显示应用的图像复杂度较高,则需要对该投屏窗口对应的应用分配较多的码率,以保证图像的清晰度。相对的,如果投屏窗口显示应用的图像复杂度较低,则对该投屏窗口对应的应用分配较少的码率,即可保证图像的清晰度。因此,在多窗口投屏的应用场景中,需要对每个应用(即应用对应的投屏窗口)进行合理的码率分配,以确保每个应用的投屏数据都可以成功的投屏至第二设备。
现有的多窗口投屏技术中,通常将码率平均分配给每个投屏应用(或者称为应用)。例如,每个应用都被分配为5兆(M)的码率。或者,基于先验知识为每个应用预设码率。例如,游戏应用的码率预设为10M,短消息应用的码率预设为3M。上述两种应用的码率分配方法中,每个应用的码率均为固定值,不随用户操作、投屏内容、网络状态等因素动态调整。如果应用的码率无法满足用户的使用需求,则会影响应用对应的投屏窗口显示画面的清晰度和流畅度,使投屏窗口显示画面容易出现卡顿和跳帧的情况,影响用户的使用体验。
示例性的,以多个投屏应用(比如短信应用、视频应用、游戏应用)均被分配固定的5M的码率为例,图3为本申请实施例示出的一种投屏窗口显示画面效果的示意图。如果一个投屏窗口的码率无法满足用户使用需求,则投屏窗口显示的画面可能会出现卡顿的情况。如图3所示,短信应用的传输需求比较低,此时以5M的码率进行投屏可以满足用户使用需求,实现正常投屏,而视频应用、游戏应用因其传输数据量比较大,传输需求要求比较高,此时若也以5M的码率的进行投屏,则无法满足用户使用需求,出现卡顿和跳帧,如第二设备20中的投屏窗口b视频应用的画面停止,显示“加载中…”的提示。投屏窗口游戏应用的画面停止,显示“加载中…”的提示。如果卡顿的情况不断出现,甚至还会出现跳帧的情况,例如视频画面突然从当前的一帧跳转到了不相关的另一帧等。这种卡顿和跳帧的情况,会影响投屏的效果,也会影响投屏时用户在第二设备上的使用体验。
为了避免在多窗口投屏的应用场景中,因分配给应用的码率无法满足用户使用需求,而导致应用对应的投屏窗口显示内容出现卡顿和跳帧等问题。本申请实施例提供一种多窗口投屏方法,图4为本申请实施例示出的多窗口投屏方法的原理示意图,如图4所示,以第一设备向第二设备进行多窗口投屏为例,该方法能够通过网络状态监控,获取第一设备向第二设备投屏时所用的传输信道的网络状态信息,通过投屏窗口监控,获取第二设备中每个投屏窗口的窗口状态信息,通过用户体验分析,获取用户体验信息。然后,根据网络状态信息、用户体验信息和窗口状态信息等参数通过实时调度与决策自适应调整每个应用的码率,从而保证每个应用对应的投屏窗口显示画面的清晰度和流畅度,改善投屏过程中投屏窗口显示画面容易出现卡顿和跳帧等问题,提高用户的使用体验。
本申请实施例提供的投屏方法应用于多窗口投屏的应用场景中。例如,如图4所示,第一设备向第二设备进行多窗口投屏,每个投屏窗口显示一个应用的画面。或者,如图5所示,第一设备向第二设备、第三设备、第四设备等多个电子设备进行多窗口投屏。本申请实施例对多窗口投屏应用场景中的电子设备的数量不做特殊限制。
示例性的,在本申请实施例中,本申请实施例提供的多窗口投屏方法,可以应用于上述第一设备、第二设备等电子设备中。第一设备和第二设备均包括显示屏。第一设备和第二设备可以包括但不限于智能手机、上网本、平板电脑、智能手表、智能手环、电话手表、智能相机、掌上电脑、个人计算机(personal computer,PC)、个人数字助理(personal digital assistant,PDA)、便携式多媒体播放器(portable multimedia player,PMP)、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、电视机、投影设备或人机交互场景中的体感游戏机等。或者,第一设备和第二设备还可以是其他类型或结构的电子设备,本申请不限定。
通常,为了发挥多窗口投屏技术的最大优势,多窗口投屏技术多用于便携设备(即第一设备)与大屏设备(即第二设备)之间。例如,便携设备是智能手机,大屏设备是笔记本电脑。又如,便携设备是平板电脑,大屏设备是电视机。当然,本申请不限定多窗口投屏场景中的具体设备,如上文所述,第一设备和第二设备可以为智能手机、上网本、平板电脑、智能手表、智能手环、电话手表、智能相机、掌上电脑、PDA、PMP、AR/VR设备或电视机等任意支持多窗口投屏的电子设备。
在本申请实施例中,第一设备与第二设备之间可以通过“碰一碰”、“扫一扫”(如扫描二维 码或条形码)、“靠近自动发现”(如借助蓝牙或无线保真(wireless fidelity,Wi-Fi))等方式建立无线通信连接。其中,第一设备与第二设备之间可以遵循无线传输协议,通过无线连接收发器传输信息。其中,该无线传输协议可以包含但不限于蓝牙(bluetooth,BT)传输协议或Wi-Fi传输协议等。例如,Wi-Fi传输协议可以是Wi-FiP2P传输协议。该无线连接收发器包含但不限于蓝牙,Wi-Fi等收发器。通过无线配对,实现第一设备与第二设备之间的信息传输。其中,第一设备与第二设备之间传输的信息包括但不限于需要显示的内容数据(如标准视频流)和控制指令等。
或者,第一设备与第二设备之间可以建立有线通信连接。例如,第一设备与第二设备之间通过视频图像配接器(video graphics array,VGA)、数字视频接口(digital visual interface,DVI)、高清多媒体接口(high definition multimedia interface,HDMI)或数据传输线等建立有线通信连接。第一设备与第二设备之间通过建立的有线通信连接实现信息传输。本申请不限定第一设备与第二设备之间的具体连接方式。
图6示出了电子设备100的硬件结构示意图。示例性的,图6所示的电子设备100可以是第一设备,也可以是第二设备。
如图6所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器,受话器,麦克风,耳机接口,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器,使处理器110与触摸传感器通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110 可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在本申请实施例中,当电子设备100为前述实施例中的第一设备时,第一设备可以通过天线1,天线2,移动通信模块150,无线通信模块160等向第二设备发送应用的投屏数据。
在本申请实施例中,当电子设备100为前述实施例中的第二设备时,第二设备可以通过天线1,天线2,移动通信模块150,无线通信模块160等接收第一设备发送应用的投屏数据。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrixorganic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-OLED,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
在本申请实施例中,当电子设备100为前述实施例中的第一设备时,第一设备的显示屏194可以显示用户在第一设备上选择的投屏应用的内容,并在用户点击或者触摸屏幕时,进一步显示一些功能控件,例如连接控件、投屏控件等。
在本申请实施例中,当电子设备100为前述实施例中的第二设备时,第二设备的显示屏194可以显示用户在第一设备上选择的投屏应用的内容。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器 110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
在本申请实施例中,当电子设备100是前述实施例中的第一设备/或第二设备时,处理器110通过运行存储在内部存储器121的指令,可以根据投屏参数实时调整每个应用对应的码率,从而保证每个应用对应的投屏窗口显示内容的清晰度和流畅度,改善多窗口投屏过程中投屏窗口显示画面容易出现卡顿和跳帧等问题。
电子设备100可以通过音频模块170,扬声器,受话器,麦克风,耳机接口,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器收听音乐,或收听免提通话。
受话器,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器靠近人耳接听语音。
麦克风,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风发声,将声音信号输入到麦克风。电子设备100可以设置至少一个麦克风。在另一些实施例中,电子设备100可以设置两个麦克风,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口用于连接有线耳机。耳机接口可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
电子设备100在显示视频内容时,音频模块170可以同步播放视频内容对应的音频内容。
示例性的,当电子设备100为前述实施例中的第一设备时,如果第一设备显示内容,那么第一设备的音频模块170可以同步播放该视频内容对应的音频内容。如果第一设备将视频内容投屏到第二设备上,那么第一设备的音频模块170可以不播放该视频内容对应的音频内容。
示例性的,当电子设备100为前述实施例中的第二设备时,如果第二设备显示视频内容,那么第二设备的音频模块170可以同步播放该视频内容对应的音频内容。如果第二设备在投屏场景中显示第一设备投屏的视频内容,那么第二设备的音频模块170可以同步播放投屏的视频内容对应的音频内容。
在一些实施例中,基于图6所示的电子设备100实现本申请实施例中的多窗口投屏方法时,当电子设备100是前述实施例中的第一设备/或第二设备时,以第一设备向第二设备进行投屏为例。第一设备的处理器110通过运行存储在内部存储器121的指令,可以根据投屏参数实时调整每个应用对应的码率,并以对应的码率将投屏应用的投屏数据通过天线1,天线2,移动通信模块150,无线通信模块160等向第二设备发送。第二设备通过天线1,天线2,移动通信模块150,无线通信模块160等接收到投屏数据。第二设备通过显示屏194显示投屏数据(例如:视频内容),并通过音频模块170同步播放视频内容对应的音频内容。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图7是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓 运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图7所示,应用程序包可以包括短消息类应用、视频类应用、办公类应用、游戏应用、生活类应用、购物类应用或功能类应用等。具体的,例如,相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图7所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android Runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL),投屏模块等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
投屏模块用于电子设备100向其他设备进行投屏,或者用于接收其他设备的投屏数据。投屏模块可以实现投屏应用的选择、投屏数据的发送、投屏数据的接收以及投屏数据的显示等。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
以下结合上述图4所示应用场景,将以第一设备向第二设备进行多窗口投屏为例,对本申请实施例提供的多窗口投屏方法进行说明。其中下述第一设备、第二设备可以具体图6所示的部件以及图7所示的应用程序框架层。图8为本申请实施例示出的多窗口投屏方法的流程示意图,如图8所示,该方法可以包括如下步骤S101-S105。
S101、第一设备获取第一投屏参数。
其中第一设备可以为图4中的第一设备,该第一设备可以具备如图7所示的应用程序框架图,第一设备中的应用程序包可以包括多个应用,比如短消息类应用、视频类应用、办公类应用、游戏 应用、生活类应用、购物类应用或功能类应用等。第一设备可以通过其显示屏/用户交互界面接收用户对多个应用的投屏操作,响应于该投屏操作,通过处理器调用应用程序包中的该多个应用,并将多个应用以初始码率投屏给第二设备。
需要说明的是,该初始码率组可以包括与多个应用对应的多个初始码率,一个应用对应一个初始码率,该初始码率可以是预设码率。
在本申请实施例中,第一投屏参数用于可以表征将多个应用的投屏数据投屏到多个投屏窗口(比如第二设备的多个投屏窗口)的投屏特征,或者理解为第一投屏参数用于表征将多个应用的投屏数据以初始码率投屏到多个投屏窗口的投屏特征。其中,本申请所述的多个应用包括但不限于:短消息类应用、视频类应用、办公类应用、游戏应用、生活类应用、购物类应用或功能类应用等。投屏数据包括但不限于:图像数据、音频数据或文档数据等。
在第二设备的多个投屏窗口中,每一个投屏窗口可以显示一个应用的画面。示例性的,如图4所示,第二设备的显示界面中包括3个投屏窗口,3个投屏窗口可以分别显示短消息应用、视频应用和游戏应用的界面。这样,可以满足用户的多应用使用需求,提高用户的使用体验。
示例性的,投屏窗口的投屏特征可以包括但不限于:投屏传输通道的网络状态特征、用户对投屏窗口展示图像的体验特征以及投屏窗口的状态特征等。上述投屏特征会影响每个应用的投屏效果,因此,可以根据上述投屏特征确定每个应用的码率。
为了将上述投屏特征以量化的方式表示出来,在一些实施例中,第一投屏参数可以包括但不限于下述至少一项参数:网络状态信息、用户体验信息和窗口状态信息。其中,网络状态信息用于表征多个应用在第一设备向第二设备投屏时所用的传输信道的状态,比如参考信号接收功率(reference signal receiving power,RSRP)等。用户体验信息用于表征用户对第二设备中投屏窗口展示每个应用的投屏图像的质量体验,比如通过图像质量评价方法对投屏图像的评分。窗口状态信息用于表征第二设备中每个投屏窗口的实时状态信息,比如投屏窗口展示图像的复杂度、用户的操作频率等。这样,第一设备基于网络状态信息、用户体验信息和窗口状态信息,可以合理地分配每个应用的码率,以保证每个应用对应的投屏窗口显示的图像清晰和流程。
具体的,第一设备可以通过下述方式获取第一投屏参数:第一投屏参数包括:网络状态信息、用户体验信息和窗口状态信息为例。其中,网络状态信息是网络状态的实际情况表征,其获取方式和所采用的传输协议相关。网络状态信息可以通过历史传输情况进行分析,或者可以通过发送测试数据进行测试,又或者通过部分私有化、定制化的网络传输协议(比如用户手动定义一个协议),提供一个面向应用层的接口,通过该接口可获取网络状态信息。用户体验信息可以根据已知的先验知识获得。窗口状态信息可以通过第一设备的屏幕或者第一设备后台运行的虚拟现实器(virtual display)中获取投屏数据(例如视频流),通过对投屏数据的内容进行分析,获取详细的窗口状态信息。
S102、第一设备根据第一投屏参数,确定第一组码率。
在本申请实施例中,第一组码率包括多个应用中每个应用的第一码率,也即每个应用对应的投屏窗口的第一码率。第一组码率可以用于表征基于第一投屏参数对每个应用(即每个应用对应的投屏窗口)的码率分配情况。
在一些实施例中,以上述第一投屏参数包括:网络状态信息、用户体验信息和窗口状态信息为例,示例性的说明第一设备根据第一投屏参数,确定第一组码率的方法。
第一设备根据网络状态信息确定可用于多个应用投屏的总码率,根据窗口状态信息确定每个应用的码率分配权重系数,根据用户体验信息确定每个应用对应的质量体验评分。第一设备根据总码率、每个应用的码率分配权重系数和每个应用对应的质量体验评分确定第一组码率。具体的,在满足所有应用对应的第一码率的总和等于总码率的约束条件下,根据应用的码率分配权重系数以及应用对应的质量体验评分,确定每个应用的第一码率。
示例性的,在约束条件为所有应用对应的第一码率的总和等于总码率,即满足以下表达式的情况下:∑An=B;其中,An为分配给第n个应用的码率,B为总码率。使每个应用的码率分配权重系数和质量体验评分乘积的总和最大,即求解方程:
maxKn×Sn
其中,Kn为第n个应用的码率分配权重系数,Sn为第n个应用的质量体验评分。
其中,每个应用的码率和该应用对应的质量体验评分具有正相关性。应用的码率越高,则该应用对应的质量体验评分越高。应用的码率越低,则该应用对应的质量体验评分越低。具体的,应用的第一码率与对应的质量体验评分的关系,与质量体验评分选取的图像质量评价方法有关,下文将结合选取的图像质量评价方法对应用的第一码率与对应的质量体验评分的关系进行示例性说明。
在一些实施例中,上述网络状态信息包括网络带宽参数和信道干扰参数,网络带宽参数用于表征传输信道的带宽,信道干扰参数用于表征传输信道的干扰情况。则第一设备根据网络状态信息确定可用于多个应用投屏的总码率的方法为:将网络带宽参数与信道干扰参数进行乘积,得到总码率。
具体的,网络带宽参数为第一设备向第二设备投屏时,投屏数据的传输信道的带宽。带宽指传输信道在单位时间内(例如1秒钟内)能传输的数据量。网络带宽参数越高,传输信道在单位时间内能传输的数据量就越大,可用于多个应用投屏的总码率也就越高。
投屏数据在传输通道中传输时,可能会受到由其他一路或多路信道中的信号所导致的干扰,从而影响到可用的总码率。因此,第一设备还需要获取信道干扰参数。示例性的,信道干扰参数包括:邻道干扰参数和同信道干扰参数,邻道干扰参数用于表征相邻或邻近传输信道之间的干扰,相邻或邻近传输信道距离越近,邻道干扰越大。同信道干扰参数用于表征相同频率的无用信号对传输信道形成的干扰,称为同信道干扰,也称为同频干扰或同道干扰。
在一些实施例中,网络带宽参数和信道干扰参数为通过时间序列预测法,对历史时刻网络带宽参数和历史时刻信道干扰参数预测得到。
具体的,首先,第一设备获取历史p个时刻的网络带宽参数和信道干扰参数的集合为{[F0,G0],[F1,G1]…[Fp-1,Gp-1]},其中,Fn为第n个时刻的网络带宽参数,Gn为第n个时刻的信道干扰参数。然后,第一设备通过时间序列预测法,基于历史p个时刻的网络带宽参数和信道干扰参数,来预测未来一段时间内的网络带宽参数和信道干扰参数。
示例性的,第一设备可利用循环神经网络(recurrent neural network,RNN)预测未来一段时间内的网络带宽参数和信道干扰参数。RNN是一类以序列(sequence)数据为输入,在序列的演进方向进行递归(recursion)且所有节点(循环单元)按链式连接的递归神经网络。RNN具有记忆性、参数共享并且图灵完备(turing completeness)的特点,在对时间序列进行学习时具有一定优势,可以应用于对各类时间序列的预测中。因此,第一设备基于历史p个时刻的网络带宽参数和信道干扰参数的集合,通过RNN可以更准确地预测未来一段时间内的网络带宽参数和信道干扰参数。
可选的,在一些实施方式中,第一设备获取历史p个时刻的网络带宽参数和信道干扰参数的集合后,还可以通过平均值法或加权平均值法,预测未来一段时间内的网络带宽参数和信道干扰参数。具体的,第一设备对历史p个时刻的网络带宽参数计算平均值,得到预测的网络带宽参数。第一设备对历史p个时刻的信道干扰参数计算平均值,得到预测的信道干扰参数。或者,根据历史p个时刻实际的网络状态,对p个时刻的网络带宽参数和信道干扰参数进行加权。例如,距离当前时刻越近的时刻对应的网络带宽参数和信道干扰参数的权重越大,距离当前时刻越远的时刻对应的网络带宽参数和信道干扰参数的权重越小。然后,第一设备对历史p个时刻的网络带宽参数和信道干扰参数计算加权平均值,得到预测的网络带宽参数和信道干扰参数。这样,可以使预测的网络带宽参数和信道干扰参数更加真实地反映出未来一段时间的网络状态信息。
在一些实施例中,上述用户体验信息包括每个应用对应的用户体验评分,应用对应的用户体验评分为用户对应用的投屏图像的质量体验评分。
具体的,用户体验评分表示用户对投屏图像的质量的用户体验,反应了图像质量评价方法与用户体验之间的关系。投屏图像的质量越高,用户体验越好。相反,投屏图像的质量越低,用户体验越差。但是,用户体验通常难以量化表征,因此,需要确定不同的图像质量评价方法对于用户体验的影响,再结合不同的图像质量评价方法的评分确定用户体验评分。
在一些实施例中,用户体验评分根据图像质量评价评分和对应的预设图像质量评价权重系数得到;用户体验评分为图像质量评价评分与预设图像质量评价权重系数的乘积。
具体的,图像质量评价评分根据选取的图像质量评价方法对投屏图像的质量进行评分。预设图像质量评价权重系数表征了不同的图像质量评价方法对于用户体验的影响,预设图像质量评价权重 系数可以基于实验人员对大量图像数据进行主观评价,根据主观评价的统计结果结合各图像质量评价方法的客观评分,确定各图像质量评价方法的权重系数,即为预设图像质量评价权重系数,这样可以准确地反映出不同的图像质量评价方法对于用户体验的影响。
可选的,在一些实施例中,上述图像质量评价评分包括以下的一个或多个评分:PSNR评分,SSIM评分或MSE评分。
具体的,图像质量评价方法可以包括但不限于:PSNR法、SSIM法或MSE法等。通过上述图像质量评价方法可以分别得到PSNR评分、SSIM评分和MSE评分。
其中,PSNR法为峰值信噪比评价法,是一种用于衡量图像失真程度或是噪声水平的方法。PSNR是一个表示信号最大可能功率和影响它的表示精度的破坏性噪声功率的比值,用分贝单位来表示。通常,原始图像在经过压缩等操作得到输出图像,输出图像会在某种程度与原始图像不同。为了衡量经过处理后的图像品质,可以通过PSNR法对输出图像的图像质量进行评价。具体的,PSNR评分数值越大,则输出图像失真越小,图像质量越高。
SSIM法为结构相似度评价法,是一种用于衡量两幅图像相似度,或用来判断图像压缩后质量的方法。SSIM从图像组成的角度将结构信息定义为独立于亮度、对比度的,反映场景中物体结构的属性,并将失真建模为亮度、对比度和结构三个不同因素的组合。用均值作为亮度的估计,标准差作为对比度的估计,协方差作为结构相似程度的度量。具体的,SSIM评分数值越大,则图像质量越高。
MSE法为均方误差评价法,是一种可以用于衡量图像失真程度的方法。MSE法首先计算原始图像和输出图像像素差值的均方值,然后通过均方值的大小来确定输出图像的失真程度。具体的,MSE评分数值越小,则输出图像失真越小,图像质量越高。
在一些实施例中,可以选取多个图像质量评价方法共同对投屏图像的质量进行评价。选取的图像质量评价方法越多,用户体验评分就可以更真实地反映出用户对投屏图像质量的用户体验。
下面,以选取PSNR法、SSIM法和MSE法对投屏图像的质量进行评价为例,示例性的说明用户体验评分的具体计算方法。
具体的,分别通过PSNR法、SSIM法和MSE法对投屏图像的质量进行评价,得到PSNR评分、SSIM评分和MSE评分。然后通过先验知识得到PSNR法、SSIM法和MSE法对应的预设图像质量评价权重系数。则用户体验评分满足以下表达式:
Sn=wpsnr,n×PSNRn+wmse,n×MSEn+wssim,n×SSIMn
其中,Sn为第n个应用的用户体验评分,PSNRn为第n个应用的PSNR评分,MSEn为第n个应用的MSE评分,SSIMn为第n个应用的SSIM评分,wpsnr,n第n个应用的PSNR评分的预设图像质量评价权重系数,wmse,n为第n个应用的MSE评分的预设图像质量评价权重系数,wssim,n为第n个应用的SSIM评分的预设图像质量评价权重系数。
具体的,MSEn评分可通过以下表达式计算得到:
MSE=∑ij‖Targetij-Oriij2
其中,Ori和Target分别为原始图像和编码后的图像,i、j为图像中像素点的横纵坐标。
Target满足以下表达式:
Targetij=f(Oriij,A);
其中,f(·)表示编码器基于码率A对图像编码的过程。
PSNRn评分可通过以下表达式计算得到:
SSIMn评分可通过以下表达式计算得到:
其中,μtarget为target的平均值,μori为ori的平均值,为target的方差,为ori的方差,c1=(0.01L)2,c2=(0.03L)2是用来维持稳定的常数,L是像素值的动态范围。
结合上文所述的根据应用的码率分配权重系数以及应用对应的质量体验评分,确定每个应用的第一码率的方法。以选取PSNR法、SSIM法和MSE法对投屏图像的质量进行图像质量评价为例,则第一码率与对应的质量体验评分的关系的表达式为:
其中,
MSE=∑ij‖f(Oriij,A)-Oriij2
可见,每个应用的第一码率和该应用对应的质量体验评分具有正相关性。即应用的第一码率越高,则该应用对应的质量体验评分越高。应用的第一码率越低,则该应用对应的质量体验评分越低。
因此,上文所述的求解方程:
maxKn×Sn
可以转换为求解下述方程:
即可得到每个应用的第一码率。
在一些实施例中,上述窗口状态信息包括每个应用的码率分配权重系数。一个应用的码率分配权重系数可以用于表征为该应用分配的码率的比例。
具体的,窗口状态信息用于表征第二设备中每个投屏窗口的实时状态。例如,投屏窗口的实时状态包括:投屏窗口对应应用的图像复杂度、用户对该投屏窗口的操作频率(即用户在该投屏窗口投入的注意力情况)以及该投屏窗口预设的优先级(权重系数)等。基于上述每个投屏窗口的实时状态可以确定每个投屏窗口对应应用的码率分配权重系数。
为了将上述投屏窗口的实时状态以量化的方式表示出来,以确定码率分配权重系数。在一些实施例中,第一设备获取用户注意力权重系数、图像复杂度权重系数和预设权重系数,码率分配权重系数即为用户注意力权重系数、图像复杂度权重系数和预设权重系数的乘积。
其中,用户注意力权重系数为用户在该投屏窗口投入的注意力情况,可以通过用户对该投屏窗口历史的操作频率来体现。因此,用户注意力权重系数可以根据历史时间段内对窗口界面的操作频率和对应的预设频率权重系数得到。
在一些实施例中,历史时间段内对窗口界面的操作频率包括:第一历史时间段内对窗口界面的第一操作频率,第二历史时间段内对窗口界面的第二操作频率,其中,第一历史时间段大于第二历史时间段。用户注意力权重系数为第一操作频率与第一频率预设权重系数的乘积、以及第二操作频率与第二频率预设权重系数的乘积之和。
可选的,为了更真实地反映出用户对投屏窗口投入的注意力情况,第一历史时间段可以远远大于第二历史时间段,例如,第一历史时间段为1分钟内,第二历史时间段为0.1秒内。
预设频率权重系数包括:与第一操作频率对应的第一频率预设权重系数,与第二操作频率对应的第二频率预设权重系数。预设频率权重系数可以由用户根据第一操作频率和第二操作频率的重要性进行设置。例如,如果第一历史时间段内的第一操作频率较高,但是,第一操作频率为对游戏应用进行操作的频率,对于用户来说,对游戏应用进行操作的重要性较低。因此,对于第一操作频率,用户可以设置较低的权重系数。如果第二历史时间段内的第二操作频率较低,但是,第二操作频率为对办公应用进行操作的频率,对于用户来说,对办公应用进行操作的重要性较高。因此,对于第二操作频率,用户可以设置较低的权重系数。这样,根据预设频率权重系数确定的用户注意力权重系数,可以更准确地反映出用户对投屏窗口的注意力情况。
可选的,在一些实施例中,还可以基于两个以上的历史时间段内对窗口界面的操作频率,以及对应预设权重系数确定用户注意力权重系数。选取的历史时间段越多,就越能真实地反映出用户对投屏窗口投入的注意力情况。
在一些实施例中,上述图像复杂度权重系数可根据窗口界面的历史多帧图像的复杂度,以及每帧图像对应的预设复杂度权重系数得到;图像复杂度权重系数为历史多帧图像的复杂度和每帧图像对应的预设复杂度权重系数的乘积。
具体的,图像复杂度权重系数由投屏窗口历史多帧图像的复杂度以及对应的权重系数确定。其中,图像的复杂度包括但不限于:图像的颜色(red green blue,RGB)复杂度、图像的图形复杂度等。预设复杂度权重系数可以由用户根据历史每帧图像的重要性而确定。例如,图9(a)为本申请 实施例示出的一种图像复杂度的示意图,如图9(a)所示,如果第一帧图像的图像复杂度较高,但是第一帧图像为背景图片,没有有效的文字信息,对于用户来说,第一帧图像的重要性较低。因此,对于第一帧图像,用户可以设置较低的复杂度权重系数。图9(b)为本申请实施例示出的另一种图像复杂度的示意图,如图9(b)所示,如果第二帧图像的图像复杂度较低,但是第二帧图像包括了重要的文字信息,对于用户来说,第二帧图像的重要性较高。因此,对于第二帧图像,用户可以设置较高的复杂度权重系数。这样,根据图像复杂度权重系数和对应的图像的复杂度确定的图像复杂度权重系数,可以更准确地反应出投屏窗口的图像复杂度。
在一些实施例中,上述预设权重系数根据窗口启动的应用权重系数和预设窗口权重系数得到;预设权重系数为窗口启动的应用权重系数和预设窗口权重系数的乘积。
具体的,窗口启动的应用权重系数可由用户进行预设。例如,用户可根据窗口启动的应用的重要性设置窗口启动的应用权重系数。示例性的,窗口可启动的应用包括:短消息应用、视频应用和办公应用。如果当前用户需要通过视频应用观看视频,期间会通过短消息应用收发消息,但不会使用办公应用进行文档编辑,应用的重要性从高到低依次为:视频应用、短消息应用和办公应用。因此,用户设置的窗口启动的应用权重系数从高到低依次为:视频应用、短消息应用和办公应用。这样,通过窗口启动的应用权重系数可以真实地反映出当前用户更想要使用哪一类应用。
预设窗口权重系数同样可由用户进行预设。例如,可以根据投屏窗口的面积大小,或者投屏窗口显示的先后位置设置预设窗口权重系数。示例性的,图10为本申请实施例示出的一种多个投屏窗口显示效果的示意图;如图10所示,第二电子设备包括:第一投屏窗口、第二投屏窗口和第三投屏窗口。三个投屏窗口的面积从大到小依次为:第一投屏窗口、第二投屏窗口、第三投屏窗口。第一投屏窗口的面积最大,可以用于显示重要性高的应用的图像。第三投屏窗口的面积最小,可以用于显示重要性较低的应用的图像。因此,用户设置的预设窗口权重系数从高到低依次为:第一投屏窗口、第二投屏窗口、第三投屏窗口。这样,通过预设窗口权重系数可以真实地反映出当前投屏窗口的重要性。
S103、第一设备将多个应用的投屏数据以第一组码率发送至第二设备。
在本申请实施例中,第一设备将每一个应用的投屏数据以对应的第一码率发送至第二设备。其中,投屏数据至少包括:图像数据。可选的,根据应用类别的不同,投屏数据还可以包括:音频数据、文档数据等。
可选的,在一些实施例中,在S103之前,还包括:第一设备根据第一组码率将多个应用的投屏数据进行编码压缩。
具体的,第一设备将每个应用的原始投屏数据进行编码压缩,得到每个应用的编码后投屏数据,以使每个应用的编码后投屏数据在数据传输时满足第一码率,即小于或者等于第一码率。这样,可以使每个应用的投屏数据,在满足应用对应的第一码率的情况下,成功传输至第二设备。
以投屏数据为视频数据为例,对投屏数据进行编码压缩,即为视频编码。视频编码是指通过特定的压缩技术,将某个视频格式的文件转换成另一种视频格式文件的方式。例如,视频编码可以采用H.261、H.263、H.263+、H.263++、H.264、H265、MPEG-1、MPEG-2或MPEG-4等标准。视频解码是视频编码的逆向过程。关于不同视频编码的标准、视频编码的具体过程以及视频解码的具体过程的介绍,可以参考常规技术中的解释和说明,本申请不做赘述。
S104、第二设备接收多个应用的投屏数据,分别通过不同的投屏窗口显示多个应用的投屏数据。
在本申请实施例中,第二设备将接收到的多个应用的投屏数据,分别通过不同的投屏窗口显示出来,以实现多个应用的多窗口投屏。
可选的,在一些实施例中,如果第二设备接收到的编码后的投屏数据,则还需要对编码后的投屏数据进行解码,得到原始投屏数据,即每个应用未编码压缩的投屏数据。再将多个应用的原始投屏数据通过不同的投屏窗口显示。
S105、第一设备获取第二投屏参数,根据第二投屏参数,以第二组码率将多个应用的投屏数据投屏到多个投屏窗口,第二组码率包括多个应用中每个应用的第二码率。
在本申请实施例中,第一设备还可以获取第二投屏参数,并根据上文S102中的方法确定第二组码率,将多个应用的投屏数据以第二组码率投屏到多个投屏窗口中。这样,第一设备可以实现根据 投屏参数,自适应调整每个应用(即对应的投屏窗口)的码率,并将每个应用以调整后的码率发送至第二设备。从而保证每个应用对应的投屏窗口显示画面的清晰度和流畅度,改善投屏过程中投屏窗口显示画面容易出现卡顿和跳帧等问题,提高用户的使用体验。
上述S105为可选步骤,如果在执行完S104后,第一设备向第二设备投屏结束,则不执行S105。如果第一设备向第二设备投屏未结束,则执行S105,以实现第一设备根据投屏参数,自适应调整每个应用(即对应的投屏窗口)的码率。
在一些实施例中,在S105中,第一设备还可以以一定频率周期性的获取第二投屏参数,以实现对每个应用的码率进行实时调整。具体的,第一设备获取第二投屏参数获取的频率越高,每个应用的码率调整更新的越快,可以更进一步保证每个应用对应的投屏窗口显示内容的清晰度和流畅度。
上述主要从各个节点之间交互的角度对本申请实施例提供的方案进行了介绍。可以理解的是,各个节点,例如第一设备、第二设备等为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对第一设备、第二设备等进行功能模块的分组,例如,可以对应各个功能分组各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的分组是示意性的,仅仅为一种逻辑功能分组,实际实现时可以有另外的分组方式。
图11示出了一种电子设备300的结构图,该电子设备300可以为第一设备,或者第一设备中的芯片,或者片上系统,该电子设备300可以用于执行上述实施例中涉及的第一设备的功能。
作为一种可实现方式,图11所示的电子设备300包括:获取模块310和处理模块320。其中,获取模块310用于获取第一投屏参数,处理模块320用于根据第一投屏参数,以第一组码率将多个应用的投屏数据投屏到多个投屏窗口;所述第一组码率包括所述多个应用中每个应用的第一码率。获取模块310还用于获取第二投屏参数;处理模块320还用于根据第二投屏参数,以第二组码率将多个应用的投屏数据投屏到多个投屏窗口,第二组码率包括多个应用中每个应用的第二码率。
在一些实施例中,第一投屏参数包括:网络状态信息、用户体验信息和窗口状态信息。则获取模块310还用于:获取网络状态信息、用户体验信息和窗口状态信息。其中,网络状态信息用于表征所述多个应用在投屏时所用的传输信道的状态,用户体验信息用于表征用户对每个应用的投屏图像的质量体验,窗口状态信息用于表征每个投屏窗口的实时状态。
可选的,在一些实施方式中,上述获取模块310所执行的方法可以由多个模块分别完成。例如,电子设备300包括:网络监控模块、用户体验分析报告构建模块和窗口监控模块,网络监控模块用于获取网络状态信息,用户体验分析报告构建模块用于获取用户体验信息,窗口监控模块用于获取窗口状态信息。上述各模块的名称仅为示例性说明,各模块可以执行对应的方法即可,本申请实施例对此不做限定。
可以理解的是,上述电子设备300还可以包括如图6所示部件,此时,上述电子设备300中的收发动作可以由图6中天线1,天线2,移动通信模块150,无线通信模块160等部件执行,具体处理动作可以由图6中的处理器110执行。
本申请实施例还提供一种电子设备,该电子设备可以包括一个或者多个处理器、存储器和通信接口。其中,存储器、通信接口与处理器耦合。例如,存储器、通信接口与处理器可以通过总线耦合在一起。
其中,通信接口用于与其他设备进行数据传输。存储器中存储有计算机程序代码。计算机程序代码包括计算机指令,当计算机指令被处理器执行时,使得电子设备执行本申请实施例中的多窗口投屏方法。
可以理解的是,上述电子设备还可以包括如图6所示部件。例如,处理器可以为图6中的处理器110,存储器可以为图6中的内部存储器121或者通过外部存储器接口120连接的外部存储器,通 信接口可以为图6中的USB接口。
本申请实施例还一种多窗口投屏系统,该系统包括第一设备和第二设备;其中,第一设备,用于获取第一投屏参数;根据所述第一投屏参数,以第一组码率将多个应用的投屏数据投屏到第二设备的多个投屏窗口;第一组码率包括多个应用中每个应用的第一码率;获取第二投屏参数,根据第二投屏参数,以第二组码率将所述多个应用的投屏数据投屏到第二设备的多个投屏窗口,第二组码率包括多个应用中每个应用的第二码率。第二设备,用于将多个应用的投屏数据分别通过一个投屏窗口进行显示。
本申请实施例还提供一种计算机可读存储介质,该计算机存储介质中存储有计算机程序代码,当上述处理器执行该计算机程序代码时,电子设备执行上述方法实施例中多窗口投屏方法的相关步骤。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述方法实施例中多窗口投屏方法的相关步骤。
其中,本申请提供的电子设备、多窗口投屏系统、计算机存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (21)

  1. 一种多窗口投屏方法,其特征在于,包括:
    获取第一投屏参数;所述第一投屏参数用于表征将多个应用的投屏数据投屏到所述多个投屏窗口的投屏特征;
    根据所述第一投屏参数,以第一组码率将所述多个应用的投屏数据投屏到所述多个投屏窗口;所述第一组码率包括所述多个应用中每个应用的第一码率;
    获取第二投屏参数,根据所述第二投屏参数,以第二组码率将所述多个应用的投屏数据投屏到所述多个投屏窗口,所述第二组码率包括所述多个应用中每个应用的第二码率。
  2. 根据权利要求1所述的方法,其特征在于,所述第一投屏参数包括下述至少一项参数:网络状态信息、用户体验信息和窗口状态信息;
    所述网络状态信息用于表征所述多个应用在投屏时所用的传输信道的状态;所述用户体验信息用于表征用户对每个应用的投屏图像的质量体验;所述窗口状态信息用于表征每个投屏窗口的实时状态。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    根据所述网络状态信息确定可用于所述多个应用投屏的总码率;
    根据所述窗口状态信息确定每个应用的码率分配权重系数;
    根据所述用户体验信息确定每个应用对应的质量体验评分;
    在满足约束条件的情况下,对于任一应用,根据所述应用的码率分配权重系数以及所述应用对应的质量体验评分,确定所述应用的第一码率,其中所述约束条件包括多个所述应用对应的第一码率的总和等于所述总码率。
  4. 根据权利要求3所述的方法,其特征在于,
    所述网络状态信息包括网络带宽参数和信道干扰参数,所述网络带宽参数用于表征所述传输信道的带宽,所述信道干扰参数用于表征所述传输信道的干扰情况;
    所述总码率为所述网络带宽参数与所述信道干扰参数的乘积。
  5. 根据权利要求4所述的方法,其特征在于,所述网络带宽参数和所述信道干扰参数为通过时间序列预测法,对历史时刻网络带宽参数和历史时刻信道干扰参数预测得到。
  6. 根据权利要求2-5任一项所述的方法,其特征在于,所述用户体验信息包括每个应用对应的用户体验评分,所述应用对应的用户体验评分为用户对所述应用的投屏图像的质量体验评分。
  7. 根据权利要求6所述的方法,其特征在于,所述用户体验评分根据图像质量评价评分和对应的预设图像质量评价权重系数得到。
  8. 根据权利要求6或7所述的方法,其特征在于,所述图像质量评价评分包括以下的一个或多个评分:峰值信噪比PSNR评分,结构相似度SSIM评分或均方误差MSE评分。
  9. 根据权利要求2-8任一项所述的方法,其特征在于,所述窗口状态信息包括每个应用的码率分配权重系数。
  10. 根据权利要求9所述的方法,其特征在于,所述码率分配权重系数根据下述至少一项系数得到:用户注意力权重系数、图像复杂度权重系数和预设权重系数。
  11. 根据权利要求10所述的方法,其特征在于,所述码率分配权重系数为所述用户注意力权重系数、所述图像复杂度权重系数和所述预设权重系数的乘积。
  12. 根据权利要求9或10所述的方法,其特征在于,所述用户注意力权重系数根据历史时间段内对所述窗口界面的操作频率和对应的预设频率权重系数得到。
  13. 根据权利要求12所述的方法,其特征在于,所述历史时间段内对所述窗口界面的操作频率包括:第一历史时间段内对所述窗口界面的第一操作频率,第二历史时间段内对所述窗口界面的第二操作频率,其中,所述第一历史时间段大于所述第二历史时间段;
    所述预设频率权重系数包括:与所述第一操作频率对应的第一频率预设权重系数,与所述第二操作频率对应的第二频率预设权重系数;
    所述用户注意力权重系数为所述第一操作频率与所述第一频率预设权重系数的乘积、以及所述第二操作频率与所述第二频率预设权重系数的乘积之和。
  14. 根据权利要求10-13任一项所述的方法,其特征在于,所述图像复杂度权重系数根据所述窗口界面的历史多帧图像的复杂度,以及每帧图像对应的预设复杂度权重系数得到。
  15. 根据权利要求10-14任一项所述的方法,其特征在于,所述预设权重系数根据窗口启动的应用权重系数和预设窗口权重系数得到。
  16. 一种电子设备,其特征在于,所述设备包括:
    获取模块,用于获取第一投屏参数;所述第一投屏参数用于表征将多个应用的投屏数据投屏到所述多个投屏窗口的投屏特征;
    处理模块,用于根据所述第一投屏参数,以第一组码率将所述多个应用的投屏数据投屏到所述多个投屏窗口;所述第一组码率包括所述多个应用中每个应用的第一码率;
    所述获取模块,还用于获取第二投屏参数;
    所述处理模块,还用于根据所述第二投屏参数,以第二组码率将所述多个应用的投屏数据投屏到所述多个投屏窗口,所述第二组码率包括所述多个应用中每个应用的第二码率。
  17. 根据权利要求16所述的设备,其特征在于,所述第一投屏参数包括下述至少一项参数:网络状态信息、用户体验信息和窗口状态信息,所述获取模块还用于:
    获取所述网络状态信息、所述用户体验信息和所述窗口状态信息,所述网络状态信息用于表征所述多个应用在投屏时所用的传输信道的状态,所述用户体验信息用于表征用户对每个应用的投屏图像的质量体验,所述窗口状态信息用于表征每个投屏窗口的实时状态。
  18. 一种电子设备,其特征在于,包括:存储器、一个或多个处理器;所述存储器与所述处理器耦合;其中,所述存储器中存储有计算机程序代码,所述计算机程序代码包括计算机指令,当所述计算机指令被所述处理器执行时,使得所述电子设备执行如权利要求1-15任一项所述的多窗口投屏方法。
  19. 一种多窗口投屏系统,其特征在于,所述系统包括第一设备和第二设备;其中,
    所述第一设备,用于获取第一投屏参数;所述第一投屏参数用于表征将多个应用的投屏数据投屏到所述第二设备的多个投屏窗口的投屏特征;根据所述第一投屏参数,以第一组码率将所述多个应用的投屏数据投屏到所述多个投屏窗口;所述第一组码率包括所述多个应用中每个应用的第一码率;获取第二投屏参数,根据所述第二投屏参数,以第二组码率将所述多个应用的投屏数据投屏到所述多个投屏窗口,所述第二组码率包括所述多个应用中每个应用的第二码率;
    所述第二设备,用于将所述多个应用的投屏数据分别通过一个投屏窗口进行显示。
  20. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-15任一项所述的多窗口投屏方法。
  21. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-15任一项所述的多窗口投屏方法。
PCT/CN2023/110589 2022-08-05 2023-08-01 一种多窗口投屏方法、电子设备及系统 WO2024027718A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210938422.8 2022-08-05
CN202210938422.8A CN117560534A (zh) 2022-08-05 2022-08-05 一种多窗口投屏方法、电子设备及系统

Publications (1)

Publication Number Publication Date
WO2024027718A1 true WO2024027718A1 (zh) 2024-02-08

Family

ID=89811629

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/110589 WO2024027718A1 (zh) 2022-08-05 2023-08-01 一种多窗口投屏方法、电子设备及系统

Country Status (2)

Country Link
CN (1) CN117560534A (zh)
WO (1) WO2024027718A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109491627A (zh) * 2018-09-30 2019-03-19 广州市保伦电子有限公司 一种多应用投屏方法、多应用投屏系统和存储介质
CN110221798A (zh) * 2019-05-29 2019-09-10 华为技术有限公司 一种投屏方法、系统及相关装置
US20200310739A1 (en) * 2017-06-20 2020-10-01 Microsoft Technology Licensing, Llc Real-time screen sharing
CN112019897A (zh) * 2020-08-27 2020-12-01 北京字节跳动网络技术有限公司 投屏方法、装置、电子设备及计算机可读介质
CN112433690A (zh) * 2020-12-08 2021-03-02 努比亚技术有限公司 数据处理方法、终端及计算机可读存储介质
CN113986177A (zh) * 2021-11-05 2022-01-28 Oppo广东移动通信有限公司 投屏方法、投屏装置、存储介质与电子设备
CN114647468A (zh) * 2022-02-28 2022-06-21 深圳创维-Rgb电子有限公司 投屏图像显示方法、装置、电子设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200310739A1 (en) * 2017-06-20 2020-10-01 Microsoft Technology Licensing, Llc Real-time screen sharing
CN109491627A (zh) * 2018-09-30 2019-03-19 广州市保伦电子有限公司 一种多应用投屏方法、多应用投屏系统和存储介质
CN110221798A (zh) * 2019-05-29 2019-09-10 华为技术有限公司 一种投屏方法、系统及相关装置
CN112019897A (zh) * 2020-08-27 2020-12-01 北京字节跳动网络技术有限公司 投屏方法、装置、电子设备及计算机可读介质
CN112433690A (zh) * 2020-12-08 2021-03-02 努比亚技术有限公司 数据处理方法、终端及计算机可读存储介质
CN113986177A (zh) * 2021-11-05 2022-01-28 Oppo广东移动通信有限公司 投屏方法、投屏装置、存储介质与电子设备
CN114647468A (zh) * 2022-02-28 2022-06-21 深圳创维-Rgb电子有限公司 投屏图像显示方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN117560534A (zh) 2024-02-13

Similar Documents

Publication Publication Date Title
WO2020221039A1 (zh) 投屏方法、电子设备以及系统
WO2022052773A1 (zh) 多窗口投屏方法及电子设备
CN113726950B (zh) 一种图像处理方法和电子设备
US20230162324A1 (en) Projection data processing method and apparatus
US20230359424A1 (en) Multi-Screen Collaboration Method and System, and Electronic Device
WO2022100304A1 (zh) 应用内容跨设备流转方法与装置、电子设备
US20230305864A1 (en) Method for Displaying Plurality of Windows and Electronic Device
US10237318B2 (en) Electronic device and method for encoding image data thereof
CN111371849A (zh) 数据处理的方法和电子设备
WO2022222924A1 (zh) 一种投屏显示参数调节方法
WO2022062809A1 (zh) 投屏控制方法和装置
US20240045643A1 (en) Codec negotiation and switching method
CN115756268A (zh) 跨设备交互的方法、装置、投屏系统及终端
WO2023273543A1 (zh) 一种文件夹管理方法及装置
CN114205336A (zh) 跨设备音频播放方法、移动终端、电子设备及存储介质
CN114827696B (zh) 一种跨设备的音视频数据同步播放的方法和电子设备
US20230350629A1 (en) Double-Channel Screen Mirroring Method and Electronic Device
CN112437341B (zh) 一种视频流处理方法及电子设备
WO2024027718A1 (zh) 一种多窗口投屏方法、电子设备及系统
CN114697731A (zh) 投屏方法、电子设备及存储介质
CN117130765B (zh) 计算资源的配置方法和电子设备
WO2024051634A1 (zh) 一种投屏显示的方法、系统以及电子设备
WO2024037352A1 (zh) 一种分屏显示方法及相关装置
CN116991302B (zh) 应用与手势导航栏兼容运行方法、图形界面及相关装置
WO2022206600A1 (zh) 一种投屏方法、系统及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23849424

Country of ref document: EP

Kind code of ref document: A1