WO2022089271A1 - 无线投屏方法、移动设备及计算机可读存储介质 - Google Patents

无线投屏方法、移动设备及计算机可读存储介质 Download PDF

Info

Publication number
WO2022089271A1
WO2022089271A1 PCT/CN2021/124895 CN2021124895W WO2022089271A1 WO 2022089271 A1 WO2022089271 A1 WO 2022089271A1 CN 2021124895 W CN2021124895 W CN 2021124895W WO 2022089271 A1 WO2022089271 A1 WO 2022089271A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
screen projection
application
wireless screen
projection mode
Prior art date
Application number
PCT/CN2021/124895
Other languages
English (en)
French (fr)
Inventor
卢峰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21884999.0A priority Critical patent/EP4227792A4/en
Priority to US18/251,119 priority patent/US20230385008A1/en
Publication of WO2022089271A1 publication Critical patent/WO2022089271A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Definitions

  • the present application relates to the technical field of screen projection, and in particular, to a wireless screen projection method, a mobile device, and a computer-readable storage medium.
  • Wireless screen projection enables mobile devices such as mobile phones and tablets to deliver local or network multimedia content to PCs, smart screens, and other electronic devices with audio and video playback capabilities, and play the above-mentioned multimedia data on the electronic devices.
  • Typical wireless projection includes mirror projection (such as miracast), online projection (such as DLNA), and so on.
  • the electronic device does not need to be connected to the Internet or a local area network, but the mobile device and the electronic device must process the delivery data in real time during the whole process. For example, the mobile device needs to encode and send the delivery data in real time. Real-time reception and decoding of delivery data, etc.; this results in a large number of participating devices, a large delay in delivering data, and the effect of screen projection is easily affected by mobile devices.
  • the mobile device In online screen projection, the mobile device only participates in the initial network address transmission and does not participate in the subsequent process. The delay in delivering data is small, and the projection effect is not easily affected by the mobile device. The projection effect is good, but electronic equipment is required. Access the Internet or a local area network.
  • the smart TV also known as the large screen
  • the smart TV also known as the large screen
  • the game application uses the mobile phone and the large screen to play games together.
  • the delay is low and the user experience is better;
  • the audio and video are not synchronized, and the user experience is not good.
  • the present application provides a wireless screen projection method, a mobile device and a computer-readable storage medium, which can automatically identify the current application category and automatically Give suggestions, and automatically prompt the user whether to change, or even automatically change the screen projection method, so that the changed screen projection method is most suitable for the screen projection of the current application, thereby improving the user experience. For example, after the user casts the screen of the mobile phone to the large screen through the mirror screen, open the game application, and play the game with the mobile phone and the large screen. At this time, the scene is automatically recognized, and it is judged that the mirror screen is the most suitable game application.
  • the screen projection method remains unchanged; after that, the user switches the game application to a video application (such as Tencent Video), and the scene is automatically recognized at this time, and it is judged that online screen projection is the most suitable screen projection method for video applications, and the user is automatically prompted to change , and even automatically change the current mirror projection mode to online projection mode.
  • a video application such as Tencent Video
  • the present application provides a mobile device, the mobile device runs a first application in the foreground, the mobile device wirelessly projects a screen to an electronic device in a first wireless screen projection manner, and the mobile device includes: a processor; a memory; and a computer program, The computer program is stored in the memory, and when the computer program is executed by the processor, the mobile device performs the following steps: After detecting that the first application belongs to the first type of application, the mobile device automatically outputs first prompt information, and the first prompt information It is used to prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and uses the second wireless screen projection mode to display the electronic device.
  • the mobile device automatically switches the first wireless screen projection method to the second wireless screen projection method, and uses the second wireless screen projection method to detect one or more devices that support the second wireless screen projection method detected by the mobile device.
  • Wireless projection of electronic devices the mobile device can automatically identify the category of the application running in the foreground, and automatically prompt the user whether to change it, or automatically change the screen projection method, so that the changed screen projection method is most suitable for the screen projection of the current application, so as to take into account the requirements of different applications , and the features of mirror projection and online projection to improve user experience.
  • the specific execution subject is the operating system currently running on the mobile device, or a default system-level application on the mobile device (for example, a system-level application that is started after the mobile device is powered on).
  • the third solution of the first aspect is schematically illustrated.
  • the mobile device after detecting that the first application belongs to the first type of application, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and the mobile device uses the second wireless screen projection mode to send the electronic device 1 to the electronic device 1. , at least one of the electronic device 2 and the electronic device 3 to wirelessly project the screen.
  • the mobile device after the mobile device automatically outputs the first prompt information, the mobile device further performs the following steps: detecting the first user input, and the first user input is used to switch the first wireless screen projection mode to the second wireless screen projection mode mode; in response to the first user input, the mobile device automatically outputs the identifiers of one or more electronic devices, and the electronic device is an electronic device detected by the mobile device that supports the second wireless screen projection mode; when the second user input is detected, the second The user input is used to select an identifier of an electronic device from the identifiers of the electronic devices; in response to the second user input, the mobile device switches the first wireless screen projection mode to the second wireless screen projection mode, and uses the second wireless screen projection mode Cast to the selected electronic device.
  • the mobile device outputs the prompt information, the user is provided with an opportunity to choose whether to switch the screen projection mode, and to select the switched screen projection mode and the electronic device that accepts the screen projection. Users can choose according to the prompts.
  • the mobile device further performs the following steps: after detecting that the second application belonging to the second type of application is started, or, after detecting that the second application belonging to the second type of application is started After the second application is switched to the application running in the foreground, the mobile device automatically outputs the second prompt information, and the second prompt information is used to prompt to switch the second wireless screen projection mode to the first wireless screen projection mode; The second wireless screen projection mode is switched to the first wireless screen projection mode, and the screen is wirelessly projected to the electronic device in the first wireless screen projection mode.
  • the mobile device automatically recognizes and automatically prompts the user whether to change, or automatically changes the screen projection method, so that the changed screen projection method is most suitable for the current application. Improve user experience.
  • the mobile device further performs the following steps: after detecting that the first application is a third-type application, the first application of the mobile device automatically projects the first wireless screen to the screen
  • the second wireless screen projection mode is switched to the second wireless screen projection mode, and the screen is wirelessly projected to the electronic device in the second wireless screen projection mode; or, the first application of the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, to
  • the second wireless screen projection method wirelessly projects the screen to one or more electronic devices detected by the mobile device that support the second wireless screen projection method.
  • the mobile device automatically recognizes it, and automatically prompts the user whether to change it, or automatically changes the screencasting method, so that the changed screencasting method is most suitable for the screencasting of the current application. user experience.
  • the mobile device further performs the following steps: after detecting that the first application belonging to the first type of application is switched to the application running in the foreground, or, after detecting that the first application belonging to the first type of application is switched to the application running in the foreground
  • the mobile device automatically outputs the third prompt information, and the third prompt information is used for Prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and wirelessly projects to the electronic device in the second wireless screen projection mode Screen.
  • the mobile device automatically recognizes it and automatically prompts the user whether to change it back, or automatically change the screen projection method back, so that the changed screen projection method is most suitable for the current application. Screen projection to improve user experience.
  • the mobile device after the mobile device automatically outputs the first prompt information, the mobile device further performs the following steps: detecting the first user input, and the first user input is used to A wireless screen projection mode is switched to the second wireless screen projection mode; in response to the first user input, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and uses the second wireless screen projection mode to the electronic device.
  • Wireless screen projection or, the mobile device automatically switches the first wireless screen projection method to the second wireless screen projection method, and uses the second wireless screen projection method to detect one or more devices that support the second wireless screen projection method detected by the mobile device. Wireless projection of electronic devices. In this way, another solution for changing the screen projection method is provided, which can also improve the user experience.
  • the mobile device after detecting that the first application belongs to the first type of application, the mobile device further performs the following steps: after detecting that the mobile device plays the network video through the first application , the mobile device automatically outputs first prompt information, and the first prompt information is used to prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode Screen projection mode, wirelessly project the screen to the electronic device in the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and detects the mobile device in the second wireless screen projection mode The received one or more electronic devices that support the second wireless screen projection method wirelessly project the screen.
  • the mobile device also automatically prompts the user whether to change, or automatically changes the projection method, so that the changed projection method is most suitable for the projection of the current application, so as to take into account the requirements of different applications, as well as the characteristics of mirror projection and online projection , to improve the user experience.
  • the first type of application is a non-customized video application
  • the second type of application is a game application
  • the third type of application is a customized video application
  • the screen mode is mirror screen projection mode
  • the second wireless screen projection mode is online screen projection mode
  • the one or more electronic devices include the electronic device, or the one or more electronic devices do not include the electronic device
  • the input forms of the first user input and the second user input include touch input and voice input.
  • the mobile device can ensure the picture quality of the network video played by the electronic device, so as to ensure that the user gets the best screen projection experience, and can also reduce the cost.
  • the mobile device after the mobile device changes the screen projection mode, it can continue to project the screen to the original electronic device, or it can be replaced by other electronic devices to perform the screen projection, so as to meet the different needs of users.
  • the mirroring method may be miracast formulated by the Wi-Fi Alliance, and the online screen projection method may be DLNA.
  • the mobile device stores a whitelist
  • the whitelist is used to identify which applications belong to the first category of applications
  • the whitelist includes one or more applications of the first category of applications .
  • the whitelist is preset and can be updated.
  • the first prompt information includes but is not limited to: interface elements displayed on the display screen, playing audio, flashing indicator lights, motor vibration, and the like.
  • a mobile device wirelessly projects a screen to the electronic device in a first wireless screen projection manner
  • the mobile device includes: a processor; a memory; and a computer program, wherein the computer program is stored in the memory, and when the computer program is executed by the processor, the mobile device is executed.
  • a wireless screen projection method is provided.
  • the wireless screen projection method is applied to a mobile device.
  • the mobile device includes a processor and a memory.
  • the mobile device runs a first application in the foreground.
  • the method includes: after detecting that the first application belongs to the first type of application, the mobile device automatically outputs first prompt information, where the first prompt information is used to prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or , the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and wirelessly projects the screen to the electronic device in the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode
  • the second wireless screen projection method is used to wirelessly project the screen to one or more electronic devices detected by the mobile device that support the second wireless screen projection method.
  • the wireless screen projection method further includes: detecting a first user input, where the first user input is used to switch the first wireless screen projection mode to the second wireless screen projection mode; in response to the first user input, the mobile device automatically outputs the identifiers of one or more electronic devices, and the electronic device is an electronic device detected by the mobile device that supports the second wireless screen projection mode; when the second user input is detected, the second The user input is used to select an identifier of an electronic device from the identifiers of the electronic devices; in response to the second user input, the mobile device switches the first wireless screen projection mode to the second wireless screen projection mode, and uses the second wireless screen projection mode Cast to the selected electronic device.
  • the mobile device outputs the prompt information
  • the user is provided with an opportunity to choose whether to switch the screen projection mode, and to select the switched screen projection mode and the electronic device that accepts the screen projection. Users can choose according to the prompts.
  • the wireless screen projection method further includes: after detecting that the second application belonging to the second type of application is started, or, after detecting that the second application belonging to the second type of application is started After the second application is switched to the application running in the foreground, the mobile device automatically outputs the second prompt information, and the second prompt information is used to prompt to switch the second wireless screen projection mode to the first wireless screen projection mode; The second wireless screen projection mode is switched to the first wireless screen projection mode, and the screen is wirelessly projected to the electronic device in the first wireless screen projection mode.
  • the mobile device automatically recognizes and automatically prompts the user whether to change, or automatically changes the screen projection method, so that the changed screen projection method is most suitable for the current application. Improve user experience.
  • the wireless screen projection method further includes: after detecting that the first application is a third type of application, the mobile device automatically switches the first wireless screen projection mode to the third type of application.
  • the second wireless screen projection mode is to wirelessly project the screen to the electronic device in the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and uses the second wireless screen projection mode to the mobile device.
  • One or more electronic devices detected by the device that support the second wireless screen projection method wirelessly project the screen.
  • the mobile device automatically recognizes it, and automatically prompts the user whether to change it, or automatically changes the screencasting method, so that the changed screencasting method is most suitable for the screencasting of the current application. user experience.
  • the wireless screen projection method further includes: after detecting that the first application belonging to the first type of application is switched to the application running in the foreground, or, after detecting that the first application belonging to the first type of application is switched to the application running in the foreground
  • the mobile device automatically outputs the third prompt information, and the third prompt information is used for Prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and wirelessly projects to the electronic device in the second wireless screen projection mode Screen.
  • the mobile device automatically recognizes it and automatically prompts the user whether to change it back, or automatically change the screen projection method back, so that the changed screen projection method is most suitable for the current application. Screen projection to improve user experience.
  • the wireless screen projection method further includes: detecting a first user input, where the first user input is used to display the first prompt information.
  • a wireless screen projection mode is switched to the second wireless screen projection mode; in response to the first user input, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and uses the second wireless screen projection mode to the electronic device.
  • the wireless screen projection method further includes: after detecting that the mobile device plays the network video through the first application , the mobile device automatically outputs first prompt information, and the first prompt information is used to prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode Screen projection mode, wirelessly project the screen to the electronic device in the second wireless screen projection mode; or, the mobile device automatically switches the first wireless screen projection mode to the second wireless screen projection mode, and detects the mobile device in the second wireless screen projection mode The received one or more electronic devices that support the second wireless screen projection method wirelessly project the screen.
  • the mobile device also automatically prompts the user whether to change, or automatically changes the projection method, so that the changed projection method is most suitable for the projection of the current application, so as to take into account the requirements of different applications, as well as the characteristics of mirror projection and online projection , to improve the user experience.
  • the first type of application is a non-customized video application
  • the second type of application is a game application
  • the third type of application is a customized video application
  • the screen mode is mirror screen projection mode
  • the second wireless screen projection mode is online screen projection mode
  • the one or more electronic devices include the electronic device, or the one or more electronic devices do not include the electronic device
  • the input forms of the first user input and the second user input include touch input and voice input.
  • the mobile device can ensure the picture quality of the network video played by the electronic device, so as to ensure that the user gets the best screen projection experience, and can also reduce the cost.
  • the mobile device after the mobile device changes the screen projection mode, it can continue to project the screen to the original electronic device, or it can be replaced by other electronic devices to perform the screen projection, so as to meet the different needs of users.
  • the mirroring method may be miracast formulated by the Wi-Fi Alliance, and the online screen projection method may be DLNA.
  • the mobile device stores a whitelist
  • the whitelist is used to identify which applications belong to the first type of applications
  • the whitelist includes one or more applications belonging to the first type of applications application.
  • the whitelist is preset and can be updated.
  • the whitelist can be set by users to add or delete them.
  • the first prompt information includes but is not limited to: interface elements displayed on the display screen, playing audio, flashing indicator lights, motor vibration, and the like.
  • the third aspect and any implementation manner of the third aspect correspond to the first aspect and any implementation manner of the first aspect, respectively.
  • the technical effects corresponding to the third aspect and any implementation manner of the third aspect reference may be made to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which will not be repeated here.
  • a wireless screen projection method is provided.
  • the wireless screen projection method is applied to a mobile device.
  • the mobile device includes a processor and a memory.
  • the mobile device wirelessly projects a screen to the electronic device in a first wireless screen projection manner.
  • the method includes: after detecting that a first application is started, and the first After the application belongs to the first type of application, the mobile device automatically outputs the first prompt information, and the first prompt information is used to prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device automatically outputs the first wireless screen projection mode.
  • the screen mode is switched to the second wireless screen projection mode, and the screen is wirelessly projected to the electronic device in the second wireless screen projection mode.
  • the present application provides a wireless screen projection method.
  • the wireless screen projection method is applied to a mobile device, the mobile device includes a processor and a memory, and the mobile device wirelessly projects a screen to the electronic device in a first wireless screen projection manner;
  • the wireless screen projection method includes: the mobile device runs a first application in the foreground, and the mobile device The scene corresponding to the application running in the foreground is identified, and the scene is notified to the electronic device, so that the electronic device plays the multimedia content by using the play strategy corresponding to the scene.
  • an embodiment of the present application provides a screen projection method, which is applied to an electronic device.
  • the method may include: receiving, by the electronic device, multimedia content sent by the mobile device in a first wireless screen projection manner; receiving a scene sent by the mobile device and recognized by the mobile device; and playing the multimedia content using a playback strategy corresponding to the scene.
  • the present application provides a computer-readable storage medium, comprising a computer program, characterized in that, when the computer program is run on a mobile device, the mobile device is caused to execute the third aspect, the fourth aspect, the third aspect, and the third aspect.
  • the present application provides a computer program product, which, when the computer program product is run on a mobile device, enables the mobile device to perform any implementation of the third aspect, the fourth aspect, the fifth aspect and the third aspect. method.
  • the technical solution provided in this application is improved from the system side, without any adaptation by a third-party application.
  • the technical solution provided by this application automatically selects the optimal screen projection method according to the type of the application running in the foreground, and even according to whether the application running in the foreground plays network video, etc., or automatically outputs prompt information to let the user Being able to choose independently makes the screen projection effect of the mobile device the best, improves the user's screen projection experience, reduces costs, and improves the screen projection efficiency.
  • FIG. 1A is a schematic diagram showing the principle of sharing a network video based on a mirrored screen and a provided mobile device and an electronic device;
  • FIG. 1B is a schematic diagram of a provided mobile device and an electronic device for sharing a network video based on online screen projection;
  • FIG. 2 is a schematic diagram of a scenario of a wireless screen projection method provided by an embodiment of the present application
  • 3A is a schematic diagram of a hardware structure of a mobile device according to an embodiment of the present application.
  • 3B is a schematic diagram of a software structure of a mobile device provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • 5A-5C are schematic flowcharts of a wireless screen projection method according to an embodiment of the present application.
  • 6A-6C, 7A-7C, and 8A-8H are schematic diagrams of user interfaces of a mobile device in a wireless screen projection method provided by an embodiment of the present application;
  • FIG. 9 is a schematic diagram of a user interface of an electronic device in a wireless screen projection method provided by an embodiment of the present application.
  • 10A-10B are schematic diagrams of interaction between internal modules of a mobile device in a wireless screen projection method provided by an embodiment of the present application;
  • FIG. 11 is a schematic diagram of the interaction of internal modules of a mobile device in a wireless screen projection method provided by an embodiment of the present application;
  • 13A-13B are schematic diagrams of interaction between internal modules of a mobile device and an electronic device in another wireless screen projection method provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as implying or implying relative importance or implying the number of indicated technical features. Thus, a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • words such as “exemplarily” or “for example” are used to represent examples, illustrations or illustrations. Any embodiment or design described in the embodiments of the present application as “exemplarily” or “such as” should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as “exemplarily” or “such as” is intended to present the related concepts in a specific manner.
  • GUI graphical user interface
  • the content displayed by the mobile device can be displayed on the electronic device (such as continuing to play the multimedia content), and then the mobile device can continue to display the content, or Don't show this content again.
  • the wireless screen projection can also be replaced by other words, such as multi-screen interaction, etc.
  • the embodiment of the present application does not limit it.
  • the mobile device may also be referred to as an output end or a source end (source end), and an electronic device may also be referred to as an input end or a sink end (sink end).
  • the multimedia content displayed or played by the electronic device may include any one or more of the following: video, text, picture, photo, audio or table, etc.
  • the multimedia content may be movies, TV series, short videos, musicals, and the like.
  • the multimedia content may be network multimedia content, local multimedia content, or a combination of network multimedia content and local multimedia content.
  • the network multimedia content refers to the multimedia content obtained by the mobile device from the network. For example, when a mobile device runs a video application, the video is obtained from a server that provides audio and video services.
  • Local multimedia content refers to the multimedia content locally stored or generated by the mobile device. For example, pictures or tables stored locally on the mobile device, etc.
  • a mobile device such as a smartphone, tablet computer, etc. establishes a communication connection with an electronic device (such as a smart TV, a smart screen, etc.), the mobile device obtains multimedia content from the network side or the local side, and then transfers the After the multimedia content is encoded, it is transmitted to the electronic device by means of wireless transmission such as point-to-point transmission; the electronic device outputs (such as display, playback, etc.) after decoding.
  • an electronic device such as a smart TV, a smart screen, etc.
  • the multimedia content may be network multimedia content, local multimedia content, or a combination of both.
  • the point-to-point transmission method between the mobile device and the electronic device may include, but is not limited to: wireless fidelity direct (Wi-Fi direct) (also known as wireless fidelity peer-to-peer, Wi-Fi P2P) communication connection, Bluetooth communication connection, near field communication (near field communication, NFC) connection, etc.
  • Wi-Fi Mirroring can include miracast formulated by the Wi-Fi Alliance, and private mirroring solutions developed by various companies, such as Huawei's cast+ and Apple's AirPlay. Among them, miracast is based on the basic technical standards developed by the Wireless Fidelity Wi-Fi Alliance and the real time streaming protocol (RTSP).
  • Wi-Fi basic technical standards can include wireless transmission technology 802.11n, 802.11ac, Wi-Fi direct/Wi-Fi P2P, tunneled direct link setup (TDLS), WPA2 (Wi-Fi protected access) for management security 2) Encryption, WMM (Wi-Fimultimedia) technology that provides quality of service and traffic management, etc.
  • FIG. 1A exemplarily shows a scenario in which a mobile device and an electronic device share a network video based on mirror screen projection.
  • the mobile device establishes a communication connection (eg, a Wi-Fi P2P connection) with the electronic device.
  • the mobile device accesses the routing device, and obtains the streaming media from the server through the routing device.
  • the mobile device accessing the routing device may specifically be an access point (access point, AP) provided by the mobile device accessing the routing device.
  • the mobile device acquires the screen recording content and the audio recording content through screen recording, audio recording, etc.
  • the Wi-Fi P2P connection is sent to the electronic device; after the electronic device receives it, it will play and display in real time.
  • the network video received by the electronic device undergoes network transmission and codec conversion for many times, and the process is cumbersome.
  • the picture quality (eg, resolution) of the network video played by the electronic device is limited by the mobile device, and the playback effect may be poor, which affects the user experience.
  • the audio and video synchronization effect of mirror projection is better.
  • mirror projection may also be referred to as, for example, full share projection, wireless display, and the like.
  • Both the mobile device and the electronic device are connected to the Internet or a local area network.
  • the mobile device only sends the network address corresponding to the multimedia resource to be projected, such as a uniform resource locator (URL), to the electronic device.
  • the electronic device obtains the corresponding multimedia content from the Internet side or the local area network side, so as to output (eg play, display).
  • the multimedia content may be network multimedia content, local multimedia content, or a combination of the two.
  • the electronic device and the mobile device can be connected to the local area network formed by the same wireless Wi-Fi access point AP, or they can be connected to different networks with the mobile device, such as access to different local area networks formed by different APs. interconnected via the Internet.
  • Online screencasting can include DLNA, as well as private online screencasting solutions developed by various companies. For example, Google's Googlecast, Apple's AirPlay, etc. Among them, DLNA is built on the universal plug and play (universal plug and play, UPnP) protocol.
  • DLNA universal plug and play (universal plug and play, UPnP) protocol.
  • FIG. 1B exemplarily shows a scenario in which a mobile device and an electronic device share a network video based on online screen projection.
  • the mobile device and the electronic device are jointly connected to the local area network formed by the Wi-Fi AP, the mobile device sends the URL of the played network video to the electronic device through the AP, and then the electronic device obtains the multimedia content according to the URL.
  • the URL is the address of the server that provides network audio and video services.
  • the electronic device can directly obtain multimedia content from the network side, without the need for multiple network transmission and codec conversion of the network video, the process is simple, and the electronic device can have a better playback effect when playing the network video. , the user experience is better.
  • the audio and video synchronization of online projection is relatively poor.
  • online screencasting may also be referred to as network screencasting or the like.
  • the mirror screen projection mode may be referred to as the first wireless screen projection mode
  • the online screen projection mode may be referred to as the second wireless screen projection mode.
  • FIG. 2 is a schematic diagram of a scenario of a wireless screen projection method provided by an embodiment of the present application.
  • the mobile device 100 can deliver multimedia content to the electronic device 200 through mirror projection, and can also deliver multimedia content to the electronic device 200 through online projection.
  • a Wi-Fi P2P connection can be established between the mobile device 100 and the electronic device 200; other short-distance communication direct connections, such as Bluetooth, ZigBee, etc., can also be established.
  • the multimedia content delivered by the mobile device 100 can come from itself; it can also come from the server 400 .
  • the mobile device 100 connects to the server 400 via the Wi-Fi access point 300 .
  • both the mobile device 100 and the electronic device 200 can access the Wi-Fi access point 300 and thus are in the same local area network.
  • the mobile device 100 and the electronic device 200 may also access different networks, which is not limited in this embodiment of the present application. It should be emphasized that, unless otherwise specified, in the following embodiments of the present application, both the mobile device 100 and the electronic device 200 access the same local area network through the Wi-Fi access point 300 .
  • the multimedia content delivered by the mobile device 100 comes from the server 400 .
  • the server 400 provides network audio and video services.
  • the server 400 may be a server that stores various multimedia contents.
  • the server 400 may be a Tencent video server that provides audio and video services.
  • the number of servers 400 may be one or more.
  • the mobile device 100 can automatically or under the trigger of the user, switch the mirror projection to the online projection, and continue to communicate with the online projection through the online projection.
  • the electronic device 200 shares the network video.
  • the network video may be obtained from the server 400 when the mobile device 100 runs a video application (application, APP).
  • the mobile device 100 may also switch the online projection back to the mirror projection.
  • Mobile devices in the embodiments of the present application include, but are not limited to, smart phones, tablet computers, personal digital assistants (PDAs), wearable electronic devices with wireless communication functions (eg, smart watches, smart glasses), and the like.
  • Exemplary embodiments of mobile devices include, but are not limited to, piggybacks Portable electronic devices with Linux or other operating systems.
  • the above-mentioned mobile device may also be other portable electronic devices, such as a laptop computer (Laptop) and the like. It should also be understood that, in some other embodiments, the above-mentioned mobile device may not be a portable electronic device, but a desktop computer.
  • FIG. 3A shows the hardware structure of the mobile device 100 provided by this embodiment of the present application.
  • the mobile device 100 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193, a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile device 100 .
  • the mobile device 100 may include more or less components than shown, or some components may be combined, or some components may be split, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the wireless communication function of the mobile device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in mobile device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the mobile device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), NFC, infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellites System
  • FM global navigation satellite system
  • NFC infrared technology
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the mobile device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the wireless communication module 160 can be used to establish a communication connection with the electronic device 200 (for example, a Wi-Fi direct connection communication connection, a Bluetooth communication connection, etc.), and based on the communication connection, encode and send the data collected by the mobile device 100 through screen recording and audio recording. to the electronic device 200 . That is, the wireless communication module 160 can support the sharing of multimedia content between the mobile device 100 and the electronic device 200 based on mirror projection (eg, miracast).
  • mirror projection eg, miracast
  • the wireless communication module 160 can also be connected to a local area network or other network formed by the Wi-Fi access point 300, and can send the URL of the currently playing multimedia content to the electronic device 200 through the network. Afterwards, the electronic device 200 can directly acquire the multimedia content through the website. That is, the wireless communication module 160 can support the sharing of multimedia content between the mobile device 100 and the electronic device 200 based on online screen projection (eg, DLNA).
  • online screen projection eg, DLNA
  • the processor 110 is configured to identify the current scene when the mobile device 100 shares multimedia content with the electronic device 200 based on mirror projection, and notify the electronic device 200 of the scene through the wireless communication module 160, So that the electronic device 200 adaptively selects a corresponding play strategy according to the scene to play the multimedia content.
  • the manner in which the processor 110 identifies the current scene and the manner in which the wireless communication module 160 notifies the electronic device 200 of the current scene may refer to the related descriptions of the subsequent method embodiments, which will not be described here.
  • the processor 110 is further configured to instruct the wireless communication module 160 to mirror the video automatically or under the trigger of the user during the process of the mobile device 100 sharing the network video with the electronic device 200 based on mirroring Switch screencasting to online screencasting.
  • the wireless communication module 160 switching the mirrored screen projection to the online screen projection reference may be made to the related descriptions of the subsequent method embodiments, which will not be described here for the time being.
  • the processor 110 may also be used to instruct the wireless communication module 160 to switch the online screen projection back to the mirror screen projection.
  • the wireless communication module 160 switches the online screen projection back to the mirror screen projection.
  • the mobile device 100 realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • mobile device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 is used to display the user interface implemented on the mobile device 100 mentioned in the embodiments of the present application.
  • the user interface For the specific implementation of the user interface, reference may be made to related descriptions of subsequent method embodiments.
  • Video codecs are used to compress or decompress digital video.
  • Mobile device 100 may support one or more video codecs.
  • the mobile device 100 can play or record videos in various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the mobile device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile device 100 in a different location than the display screen 194 .
  • the software system of the mobile device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the mobile device 100 .
  • FIG. 3B is a schematic block diagram of a software structure of the mobile device 100 provided by an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include screencasting services, video applications, game applications, office applications, camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, SMS and other applications program.
  • video application will be referred to as video application for short
  • game application will be referred to as game application.
  • the screen mirroring service provides a mirror mirroring function of the mobile device 100 .
  • the screen projection service supports the mobile device 100 to share multimedia content with the electronic device 200 based on mirror projection.
  • the screen projection service can invoke the wireless communication module 160 of the mobile device 100 to provide a mirror projection function.
  • the video application may be referred to as a video application for providing audio and video services to the mobile device 100 .
  • the mobile device 100 can run a video application, and obtain network video from a server corresponding to the video application.
  • the number of video applications can be one or more.
  • the video application may include Tencent Video.
  • Video applications can provide online screencasting.
  • the video application supports the mobile device 100 to share multimedia content with the electronic device 200 through online screen projection. Specifically, when the mobile device 100 is running a video application and playing the online video therein, if the user enables the online screen projection function of the video application, the mobile device 100 can send the website of the online video to the electronic device 200 .
  • Game applications which may be referred to as game applications, are used to provide game services to the mobile device 100 .
  • the mobile device 100 can run a game application and acquire game resources locally or from a server corresponding to the game application.
  • the number of game applications can be one or more.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a scene awareness module, a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
  • the scene perception module is configured to listen to the operation of the currently used application program, and identify the currently used application program of the mobile device 100 accordingly, so as to determine the scene in which the mobile device 100 is located.
  • the scene awareness module is optional.
  • the functionality of the scene awareness module can be integrated into the screencasting service at the application layer.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • FIG. 3B is only a schematic example; the software structure of the mobile device 100 provided in this embodiment of the present application may also adopt other software structures, such as The software architecture of Linux or other operating systems.
  • the electronic devices in the embodiments of the present application include but are not limited to tablet computers, desktop computers, portable electronic devices (such as laptop computers, Laptop), smart TVs (such as smart screens), car computers, smart speakers, augmented reality (augmented reality, AR) devices, virtual reality (VR) devices, electronic billboards with displays, projectors used alone (eg, projected on a wall) or in combination with a display device (eg, a screen), other Smart devices with displays, other smart devices with speakers, etc.
  • Exemplary embodiments of electronic devices include, but are not limited to, onboard Portable electronic devices with Linux or other operating systems.
  • the electronic device 200 may be a TV set equipped with a TV box, the TV box is configured to receive multimedia content from the mobile device 100 or the server 400 and provide a screen projection function, and the TV set only provides a display function.
  • the electronic device 200 may also be used in conjunction with a remote control. The remote controller and the electronic device 200 can communicate through infrared signals.
  • FIG. 4 shows the hardware structure of the electronic device 200 provided by the embodiment of the present application.
  • the electronic device 200 may include: a video codec 221, a processor 222, a memory 223, a wireless communication processing module 224, a power switch 225, a wired LAN communication processing module 226, a high definition multimedia interface (high definition multimedia interface) multimedia interface, HDMI) communication processing module 227, USB communication processing module 228, display screen 229, audio module 230.
  • the individual modules can be connected via a bus. in:
  • the processor 222 may be used to read and execute computer readable instructions.
  • the processor 222 may mainly include a controller, an arithmetic unit, and a register.
  • the controller is mainly responsible for instruction decoding, and sends out control signals for the operations corresponding to the instructions.
  • the operator is mainly responsible for performing fixed-point or floating-point arithmetic operations, shift operations, and logical operations, etc., and can also perform address operations and conversions.
  • Registers are mainly responsible for saving register operands and intermediate operation results temporarily stored during instruction execution.
  • the hardware architecture of the processor 222 may be an application specific integrated circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, an NP architecture, or the like.
  • ASIC application specific integrated circuit
  • the wireless communication processing module 224 may include a WLAN communication processing module 224A, and may further include a Bluetooth (BT) communication processing module 224B, an NFC processing module 224C, a cellular mobile communication processing module (not shown), and the like.
  • BT Bluetooth
  • NFC NFC
  • cellular mobile communication processing module not shown
  • the wireless communication processing module 224 may be configured to establish a communication connection with the mobile device 100 and receive encoded data sent by the mobile device 100 based on the communication connection.
  • the WLAN communication processing module 224A can be used to establish a Wi-Fi Direct communication connection with the mobile device 100
  • the Bluetooth (BT) communication processing module 224B can be used to establish a Bluetooth communication connection with the mobile device 200
  • the NFC processing module 224C can be used to establish a Bluetooth communication connection with the mobile device 100 Establish an NFC connection, etc. That is, the wireless communication processing module 224 can support the sharing of multimedia content between the mobile device 100 and the mobile device 200 through mirror projection (eg, miracast).
  • the wireless communication processing module 224 can monitor signals transmitted by the mobile device 100 such as probe requests and scan signals, discover the mobile device 100 , and establish a communication connection with the mobile device 100 .
  • the wireless communication processing module 224 can also transmit signals, such as probe requests and scan signals, so that the electronic device 200 can discover the mobile device 100 and establish a communication connection (such as a Wi-FiP2P connection) with the mobile device 100 . .
  • the wireless communication processing module 224 may also receive a notification from the mobile device 100 scene.
  • the processor 222 can parse and learn the scene, adaptively select a play strategy corresponding to the scene, and use the play strategy to call modules such as the display screen 229 and the audio module 230 to play the multimedia content sent by the mobile device 100 .
  • the wireless communication processing module 224 can also access the local area network or other network formed by the Wi-Fi access point 300, and receive the mobile device through the Wi-Fi access point 300. 100 sends the URL of the web video, and then the web video can be directly obtained from the server corresponding to the web address. That is, the WLAN communication processing module 224A can support online video sharing between the mobile device 100 and the electronic device 200 through online screen projection (eg, DLNA).
  • online screen projection eg, DLNA
  • the video codec 221 is used to compress or decompress digital video.
  • the video codec 221 may decompress the multimedia content from the mobile device 100 or the server 400 .
  • the electronic device 200 may support one or more video codecs, and may play videos in one or more encoding formats. For example: MPEG1, MPEG2, MPEG3, MPEG4, etc.
  • the processor 222 may be configured to parse the signal received by the wireless communication processing module 224, such as a broadcast probe request of the electronic device 200, and the like.
  • the processor 222 may be configured to perform corresponding processing operations according to the parsing result, such as generating a probe response, and the like.
  • the processor 222 may be used to drive the display screen 229 to perform display according to the decompression result of the video codec 221 .
  • Memory 223 is coupled to processor 222 for storing various software programs and/or sets of instructions.
  • memory 223 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices.
  • the memory 223 can store operating systems, such as embedded operating systems such as uCOS, VxWorks, RTLinux, Harmony, and Android.
  • Memory 223 may also store communication programs that may be used to communicate with electronic device 200, one or more servers, or additional devices.
  • the power switch 225 may be used to control the power supplied by the power source to the electronic device 200 .
  • the wired LAN communication processing module 226 can be used to communicate with other devices in the same LAN through the wired LAN, and can also be used to connect to the WAN through the wired LAN, and can communicate with the devices in the WAN.
  • the HDMI communication processing module 227 may be used to communicate with other devices through an HDMI interface (not shown).
  • the USB communication processing module 228 may be used to communicate with other devices through a USB interface (not shown).
  • Display screen 229 may be used to display images, video, and the like.
  • the display screen 229 can be LCD, OLED, AMOLED, FLED, QLED and other display screens.
  • For the content displayed on the display screen 229 reference may be made to related descriptions of subsequent method embodiments.
  • the audio module 230 can be used to output audio signals through the audio output interface, so that the electronic device 200 can support audio playback.
  • the audio module 230 can also be used to receive audio data through the audio input interface.
  • the audio module 230 may include, but is not limited to, a microphone, a speaker, a receiver, and the like.
  • the electronic device 200 may also include a serial interface such as an RS-232 interface.
  • the serial interface can be connected to other devices, such as audio amplifiers such as speakers, so that the display and audio amplifiers can cooperate to play audio and video.
  • the structure shown in FIG. 4 does not constitute a specific limitation on the electronic device 200 .
  • the electronic device 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the software system of the electronic device 200 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture, or the like.
  • the software system of the electronic device 200 includes but is not limited to Linux or other operating systems. For Huawei's Hongmeng system.
  • the application layer of the software system of the electronic device 200 may include a screen casting service and a screen casting player.
  • the screen projection service supports the electronic device 200 to receive multimedia content delivered by the mobile device 100 through mirror projection and online projection.
  • the screen projection service can call the wireless communication processing module 224 to provide the mirror projection function and the online projection function.
  • the screencasting player is used to play multimedia content from the mobile device 100 or the server 400 .
  • the screen-casting service may instruct the screen-casting player to play the content according to the corresponding playback strategy according to the scene in which the mobile device 100 is currently located. multimedia content.
  • video applications can be divided into customized video applications and non-customized video applications.
  • the customized video application refers to the application that the application itself has the ability to initiate screen projection.
  • the customized video application itself integrates a software development kit (software development kit, SDK).
  • SDK software development kit
  • the screen projection status of the mobile device 100 may include: whether the mobile device 100 currently shares multimedia content with the electronic device 200 through mirror projection.
  • Non-customized video applications refer to applications that do not have the ability to initiate screen projection.
  • the non-customized video application itself does not integrate the SDK.
  • the wireless screen projection method provided by the embodiments of the present application can automatically identify the category of the application currently running in the foreground, and prompt the user to select a suggested wireless screen projection method, or automatically change an appropriate wireless screen projection method.
  • FIG. 5A is a schematic flowchart of a wireless screen projection method provided by an embodiment of the present application. As shown in Figure 5A, the method may include:
  • the mobile device 100 shares multimedia content with the electronic device 200 by mirroring the screen.
  • the mobile device 100 detects a user operation for enabling the mirroring function.
  • FIG. 6A and FIG. 6B exemplarily show a user operation detected by the mobile device 100 to enable the mirroring function.
  • FIG. 6A shows an exemplary user interface 61 on the mobile device 100 for presenting installed applications.
  • the user interface 61 displays a status bar, a calendar indicator, a weather indicator, a tray with icons of frequently used applications, a navigation bar, icons 601 of video applications, icons 602 of game applications, and icons of other applications.
  • the status bar may include: one or more signal strength indicators of mobile communication signals (also known as cellular signals), operator name (such as "China Mobile"), one or more signal strengths of Wi-Fi signals indicator, battery status indicator, time indicator, etc.
  • the navigation bar may include system navigation keys such as the back key, the home screen key, and the multitasking key.
  • the user interface 61 exemplarily shown in FIG. 6A may be a home screen.
  • the mobile device 100 when the mobile device 100 detects a downward swipe gesture on the display screen, the mobile device 100 displays a window 603 on the user interface 61 in response to the swipe gesture.
  • a control 603 a may be displayed in the window 603 , and the control 603 a may receive an operation (eg, touch operation, click operation) for enabling/disabling the mirroring function of the mobile device 100 .
  • the presentation form of the control 603a may include icons and/or text (eg, the text "mirror projection", “wireless projection", “multi-screen interaction", etc.).
  • the window 603 may also display switch controls for other functions such as Wi-Fi, Bluetooth, flashlight, and the like. As shown in FIG.
  • the mobile device 100 can detect a user operation acting on the control 603a, that is, detecting a user operation to enable the mirroring function. In some embodiments, after detecting the user operation acting on the control 603a, the mobile device 100 can change the display form of the control 603a, for example, adding a shadow when the control 603a is displayed.
  • the user may also input a downward swipe gesture on other interfaces to trigger the mobile device 100 to display a window.
  • the user operation of enabling the mirroring function can also be implemented in other forms, which are not implemented in this embodiment of the present application. limit.
  • the mobile device 100 may also display a setting interface provided by a settings application, and the setting interface may include a control provided to the user for enabling/disabling the mirroring function of the mobile device 100. Enter a user operation on the device to enable the mirroring function of the mobile device 100 .
  • the user can also bring the mobile device 100 close to the NFC tag of the electronic device 200 to trigger the mobile device 100 to enable the mirroring function.
  • the mobile device 100 discovers nearby electronic devices.
  • the mobile device 100 enables one or more of Wi-Fi direct connection (not shown in the figure), Bluetooth or NFC in the wireless communication module 160, and uses Wi-Fi - One or more of Fi direct connection, Bluetooth, and NFC to discover electronic devices that can mirror the screen near the mobile device 100 .
  • the mobile device 100 may discover the nearby electronic device 200 and other electronic devices through Wi-Fi Direct.
  • the mobile device 100 displays the identifiers of the found nearby electronic devices.
  • the mobile device 100 may also display other information, such as images of the found electronic devices, which are not limited in this embodiment of the present application.
  • Window 605 is popped up on the mobile device, exemplarily as shown in FIG. 6C .
  • Window 605 includes interface indicators 605a, icons 605b, images 605c of one or more electronic devices, and logos 605d.
  • This embodiment of the present application does not limit the sequence of S002 and S003, and the two may be executed simultaneously or sequentially.
  • the number of electronic devices displayed in the window 605 is zero.
  • the mobile device 100 detects a user operation of selecting the electronic device 200 .
  • the user operation of selecting the electronic device 200 may be a user operation acting on the image 605c and/or the logo 605d corresponding to the electronic device 200 .
  • the user operation for selecting the electronic device 200 may also be implemented in other forms, which are not limited in this embodiment of the present application.
  • the mobile device 100 and the electronic device 200 in response to the detected user operation of selecting the electronic device 200, the mobile device 100 and the electronic device 200 establish a communication connection.
  • the mobile device 100 may establish a communication connection with the electronic device 200 through one or more wireless communication technologies among Wi-Fi Direct, Bluetooth, and NFC.
  • the mobile device 100 and the electronic device 200 establish a Wi-Fi Direct communication connection.
  • they can negotiate capabilities based on the communication connection, including encoding formats, resolutions, and audio formats supported by both parties, so as to facilitate the subsequent transmission of multimedia content.
  • the mobile device 100 sends the currently displayed multimedia content to the electronic device 200 based on the communication connection with the electronic device 200 .
  • the mobile device 100 can acquire the currently displayed multimedia content (including images and/or audio) through screen recording, audio recording, etc., and then compress the acquired multimedia content and send it to the electronic device 200 through the communication connection.
  • Electronic device 200 Taking the mobile device 100 and the electronic device 200 sharing multimedia content based on miracast as an example, the mobile device 100 can obtain the image displayed on the display screen by recording the screen according to the provisions in the miracast protocol, and use the H.264 encoding algorithm to compress the image.
  • the audio played by mobile device 100 use advanced audio coding (advanced audio coding, AAC) algorithm to compress this audio; Then the compressed audio data and image data are encapsulated as transport stream (transport stream, TS), after The TS stream is encoded according to a real-time transport protocol (RTP), and the encoded data is sent to the electronic device 200 through a Wi-Fi direct connection. That is, the multimedia content is transmitted by means of streaming media.
  • AAC advanced audio coding
  • the electronic device 200 plays the received multimedia content.
  • the electronic device 200 may perform decoding processing on the multimedia content, thereby acquiring the multimedia content.
  • the electronic device 200 can receive the RTP-encoded TS stream based on the Wi-Fi Direct communication connection with the mobile device 100, and can execute it in sequence.
  • S006 and S007 will continue to be executed until the mobile device 100 disables the mirror screen projection function, enables the online screen projection function, and the like.
  • the multimedia content played by the electronic device 200 is the same as the multimedia content played by the mobile device 100.
  • the multimedia content played by the electronic device 200 also changes accordingly.
  • the mobile device 100 prompts the user to select or automatically selects an appropriate screen projection mode according to the category of the currently activated application.
  • the mobile device 100 starts the application, or switches the application.
  • the application program started by the mobile device 100 depends on the user, and may be a game application or a video application, and the video application may include a non-customized video application and a non-customized video application.
  • the mobile device 100 may, in response to a user operation (such as a click operation, a touch operation, etc.) detected on the icon of the application program in the user interface 61 shown in FIG. 6A, start the application program corresponding to the icon, and may also respond to other A user operation (such as a voice command) starts the corresponding application, which is not limited here.
  • a user operation such as a click operation, a touch operation, etc.
  • the mobile device 100 may launch a video application in response to a user operation acting on the icon 601 of the video application on the home interface.
  • FIG. 7B exemplarily shows the user interface 71 displayed after the mobile device 100 starts the video application.
  • the user interface 71 is the main page provided by the video application. As shown in FIG.
  • one or more video images 701 are displayed in the user interface 71 .
  • the image of the video can be dynamic or static.
  • the user interface 71 may also display a bottom menu bar, a search box, a sub-channel entry, and the like, which are not limited in this embodiment of the present application.
  • the mobile device 100 can detect the user operation acting on the video image 701 , obtain the network video indicated by the video image 701 from the server corresponding to the video application through the network, and play the network video.
  • the network video indicated by the video image 701 in which the user's operation is monitored is the network video selected by the user.
  • the following embodiments will be described by taking the video application corresponding server 400 as an example. That is to say, the web address of the web video acquired by the mobile device 100 is the address of the server 400 .
  • FIG. 7C exemplarily shows the user interface 72 displayed when the mobile device 100 plays the web video selected by the user.
  • the user interface 72 may be displayed by the mobile device 100 in response to the user's action of switching the mobile device 100 from the portrait screen state to the landscape screen state, or the user clicking the full-screen playback control displayed in the lower right corner of the user interface displayed by the mobile device 100 of.
  • the user interface 72 may further include a switch control 702 for online screen projection, and the control 702 is used to monitor user operations (eg, click operations, touch operations, etc.) for enabling/disabling the online screen projection function of the video application.
  • the display state of the control 702 indicates that the online screen projection function of the mobile device 100 is currently enabled.
  • the mobile device 100 determines whether the activated application or the switched application is a video application.
  • the mobile device 100 can listen to the application running in the foreground or the application corresponding to the currently used window, and accordingly identify whether the application running in the foreground of the mobile device 100 is a video application. If it is a video application, execute S010.
  • the scene perception module in FIG. 10A and FIG. 10B can sense application startup and application switching through the operating system calling API. For example, taking the perception of application switching as an example, the scene perception module perceives and sends an "android.ActivityState.CHANGE" message to subscribe to application switching events. After the scene perception module senses the application switching, it calls the API provided by the operating system to query the name of the top-level APP that the user sees. For example, query the task and the PackageName corresponding to the task through the API provided by ActivityManager. The scene perception module perceives whether it is a video application according to whether the queried application name is in a pre-made database or table. The database or the table can be added, deleted, updated, etc. by the user as required.
  • the mobile device 100 determines whether the activated video application is a non-customized video application.
  • the mobile device 100 may store a whitelist, which may include one or more non-customized video applications. In this way, when the application running in the foreground is in the whitelist, the mobile device 100 can determine that the application running in the foreground by the mobile device 100 is a non-customized video application.
  • the non-customized video applications in the whitelist may be installed in the mobile device 100, or may not be installed in the mobile device 100.
  • the mobile device 100 can update the whitelist as needed.
  • FIG. 5B depicts the process of switching the mirror projection currently used by the mobile device 100 to online projection for a non-customized video application.
  • the mobile device 100 shares the network video provided by the non-customized video application with the electronic device 200 through mirror projection, it can switch the mirror projection to online projection under the trigger of the user, based on the online projection sharing the web video.
  • the process shown in FIG. 5B may include:
  • the mobile device 100 outputs prompt information, where the prompt information is used to prompt the user to switch the mirrored screen projection to the online screen projection.
  • the implementation form of the prompt information may include, but is not limited to: interface elements displayed on the display screen of the mobile device 100, playing audio, flashing indicator lights, motor vibration, and the like.
  • the scenarios in which the mobile device 100 outputs prompt information may include the following three scenarios:
  • Scenario 1 After the mobile device 100 starts the video application, prompt information is output.
  • the mobile device 100 may adopt the methods of S009, or S009 and S010, when recognizing that the application program running in the foreground of the mobile device 100 is a non-customized video application, the mobile device 100 outputs prompt information.
  • FIG. 8A exemplarily shows prompt information displayed by the mobile device 100 in the scenario 1 .
  • the prompt information is window 703 .
  • Window 703 may include: text 703a.
  • the text 703a can be, for example, "click the online projection button when playing a video to switch the mirror projection to online projection", "you can click the online projection button when playing a video to obtain a clearer projection effect", etc.
  • the window 703 may further include an image 703b and an image 703c, and the image 703b and the image 703c are respectively used to indicate the effect of sharing network video based on mirror projection and online projection. It can be seen that the image 703c is clearer and better than the image 703b. In this way, the user can be reminded of the difference between mirror projection and online projection, which is beneficial for the user to choose a more suitable online projection to share online videos.
  • the prompt information displayed on the display screen of the mobile device 100 may disappear automatically after being displayed for a period of time (for example, 5 seconds) without user interaction.
  • the mobile device 100 may also stop displaying the prompt information and the like in response to the user's operation of clicking on other areas on the display screen other than the prompt information.
  • the mobile device 100 may sequentially display FIG. 7A , FIG. 8A , FIG. 7B and FIG. 7C according to the user operation.
  • the user can be automatically prompted to trigger the mobile device 100 to switch from mirror projection to online projection, so as to ensure the best projection experience when the user watches online videos.
  • "9:21" in FIG. 7C is used to indicate the last playback record, that is, when the playback of the network video reaches "9:21" last time, the playback of the network video is quit.
  • Scenario 2 After the mobile device 100 starts the video application and plays the network video selected by the user, it outputs prompt information.
  • the mobile device 100 may use the methods of S009, or the methods of S009 and S010, when recognizing that the application running in the foreground of the mobile device 100 is a non-customized video application, the mobile device 100 outputs prompt information.
  • FIG. 8B exemplarily shows prompt information displayed by the mobile device 100 in the scenario 2.
  • the prompt information 704 is a window.
  • the specific content of the prompt information 704 is similar to the window 703 in FIG. 8A , and will not be described in detail. It is understandable that when the prompt information is output in the scenario 2, the mobile device 100 may sequentially display FIG. 7A , FIG. 7B , FIG. 7C and FIG. 8B according to the user operation.
  • the user can be automatically prompted to trigger the mobile device 100 to switch the mirrored screen projection to the online screen projection, so as to ensure that the user can obtain the best screen projection experience when watching the online video.
  • the window 703 is closed.
  • the online screen projection button is not clicked within a preset time period (for example, 20 seconds)
  • the mirror screen projection mode is still used, or the online screen projection mode is automatically switched.
  • 20 seconds is only a schematic example, and any duration may be a preset duration, which is not limited in this application.
  • Scenario 3 After the mobile device 100 continues to run the video application in the foreground for longer than the first preset time period, the mobile device 100 outputs prompt information.
  • the mobile device 100 may adopt the method of S009, or the methods of S009 and S010, after recognizing that the application program running on the mobile device 100 in the foreground is a non-customized video application, and the continuous running duration in the foreground exceeds the first preset duration, the mobile device 100 Output prompt information.
  • the first preset duration may be, for example, 10 seconds, 30 seconds, 1 minute, etc., which is not limited in the embodiment of the present application.
  • the mobile device 100 may still display the home page provided by the video application, or may play the web video in response to the user's operation of selecting the web video. Therefore, as shown in FIG.
  • the prompt information displayed by the mobile device 100 can be displayed on the main page 71 of the video application shown in FIG. 7B ; as shown in FIG. 8B , the prompt information can also be displayed on the user interface shown in FIG. 7C . 72.
  • the mobile device 100 can be prompted and triggered to switch the mirror projection to online projection while the user is watching the online video, so as to ensure that the user can obtain the best projection experience when watching the online video.
  • the mobile device 100 receives a user operation for enabling the online screen projection function.
  • the user operation for enabling the online screen projection function of the video application may be, for example, a user operation (eg, click operation, touch operation, etc.) acting on the online screen projection control 702 .
  • the user operation for enabling the online screen projection control may also be in other forms, such as shaking gestures, voice commands, etc., which are not limited in this embodiment of the present application.
  • the mobile device 100 switches the mirror screen projection to the online screen projection, and continues to share the network video with the electronic device 200 based on the online screen projection.
  • the mobile device 100 may enable the online screen projection function of the video application.
  • the mobile device 100 discovers a nearby electronic device that supports online screen projection, and displays an identifier of the discovered electronic device.
  • the mobile device 100 may send a user datagram protocol (UDP) broadcast.
  • UDP user datagram protocol
  • Nearby electronic devices that support online screen projection such as the electronic device 200 connected to the Wi-Fi access point 300, and other electronic devices (not shown in the figure) can respond to the UDP broadcast reply with their own relevant information (for example, device identification) UDP message so that the mobile device 100 can discover itself.
  • the mobile device 100 displays an identifier of the discovered electronic device, and may also display other information such as an image of the electronic device, which is not limited in this embodiment of the present application.
  • FIG. 8D exemplarily shows a user interface displayed by the mobile device 100 in response to the user operation of turning on the mirroring function detected in S101. As shown in FIG. 8D , the user interface may display the identifiers of the electronic devices discovered by the mobile device 100 .
  • the mobile device 100 detects a user operation of selecting the electronic device 200 .
  • the user operation of selecting the electronic device 200 may be a user operation acting on the logo corresponding to the electronic device 200 in the user interface shown in FIG. 8D .
  • the user operation for selecting the electronic device 200 may also be implemented in other forms, which are not limited in this embodiment of the present application.
  • the mobile device executes S103 and S104 to share the online video with the electronic device selected by the user based on the online screen projection.
  • the mobile device 100 when the mobile device 100 is not the first time to enable the online screen projection function of the video application, the mobile device 100 automatically connects to the electronic device for the last online screen projection, and shares the network video through the online screen projection. That is, in another implementation manner, S103-S104 may not be performed, and after S102, S105 and its subsequent steps are directly performed.
  • the mobile device 100 when the mobile device 100 detects that the nearby electronic device supporting online screen projection is only one electronic device, the mobile device 100 automatically wirelessly projects the screen to the electronic device in an online screen projection manner to share the network video.
  • TCP transmission control protocol
  • the mobile device 100 sends the URL of the played web video to the electronic device 200 based on the TCP connection.
  • the website address of the online video may be a URL
  • the URL locates a server corresponding to the video application, such as server 400 .
  • the mobile device 100 may also send the time node of the currently played network video to the electronic device 200, so that the electronic device 200 continues to play the network video from the time node.
  • the mobile device 100 can also change the control 702 after establishing a TCP connection with the electronic device 200 and sending the URL of the web video to the electronic device 200 in response to the user operation of enabling the online screen projection function detected in S102 , such as adding a shadow, changing the color of the control 702, etc., so as to prompt the user that the mobile device 100 is currently sharing a network video based on online screen projection.
  • the mobile device 100 may also prompt the user that the mobile device 100 is currently sharing a network video based on online screen projection by displaying text or the like, which is not limited in this embodiment of the present application.
  • the electronic device 200 acquires the web video from the web address of the web video.
  • the electronic device 200 may request the server 400 to obtain the network video according to the URL, and the server 400 may respond to the request and send the encoded network video to the electronic device 200 through a network (eg, a local area network formed by the Wi-Fi access point 300).
  • a network eg, a local area network formed by the Wi-Fi access point 300.
  • the electronic device 200 plays the network video.
  • the user can continue to control the mobile device 100, and the user's manipulation of the mobile device 100 does not affect the electronic device 200 to continue playing the online video.
  • the user can manipulate the mobile device 100 to quit playing a web video, quit running a video application, and start a game application, and so on.
  • the mobile device 100 can switch the mirror projection to online projection when the user selects an inappropriate mirror projection to share the network video, which improves the projection efficiency and ensures that the electronic device 200 plays the network video quality, so as to ensure that users get the best screen projection experience. That is, the screen projection method shown in FIG. 5B can lower the user threshold and ensure user experience.
  • the mobile device 100 may also switch the online screencasting back to mirror screencasting in some cases.
  • the following describes the case of switching the online screen projection back to mirror screen projection through optional step S109.
  • the mobile device 100 switches the online screen projection to the mirror screen projection under the trigger of the user.
  • the mobile device 100 can use the methods of S009, or S009 and S010. After detecting that the application program running in the foreground of the mobile device 100 is a game application, the mobile device 100 can output prompt information, and the prompt information can be used to ask the user whether Switch online screencasting to mirror screencasting.
  • FIG. 8E exemplarily shows prompt information displayed after the mobile device 100 starts the game application.
  • the prompt information may be a window 705 .
  • the window 705 may include: text 705a, controls 705b, and controls 705c.
  • the text 705a may be, for example, "Whether to switch online screen projection to mirror screen projection to obtain a smoother game screen projection experience?", which is used to ask the user whether to switch online screen projection to mirror screen projection.
  • the control 705b is used to monitor a user operation, and the mobile device 100 may, in response to the user operation, not perform the operation of switching the online screencasting to the mirror screencasting.
  • the control 705c is used to monitor the user operation, and the mobile device 100 can switch the online screen projection to the mirror screen projection in response to the user operation.
  • the mobile device 100 switching the online screen projection to the mirror screen projection reference may be made to S002-S007 in the method shown in FIG. 5A , which will not be repeated here.
  • the mobile device 100 when the application running in the foreground is switched to a game application, or when the game application is started, the mobile device 100 automatically prompts the user to switch the online screen projection back to mirror projection. If the user chooses to switch the online projection back to mirror projection, a smoother game projection experience with lower latency will be provided for the user; if the user chooses to refuse to switch the online projection back to mirror projection, the user can Watch online video on device 200 , and play games on mobile device 100 .
  • S109 can also be replaced with: after detecting that the game application is started, or after detecting that the game application is switched to an application running in the foreground, the mobile device automatically switches back to mirroring. In this way, it is automatically switched back to mirror projection without the user's selection, which makes the user experience better.
  • FIG. 5C depicts the process of switching the mirror projection currently used by the mobile device 100 to online projection for a customized video application.
  • the mobile device 100 shares the network video provided by the customized video application with the electronic device 200 through mirror projection, it can automatically or under the trigger of the user, switch the mirror projection to online projection, and continue to use the online projection screen share the web video.
  • the process shown in FIG. 5C may include:
  • the mobile device 100 automatically or triggered by the user switches the mirror projection to the online projection, and continues to share the online video with the electronic device 200 based on the online projection.
  • the mobile device 100 directly establishes a TCP connection with the electronic device 200 .
  • the video application is a customized video application.
  • the video application can query the screen projection status from the screen projection service through the SDK interface, and can also query the current status with the mobile device 100.
  • the device that shares the network video is the electronic device 200 . Therefore, in S201, the mobile device 100 can directly establish a TCP connection with the queried electronic device 200 without the need to be selected by the user as shown in FIG. 5A or by default by the mobile device 100 according to the last online screen projection interaction.
  • S202-S204 are the same as S106-S108 in FIG. 5B, and will not be repeated here.
  • S201-S204 are performed automatically.
  • the mobile device 100 may output prompt information before the automatic switching, so as to prompt the user that the mobile device 100 should switch the mirrored screen projection to the online screen projection.
  • the mobile device 100 may output prompt information after the automatic switching to prompt the user that the current screencasting has been switched to online.
  • the implementation forms of the prompt information output by the mobile device 100 before and after the switching may include, but are not limited to: interface elements displayed on the display screen, audio, flashing indicator lights, motor vibration, and the like.
  • FIG. 8F exemplarily shows prompt information 1001 displayed before the mobile device 100 is automatically switched, and the prompt information 1001 is the text "I will switch to online screencasting for you soon”.
  • FIG. 8G exemplarily shows prompt information 1002 displayed after the mobile device 100 is automatically switched, and the prompt information 1002 is the text "Switched to online screencasting”.
  • the mobile device 100 may not output prompt information before or after the automatic switching, which is not limited in this embodiment of the present application.
  • the mobile device 100 may adopt the methods of S009, or the methods of S009 and S010.
  • the mobile device 100 When recognizing that the application program running in the foreground of the mobile device 100 is a customized video application, the mobile device 100 outputs the prompt information, and in the Switch the mirroring screen to online screencasting under the trigger of the user.
  • FIG. 8H exemplarily shows the prompt information output when the mobile device 100 recognizes that the network video is played.
  • the prompt information in FIG. 8H is the same as the prompt information in FIG. 8B , and details are not repeated here.
  • the mobile device 100 may detect a user operation (such as a click operation, a touch operation) acting on the control 702, and in response to the user operation, switch the mirror projection to online projection.
  • the mobile device 100 may also switch the mirrored screen projection to the online screen projection in response to other user operations, such as shaking gestures, voice commands, etc., which is not limited in this embodiment of the present application.
  • the display mode of the control 702 may be changed to prompt the user that the switch to online projection is currently performed. In this way, the user can be given sufficient options when switching.
  • the mobile device 100 may also switch the online screen projection back to mirror screen projection in some cases. That is, an optional step S205 may also be included.
  • the mobile device 100 may automatically switch the online screen projection back to mirror screen projection after the electronic device 200 finishes playing the online video.
  • the user needs to further operate the electronic device 200 to play the web video repeatedly or play other web videos.
  • FIG. 9 exemplarily shows a user interface displayed by the electronic device 200 after playing the network video.
  • the mobile device 100 can adaptively adjust the used wireless screen projection method to provide the user with the best screen projection experience.
  • the mobile device 100 can automatically perform mutual switching between online screen projection and mirrored screen projection in the background without user operation, which can bring a good screen projection experience to the user.
  • the mobile device 100 may, under the trigger of the user, cast the screen online Switch back to mirroring.
  • the relevant description in S109 which will not be repeated here.
  • non-customized video applications may be referred to as first-type applications
  • game-type applications may be referred to as second-type applications
  • customized video applications may be referred to as third-type applications class application.
  • the application started by the mobile device 100 in S008 of FIG. 5A may be referred to as the first application.
  • the prompt information output by the mobile device 100 in S101 of FIG. 5B may be referred to as the first prompt information
  • the user operation to enable the online screen projection function received by the mobile device 100 in S102 may be referred to as the first user input
  • the mobile device 100 detects The user operation of selecting the electronic device 200 may be referred to as the second user input
  • the application detected by the mobile device 100 to be activated or switched to the foreground in S109 may be referred to as the second application
  • the game application may be referred to as the second type of application
  • the prompt information output by the mobile device 100 in S109 may be referred to as second prompt information.
  • the mobile device 100 can also switch the first application belonging to the first type of application to run in the foreground, or start a third application belonging to the first type of application, or The third application of a class of applications switches to run in the foreground, and automatically outputs third prompt information, the third prompt information is used to prompt to switch the first wireless screen projection mode to the second wireless screen projection mode; or, the mobile device 100 automatically The first wireless screen projection mode is switched to the second wireless screen projection mode, and the screen is wirelessly projected to the electronic device 200 in the second wireless screen projection mode.
  • FIG. 10A shows an interaction flow between various internal modules when the mobile device 100 executes the flow shown in FIG. 5B . As shown in Figure 10A, it can include:
  • Step 1 The mirroring service enables the mirroring function.
  • step 1 after the mobile device 100 detects the user's operation to enable the mirroring screen projection function, the screen projection service will call one or more of Wi-Fi Direct, Bluetooth or NFC in the wireless communication module 160, and send it through Wi-Fi.
  • Wi-Fi Direct Wi-Fi Direct
  • Bluetooth or NFC wireless communication module 160
  • Step 2 The video application is started, and the network video is obtained.
  • step 2 For the implementation of step 2, reference may be made to the related descriptions in FIGS. 7A-7C , which will not be repeated here.
  • step 2 after detecting the user operation for starting the video application, the mobile device 100 starts the video application, and then acquires the network video in response to the user operation for selecting the network video playback by the user.
  • Step 3 The scene perception module recognizes any one of the three scenes.
  • the three scenarios are the three scenarios mentioned in S101 in FIG. 5B , which may specifically include: (1) the mobile device 100 starts a non-customized video application; (2) the mobile device 100 starts a non-customized video application and plays the user The network video selected from the application; (3) the duration of the mobile device 100 continuously running the non-customized video application in the foreground exceeds the first preset duration.
  • the scene perception module can use S009 or S009 and S010 in real time or periodically to identify whether the application running in the foreground of the mobile device 100 is a non-customized video application, and then identify the above three scenarios.
  • the scene awareness module can be integrated into the screen casting service. In this way, the screen casting service can be used to perform step 3.
  • Step 4 The scene perception module notifies the screen projection service of the identified scene.
  • Step 5 After the screencasting service learns the scene identified by the scene perception module, it outputs prompt information to prompt the user to switch to online screencasting.
  • the screen projection service learns that the mobile device 100 has enabled the mirror projection function and the mobile device 100 is in one of the three scenarios described above, it calls the hardware of the mobile device 100, such as a display screen, a flash, a motor, etc., to output prompt information.
  • the prompt information is used to prompt the user to switch from mirroring to online mirroring.
  • S101 For the implementation form of the prompt information, reference may be made to the relevant description of S101.
  • Step 6 The video application enables the online screen projection function under the trigger of the user.
  • step 6 reference may be made to the related descriptions of S102-S106 in FIG. 5B.
  • the video application may receive an event that the user enables the online screen projection function of the video application.
  • the user may click on the control 702, and the click operation may be encapsulated as an event for enabling the online screen projection function of the video application, and transmitted to the video application from the bottom layer.
  • the video application can enable the online screen projection function, that is, call the wireless communication module 160 to send UDP broadcast, establish a TCP connection with the electronic device 200, and send the website of the network video to the electronic device 200 based on the TCP connection.
  • FIG. 10B shows another interaction flow between various internal modules when the mobile device 100 executes the flow shown in FIG. 5B . As shown in Figure 10B, it can include:
  • Step 1 to Step 3 are the same as Step 1 to Step 3 in FIG. 10A , and are not repeated here.
  • Step 4 The scene perception module queries the screen projection service for the screen projection status of the mobile device 100 .
  • the scene perception module may query the screen projection service to find that the mirror projection function is currently enabled on the mobile device 100.
  • the scene perception module learns that the mobile device 100 has enabled the mirroring function and the mobile device 100 is in one of the three scenarios in S101, it calls the hardware of the mobile device 100, such as a display screen, a flash, a motor, etc., to output prompt information.
  • the prompt information It is used to prompt the user to switch from mirroring to online mirroring.
  • Step 5 is the same as Step 5 in FIG. 10A and will not be repeated here.
  • the function of the scene perception module can be integrated into the screen projection service.
  • the steps performed by the scene perception module in FIG. 10A and FIG. 10B are all performed by the screen projection service, and the interaction steps between the two can also be omitted.
  • the screen mirroring service enables the mirror mirroring function.
  • the video application is started and the network video is obtained.
  • Step 1 - Step 2 are the same as Step 1 - Step 2 in FIG. 10A , and will not be repeated here.
  • the video application queries the screencasting service for the screencasting status.
  • the video application can query the current screencasting status from the screencasting service through the SDK, and learn that the mobile device 100 is currently sharing network video based on mirror screencasting. In some embodiments, the video application can query the screen projection service through the SDK to find the electronic device 200 that is currently accepting the screen projection of the mobile device 100 .
  • the video application learns that the mobile device 100 shares the network video with the electronic device 200 based on mirror projection.
  • the video application can learn that the mobile device 100 is currently sharing network video with the electronic device 200 based on mirror projection.
  • the video application automatically or triggered by the user, enables the online screen projection function.
  • the video application may also call modules such as a display screen, an audio module, a flashlight, and the like to output prompt information.
  • modules such as a display screen, an audio module, a flashlight, and the like to output prompt information.
  • the video application can call the display screen to output prompt information to prompt the user to switch from mirror projection to online projection.
  • the display screen After entering the user operation to enable online screencasting, enable the online screencasting function.
  • the display screen After entering the user operation to enable online screencasting, enable the online screencasting function.
  • the display screen outputs prompt information to prompt the user to switch the mirror projection to the online projection, reference may be made to the window 704 in FIG. 8H .
  • the video application enables the online screen projection function, that is, calls the wireless communication module 160 to send a UDP broadcast, establishes a TCP connection with the electronic device 200, and sends the website of the network video to the electronic device 200 based on the TCP connection.
  • the online screen projection function that is, calls the wireless communication module 160 to send a UDP broadcast, establishes a TCP connection with the electronic device 200, and sends the website of the network video to the electronic device 200 based on the TCP connection.
  • the scene perception module can perceive the three scenarios mentioned in S101, and then the scene perception module or the screen projection service can output prompt information to prompt the user to Switch from mirror projection to online projection.
  • the video application can enable the online screen projection function in response to the user operation input by the user after seeing the prompt information.
  • the customized video application can obtain the screen projection status from the screen projection service through the SDK interface, and can identify the mobile device 100 based on the running status of the customized video application itself. Scenarios for sharing online videos based on mirroring. Therefore, the video application can automatically or passively enable the online screencasting function.
  • executing the process shown in FIG. 5B does not require any improvement of the video application, but only needs to improve the software system of the mobile device 100, add a scene perception module, add a screen projection service or the function of the scene perception module to output prompt information, and this can be achieved.
  • the function of switching mirror projection to online projection In the process of sharing online video based on mirror projection, the function of switching mirror projection to online projection.
  • the video application program needs to be improved and an SDK is added to realize the function of switching from mirror projection to online projection in the process of sharing network video based on mirror projection.
  • customized video applications and non-customized video applications may not be distinguished.
  • the mobile device 100 uses the process shown in FIG. 5B to switch the wireless screen projection mode.
  • the wireless screen projection method provided by the present application does not necessarily include the processes shown in FIG. 5A , FIG. 5B and FIG. 5C .
  • the processes shown in FIGS. 5A , 5B and 5C and the parts in the text corresponding to the processes shown in FIGS. 5A , 5B and 5C can also independently constitute the wireless screen projection method provided by the embodiments of the present application. .
  • the process shown in FIG. 5B and the text corresponding to the process shown in FIG. 5B above can independently constitute the wireless screen projection method provided by the embodiment of the present application.
  • FIG. 12 is a schematic flowchart of another screen projection method provided by an embodiment of the present application.
  • the mobile device 100 shares multimedia content with the electronic device 200 based on mirror projection, the mobile device 100 can identify the current scene and notify the electronic device 200 of the scene.
  • the scene adaptively selects the corresponding play strategy to play the multimedia content.
  • the multimedia content shared between the mobile device 100 and the electronic device 200 may be network multimedia content, local multimedia content, or a combination of the two, which is not limited in this application.
  • the method may include the following steps:
  • S301-S307 are the same as S001-S007 in FIG. 5A, and will not be repeated here.
  • the application launched by the mobile device 100 may be any application that is installed.
  • the mobile device 100 may launch the video application in response to a user operation acting on the icon 601 of the video application on the main interface 61 shown in FIG. 6A .
  • the mobile device 100 may launch the game application in response to a user operation acting on the icon 602 of the game application on the main interface 61 shown in FIG. 6A .
  • the video application refers to an application program provided by a server to provide audio and video services.
  • a game application refers to an application program provided by a server to provide game services.
  • the mobile device 100 identifies a scene corresponding to the activated application.
  • the mobile device 100 can distinguish different scenarios according to different activated applications. For example, when the mobile device 100 starts a video application, the mobile device 100 can recognize the scene of running the video application; when the mobile device 100 starts the video application, the mobile device 100 can recognize the scene of running the game application.
  • the scene perception module or the screen projection service of the mobile device 100 may use the methods of S009 or S009 and S010 in real time or periodically to learn the application running in the foreground, thereby identifying the current scene.
  • the mobile device 100 is not limited to differentiating different scenarios according to the launched applications, and the mobile device 100 can also differentiate different scenarios according to other policies, which is not limited in this embodiment of the present application.
  • the mobile device 100 notifies the electronic device 200 of the recognized scene.
  • the mobile device 100 may carry the indication information of the recognized scene in the used control instruction of mirror projection, so as to notify the electronic device 200 of the recognized scene.
  • the mobile device 100 may carry the indication information of the identified scene in the extension field of the RTSP control instruction.
  • the value of the extension field is 1, which may indicate that the mobile device 100 recognizes the scene of running the video application; the value of the extension field is 0, which may indicate that the mobile device 100 recognizes the scene of running the game application.
  • the electronic device 200 plays the multimedia content by using the play strategy corresponding to the scene.
  • the playback strategy corresponding to mirror projection may include: a strategy for real-time display and display, and a strategy for synchronizing audio and video caches.
  • the electronic device 200 directly decodes and plays after receiving the data sent by the mobile device 100, and preferentially guarantees the user's low-latency projection experience in scenarios such as game projection.
  • the electronic device 200 When using the audio and video cache synchronization strategy, after receiving the encoded data sent by the mobile device 100, the electronic device 200 will cache certain data to ensure the smoothness of the multimedia content, and automatically compare the audio and image timestamps in the multimedia content ( presentation time stamp, PTS), when the difference between the audio and video timestamps exceeds the threshold, the audio and video synchronization processing logic (such as audio double-speed playback, image frame loss, etc.) is triggered to ensure the audio and video synchronization quality when the electronic device 200 plays multimedia content.
  • presentation time stamp, PTS presentation time stamp
  • the audio and video synchronization processing logic such as audio double-speed playback, image frame loss, etc.
  • the real-time transmission and display strategy and the audio and video cache synchronization strategy are just words used in the embodiments of the present application, and the meanings they represent have been recorded in the embodiments of the present application, and their names do not constitute any restrictions on the embodiments of the present application.
  • the real-time display sending strategy may also be referred to as other terms such as a low-latency fast display sending mode.
  • the audio and video cache synchronization strategy mentioned in the embodiments of the present application may also be referred to as other names such as cache mode in other embodiments.
  • Table 1 shows the correspondence between the scene where the mobile device 100 is located and the playback strategy used by the electronic device 200 on the premise that the mobile device 100 wirelessly projects the screen to the electronic device 200 .
  • the scenario of running the game application may correspond to the real-time display sending strategy
  • the scenario of running the video application may correspond to the audio-video cache synchronization strategy.
  • the real-time display strategy is more suitable for a scenario in which a game application is running.
  • the fluency of the projection screen and the quality of audio and video synchronization are the main factors affecting the user's projection experience. Therefore, the audio and video cache synchronization strategy is more suitable for the scenario in which the video application is running. It is not limited to the real-time display transmission strategy and the audio and video cache synchronization strategy.
  • mirror projection may also correspond to other more playback strategies, such as ultra-low latency mode, which is not limited in this embodiment of the present application.
  • the screen projection service in the electronic device 200 may instruct the screen projection player to play the multimedia content using the playback strategy corresponding to the scene identified by the mobile device 100 .
  • the electronic device 200 can adaptively select a corresponding playback strategy according to the scene where the mobile device 100 is located to play the multimedia content, that is, adjust the playback according to the actual scene strategy, so as to ensure the user's screen projection experience.
  • the mobile device 100 performs steps 1-5:
  • Step 1 The mirroring service enables the mirroring function.
  • Step 2 The application starts.
  • Step 3 The scene perception module recognizes the scene corresponding to the started application.
  • step 1 refers to the operations of the mobile device 100 in S301-S307 in the method shown in FIG. 12; for the implementation of step 2, refer to S308 in the method shown in FIG. 12; for the implementation of step 3, refer to S308 in the method shown in FIG. 12. S309.
  • Step 4 The scene perception module notifies the screen projection service of the identified scene.
  • Step 5 The screen projection service notifies the electronic device 200 of the scene identified by the scene perception module.
  • the screen projection service may notify the electronic device 200 of the scene after learning that the mobile device 100 has enabled the mirroring function and the scene identified by the scene perception module.
  • step 5 For the implementation of step 5, reference may be made to S310 in the method shown in FIG. 12 .
  • Step 6 The screen projection service determines a playback strategy corresponding to the scene where the mobile device 100 is located, and transmits the playback strategy to the screen projection player.
  • Step 7 The screen projection player uses the playback strategy to play the multimedia content.
  • steps 6 and 7 For the implementation of steps 6 and 7, reference may be made to S311 in the method shown in FIG. 12 .
  • FIG. 13B shows another interaction flow between various internal modules when the mobile device 100 executes the flow of FIG. 5C .
  • the difference between FIG. 13B and FIG. 13A is that in step 4, the scene perception module of the mobile device 100 queries the screen projection service for the screen projection status, and in step 5, the scene perception module notifies the electronic device 200 of the recognized scene.
  • the function of the scene perception module may be integrated into the screen projection service.
  • the steps performed by the scene perception module in FIG. 13A and FIG. 13B are all performed by the screen projection service, and the interaction steps between the two can also be omitted.
  • the mobile device 100 can perform the screen projection described in FIG. method.
  • the mobile device 100 can perform the process shown in FIG. 5C ; if the electronic device 200 is not connected to the network, that is, when the electronic device 200 does not have the conditions for online screen projection, the mobile device 100 can perform the screen projection described in FIG. 12 . method.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions when loaded and executed on a computer, result in whole or in part of the processes or functions described herein.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed , which may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请涉及移动设备、无线投屏方法及计算机可读存储介质。移动设备在前台运行第一应用,且以第一无线投屏方式向电子设备无线投屏,移动设备包括:处理器;存储器;存储在存储器上的计算机程序,当计算机程序被处理器执行时,使得移动设备执行:在检测到第一应用为第一类应用后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏。移动设备能自动识别场景,判断出最适合前台当前运行应用的投屏方式,自动输出提示信息,或者自动切换投屏方式,兼顾不同应用和不同投屏方式的特点,提升用户体验。

Description

无线投屏方法、移动设备及计算机可读存储介质
本申请要求于2020年10月30日提交中国专利局、申请号为202011198023.X、申请名称为“无线投屏方法、移动设备及计算机可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及投屏技术领域,尤其涉及无线投屏方法、移动设备及计算机可读存储介质。
背景技术
无线投屏使得诸如手机、平板等的移动设备可将本地或网络上的多媒体内容投放到PC、智慧屏等具有音频、视频等播放能力的电子设备上,在电子设备上播放上述多媒体数据。典型的无线投屏包括镜像投屏(如miracast)、在线投屏(如DLNA)等。
镜像投屏中,电子设备可以不需接入互联网或局域网,但整个过程中移动设备和电子设备均要实时地处理投放数据,比如移动设备要实时地进行投放数据的编码、发送等,电子设备要实时地进行投放数据的接收、解码等;如此导致参与设备较多,投放数据的时延较大,投屏效果易受移动设备的影响。在线投屏中,移动设备仅参与初始的网络地址传输,并不参与后续的过程,投放数据的时延较小,投屏效果不易受移动设备的影响,投屏效果较好,但需要电子设备接入互联网或局域网。
发明内容
发明人经过长期地研究发现,不同的应用侧重不同的要求,在投屏中却都采用同一投屏方式,这样带给用户的体验不好。比如,游戏应用侧重实时性,对低时延要求较高,音画同步要求相对较低;而视频应用(比如腾讯视频),侧重音画同步,实时性要求相对较低,即低时延要求相对较低。而若在投屏中都采用同一投屏方式,在用户切换不同的应用后,会使得同一投屏方式适合切换前的应用,但不适合切换后的应用。比如,用户在通过镜像投屏将手机画面投屏至智能电视(也称为大屏)上后,打开游戏应用,用手机和大屏配合打游戏,此时时延较低,用户体验较好;之后,用户将游戏应用切换至视频应用,此时会出现音画不同步的现象,用户体验不好。
为了解决上述技术问题,兼顾镜像投屏和在线投屏的特点,以及不同应用的要求,本申请提供一种无线投屏方法、移动设备及计算机可读存储介质,自动识别当前应用的类别,自动给出建议,并自动提示用户是否更改,甚至自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,从而提升用户体验。比如,用户在通过镜像投屏将手机画面投屏至大屏上后,打开游戏应用,用手机和大屏配合打游戏,此时自动识别场景,并判断出镜像投屏为最合适游戏应用的投屏方式,保持不变;之后,用户将游戏应用切换至视频应用(比如腾讯视频),此时自动识别场景,并判断出在线投屏是最适合视频应用的投屏方式,自动提示用户更改,甚至将当前的镜像投屏方式自动更改为在线投屏方式。这样,兼顾不同应用的要求,以及镜像投屏和在线投屏的特点,用户的体验较高。
第一方面,本申请提供一种移动设备,移动设备在前台运行第一应用,移动设备以第一无线投屏方式向电子设备无线投屏,移动设备包括:处理器;存储器;以及计算机程序,其中计算机程序存储在存储器上,当计算机程序被处理器执行时,使得移动设备执行以下步骤: 在检测到第一应用属于第一类应用后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,移动设备可以自动识别前台正在运行的应用的类别,并自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,从而兼顾不同应用的要求,以及镜像投屏和在线投屏的特点,提升用户体验。在该方案中,具体的执行主体为所述移动设备上当前运行的操作系统,或者所述移动设备上默认的系统级应用(比如,移动设备开机后即启动的系统级应用)。
示意性的举例说明第一方面的第三个方案。比如,移动设备的周围有电子设备1、电子设备2和电子设备3;移动设备运行第一应用,且移动设备以第一无线投屏方式向电子设备1无线投屏;在检测到第一应用属于第一类应用后,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,移动设备断开与电子设备1的无线投屏,以第二无线投屏方式,向电子设备2、电子设备3中的至少一个无线投屏。可替换地,在检测到第一应用属于第一类应用后,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,移动设备以第二无线投屏方式,向电子设备1、电子设备2和电子设备3中的至少一个无线投屏。
根据第一方面,在移动设备自动输出第一提示信息之后,移动设备还执行以下步骤:检测到第一用户输入,第一用户输入用于将第一无线投屏方式切换为第二无线投屏方式;响应于第一用户输入,移动设备自动输出一个或多个电子设备的标识,电子设备为移动设备检测到的支持第二无线投屏方式的电子设备;检测到第二用户输入,第二用户输入用于从电子设备的标识中选择一个电子设备的标识;响应于第二用户输入,移动设备将第一无线投屏方式切换为第二无线投屏方式,并以第二无线投屏方式向所选择的电子设备投屏。这样,在移动设备输出提示信息后,给用户提供了自行选择投屏方式是否切换,并选择切换后的投屏方式和接受投屏的电子设备的机会。用户可根据提示,进行选择。
根据第一方面,或者以上第一方面的任意一种实现方式,移动设备还执行以下步骤:在检测到属于第二类应用的第二应用启动后,或者,在检测到属于第二类应用的第二应用被切换为前台运行的应用后,移动设备自动输出第二提示信息,第二提示信息用于提示将第二无线投屏方式切换为第一无线投屏方式;或者,移动设备自动将第二无线投屏方式切换为第一无线投屏方式,以第一无线投屏方式向电子设备无线投屏。这样,在移动设备的前台正在运行的应用类别再次变化后,移动设备自动识别,并自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,提升用户体验。
根据第一方面,或者以上第一方面的任意一种实现方式,移动设备还执行以下步骤:在检测到第一应用为第三类应用后,移动设备的第一应用自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备的第一应用自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,在移动设备的前台正在运行的应用类别变化后,移动设备自动识别,并自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,提升用户体验。
根据第一方面,或者以上第一方面的任意一种实现方式,移动设备还执行以下步骤:在 检测到属于第一类应用的第一应用被切换为前台运行的应用后,或者,在检测到属于第一类应用的第三应用启动后,或者,在检测到属于第一类应用的第三应用被切换为前台运行的应用后,移动设备自动输出第三提示信息,第三提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏。这样,在移动设备的前台正在运行的应用类别变化回去后,移动设备自动识别,并自动提示用户是否更改回去,或者自动将投屏方式更改回去,使得更改后的投屏方式最适合当前应用的投屏,提升用户体验。
根据第一方面,或者以上第一方面的任意一种实现方式,在移动设备自动输出第一提示信息之后,移动设备还执行以下步骤:检测到第一用户输入,第一用户输入用于将第一无线投屏方式切换为第二无线投屏方式;响应于第一用户输入,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,就提供了另外一种更改投屏方式的方案,也能提升用户体验。
根据第一方面,或者以上第一方面的任意一种实现方式,在检测到第一应用属于第一类应用后,移动设备还执行以下步骤:在检测到移动设备通过第一应用播放网络视频后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,移动设备也自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,从而兼顾不同应用的要求,以及镜像投屏和在线投屏的特点,提升用户体验。
根据第一方面,或者以上第一方面的任意一种实现方式,第一类应用为非定制类视频应用,第二类应用为游戏应用,第三类应用为定制类视频应用;第一无线投屏方式为镜像投屏方式,第二无线投屏方式为在线投屏方式;所述一个或多个电子设备包括所述电子设备,或者,所述一个或多个电子设备不包括所述电子设备;第一用户输入和第二用户输入的输入形式包括触摸输入和语音输入。这样,无需对提供视频服务的第一应用做改进,即可保证移动设备在运行第一应用时提示用户更改,甚至将当前的镜像投屏方式自动更改为在线投屏方式,提高了投屏效率,并且能够保证电子设备播放网络视频的画质,从而保证用户得到最佳的投屏体验,还可以降低成本。另外,移动设备更换投屏方式后,可以继续向原来的电子设备进行投屏,也可以更换其他电子设备进行投屏,满足用户的不同需求。
根据第一方面,或者以上第一方面的任意一种实现方式,镜像投屏的方式可以为Wi-Fi联盟制定的miracast,在线投屏的方式可以为DLNA。
根据第一方面,或者以上第一方面的任意一种实现方式,移动设备存储有白名单,白名单用于识别哪些应用属于第一类应用,白名单包括第一类应用的一个或多个应用。该白名单为预先设置的,且能够更新。
根据第一方面,或者以上第一方面的任意一种实现方式,第一提示信息包括但不限于:在显示屏上显示的界面元素、播放的音频、指示灯闪烁、马达震动等。
第二方面,提供一种移动设备。移动设备以第一无线投屏方式向电子设备无线投屏,移动设备包括:处理器;存储器;以及计算机程序,其中计算机程序存储在存储器上,当计算机程序被处理器执行时,使得移动设备执行以下步骤:在检测到第一应用启动,以及第一应用属于第一类应用后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏。
第三方面,提供一种无线投屏方法,无线投屏方法应用于移动设备,移动设备包括处理器和存储器,移动设备在前台运行第一应用,移动设备以第一无线投屏方式向电子设备无线投屏。该方法包括:在检测到第一应用属于第一类应用后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。
根据第三方面,在移动设备自动输出第一提示信息之后,无线投屏方法还包括:检测到第一用户输入,第一用户输入用于将第一无线投屏方式切换为第二无线投屏方式;响应于第一用户输入,移动设备自动输出一个或多个电子设备的标识,电子设备为移动设备检测到的支持第二无线投屏方式的电子设备;检测到第二用户输入,第二用户输入用于从电子设备的标识中选择一个电子设备的标识;响应于第二用户输入,移动设备将第一无线投屏方式切换为第二无线投屏方式,并以第二无线投屏方式向所选择的电子设备投屏。这样,在移动设备输出提示信息后,给用户提供了自行选择投屏方式是否切换,并选择切换后的投屏方式和接受投屏的电子设备的机会。用户可根据提示,进行选择。
根据第三方面,或者以上第三方面的任意一种实现方式,无线投屏方法还包括:在检测到属于第二类应用的第二应用启动后,或者,在检测到属于第二类应用的第二应用被切换为前台运行的应用后,移动设备自动输出第二提示信息,第二提示信息用于提示将第二无线投屏方式切换为第一无线投屏方式;或者,移动设备自动将第二无线投屏方式切换为第一无线投屏方式,以第一无线投屏方式向电子设备无线投屏。这样,在移动设备的前台正在运行的应用类别再次变化后,移动设备自动识别,并自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,提升用户体验。
根据第三方面,或者以上第三方面的任意一种实现方式,无线投屏方法还包括:在检测到第一应用为第三类应用后,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,在移动设备的前台正在运行的应用类别变化后,移动设备自动识别,并自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,提升用户体验。
根据第三方面,或者以上第三方面的任意一种实现方式,无线投屏方法还包括:在检测到属于第一类应用的第一应用被切换为前台运行的应用后,或者,在检测到属于第一类应用的第三应用启动后,或者,在检测到属于第一类应用的第三应用被切换为前台运行的应用后,移动设备自动输出第三提示信息,第三提示信息用于提示将第一无线投屏方式切换为第二无 线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏。这样,在移动设备的前台正在运行的应用类别变化回去后,移动设备自动识别,并自动提示用户是否更改回去,或者自动将投屏方式更改回去,使得更改后的投屏方式最适合当前应用的投屏,提升用户体验。
根据第三方面,或者以上第三方面的任意一种实现方式,在移动设备自动输出第一提示信息之后,无线投屏方法还包括:检测到第一用户输入,第一用户输入用于将第一无线投屏方式切换为第二无线投屏方式;响应于第一用户输入,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,就提供了另外一种更改投屏方式的方案,也能提升用户体验。
根据第三方面,或者以上第三方面的任意一种实现方式,在检测到第一应用属于第一类应用后,无线投屏方法还包括:在检测到移动设备通过第一应用播放网络视频后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向移动设备检测到的支持第二无线投屏方式的一个或多个电子设备无线投屏。这样,移动设备也自动提示用户是否更改,或者自动更改投屏方式,使得更改后的投屏方式最适合当前应用的投屏,从而兼顾不同应用的要求,以及镜像投屏和在线投屏的特点,提升用户体验。
根据第三方面,或者以上第三方面的任意一种实现方式,第一类应用为非定制类视频应用,第二类应用为游戏应用,第三类应用为定制类视频应用;第一无线投屏方式为镜像投屏方式,第二无线投屏方式为在线投屏方式;所述一个或多个电子设备包括所述电子设备,或者,所述一个或多个电子设备不包括所述电子设备;第一用户输入和第二用户输入的输入形式包括触摸输入和语音输入。这样,无需对提供视频服务的第一应用做改进,即可保证移动设备在运行第一应用时提示用户更改,甚至将当前的镜像投屏方式自动更改为在线投屏方式,提高了投屏效率,并且能够保证电子设备播放网络视频的画质,从而保证用户得到最佳的投屏体验,还可以降低成本。另外,移动设备更换投屏方式后,可以继续向原来的电子设备进行投屏,也可以更换其他电子设备进行投屏,满足用户的不同需求。
根据第三方面,或者以上第三方面的任意一种实现方式,镜像投屏的方式可以为Wi-Fi联盟制定的miracast,在线投屏的方式可以为DLNA。
根据第三方面,或者以上第三方面的任意一种实现方式,移动设备存储有白名单,白名单用于识别哪些应用属于第一类应用,白名单包括属于第一类应用的一个或多个应用。该白名单为预先设置的,且能够更新。白名单可为用户自行添加、删减等设置。
根据第三方面,或者以上第三方面的任意一种实现方式,第一提示信息包括但不限于:在显示屏上显示的界面元素、播放的音频、指示灯闪烁、马达震动等。
第三方面及第三方面的任意一种实现方式分别与第一方面及第一方面的任意一种实现方式相对应。第三方面以及第三方面中任意一种实现方式所对应的技术效果可参见上述第一方面以及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第四方面,提供一种无线投屏方法。无线投屏方法应用于移动设备,移动设备包括处理器和存储器,移动设备以第一无线投屏方式向电子设备无线投屏,该方法包括:在检测到第一应用启动,以及所述第一应用属于第一类应用后,移动设备自动输出第一提示信息,第一提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备自动将第一无线投屏方式切换为第二无线投屏方式,以第二无线投屏方式向电子设备无线投屏。
第五方面,本申请提供一种无线投屏方法。无线投屏方法应用于移动设备,移动设备包括处理器和存储器,移动设备以第一无线投屏方式向电子设备无线投屏;无线投屏方法包括:移动设备在前台运行第一应用,移动设备识别在前台正在运行的应用所对应的场景,并将该场景通知给电子设备,以使得电子设备使用与该场景对应的播放策略来播放该多媒体内容。
第六方面,本申请实施例提供了一种投屏方法,应用于电子设备。该方法可包括:该电子设备以第一无线投屏方式接收移动设备发送的多媒体内容;接收到移动设备发送的该移动设备识别到的场景;使用和该场景对应的播放策略播放该多媒体内容。
第七方面,本申请提供一种计算机可读存储介质,包括计算机程序,其特征在于,当所述计算机程序在移动设备上运行时,使得所述移动设备执行第三方面、第四方面、第五方面以及第三方面中任意一种实现方式的方法。
第八方面,本申请提供一种计算机程序产品,当计算机程序产品在移动设备上运行时,使得移动设备执行第三方面、第四方面、第五方面以及第三方面的任意一种实现方式的方法。
本申请提供的技术方案,是从系统侧进行改进,无需第三方应用做任何适配。另外,本申请提供的技术方案,根据前台正在运行的应用的类型,甚至还根据前台正在运行的应用是否播放网络视频等,来自动选择最优投屏方式,或者,自动输出提示信息以让用户能够自主选择,使得移动设备投屏的效果得到最好,提升用户的投屏体验,还降低成本,提高投屏效率。
附图说明
图1A为提供的移动设备和电子设备基于镜像投屏共享网络视频的原理示意图;
图1B为提供的移动设备和电子设备基于在线投屏共享网络视频的原理示意图;
图2为本申请实施例提供的无线投屏方法的场景示意图;
图3A为本申请实施例提供的移动设备的硬件结构示意图;
图3B为本申请实施例提供的移动设备的软件结构示意图;
图4为本申请实施例提供的电子设备的硬件结构示意图;
图5A-图5C为本申请实施例提供的一种无线投屏方法的流程示意图;
图6A-图6C、图7A-图7C、图8A-图8H为本申请实施例提供的一种无线投屏方法中移动设备的用户界面示意图;
图9为本申请实施例提供的一种无线投屏方法中电子设备的用户界面示意图;
图10A-图10B为本申请实施例提供的一种无线投屏方法中移动设备的内部模块交互示意图;
图11为本申请实施例提供的一种无线投屏方法中移动设备的内部模块交互示意图;
图12为本申请实施例提供的另一种无线投屏方法的流程示意图;
图13A-图13B为本申请实施例提供的另一种无线投屏方法中移动设备和电子设备的内部模块交互示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思。例如,A/B可以表示A或B。文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系。例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
在本申请实施例中,“示例性地”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性地”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性地”或者“例如”等词旨在以具体方式呈现相关概念。
本申请以下实施例中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在移动设备或电子设备上经过解析,渲染,最终呈现为用户可以识别的内容。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在移动设备或电子设备的显示屏中显示的文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
在本申请实施例中,通过无线投屏,可以将移动设备显示的内容(比如播放的多媒体内容)在电子设备上显示(比如继续播放多媒体内容),之后移动设备可以继续显示该内容,也可以不再显示该内容。无线投屏也可以采用其他词语替换,比如多屏互动等;本申请实施例不作限制。其中,移动设备也可以被称为输出端或源端(source端),电子设备也可以被称为输入端或接收端(sink端)。
无线投屏之后,电子设备显示或播放的多媒体内容可包括以下任意一项或多项:视频、文字、图片、照片、音频或表格等。例如,多媒体内容可以是电影、电视剧、短视频、音乐剧等。
多媒体内容可以是网络多媒体内容,也可以是本地多媒体内容,还可以是网络多媒体内容和本地多媒体内容的组合。其中,网络多媒体内容是指移动设备从网络中获取到的多媒体内容。例如移动设备运行视频应用时从提供音视频服务的服务器处获取到的视频。本地多媒体内容是指移动设备本地存储或生成的多媒体内容。例如,移动设备本地存储的图片或表格等。
首先,介绍本申请实施例涉及的两种投屏方式:镜像投屏(mirroring technology)和在线投屏。
1、镜像投屏:移动设备(如智能手机、平板电脑等)与电子设备(如智能电视、智能屏等)建立通信连接,该移动设备从网络侧或本地侧获取多媒体内容,然后将所述多媒体内容经过编码后,以点对点传输等无线传输的方式传输给电子设备;电子设备经过解码后进行输 出(如显示、播放等)。
其中,所述多媒体内容可以是网络多媒体内容、本地多媒体内容或两者的组合。所述移动设备和电子设备之间的点对点传输方式可包括但不限于:无线保真直连(wireless fidelity direct,Wi-Fi direct)(又称为无线保真点对点(wireless fidelity peer-to-peer,Wi-Fi P2P))通信连接、蓝牙通信连接、近场通信(near field communication,NFC)连接等。
镜像投屏可包括Wi-Fi联盟制定的miracast、各公司制定的私有镜像投屏解决方案例如华为公司的cast+、苹果公司的AirPlay等。其中,miracast建立在无线保真Wi-Fi联盟所发展的基础技术标准以及实时流传输协议(real time streaming protocol,RTSP)之上。Wi-Fi基础技术标准可包括无线传输技术802.11n、802.11ac、Wi-Fi direct/Wi-Fi P2P、通道直接链路建立(tunneled direct link setup,TDLS)、管理安全的WPA2(Wi-Fiprotected access 2)加密、提供服务质量及流量管理的WMM(Wi-Fimultimedia)技术等。
下面结合图1A,对镜像投屏进一步阐述说明。图1A示例性示出了移动设备和电子设备基于镜像投屏共享网络视频的场景。如图1A所示,移动设备与电子设备建立通信连接(如Wi-FiP2P连接)。同时,移动设备接入路由设备,并通过路由设备,从服务器处获取到流媒体。其中,移动设备接入路由设备,具体可为移动设备接入路由设备提供的接入点(access point,AP)。之后,移动设备在本身播放流媒体的过程中通过录屏、录音等方式获取到录屏内容以及录音内容,然后实时地将所述录屏内容和所述录音内容,分别经编码后,通过诸如Wi-Fi P2P连接发送给电子设备;电子设备接收到后,实时播放、显示。
从图1A可以看出,基于镜像投屏共享网络视频时,电子设备接收到的网络视频经过了多次的网络传递以及编解码转换,过程繁琐。并且,电子设备播放网络视频的画质(例如分辨率)受到移动设备的限制,播放效果可能较差,影响用户体验。不过,镜像投屏的音画同步效果较好。
在一些实施方式中,镜像投屏也可以被称为诸如全分享投屏、无线显示等。
2、在线投屏:移动设备和电子设备均接入互联网或局域网,移动设备仅将欲投屏的多媒体资源对应的网络地址,比如统一资源定位符(uniform resource locator,URL),发送给电子设备;电子设备根据该网络地址,从互联网侧或局域网侧获取对应的多媒体内容,从而进行输出(如播放、显示)。
其中,多媒体内容可以是网络多媒体内容、本地多媒体内容或两者的组合。这里,电子设备可以和移动设备接入同一个无线Wi-Fi接入点AP组建的局域网,也可以和移动设备连接不同的网络,比如接入不同的AP组建的不同的局域网,该不同的局域网通过互联网互联。
在线投屏可包括DLNA,以及各公司制定的私有在线投屏解决方案。比如,谷歌公司的Google cast、苹果公司的AirPlay等。其中,DLNA建立在通用即插即用(universal plug and play,UPnP)协议之上。
下面结合图1B,对在线投屏进一步阐述说明。图1B示例性示出了移动设备和电子设备基于在线投屏共享网络视频的场景。如图1B所示,移动设备和电子设备共同接入由Wi-Fi AP组建的局域网中,移动设备将播放的网络视频的网址通过AP发送给电子设备,然后电子设备根据该网址获取多媒体内容。该网址为提供网络音视频服务的服务器的地址。基于在线投屏共享网络视频时,电子设备可以直接从网络侧获取多媒体内容,无需对网络视频进行多次网络传递和编解码转换,过程简便,电子设备播放网络视频时可以有较好的播放效果,用户 体验较佳。不过,相对于镜像投屏而言,在线投屏的音画同步相对较差。
在一些实施方式中,在线投屏也可以被称为网络投屏等。
在本申请实施例中,镜像投屏方式可以被称为第一无线投屏方式,在线投屏方式可以被称为第二无线投屏方式。
图2为本申请实施例提供的无线投屏方法的场景示意图。如图2所示,移动设备100可以通过镜像投屏向电子设备200投放多媒体内容,也可以通过在线投屏向电子设备200投放多媒体内容。
在移动设备100通过镜像投屏向电子设备200投放多媒体内容时,移动设备100与电子设备200之间可以建立Wi-FiP2P连接;也可以建立其他的短距离通信直接连接,比如蓝牙、ZigBee等。在镜像投屏中,移动设备100投放的多媒体内容可以来自自身;也可以来自服务器400,此时移动设备100经Wi-Fi接入点300连接服务器400。
在移动设备100通过在线投屏向电子设备200投放多媒体内容时,移动设备100和电子设备200都可以接入Wi-Fi接入点300,从而处于同一个局域网内。在其他一些实施例中,移动设备100和电子设备200也可以接入不同的网络,本申请实施例对此不作限制。需要强调的是,在没有特殊说明的情况下,本申请以下实施例中移动设备100和电子设备200均通过Wi-Fi接入点300接入同一个局域网。在线投屏中,移动设备100投放的多媒体内容来自服务器400。
服务器400提供网络音视频服务。示例性地,服务器400可以为存储有多种多样的多媒体内容的服务器。比如,服务器400可以为提供音视频服务的腾讯视频服务器。服务器400的数量可以为一个,也可以为多个。
在一些实施例中,移动设备100在通过镜像投屏与电子设备200共享网络视频的过程中,可以自动或者在用户的触发下,将镜像投屏切换为在线投屏,通过在线投屏继续与电子设备200共享该网络视频。示例性地,该网络视频可以是移动设备100运行视频应用程序(application,APP)时从服务器400处获取到的。
在一些实施例中,移动设备100将镜像投屏切换为在线投屏后,在一些情况下,移动设备100还可以将在线投屏切换回镜像投屏。
本申请实施例的移动设备包括但不限于智能手机、平板电脑、个人数字助理(personal digital assistant,PDA)、具备无线通讯功能的可穿戴电子设备(如智能手表、智能眼镜)等。移动设备的示例性实施例包括但不限于搭载
Figure PCTCN2021124895-appb-000001
Linux或者其它操作系统的便携式电子设备。上述移动设备也可为其它便携式电子设备,诸如膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述移动设备也可以不是便携式电子设备,而是台式计算机。
示例性地,图3A示出了本申请实施例提供的移动设备100的硬件结构。如图3A所示,移动设备100可以包括:处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传 感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对移动设备100的具体限定。在本申请另一些实施例中,移动设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
移动设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。移动设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在移动设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在移动设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),NFC,红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,移动设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得移动设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple  access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
无线通信模块160可用于和电子设备200建立通信连接(例如Wi-Fi直连通信连接、蓝牙通信连接等),并基于该通信连接将移动设备100通过录屏、录音采集到的数据编码后发送给电子设备200。即,无线通信模块160可支持移动设备100和电子设备200之间基于镜像投屏(如miracast)来共享多媒体内容。
无线通信模块160还可以接入Wi-Fi接入点300组建的局域网或其他网络,并可以将当前播放的多媒体内容的网址通过网络发送给电子设备200。之后,电子设备200可以通过该网址直接获取多媒体内容。即,无线通信模块160可支持移动设备100和电子设备200之间基于在线投屏(如DLNA)来共享多媒体内容。
在本申请的一些实施例中,处理器110用于在移动设备100基于镜像投屏和电子设备200共享多媒体内容时,识别当前场景,并通过无线通信模块160将该场景通知给电子设备200,以使得电子设备200根据该场景适应性选择对应的播放策略来播放多媒体内容。处理器110识别当前场景的方式、无线通信模块160通知电子设备200当前场景的方式,可参考后续方法实施例的相关描述,在此暂不赘述。
在本申请的另一些实施例中,处理器110还用于在移动设备100基于镜像投屏和电子设备200共享网络视频的过程中,自动或者在用户的触发下,指示无线通信模块160将镜像投屏切换为在线投屏。无线通信模块160将镜像投屏切换为在线投屏的具体实现,可参考后续方法实施例的相关描述,在此暂不赘述。
在一些实施例中,无线通信模块160将镜像投屏切换为在线投屏后,在一些情况下,处理器110还可用于指示无线通信模块160将在线投屏切换回镜像投屏。无线通信模块160将在线投屏切换回镜像投屏的情况可参考后续方法实施例的相关描述。
移动设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或自动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,移动设备100可以包括1个或N个显示屏194,N为大于1的正整数。
在本申请实施例中,显示屏194用于显示本申请实施例提及的在移动设备100上实现的用户界面。该用户界面的具体实现可参考后续方法实施例的相关描述。
视频编解码器用于对数字视频压缩或解压缩。移动设备100可以支持一种或多种视频编 解码器。这样,移动设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
移动设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于移动设备100的表面,与显示屏194所处的位置不同。
移动设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明移动设备100的软件结构。图3B是本申请实施例提供的移动设备100的一种软件结构的示意性框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。应用程序层可以包括一系列应用程序包。
如图3B所示,应用程序包可以包括投屏服务、视频应用程序、游戏应用程序、办公类应用程序、相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,短信息等应用程序。为了描述简便,后续将视频应用程序简称为视频应用,将游戏应用程序简称为游戏应用。
投屏服务提供移动设备100的镜像投屏功能。投屏服务支持移动设备100基于镜像投屏和电子设备200共享多媒体内容。投屏服务可调用移动设备100的无线通信模块160来提供镜像投屏功能。
视频应用程序可以称为视频应用,用于为移动设备100提供音视频服务。移动设备100可以运行视频应用,并从该视频应用对应的服务器中获取网络视频。视频应用的数量可以为一个或多个。比如,视频应用可包括腾讯视频。
视频应用可以提供在线投屏功能。视频应用支持移动设备100通过在线投屏与电子设备200共享多媒体内容。具体地,移动设备100在运行视频应用且播放其中的网络视频中,若用户开启该视频应用的在线投屏功能,则移动设备100可以将该网络视频的网址发送给电子设备200。
游戏应用程序可以称为游戏应用,用于为移动设备100提供游戏服务。移动设备100可以运行游戏应用,并从本地或者该游戏应用对应的服务器中获取游戏资源。游戏应用的数量可以为一个或多个。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3B所示,应用程序框架层可以包括场景感知模块、窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
具体的,场景感知模块用于侦听当前使用的应用程序的操作,并据此识别出移动设备100当前使用的应用程序,从而确定移动设备100所处的场景。场景感知模块为可选模块。在一些实施例中,场景感知模块的功能可以集成到应用程序层的投屏服务中。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
需要强调的是,图3B仅为示意性举例;本申请实施例提供的移动设备100的软件结构还可采用其他的软件架构,比如
Figure PCTCN2021124895-appb-000002
Linux或者其它操作系统的软件架构。
本申请实施例的电子设备包括但不限于平板电脑、台式计算机、便携式电子设备(如膝上型计算机,Laptop)、智能电视(如智慧屏)、车载电脑、智能音箱、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、带有显示屏的电子广告牌、单独使用(比如,投影在墙壁上)或与显示装置(比如幕布)组合使用的投影仪、其他带有显示屏的智能设备、以及其他带有扬声器的智能设备等。电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2021124895-appb-000003
Linux或者其它操作系统的便携式电子设备。在一些实施例中,电子设备200可以为配置有电视盒子的电视机,电视盒子用于接收来自移动设备100或服务器400的多媒体内容并提供投屏功能,电视机仅提供显示功能。在一些实施例中,电子设备200还可以与遥控器配合使用。遥控器与电子设备200之间可以通过红外线信号通信。
示例性地,图4示出了本申请实施例提供的电子设备200的硬件结构。如图4所示,电 子设备200可包括:视频编解码器221、处理器222、存储器223、无线通信处理模块224、电源开关225、有线LAN通信处理模块226、高清晰度多媒体接口(high definition multimedia interface,HDMI)通信处理模块227、USB通信处理模块228、显示屏229、音频模块230。各个模块可通过总线连接。其中:
处理器222可用于读取和执行计算机可读指令。具体实现中,处理器222可主要包括控制器、运算器和寄存器。其中,控制器主要负责指令译码,并为指令对应的操作发出控制信号。运算器主要负责执行定点或浮点算数运算操作、移位操作以及逻辑操作等,也可以执行地址运算和转换。寄存器主要负责保存指令执行过程中临时存放的寄存器操作数和中间操作结果等。具体实现中,处理器222的硬件架构可以是专用集成电路(ASIC)架构、MIPS架构、ARM架构或者NP架构等。
无线通信处理模块224可以包括WLAN通信处理模块224A,还可包括蓝牙(BT)通信处理模块224B、NFC处理模块224C、蜂窝移动通信处理模块(未示出)等。
在一些实施例中,无线通信处理模块224可用于与移动设备100建立通信连接,并基于该通信连接接收到移动设备100发送的经过编码的数据。例如,WLAN通信处理模块224A可用于与移动设备100建立Wi-Fi直连通信连接,蓝牙(BT)通信处理模块224B可用于与移动设备200建立蓝牙通信连接,NFC处理模块224C可用于与移动设备100建立NFC连接等。即,无线通信处理模块224可支持移动设备100与移动设备200之间通过镜像投屏(如miracast)来共享多媒体内容。
在一种实施方式中,无线通信处理模块224可以监听到移动设备100发射的信号如探测请求、扫描信号,发现移动设备100,并与移动设备100建立通信连接。在另一种实施方式中,无线通信处理模块224也可以发射信号,如探测请求、扫描信号,使得电子设备200可以发现移动设备100,并与移动设备100建立通信连接(如Wi-FiP2P连接)。
在一些实施例中,移动设备100与电子设备200之间通过镜像投屏(如miracast)来共享多媒体内容时,无线通信处理模块224(如WLAN通信处理模块224A)还可以接收到移动设备100通知的场景。处理器222可解析并获知该场景,并自适应地选择与该场景对应的播放策略,并以该播放策略来调用显示屏229、音频模块230等模块播放移动设备100发送的多媒体内容。
在一些实施例中,无线通信处理模块224(如WLAN通信处理模块224A)还可以接入Wi-Fi接入点300组建的局域网或其他网络,并通过Wi-Fi接入点300接收到移动设备100发送的网络视频的网址,之后可以直接从该网址对应的服务器处获取该网络视频。即,WLAN通信处理模块224A可支持移动设备100与电子设备200之间通过在线投屏(如DLNA)来共享网络视频。
视频编解码器221用于对数字视频压缩或解压缩。在本申请实施例中,视频编解码器221可以对来自移动设备100的或者服务器400的多媒体内容进行解压缩。电子设备200可以支持一种或多种视频编解码器,可以播放一种或多种编码格式的视频。例如:MPEG1,MPEG2,MPEG3,MPEG4等。
处理器222可以用于解析无线通信处理模块224接收到的信号,如电子设备200的广播的探测请求等。处理器222可以用于根据解析结果进行相应的处理操作,如生成探测响应,等。处理器222可用于根据视频编解码器221的解压缩结果来驱动显示屏229执行显示。
存储器223与处理器222耦合,用于存储各种软件程序和/或多组指令。具体实现中,存储器223可包括高速随机存取的存储器,并且也可包括非易失性存储器,例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。存储器223可以存储操作系统,例如uCOS、VxWorks、RTLinux、Harmony、Android等嵌入式操作系统。存储器223还可以存储通信程序,该通信程序可用于与电子设备200,一个或多个服务器,或附加设备进行通信。
电源开关225可用于控制电源向电子设备200的供电。
有线LAN通信处理模块226可用于通过有线LAN和同一个LAN中的其他设备进行通信,还可用于通过有线LAN连接到WAN,可与WAN中的设备通信。
HDMI通信处理模块227可用于通过HDMI接口(未示出)与其他设备进行通信。
USB通信处理模块228可用于通过USB接口(未示出)与其他设备进行通信。
显示屏229可用于显示图像,视频等。显示屏229可以采用LCD、OLED、AMOLED、FLED、QLED等显示屏。显示屏229所显示的内容可参考后续方法实施例的相关描述。
音频模块230可用于通过音频输出接口输出音频信号,这样可使得电子设备200支持音频播放。音频模块230还可用于通过音频输入接口接收音频数据。音频模块230可包括但不限于:麦克风、扬声器、受话器等。
在一些实施例中,电子设备200还可以包括RS-232接口等串行接口。该串行接口可连接至其他设备,如音箱等音频外放设备,使得显示器和音频外放设备协作播放音视频。
可以理解的是图4示意的结构并不构成对电子设备200的具体限定。在本申请另一些实施例中,电子设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
电子设备200的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构等。示例性地,电子设备200的软件系统包括但不限于
Figure PCTCN2021124895-appb-000004
Figure PCTCN2021124895-appb-000005
Linux或者其它操作系统。
Figure PCTCN2021124895-appb-000006
为华为的鸿蒙系统。
电子设备200的软件系统的应用层可包括投屏服务和投屏播放器。投屏服务支持电子设备200通过镜像投屏、在线投屏接收移动设备100投放的多媒体内容。具体的,投屏服务可调用无线通信处理模块224来提供镜像投屏功能和在线投屏功能。投屏播放器用于播放来自移动设备100或服务器400的多媒体内容。
在一些实施例中,电子设备200和移动设备100之间通过镜像投屏共享多媒体内容时,投屏服务可以根据移动设备100当前所处的场景指示投屏播放器按照对应的播放策略来播放该多媒体内容。
下面详细描述本申请实施例提供的无线投屏方法。在本申请实施例中,以有无定制在线投屏方式为基准,可以将视频应用分为:定制类视频应用和非定制类视频应用。其中,定制类视频应用是指应用本身具有发起投屏能力的应用。示例性地,定制类视频应用本身集成了软件开发工具包(software development kit,SDK)。比如,腾讯视频。移动设备100的投屏状态可包括:移动设备100当前是否通过镜像投屏与电子设备200共享多媒体内容。非定制类视频应用是指应用本身不具有发起投屏能力的应用。示例性地,非定制类视频应用本身未集成SDK。
没有配置能够查询当下设备的当前投屏状态的SDK的视频应用。
上文已经阐明,发明人经过长期地研究发现,不同的应用侧重不同的要求。比如,游戏应用侧重实时性,对低时延要求较高,音画同步要求相对较低;而视频应用,侧重音画同步,实时性要求相对较低,即低时延要求相对较低。因此,本申请实施例提供的无线投屏方法可自动识别当前前台正在运行的应用的类别,并提示用户选择建议的无线投屏方式,或自动更改合适的无线投屏方式。
图5A为本申请实施例提供的一种无线投屏方法的流程示意图。如图5A所示,该方法可包括:
S001-S007,移动设备100通过镜像投屏,与电子设备200共享多媒体内容。
S001,移动设备100检测到开启镜像投屏功能的用户操作。
图6A及图6B示例性示出了移动设备100检测到的开启镜像投屏功能的用户操作。图6A示出了移动设备100上的用于展示已安装应用程序的示例性用户界面61。该用户界面61显示有:状态栏、日历指示符、天气指示符、具有常用应用程序图标的托盘、导航栏、视频应用的图标601、游戏应用的图标602以及其他应用程序的图标等。其中,状态栏可包括:移动通信信号(又可称为蜂窝信号)的一个或多个信号强度指示符、运营商名称(例如“中国移动”)、Wi-Fi信号的一个或多个信号强度指示符,电池状态指示符、时间指示符等。导航栏可包括返回键、主屏幕键、多任务键等系统导航键。在一些实施例中,图6A示例性所示的用户界面61可以为主界面(Home screen)。
如图6A及图6B所示,当移动设备100检测到在显示屏上的向下滑动手势时,响应于该滑动手势,移动设备100在用户界面61上显示窗口603。如图6B所示,窗口603中可以显示有控件603a,控件603a可接收开启/关闭移动设备100的镜像投屏功能的操作(例如触摸操作、点击操作)。控件603a的表现形式可以包括图标和/或文本(例如文本“镜像投屏”、“无线投屏”、“多屏互动”等)。窗口603中还可以显示有其他功能例如Wi-Fi、蓝牙、手电筒等的开关控件。如图6B所示,移动设备100可以检测到作用于控件603a的用户操作,即检测到开启镜像投屏功能的用户操作。在一些实施例中,移动设备100检测到作用于控件603a的用户操作后,可以更改控件603a的显示形式,例如增加显示控件603a时的阴影等。
不限于在图6A所示的主界面上,用户还可以在其他界面上输入向下滑动的手势,触发移动设备100显示窗口。
不限于图6A及图6B示出的用户在窗口603中作用于控件603a的用户操作,在本申请实施例中,开启镜像投屏功能的用户操作还可以实现为其他形式,本申请实施例不作限制。
例如,移动设备100还可以显示设置(settings)应用提供的设置界面,该设置界面中可包括提供给用户的用于开启/关闭移动设备100的镜像投屏功能的控件,用户可通过在该控件上输入用户操作来开启移动设备100的镜像投屏功能。
又例如,用户还可以将移动设备100贴近电子设备200的NFC标签,触发移动设备100开启镜像投屏功能。
S002,移动设备100发现附近的电子设备。
检测到S001中开启镜像投屏功能的用户操作,移动设备100开启无线通信模块160中的Wi-Fi直连(图中未示出)、蓝牙或NFC中的一项或多项,并通过Wi-Fi直连、蓝牙、NFC中一项或多项发现该移动设备100附近的可投屏的电子设备。例如,移动设备100可以通过Wi-Fi直连发现附近的电子设备200以及其他电子设备。
S003,移动设备100显示发现的附近的电子设备的标识。
除了显示移动设备100发现的可接受镜像投屏的电子设备的标识,移动设备100还可以显示其他信息,例如发现的电子设备的图像等,本申请实施例不作限制。
之后,示例性地,如图6C所示,移动设备上弹出窗口605。窗口605包括:界面指示符605a、图标605b、一个或多个电子设备的图像605c和标识605d。
本申请实施例不限定S002和S003的先后顺序,两者可以同时执行,也可以先后执行。在移动设备100还未发现附近的电子设备时,窗口605中显示的电子设备的数量为0。
S004,移动设备100检测到选择电子设备200的用户操作。
示例性地,如图6C所示,选择电子设备200的用户操作可以是作用于电子设备200对应的图像605c和/或标识605d上的用户操作。选择电子设备200的用户操作还可以实现为其他形式,本申请实施例不作限制。
S005,响应于检测到的选择电子设备200的用户操作,移动设备100和电子设备200建立通信连接。
响应于该用户操作,移动设备100可以通过Wi-Fi直连、蓝牙、NFC中一项或多项无线通信技术和电子设备200建立通信连接。例如,移动设备100和电子设备200建立Wi-Fi直连通信连接。移动设备100和电子设备200建立通信连接后,可以基于该通信连接进行能力协商,包括双方支持的编码格式、分辨率、音频格式等,便于后续执行的多媒体内容的传输。
S006,移动设备100基于与电子设备200之间的通信连接,将当前显示的多媒体内容发送给电子设备200。
具体的,移动设备100可以通过录屏、录音等方式获取当前显示的多媒体内容(包含图像和/或音频),然后将获取的多媒体内容压缩后,通过和电子设备200之间的通信连接发送给电子设备200。以移动设备100和电子设备200基于miracast共享多媒体内容为例,移动设备100可以根据miracast协议中的规定,通过录屏的方式获取显示屏显示的图像,使用H.264编码算法对该图像进行压缩;采集移动设备100所播放的音频,使用高级音频编码(advanced audio coding,AAC)算法对该音频进行压缩;然后将压缩后的音频数据和图像数据封装为传输流(transport stream,TS),之后对TS流按照实时传送协议(real-time transport protocol,RTP)进行编码并将编码后得到的数据通过Wi-Fi直连连接发送给电子设备200。即,该多媒体内容通过流媒体的方式传输。
S007,电子设备200播放接收到的多媒体内容。
电子设备200接收到移动设备100基于通信连接发送的多媒体内容后,可以对该多媒体内容执行解码处理,从而获取多媒体内容。以移动设备100和电子设备200基于miracast共享多媒体内容为例,电子设备200可以基于与移动设备100之间的Wi-Fi直连通信连接接收到RTP编码的TS流,并可以按顺序对其执行RTP解码、TS流解封装、音画质处理/时延同步处理,最后输出音视频,即播放多媒体内容。可理解的,S006、S007将会持续执行,直至移动设备100关闭镜像投屏功能、开启在线投屏功能等。
电子设备200执行S007期间,电子设备200播放的多媒体内容和移动设备100播放的多媒体内容相同。在移动设备100播放的多媒体内容随着用户操作而变化时,电子设备200播放的多媒体内容也随之变化。
可理解的,S001-S007所示的移动设备100和电子设备200基于镜像投屏共享多媒体内 容的过程仅为示例;还可为其他实施方式,此处不再赘述。
S008-S012,移动设备100根据当前启动的应用的类别,提示用户选择或自动选择合适的投屏方式。
S008,移动设备100启动应用程序,或切换应用程序。
移动设备100启动的应用程序取决于用户,可以为游戏应用,也可以为视频应用,视频应用可以包括非定制类视频应用和非定制类视频应用。
移动设备100可以响应于在图6A所示的用户界面61中的应用程序的图标上检测到的用户操作(例如点击操作、触摸操作等),启动该图标对应的应用程序,还可以响应于其他用户操作(例如语音指令)启动对应的应用程序,此处不做限制。示例性地,如图7A-图7B所示,移动设备100可以响应于作用于主界面上的视频应用的图标601上的用户操作,启动视频应用。图7B示例性示出了移动设备100启动视频应用后所显示的用户界面71。该用户界面71是视频应用提供的主页面。如图7B所示,用户界面71中显示有一个或多个视频图像701。视频的图像可以是动态的,也可以是静态的。此外,用户界面71还可以显示有底部菜单栏、搜索框、子频道入口等,本申请实施例对此不作限制。如图7B所示,移动设备100可以检测到作用于视频图像701上的用户操作,通过网络从视频应用对应的服务器中获取该视频图像701所指示的网络视频,并播放该网络视频。监听到用户操作的视频图像701所指示的网络视频,即为用户选择的网络视频。以下实施例将以视频应用对应服务器400为例进行说明。也即是说,移动设备100获取到的网络视频的网址为服务器400的地址。
图7C示例性地示出了移动设备100播放用户选择的网络视频时所显示的用户界面72。该用户界面72可以是移动设备100响应于用户将移动设备100由竖屏状态切换为横屏状态的动作,或者,用户点击移动设备100所显示用户界面右下角所显示的全屏播放的控件而显示的。在一些实施例中,用户界面72中还可包括在线投屏的开关控件702,控件702用于监听开启/关闭视频应用的在线投屏功能的用户操作(例如点击操作、触摸操作等)。如图7C所示,控件702的显示状态表示当前移动设备100的在线投屏功能开启。
S009,移动设备100判断启动的应用程序,或切换后的应用程序,是否为视频应用。
具体的,移动设备100可以侦听前台正在运行的应用程序或当前使用窗口对应的应用程序,并据此识别到移动设备100在前台正在运行的应用程序是否为视频应用。若为视频应用,执行S010。
启动、切换应用程序后的具体识别,可以结合图10A、图10B阐述。图10A、图10B的场景感知模块可通过操作系统调用API感知应用启动、应用切换。比如,以感知应用切换为例,场景感知模块感知发送“android.ActivityState.CHANGE”消息订阅应用切换事件。场景感知模块感知到应用切换后,再调用操作系统提供的API查询用户看到的顶层APP名称。比如,通过ActivityManager提供的API查询任务及任务对应的PackageName。场景感知模块感知根据查询到的应用名称是否在预制的数据库或表格中判断是否是视频应用。该数据库或该表格可根据需要由用户增加、删减、更新等。
S010,移动设备100判断启动的视频应用是否为非定制类视频应用。
在一些实施例中,移动设备100可以存储有白名单,白名单可包括一个或多个非定制类视频应用。这样,当前台正在运行的应用程序在白名单中时,移动设备100可以确定移动设备100在前台运行的该应用程序为非定制类视频应用。白名单中的非定制类视频应用可以安 装在移动设备100中,也可以未安装在移动设备100中。移动设备100可以根据需要更新白名单。
若为非定制类视频应用,执行S011;若为定制类视频应用,执行S012。
S011,执行图5B的流程;
S012,执行图5C的流程。
接下来,介绍图5B的流程。图5B描述了针对非定制类视频应用时,将移动设备100当前使用的镜像投屏切换为在线投屏的流程。在图5B中,移动设备100通过镜像投屏与电子设备200共享非定制类视频应用提供的网络视频时,可以在用户的触发下,将镜像投屏切换为在线投屏,基于在线投屏共享该网络视频。图5B示出的流程,可包括:
S101,移动设备100输出提示信息,该提示信息用于提示用户将镜像投屏切换为在线投屏。
在S101中,提示信息的实现形式可包括但不限于:移动设备100在显示屏上显示的界面元素、播放的音频、指示灯闪烁、马达震动等。移动设备100输出提示信息的场景可包括以下3种场景:
场景1、移动设备100启动视频应用后,即输出提示信息。
移动设备100可以采用S009,或者S009和S010的方式,在识别到移动设备100在前台运行的应用程序为非定制类视频应用时,移动设备100输出提示信息。
图8A示例性地示出了移动设备100在场景1中显示的提示信息。该提示信息为窗口703。窗口703可包括:文本703a。文本703a例如可以为“播放视频时点击在线投屏按钮,可将镜像投屏切换为在线投屏”、“播放视频时可点击在线投屏按钮,获取更清晰的投屏效果”等。
在一些实施例中,窗口703还可以包括图像703b和图像703c,图像703b和图像703c分别用于指示基于镜像投屏、在线投屏共享网络视频的效果。可以看出,图像703c相较于图像703b更清晰,效果更好。这样可以提示用户镜像投屏和在线投屏的不同,有利于用户选择更加适合的在线投屏来共享网络视频。移动设备100在显示屏上显示的提示信息可以在显示一段时长(比如5秒)后自动消失,无需用户交互。移动设备100还可以响应于用户点击显示屏上该提示信息以外的其他区域的操作,停止显示该提示信息等。可理解的,在场景1下输出提示信息时,移动设备100可以随用户操作依次显示图7A、图8A、图7B和图7C。如此,可以在用户打开具体的视频应用后,自动提示用户触发移动设备100将镜像投屏切换为在线投屏,保证用户观看网络视频时得到最佳的投屏体验。图7C中的“9:21”用于表示上一次的播放记录,即上一次在播放至“9:21”时,退出该网络视频的播放。
场景2、移动设备100启动视频应用并且播放用户选择的网络视频后,输出提示信息。
同样地,移动设备100可以采用S009,或者S009和S010的方式,在识别到移动设备100在前台运行的应用程序为非定制类视频应用时,移动设备100输出提示信息。图8B示例性地示出了移动设备100在场景2中显示的提示信息。如图8B所示,该提示信息704为窗口。提示信息704的具体内容与图8A的窗口703类似,不再详述。可理解的,在场景2下输出提示信息时,移动设备100可以随用户操作依次显示图7A、图7B、图7C和图8B。如此,可以在开始播放具体的网络视频后,自动提示用户触发移动设备100将镜像投屏切换为在线投屏,保证用户观看网络视频时得到最佳的投屏体验。可选地,在图8B中,若在预设 时长内未点击在线投屏按钮,则窗口703关闭。可选地,在图8B中,若在预设时长(比如20秒)内未点击在线投屏按钮,则仍使用镜像投屏方式,或者,自动切换为在线投屏方式。上述20秒仅为示意性举例,任意时长均可为预设时长,本申请不作限定。
场景3、移动设备100在前台持续运行视频应用超过第一预设时长后,输出提示信息。
移动设备100可以采用S009,或者S009和S010的方式,在识别到移动设备100在前台运行的应用程序为非定制类视频应用,且在前台持续运行时长超过第一预设时长后,移动设备100输出提示信息。第一预设时长例如可以为10秒、30秒、1分钟等,本申请实施例不作限定。移动设备100在前台持续运行视频应用的时长超过第一预设时长后,移动设备100可能仍然显示视频应用提供的主页面,也可能响应于用户选择网络视频的操作在播放该网络视频。因此,如图8A所示,移动设备100显示的提示信息可以显示在图7B所示的视频应用主页面71中;如图8B所示,该提示信息也可以显示在图7C所示的用户界面72中。如此,可以在用户观看网络视频的过程中,提示触发移动设备100将镜像投屏切换为在线投屏,保证用户观看网络视频时得到最佳的投屏体验。
在没有特别说明以及未有矛盾的情况下,场景2和场景3未描述之处,均与场景1的内容相同,此处不再赘述。
S102,移动设备100接收到开启在线投屏功能的用户操作。
示例性地,参考图8C,开启视频应用的在线投屏功能的用户操作例如可以为作用于在线投屏控件702上的用户操作(例如点击操作、触摸操作等)。开启在线投屏控件的用户操作还可以为其他形式,例如摇晃手势、语音指令等,本申请实施例不作限制。
S103-S108,响应于在线投屏功能的用户操作,移动设备100将镜像投屏切换为在线投屏,基于在线投屏继续和电子设备200共享网络视频。
移动设备100响应于S102中开启视频应用的在线投屏功能的用户操作,可以开启视频应用的在线投屏功能。
S103,移动设备100发现附近支持在线投屏的电子设备,并显示发现的电子设备的标识。
示例性地,移动设备100可以发送用户数据报协议(user datagram protocol,UDP)广播。附近支持在线投屏的电子设备,例如连接到Wi-Fi接入点300的电子设备200、和其他电子设备(图中未示出)均可以响应于该UDP广播回复携带自己的相关信息(例如设备标识)的UDP报文,以使得移动设备100发现自己。移动设备100发现附近支持在线投屏的电子设备后,显示发现的电子设备的标识,还可以显示例如电子设备的图像等其他信息,本申请实施例不作限定。图8D示例性地示出了移动设备100响应于S101中检测到的开启镜像投屏功能的用户操作所显示的用户界面。如图8D所示,用户界面中可显示有移动设备100发现的电子设备的标识。
S104,移动设备100检测到选择电子设备200的用户操作。
示例性地,如图8D所示,选择电子设备200的用户操作可以是作用于图8D所示用户界面中电子设备200对应的标识上的用户操作。选择电子设备200的用户操作还可以实现为其他形式,本申请实施例不作限制。
在一种实施方式中,当移动设备100是首次开启视频应用的在线投屏功能时,移动设备执行S103和S104,与用户选择的电子设备基于在线投屏共享网络视频。
在另一种实施方式中,当移动设备100是非首次开启视频应用的在线投屏功能时,移动 设备100自动地向上一次在线投屏连接的电子设备,通过在线投屏共享网络视频。也就是说,在另一种实施方式中,可以不再执行S103-S104,在S102之后,直接执行S105及其后续步骤。
在又一种实施方式中,当移动设备100检测到附近支持在线投屏的电子设备仅为一个电子设备时,自动地以在线投屏的方式向该电子设备无线投屏,共享网络视频。
S105,移动设备100和电子设备200之间建立传输控制协议(transmission control protocol,TCP)连接。
S106,移动设备100基于TCP连接将播放的网络视频的网址发送给电子设备200。
这里,网络视频的网址可以为URL,该URL定位到视频应用对应的服务器,例如服务器400。
在一些实施例中,移动设备100还可以将当前播放的网络视频的时间节点发送给电子设备200,以使得电子设备200从该时间节点处继续播放该网络视频。
在一些实施例中,移动设备100响应于S102中检测到的开启在线投屏功能的用户操作,与电子设备200建立TCP连接并将网络视频的网址发送给电子设备200后,还可以更改控件702的显示形式,例如增加阴影、改变控件702的颜色等,这样可以提示用户移动设备100当前正在基于在线投屏共享网络视频。在其他实施例中,移动设备100还可以通过显示文本等方式提示用户移动设备100当前正在基于在线投屏共享网络视频,本申请实施例不作限制。
S107,电子设备200从网络视频的网址处获取网络视频。
电子设备200可以根据网址,向服务器400请求获取网络视频,服务器400可以响应于该请求,将网络视频编码后通过网络(例如Wi-Fi接入点300组建的局域网)发送给电子设备200。
S108,电子设备200播放网络视频。
移动设备100基于在线投屏和电子设备200共享网络视频的过程中,用户可以继续操控移动设备100,且用户对移动设备100的操控不影响电子设备200继续播放网络视频。例如,用户可以操控移动设备100退出播放网络视频、退出运行视频应用并启动游戏应用等。
通过上述步骤S101-S108,移动设备100可以在用户选择不合适的镜像投屏共享网络视频时,将镜像投屏切换为在线投屏,提高了投屏效率,并且能够保证电子设备200播放网络视频的质量,从而保证用户得到最佳的投屏体验。即,图5B所示的投屏方法可以降低用户门槛,保障用户体验。
在一些实施例中,移动设备100还可以在一些情况下,将在线投屏切换回镜像投屏。下面通过可选步骤S109来介绍将在线投屏切换回镜像投屏的情况。
S109,移动设备100在检测到游戏应用启动后,或者在检测到游戏应用被切换为前台正在运行的应用程序后,在用户的触发下将在线投屏切换为镜像投屏。
具体的,移动设备100可以采用S009,或者S009和S010的方式,在检测到移动设备100在前台运行的应用程序为游戏应用后,移动设备100可以输出提示信息,该提示信息可用于询问用户是否将在线投屏切换为镜像投屏。
图8E示例性地示出了移动设备100启动游戏应用后显示的提示信息。如图8E所示,该提示信息可以为窗口705。窗口705中可包括:文本705a、控件705b、控件705c。文本705a例如可以为“是否将在线投屏切换为镜像投屏,获取更流畅的游戏投屏体验?”,用于询问用 户是否将将在线投屏切换为镜像投屏。控件705b用于监听用户操作,移动设备100可响应于该用户操作,不执行将在线投屏切换为镜像投屏的操作。控件705c用于监听用户操作,移动设备100可响应于该用户操作,将在线投屏切换为镜像投屏。移动设备100将在线投屏切换为镜像投屏的操作,可参考图5A所示方法中的S002-S007,此处不再赘述。
这样,在无线投屏的前提下,当前台运行的应用切换为游戏应用时,或者游戏应用启动时,移动设备100自动提示用户将在线投屏切换回镜像投屏。若用户选择将在线投屏切换回镜像投屏,则为用户提供更加流畅且时延更低的游戏投屏体验;若用户选择拒绝将在线投屏切换回镜像投屏,则用户可以既在电子设备200上观看网络视频,又在移动设备100上玩游戏。
在一种替换方式中,S109还可被替换为:在检测到游戏应用启动后,或者在检测到游戏应用被切换为前台正在运行的应用程序后,移动设备自动切换回镜像投屏。这样,无需用户选择,自动地切换回镜像投屏,使得用户体验更好。
接下来,介绍图5C的流程。图5C描述了针对定制类视频应用时,将移动设备100当前使用的镜像投屏切换为在线投屏的流程。在图5C中,移动设备100通过镜像投屏与电子设备200共享定制类视频应用提供的网络视频时,可以自动或者在用户的触发下,将镜像投屏切换为在线投屏,继续基于在线投屏共享该网络视频。图5C示出的流程,可包括:
S201-S204,移动设备100在播放网络视频的过程中,自动或者在用户的触发下将镜像投屏切换为在线投屏,基于在线投屏继续和电子设备200共享网络视频。
S201,移动设备100直接和电子设备200建立TCP连接。
视频应用为定制类视频应用,移动设备100在运行该视频应用并播放网络视频的过程中,视频应用可通过SDK接口从投屏服务处查询到投屏状态,还可以查询到当前与移动设备100共享网络视频的设备是电子设备200。因此,在S201中,移动设备100可以直接与查询到的电子设备200建立TCP连接,而无须像图5A中由用户选择或者由移动设备100根据上一次的在线投屏交互来默认选择。
S202-S204,与图5B中的S106-S108相同,此处不再赘述。
在一种实施方式中,S201-S204自动执行。
在一些实施例中,移动设备100可以在自动切换前输出提示信息,以提示用户移动设备100要将镜像投屏切换为在线投屏。
在一些实施例中,移动设备100可以在自动切换后输出提示信息,以提示用户当前已切换为在线投屏。
移动设备100在切换前后输出的提示信息的实现形式均可包括但不限于:显示屏上显示的界面元素、音频、指示灯闪烁、马达震动等。
图8F示例性地示出了移动设备100自动切换前所显示的提示信息1001,提示信息1001为文本“即将为您切换至在线投屏”。图8G示例性地示出了移动设备100自动切换后所显示的提示信息1002,提示信息1002为文本“已切换至在线投屏”。
在其他的实施方式中,移动设备100自动切换前或后,也可以不输出提示信息,本申请实施例对此不作限制。
这样,不仅可以提高投屏效率,保证用户得到最佳的投屏体验,还可以减少用户操作, 使得用户更加简单便捷。
在另一种实施方式中,移动设备100可以采用S009,或者S009和S010的方式,在识别到移动设备100在前台运行的应用程序为定制类视频应用时,移动设备100输出提示信息,并在用户的触发下将镜像投屏切换为在线投屏。
图8H示例性地示出了移动设备100识别到播放网络视频时输出的提示信息。图8H中的提示信息与图8B中的提示信息相同,此处不再赘述。示例性地,如图8H所示,移动设备100可以检测到作用于控件702上的用户操作(如点击操作、触摸操作),并响应于该用户操作,将镜像投屏切换为在线投屏。不限于此,移动设备100还可以响应于其他用户操作,例如摇晃手势、语音指令等,将镜像投屏切换为在线投屏,本申请实施例对此不作限制。在一些实施例中,响应于用户操作,移动设备100将镜像投屏切换为在线投屏之后,可以更改控件702的显示方式,以提示用户当前已切换为在线投屏。这样,可以在切换时给用户充分的选择权。
在一些实施例中,S202-S204之后,移动设备100还可以在一些情况下,将在线投屏切换回镜像投屏。也就是说,还可包括可选步骤S205。
S205,移动设备100将在线投屏切换回镜像投屏。
在一种实施方式中,在S205中,移动设备100可以在电子设备200播放完网络视频后,自动将在线投屏切换回镜像投屏。在一些情况中,电子设备200播放完网络视频后,需要用户进一步操作电子设备200才会重复播放该网络视频或者播放其他网络视频。图9示例性地示出了电子设备200播放完网络视频后所显示的用户界面。这样,移动设备100可以适应性地调整所使用的无线投屏方式,给用户提供最佳的投屏体验。此外,移动设备100可以在自动在后台执行在线投屏和镜像投屏之间的相互切换,无需用户操作,可以给用户带来良好的投屏体验。
在另一种实施方式中,在S205中,移动设备100可以在检测到游戏应用启动后,或者在检测到游戏应用被切换为前台正在运行的应用程序后,在用户的触发下将在线投屏切换回镜像投屏。有关具体细节,可参见S109中的相关描述,此处不再赘述。
在上述图5A-图5C所示的实施例中,非定制类视频应用可以被称为第一类应用,游戏类应用可以被称为第二类应用,定制类视频应用可以被称为第三类应用。
图5A的S008中移动设备100启动的应用程序可以被称为第一应用。
图5B的S101中移动设备100输出的提示信息可以被称为第一提示信息,S102中移动设备100接收到的开启在线投屏功能的用户操作可以被称为第一用户输入,S104中移动设备100检测到的选择电子设备200的用户操作可以被称为第二用户输入,S109中移动设备100检测到的启动或被切换到前台的应用可以被称为第二应用,游戏应用可以被称为第二类应用,S109中移动设备100输出的提示信息可以被称为第二提示信息。
在图5A或图5B所示的方法之后,移动设备100还可以将属于第一类应用的第一应用切换为前台运行,或者,启动属于第一类应用的第三应用,或者,将属于第一类应用的第三应用切换为前台运行,并自动输出第三提示信息,第三提示信息用于提示将第一无线投屏方式切换为第二无线投屏方式;或者,移动设备100自动将第一无线投屏方式切换为第二无线投 屏方式,以第二无线投屏方式向电子设备200无线投屏。
下面结合移动设备100的软件结构(例如图3B所示的软件结构),详细描述移动设备100执行图5B中的流程时内部各模块之间的交互流程。需要说明的是,此处虽然是以图3B所示的软件结构来示例性说明,但本领域技术人员应当知晓,其他操作系统(比如Harmony等)也可适用。
图10A示出了移动设备100执行图5B示出的流程时,内部各模块之间的交互流程。如图10A所示,可包括:
步骤1、投屏服务开启镜像投屏功能。
步骤1中,移动设备100检测到开启镜像投屏功能的用户操作后,投屏服务将调用无线通信模块160中的Wi-Fi直连、蓝牙或NFC中的一项或多项,并通过Wi-Fi直连、蓝牙、NFC中一项或多项发现该移动设备100附近的可镜像投屏的电子设备,与用户从中选择的电子设备200建立通信连接,并与电子设备200共享多媒体内容。
步骤2、视频应用启动,并获取网络视频。
步骤2的实现可参考图7A-图7C的相关描述,此处不再赘述。
在步骤2中,移动设备100检测到用于启动视频应用的用户操作后,启动视频应用,之后响应于用户选择网络视频播放的用户操作,获取该网络视频。
步骤3、场景感知模块识别到3种场景中的任意一种。
该3种场景为图5B中S101中提及的3种场景,具体可包括:(1)移动设备100启动非定制类视频应用;(2)移动设备100启动非定制类视频应用,并且播放用户从该应用中选择的网络视频;(3)移动设备100在前台持续运行非定制类视频应用的时长超过第一预设时长。
场景感知模块可以实时或者周期性地采用S009,或者S009和S010的方式,识别出移动设备100在前台运行的应用程序是否为非定制类视频应用,进而识别出上述3种场景。
在一些实施例中,场景感知模块可集成至投屏服务中。这样,投屏服务可用于执行步骤3。
步骤4、场景感知模块将识别到的场景通知给投屏服务。
步骤5、投屏服务获知场景感知模块识别到的场景后,输出提示信息,以提示用户切换至在线投屏。
具体的,投屏服务在获知移动设备100开启了镜像投屏功能,并且移动设备100处于上述的3种场景之一时,调用移动设备100的硬件例如显示屏、闪光灯、马达等输出提示信息,该提示信息用于提示用户将镜像投屏切换为在线投屏。提示信息的实现形式可参考S101的相关描述。
步骤6、视频应用在用户的触发下开启在线投屏功能。
步骤6可参考图5B中的S102-S106的相关描述。
具体的,视频应用可接收到用户开启视频应用的在线投屏功能的事件。示例性地,参考图8C,用户可以点击控件702,该点击操作可以被封装为开启视频应用的在线投屏功能的事件,从底层传递给视频应用。之后,视频应用可以开启在线投屏功能,即调用无线通信模块160来发送UDP广播、和电子设备200建立TCP连接、并基于该TCP连接将网络视频的网址发送给电子设备200。
图10B示出了移动设备100执行图5B示出的流程时,内部各模块之间的另一种交互流程。如图10B所示,可包括:
步骤1-步骤3,与图10A中的步骤1-步骤3相同,此处不再赘述。
步骤4、场景感知模块向投屏服务查询移动设备100的投屏状态。
具体的,场景感知模块可以从投屏服务处查询到当前移动设备100当前开启了镜像投屏功能。
场景感知模块在获知移动设备100开启了镜像投屏功能,并且移动设备100处于S101中的3种场景之一时,调用移动设备100的硬件例如显示屏、闪光灯、马达等输出提示信息,该提示信息用于提示用户将镜像投屏切换为在线投屏。提示信息的实现形式可参考S101的相关描述。
步骤5、与图10A中的步骤5相同,此处不再赘述。
可理解的,上述图10A及图10B示例中,场景感知模块的功能可以集成到投屏服务中。当场景感知模块的功能集成到投屏服务中时,图10A及图10B中场景感知模块执行的步骤均由投屏服务执行,两者之间的交互步骤也可以省略。
下面结合移动设备100的软件结构(例如图3B所示的软件结构),详细描述移动设备100执行图5C中的流程时内部各模块之间的交互流程。需要说明的是,此处虽然是以图3B所示的软件结构来示例性说明,但本领域技术人员应当知晓,其他操作系统(比如Harmony等)也可适用。如图11所示,可包括:
1、投屏服务开启镜像投屏功能。
2、视频应用启动,并获取网络视频。
步骤1-步骤2,与图10A中的步骤1-步骤2相同,此处不再赘述。
3、视频应用向投屏服务查询投屏状态。
视频应用可以通过SDK向投屏服务查询当前的投屏状态,获知移动设备100当前正在基于镜像投屏共享网络视频。在一些实施例中,视频应用可以通过SDK向投屏服务查询到当前正在接受移动设备100投屏的电子设备200。
4、视频应用获知移动设备100基于镜像投屏和电子设备200共享网络视频。
结合步骤2和步骤3,视频应用可获知当前移动设备100当前基于镜像投屏和电子设备200共享网络视频。
5、视频应用自动地或者在用户的触发下,开启在线投屏功能。
在一些实施例中,视频应用自动地将镜像投屏切换为在线投屏之前或之后,视频应用还可以调用显示屏、音频模块、闪光灯等模块来输出提示信息。提示信息的实现形式可参考图8F、图8G的相关描述。
在另一些实施例中,视频应用获知当前移动设备100当前基于镜像投屏和电子设备200共享网络视频后,可以调用显示屏输出提示信息以提示用户将镜像投屏切换为在线投屏,在用户输入开启在线投屏的用户操作后,再开启在线投屏功能。这里,显示屏输出提示信息以提示用户将镜像投屏切换为在线投屏的方式可参考图8H中的窗口704。
具体的,视频应用开启在线投屏功能,即调用无线通信模块160来发送UDP广播、与电子设备200建立TCP连接、并基于该TCP连接将网络视频的网址发送给电子设备200。
这样,当移动设备100启动的视频应用为非定制类视频应用时,场景感知模块可以感知 到S101中提及的3种场景,之后场景感知模块或者投屏服务可以输出提示信息,以提示用户将镜像投屏切换为在线投屏。之后,视频应用可以响应于用户看到提示信息后输入的用户操作开启在线投屏功能。当移动设备100启动的视频应用为定制类视频应用时,该定制类视频应用可以通过SDK接口从投屏服务处获知投屏状态,结合定制类视频应用自身的运行情况,可以识别到移动设备100基于镜像投屏共享网络视频的场景。因此,该视频应用可以自动或被动开启在线投屏功能。
也就是说,执行图5B所示的流程,无需视频应用做改进,仅需改进移动设备100的软件系统,增加场景感知模块,增加投屏服务或者场景感知模块输出提示信息的功能,即可实现在基于镜像投屏共享网络视频过程中将镜像投屏切换为在线投屏的功能。执行图5C所示的流程,需对视频应用程序做改进,新增SDK,即可实现在基于镜像投屏共享网络视频过程中将镜像投屏切换为在线投屏的功能。
在一些实施例中,还可以不区分定制类视频应用和非定制类视频应用。只要移动设备100启动的应用为视频应用,移动设备100就使用图5B示出的流程来切换无线投屏方式。
需要说明的是,本申请提供的无线投屏方法并非必需包括图5A、图5B和图5C所示的流程。图5A、图5B和图5C所示的流程以及上文中与图5A、图5B和图5C所示的流程相对应的文字中的部分,也可以单独构成本申请实施例提供的无线投屏方法。比如,图5A和图5B所示的流程,以及上文中与图5A和图5B所示的流程相对应的文字;再比如,图5A和图5C所示的流程,以及上文中与图5A和图5C所示的流程相对应的文字;又比如,图5A所示的流程的部分(例如,图5A中S009-S012),以及上文中与图5A所示的流程的部分相对应的文字,与图5B所示的流程,以及上文中与图5B所示的流程相对应的文字;均可单独构成本申请实施例提供的无线投屏方法。
图12为本申请实施例提供的另一种投屏方法的流程示意图。如图12所示,移动设备100基于镜像投屏与电子设备200共享多媒体内容时,移动设备100可以识别当前场景,并将该场景通知给电子设备200,电子设备200获知当前场景后,根据该场景适应性选择对应的播放策略来播放多媒体内容。移动设备100与电子设备200之间共享的多媒体内容可以为网络多媒体内容、本地多媒体内容或者两者的组合,本申请对此不作限制。
如图12所示,该方法可包括如下步骤:
S301-S307,与图5A中的S001-S007相同,此处不再赘述。
S308,移动设备100启动应用程序。
移动设备100启动的应用程序可以是安装的任意一款应用程序。例如,移动设备100可以响应于作用于图6A所示的主界面61上的视频应用的图标601上的用户操作,启动视频应用。又例如,移动设备100可以响应于作用于图6A所示的主界面61上的游戏应用的图标602上的用户操作,启动游戏应用。其中,视频应用是指由服务器来提供音视频服务的应用程序。游戏应用是指由服务器来提供游戏服务的应用程序。
S309,移动设备100识别到启动的应用程序对应的场景。
移动设备100可以根据启动的应用程序的不同,来区分不同的场景。例如,当移动设备100启动视频应用时,移动设备100可以识别到运行视频应用的场景;当移动设备100启动视频应用时,移动设备100可以识别到运行游戏应用的场景。
具体的,移动设备100的场景感知模块或投屏服务可以实时或者周期性地采用S009,或者S009和S010的方式,获知前台运行的应用程序,从而识别当前场景。
在一些实施例中,不限于根据启动的应用程序来区分不同的场景,移动设备100也可以根据其他的策略来区分不同的场景,本申请实施例对此不作限制。
S310,移动设备100通知电子设备200识别到的场景。
在一些实施例中,移动设备100可以在使用的镜像投屏的控制指令中携带识别到的场景的指示信息,从而通知电子设备200识别到的场景。以miracast为例,移动设备100可以在RTSP控制指令的扩展字段中携带识别到的场景的指示信息。例如,扩展字段取值为1,可以表示移动设备100识别到运行视频应用的场景;扩展字段取值为0,可以表示移动设备100识别到运行游戏应用的场景。
S311,电子设备200使用与该场景对应的播放策略来播放多媒体内容。
移动设备100与电子设备200之间基于镜像投屏共享多媒体内容时,可以有多种播放策略。示例性地,镜像投屏对应的播放策略可包括:实时送显策略,和音视频缓存同步策略。其中,使用实时送显策略时,电子设备200接收移动设备100发送的数据后直接解码播放,优先保障游戏投屏等场景下的用户低时延投屏体验。使用音视频缓存同步策略时,电子设备200在接收到移动设备100发送的编码后的数据后,会缓存一定数据保障多媒体内容的流畅性,同时自动比对多媒体内容中音频和图像的时间戳(presentation time stamp,PTS),当音画时间戳差值超过阈值后则采触发音视频同步处理逻辑(如音频倍速播放、图像丢帧等)保证电子设备200播放多媒体内容时的音画同步质量。
可理解的,实时送显策略、音视频缓存同步策略只是本申请实施例中所使用的一个词语,其代表的含义在本申请实施例中已经记载,其名称并不对本申请实施例构成任何限制。另外,在本申请其他一些实施例中,实时送显策略也可以被称为例如低时延快速送显模式等其他名词。同样的,本申请实施例中提到的音视频缓存同步策略,在其他一些实施例中也可以被称为例如缓存模式等其他名字。
示例性地,参考表1,表1示出了移动设备100在向电子设备200无线投屏的前提下,移动设备100所处的场景和电子设备200使用的播放策略的对应关系。如表1所示,运行游戏应用的场景可以对应于实时送显策略,运行视频应用的场景可以对应于音视频缓存同步策略。
表1
场景 播放策略
运行游戏应用的场景 实时送显策略
运行视频应用的场景 音视频缓存同步策略
在移动设备100运行游戏应用的场景中,实时性是影响用户投屏体验的主要因素,因此实时送显策略更加适合于运行游戏应用的场景。在移动设备100运行视频应用的场景中,投屏的流畅性以及音画同步质量是影响用户投屏体验的主要因素,因此音视频缓存同步策略更加适合于运行视频应用的场景。不限于实时送显策略和音视频缓存同步策略,在其他实施例中,镜像投屏还可以对应有其他的更多的播放策略,例如超低时延模式等,本申请实施例对此不作限制。镜像投屏对应的其他的播放策略可以由本领域技术人员自定义,本申请实施例不作限制。本申请实施例同样不限制各个场景和播放策略之间的对应关系,在其他一些实施例中,还可以有其他的对应关系,本申请实施例对此不作限制。
在具体实施S311时,电子设备200中的投屏服务可以指示投屏播放器使用移动设备100识别到的场景对应的播放策略来播放多媒体内容。
通过图12所示的投屏方法,在基于镜像投屏共享多媒体内容时,电子设备200可以根据移动设备100所处的场景自适应选择对应的播放策略来播放多媒体内容,即根据实际场景调整播放策略,这样可以保障用户的投屏体验。
下面结合移动设备100的软件结构(例如图3B所示的软件结构),详细描述移动设备100执行图12中的流程时内部各模块之间的交互流程。需要说明的是,此处虽然是以图3B所示的软件结构来示例性说明,但本领域技术人员应当知晓,其他操作系统(比如Harmony等)也可适用。如图13A所示,可包括:
移动设备100执行步骤1-步骤5:
步骤1、投屏服务开启镜像投屏功能。
步骤2、应用程序启动。
步骤3、场景感知模块识别到启动的应用程序所对应的场景。
步骤1的实现可参考图12所示方法中的S301-S307中移动设备100的操作;步骤2的实现可参考图12所示方法中的S308;步骤3的实现可参考图12所示方法中的S309。
步骤4、场景感知模块将识别到的场景通知给投屏服务。
步骤5、投屏服务将场景感知模块识别到的场景通知给电子设备200。
具体的,投屏服务在获知移动设备100开启了镜像投屏功能,并且获知了场景感知模块识别到的场景后,可以将该场景通知给电子设备200。
步骤5的实现可参考图12所示方法中的S310。
电子设备200执行的步骤6-步骤7:
步骤6、投屏服务确定与移动设备100所处场景对应的播放策略,并将该播放策略传递给投屏播放器。
步骤7、投屏播放器使用该播放策略播放多媒体内容。
步骤6、7的实现可参考图12所示方法中的S311。
图13B示出了移动设备100执行图5C的流程时,内部各模块之间的另一种交互流程。图13B和图13A的不同之处在于,步骤4中由移动设备100的场景感知模块向投屏服务查询投屏状态,步骤5中由场景感知模块将识别到的场景通知给电子设备200。
可理解的,上述图13A及图13B的示例中,场景感知模块的功能可以集成到投屏服务中。当场景感知模块的功能集成到投屏服务中时,图13A及图13B中场景感知模块执行的步骤均由投屏服务执行,两者之间的交互步骤也可以被省略。
可理解的,在本申请实施例中,上述图5B示出的流程、图5C示出的流程均可以和图12所示的方法结合实施,也可以分开单独实施。
例如,当图5B示出的流程与图12结合实施时,移动设备100基于镜像投屏和电子设备200共享网络视频的过程中,若电子设备200连接到网络,即电子设备200具有在线投屏的条件时,移动设备100可以执行图5B示出的流程;若电子设备200未连接到网络,即电子设备200不具有在线投屏的条件时,移动设备100可以执行图12所述的投屏方法。
例如,当图5C示出的流程和图12结合实施时,移动设备100基于镜像投屏和电子设备200共享网络视频的过程中,若电子设备200连接到网络,即电子设备200具有在线投屏的条件时,移动设备100可以执行图5C示出的流程;若电子设备200未连接到网络,即电子设备200不具有在线投屏的条件时,移动设备100可以执行图12所述的投屏方法。
本申请的各实施方式、各实施例中的全部或部分可以任意地、自由地进行组合。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (13)

  1. 一种移动设备,所述移动设备在前台运行第一应用,所述移动设备以第一无线投屏方式向电子设备无线投屏,其特征在于,所述移动设备包括:
    处理器;
    存储器;
    以及计算机程序,其中所述计算机程序存储在所述存储器上,当所述计算机程序被所述处理器执行时,使得所述移动设备执行以下步骤:
    在检测到所述第一应用属于第一类应用后,
    所述移动设备自动输出第一提示信息,所述第一提示信息用于提示将所述第一无线投屏方式切换为第二无线投屏方式;或者,
    所述移动设备自动将所述第一无线投屏方式切换为第二无线投屏方式,以所述第二无线投屏方式向所述电子设备无线投屏。
  2. 根据权利要求1所述的移动设备,其特征在于,在所述移动设备自动输出所述第一提示信息之后,所述移动设备还执行以下步骤:
    检测到第一用户输入,所述第一用户输入用于将所述第一无线投屏方式切换为第二无线投屏方式;
    响应于所述第一用户输入,所述移动设备自动输出一个或多个电子设备的标识,所述电子设备为所述移动设备检测到的支持第二无线投屏方式的电子设备;
    检测到第二用户输入,所述第二用户输入用于从所述电子设备的标识中选择一个电子设备的标识;
    响应于所述第二用户输入,所述移动设备将所述第一无线投屏方式切换为第二无线投屏方式,并以第二无线投屏方式向所选择的电子设备投屏。
  3. 根据权利要求1或2所述的移动设备,其特征在于,所述移动设备还执行以下步骤:
    在检测到属于第二类应用的第二应用启动后,或者,在检测到属于第二类应用的第二应用被切换为前台运行的应用后,
    所述移动设备自动输出第二提示信息,所述第二提示信息用于提示将所述第二无线投屏方式切换为第一无线投屏方式;或者,
    所述移动设备自动将所述第二无线投屏方式切换为第一无线投屏方式,以所述第一无线投屏方式向所述电子设备无线投屏。
  4. 根据权利要求3所述的移动设备,其特征在于,所述移动设备还执行以下步骤:
    在检测到属于第一类应用的所述第一应用被切换为前台运行的应用后,或者,在检测到属于第一类应用的第三应用启动后,或者,在检测到属于第一类应用的第三应用被切换为前台运行的应用后,
    所述移动设备自动输出第三提示信息,所述第三提示信息用于提示将所述第一无线投屏方式切换为所述第二无线投屏方式;或者,
    所述移动设备自动将所述第一无线投屏方式切换为所述第二无线投屏方式,以所述第二无线投屏方式向所述电子设备无线投屏。
  5. 根据权利要求1-4中任意一项所述的移动设备,其特征在于,所述第一类应用为非定制类视频应用,第二类应用为游戏应用;所述第一无线投屏方式为镜像投屏方式,所述第二无线投屏方式为在线投屏方式;所述一个或多个电子设备包括所述电子设备,或者,所述一 个或多个电子设备不包括所述电子设备;所述第一用户输入和所述第二用户输入的输入形式包括触摸输入和语音输入;所述非定制类视频应用本身不具备发起投屏能力。
  6. 一种移动设备,所述移动设备以第一无线投屏方式向电子设备无线投屏,其特征在于,所述移动设备包括:
    处理器;
    存储器;
    以及计算机程序,其中所述计算机程序存储在所述存储器上,当所述计算机程序被所述处理器执行时,使得所述移动设备执行以下步骤:
    在检测到第一应用启动,以及所述第一应用属于第一类应用后,
    所述移动设备自动输出第一提示信息,所述第一提示信息用于提示将所述第一无线投屏方式切换为第二无线投屏方式;或者,
    所述移动设备自动将所述第一无线投屏方式切换为第二无线投屏方式,以所述第二无线投屏方式向所述电子设备无线投屏。
  7. 一种无线投屏方法,所述无线投屏方法应用于移动设备,所述移动设备包括处理器和存储器,所述移动设备在前台运行第一应用,所述移动设备以第一无线投屏方式向电子设备无线投屏;所述方法包括:
    在检测到所述第一应用属于第一类应用后,
    所述移动设备自动输出第一提示信息,所述第一提示信息用于提示将所述第一无线投屏方式切换为第二无线投屏方式;或者,
    所述移动设备自动将所述第一无线投屏方式切换为第二无线投屏方式,以所述第二无线投屏方式向所述电子设备无线投屏。
  8. 根据权利要求7所述的方法,其特征在于,在所述移动设备自动输出所述第一提示信息之后,所述方法还包括:
    检测到第一用户输入,所述第一用户输入用于将所述第一无线投屏方式切换为第二无线投屏方式;
    响应于所述第一用户输入,所述移动设备自动输出一个或多个电子设备的标识,所述电子设备为所述移动设备检测到的支持第二无线投屏方式的电子设备;
    检测到第二用户输入,所述第二用户输入用于从所述电子设备的标识中选择一个电子设备的标识;
    响应于所述第二用户输入,所述移动设备将所述第一无线投屏方式切换为第二无线投屏方式,并以第二无线投屏方式向所选择的电子设备投屏。
  9. 根据权利要求7或8所述的方法,其特征在于,所述方法还包括:
    在检测到属于第二类应用的第二应用启动后,或者,在检测到属于第二类应用的第二应用被切换为前台运行的应用后,
    所述移动设备自动输出第二提示信息,所述第二提示信息用于提示将所述第二无线投屏方式切换为第一无线投屏方式;或者,
    所述移动设备自动将所述第二无线投屏方式切换为第一无线投屏方式,以所述第一无线投屏方式向所述电子设备无线投屏。
  10. 根据权利要求9所述的方法,其特征在于,所述方法还包括:
    在检测到属于第一类应用的所述第一应用被切换为前台运行的应用后,或者,在检测到 属于第一类应用的第三应用启动后,或者,在检测到属于第一类应用的第三应用被切换为前台运行的应用后,
    所述移动设备自动输出第三提示信息,所述第三提示信息用于提示将所述第一无线投屏方式切换为所述第二无线投屏方式;或者,
    所述移动设备自动将所述第一无线投屏方式切换为所述第二无线投屏方式,以所述第二无线投屏方式向所述电子设备无线投屏。
  11. 一种无线投屏方法,所述无线投屏方法应用于移动设备,所述移动设备包括处理器和存储器,所述移动设备以第一无线投屏方式向电子设备无线投屏,所述方法包括:
    在检测到第一应用启动,以及所述第一应用属于第一类应用后,
    所述移动设备自动输出第一提示信息,所述第一提示信息用于提示将所述第一无线投屏方式切换为第二无线投屏方式;或者,
    所述移动设备自动将所述第一无线投屏方式切换为第二无线投屏方式,以所述第二无线投屏方式向所述电子设备无线投屏。
  12. 一种计算机可读存储介质,包括计算机程序,其特征在于,当所述计算机程序在移动设备上运行时,使得所述移动设备执行如权利要求7-11中任一项所述的方法。
  13. 一种计算机程序产品,其特征在于,当所述计算机程序产品在移动设备上运行时,使得所述移动设备执行如权利要求7-11中任一项所述的方法。
PCT/CN2021/124895 2020-10-30 2021-10-20 无线投屏方法、移动设备及计算机可读存储介质 WO2022089271A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21884999.0A EP4227792A4 (en) 2020-10-30 2021-10-20 WIRELESS SCREEN SHARING METHOD, MOBILE DEVICE AND COMPUTER READABLE STORAGE MEDIUM
US18/251,119 US20230385008A1 (en) 2020-10-30 2021-10-20 Wireless Projection Method, Mobile Device, and Computer-Readable Storage Medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011198023.XA CN114442971A (zh) 2020-10-30 2020-10-30 无线投屏方法、移动设备及计算机可读存储介质
CN202011198023.X 2020-10-30

Publications (1)

Publication Number Publication Date
WO2022089271A1 true WO2022089271A1 (zh) 2022-05-05

Family

ID=81357670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/124895 WO2022089271A1 (zh) 2020-10-30 2021-10-20 无线投屏方法、移动设备及计算机可读存储介质

Country Status (4)

Country Link
US (1) US20230385008A1 (zh)
EP (1) EP4227792A4 (zh)
CN (1) CN114442971A (zh)
WO (1) WO2022089271A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979756A (zh) * 2022-05-13 2022-08-30 北京字跳网络技术有限公司 一种实现一分多的投屏独立显示和交互方法、装置及设备
CN115599335A (zh) * 2022-12-13 2023-01-13 佳瑛科技有限公司(Cn) 基于多屏模式下共享版式文件的方法及系统
CN115729504A (zh) * 2022-10-08 2023-03-03 珠海金智维信息科技有限公司 基于远程操控大屏的数据可视化系统、方法和装置
CN116069284A (zh) * 2023-02-27 2023-05-05 南京芯驰半导体科技有限公司 投屏方法、硬件系统及存储介质
WO2024119365A1 (zh) * 2022-12-06 2024-06-13 广州视源电子科技股份有限公司 一种无线传屏方法、设备及存储介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115174990B (zh) * 2022-06-30 2024-06-11 深圳创维-Rgb电子有限公司 语音投屏方法、装置、设备及计算机可读存储介质
CN115209213B (zh) * 2022-08-23 2023-01-20 荣耀终端有限公司 一种无线投屏方法及移动设备
CN115695928B (zh) * 2022-09-26 2024-06-07 抖音视界有限公司 一种投屏方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109120970A (zh) * 2018-09-30 2019-01-01 珠海市君天电子科技有限公司 一种无线投屏方法、终端设备及存储介质
CN110049368A (zh) * 2019-04-26 2019-07-23 北京奇艺世纪科技有限公司 一种显示方法及相关设备
CN110248224A (zh) * 2019-05-24 2019-09-17 南京苏宁软件技术有限公司 投屏连接建立方法、装置、计算机设备和存储介质
CN110536008A (zh) * 2019-08-19 2019-12-03 维沃移动通信有限公司 一种投屏方法及移动终端
WO2020014880A1 (zh) * 2018-07-17 2020-01-23 华为技术有限公司 一种多屏互动方法及设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140372621A1 (en) * 2013-06-18 2014-12-18 Huawei Technologies Co., Ltd. AirSharing Method, AirSharing Apparatus, and Terminal Device
CN113504851A (zh) * 2018-11-14 2021-10-15 华为技术有限公司 一种播放多媒体数据的方法及电子设备
CN110109636B (zh) * 2019-04-28 2022-04-05 华为技术有限公司 投屏方法、电子设备以及系统
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN111628847B (zh) * 2020-05-06 2022-04-08 上海幻电信息科技有限公司 数据传输方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020014880A1 (zh) * 2018-07-17 2020-01-23 华为技术有限公司 一种多屏互动方法及设备
CN109120970A (zh) * 2018-09-30 2019-01-01 珠海市君天电子科技有限公司 一种无线投屏方法、终端设备及存储介质
CN110049368A (zh) * 2019-04-26 2019-07-23 北京奇艺世纪科技有限公司 一种显示方法及相关设备
CN110248224A (zh) * 2019-05-24 2019-09-17 南京苏宁软件技术有限公司 投屏连接建立方法、装置、计算机设备和存储介质
CN110536008A (zh) * 2019-08-19 2019-12-03 维沃移动通信有限公司 一种投屏方法及移动终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4227792A4

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979756A (zh) * 2022-05-13 2022-08-30 北京字跳网络技术有限公司 一种实现一分多的投屏独立显示和交互方法、装置及设备
CN114979756B (zh) * 2022-05-13 2024-05-07 北京字跳网络技术有限公司 一种实现一分多的投屏独立显示和交互方法、装置及设备
CN115729504A (zh) * 2022-10-08 2023-03-03 珠海金智维信息科技有限公司 基于远程操控大屏的数据可视化系统、方法和装置
CN115729504B (zh) * 2022-10-08 2023-07-21 珠海金智维信息科技有限公司 基于远程操控大屏的数据可视化系统、方法和装置
WO2024119365A1 (zh) * 2022-12-06 2024-06-13 广州视源电子科技股份有限公司 一种无线传屏方法、设备及存储介质
CN115599335A (zh) * 2022-12-13 2023-01-13 佳瑛科技有限公司(Cn) 基于多屏模式下共享版式文件的方法及系统
CN115599335B (zh) * 2022-12-13 2023-08-22 佳瑛科技有限公司 基于多屏模式下共享版式文件的方法及系统
CN116069284A (zh) * 2023-02-27 2023-05-05 南京芯驰半导体科技有限公司 投屏方法、硬件系统及存储介质

Also Published As

Publication number Publication date
EP4227792A1 (en) 2023-08-16
CN114442971A (zh) 2022-05-06
EP4227792A4 (en) 2024-02-21
US20230385008A1 (en) 2023-11-30

Similar Documents

Publication Publication Date Title
WO2022089271A1 (zh) 无线投屏方法、移动设备及计算机可读存储介质
CN110109636B (zh) 投屏方法、电子设备以及系统
US11818420B2 (en) Cross-device content projection method and electronic device
WO2022257977A1 (zh) 电子设备的投屏方法和电子设备
US8875191B2 (en) Device for reproducing content and method thereof
CN112398855B (zh) 应用内容跨设备流转方法与装置、电子设备
CN112527174B (zh) 一种信息处理方法及电子设备
WO2021249318A1 (zh) 一种投屏方法和终端
US9509947B2 (en) Method and apparatus for transmitting file during video call in electronic device
US20230119300A1 (en) Method for Resuming Playing Multimedia Content Between Devices
CN114679610B (zh) 连续播放视频的投屏方法、装置及系统
US20230305681A1 (en) Task processing method and related electronic device
WO2022127661A1 (zh) 应用共享方法、电子设备和存储介质
WO2022222924A1 (zh) 一种投屏显示参数调节方法
WO2023005900A1 (zh) 一种投屏方法、电子设备及系统
WO2022068882A1 (zh) 镜像投屏方法、装置及系统
WO2023005711A1 (zh) 一种服务的推荐方法及电子设备
WO2022110939A1 (zh) 一种设备推荐方法及电子设备
WO2022052706A1 (zh) 一种服务的分享方法、系统及电子设备
WO2023024973A1 (zh) 一种音频控制方法及电子设备
CN114827514B (zh) 电子设备及其与其他电子设备的数据传输方法和介质
WO2024022307A1 (zh) 一种投屏方法及电子设备
CN113613230B (zh) 一种扫描参数的确定方法及电子设备
CN117241086A (zh) 通信方法、芯片、电子设备及计算机可读存储介质
CN117992007A (zh) 音频控制方法、存储介质、程序产品及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21884999

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18251119

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2021884999

Country of ref document: EP

Effective date: 20230509

NENP Non-entry into the national phase

Ref country code: DE