WO2021083280A1 - 一种跨设备的内容投射方法及电子设备 - Google Patents

一种跨设备的内容投射方法及电子设备 Download PDF

Info

Publication number
WO2021083280A1
WO2021083280A1 PCT/CN2020/124854 CN2020124854W WO2021083280A1 WO 2021083280 A1 WO2021083280 A1 WO 2021083280A1 CN 2020124854 W CN2020124854 W CN 2020124854W WO 2021083280 A1 WO2021083280 A1 WO 2021083280A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
content
nfc tag
projection
electronic
Prior art date
Application number
PCT/CN2020/124854
Other languages
English (en)
French (fr)
Inventor
王宇冬
伍晓晖
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/773,346 priority Critical patent/US11818420B2/en
Priority to EP20881192.7A priority patent/EP4044609A4/en
Publication of WO2021083280A1 publication Critical patent/WO2021083280A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4131Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8106Monomedia components thereof involving special audio data, e.g. different tracks for different languages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Definitions

  • This application relates to the field of terminals, and in particular to a cross-device content projection method and electronic equipment.
  • a user or family often has multiple electronic devices, and users often need to switch between multiple electronic devices. For example, a user uses a mobile phone to watch a video and may wish to switch the video to the TV to continue watching after returning home. For another example, a user can use a laptop to work at home, and when the user leaves home, he may wish to switch the files in the laptop to the mobile phone to continue working.
  • the user is usually required to manually project the content in one device to another or multiple devices.
  • users can connect mobile phones, smart TVs, speakers and other electronic devices to the same Wi-Fi network.
  • users can use the screen projection function or screen projection in the mobile phone.
  • the software searches multiple electronic devices in the current Wi-Fi network.
  • the user can select the target device that receives the projected content this time from among the searched multiple electronic devices.
  • the mobile phone can project images, videos, audios, and other projected content to the target device through the Wi-Fi network. Obviously, this process of switching projected content between multiple devices is time-consuming and cumbersome, and the user experience is not high.
  • the present application provides a cross-device content projection method and an electronic device.
  • the electronic device can easily and quickly project the projected content to multiple other electronic devices for playback, thereby improving the work efficiency of collaboration between multiple devices during content projection.
  • the present application provides a cross-device content projection method, including: the first electronic device starts to play the first content, for example, the first content may include display content and/or audio content; further, when the first electronic device When the distance between the NFC tag and the NFC tag is close enough, the first electronic device can obtain N (N is an integer greater than 1) second electronic devices bound to the NFC tag from the NFC tag; in this way, the first electronic device can According to a preset projection strategy, the first content is projected onto at least one second electronic device among the N second electronic devices to continue playing.
  • N is an integer greater than 1
  • the first electronic device can easily and quickly determine the multiple target devices for this content projection, so as to automatically start the projection to multiple target devices.
  • Projecting content simplifies the user's operation process when projecting content across devices, improves and enriches the user experience, and at the same time improves the efficiency of collaboration between multiple devices during content projection.
  • the first content projected by the first electronic device to the second electronic device may include part or all of the display content being displayed on the display interface of the first electronic device.
  • the first electronic device may project all the displayed content in the first interface (for example, the desktop) being displayed as the first content to the second electronic device.
  • the first electronic device may project an image in a certain video in the playing interface being displayed as the first content to the second electronic device.
  • the first content projected by the first electronic device to the second electronic device may also include audio content being played by the first electronic device, for example, music being played by the first electronic device or audio synchronized with a video being played.
  • audio content being played by the first electronic device
  • the first electronic device projects the first content to the second electronic device, if the first electronic device starts to play other content (for example, the second content) in response to the user operation, the first electronic device can continue to project the second content Play in the second electronic device.
  • the first electronic device obtains N second electronic devices bound to the NFC tag from the NFC tag, including: responding to a touch operation of the first electronic device approaching or touching the NFC tag , The first electronic device reads the identities of the N second electronic devices stored in the NFC tag to determine the N second electronic devices bound to the NFC tag; or, the first electronic device uses the NFC of the first electronic device After detecting the NFC signal from the NFC tag, the chip reads the identifiers of the N second electronic devices stored in the NFC tag through the NFC signal to determine the N second electronic devices bound to the NFC tag.
  • the user can trigger the first electronic device to read the identification of the second electronic device stored in the NFC tag through the NFC function by approaching or touching the NFC tag, thereby confirming the content projection with the first electronic device this time N second electronic devices.
  • the first electronic device projects the first content to at least one second electronic device among the N second electronic devices according to a preset projection strategy to continue playing, including: the first electronic device According to the preset projection strategy, the first content is sent to at least one second electronic device among the N second electronic devices for playing.
  • the first electronic device can be used as the main device for this content projection, and the second electronic device can be controlled to perform content projection.
  • the foregoing N second electronic devices may include a first speaker and a second speaker; wherein, the first electronic device sends the first content to at least one of the N second electronic devices according to a preset projection strategy Playing by the second electronic device includes: the first electronic device sends the first content to the first speaker for playing, and the first speaker is the speaker closest to the first electronic device; or, the first electronic device sends the first content to the first electronic device.
  • the first electronic device can compare the distance between itself and the first sound box and the second sound box. If the distance between the first speaker and the first electronic device is less than the preset value, and the distance between the second speaker and the first electronic device is greater than the preset value, it means that the first speaker is closer to the first electronic device and the second speaker is closer to the first electronic device. If the speaker is far away from the first electronic device, the first electronic device can send the first content to the first speaker to play, and complete the content projection this time.
  • the first electronic device can send the first content to the two devices, the first speaker and the second speaker, so as to project the first content to the first speaker and the second speaker for playback.
  • the first electronic device may also determine which device or devices to send the first content to to play according to the stored projection strategy, and the embodiment of the present application does not impose any limitation on this.
  • the first electronic device sending the first content to the first speaker and the second speaker for playback includes: the first electronic device sends the first audio component in the first content to the first speaker for playback And, the first electronic device sends the second audio component in the first content to the second speaker for playback.
  • the mobile phone can send the third audio component in the first content to the third speaker for playback. That is to say, the first electronic device can send the corresponding audio component in the projected content to each speaker, so that the multiple speakers can play the received audio components separately to achieve stereo or surround sound playback effects.
  • the above N second electronic devices may include speakers (there may be one or more speakers) and a television (there may be one or more televisions); wherein, the first electronic device is in accordance with a preset
  • the projection strategy, sending the first content to at least one of the N second electronic devices for playback includes: the first electronic device can send the display content (such as images or videos) in the first content to the TV for playback And, the first electronic device sends the audio content in the first content to the speaker to play; or, the first electronic device can send the display content in the first content to the TV for playback; and, the first electronic device sends the first content The audio content in is sent to the TV and speakers for playback.
  • the method further includes: the first electronic device determines among the above N second electronic devices The main device; wherein the first electronic device projects the first content to at least one second electronic device among the N second electronic devices according to the preset projection strategy to continue playing, including: the first electronic device sends the first content To the main device, so that the main device controls at least one second electronic device among the N second electronic devices to play the first content according to a preset projection strategy.
  • the first electronic device may determine a master device among the N second electronic devices, and the master device controls the N second electronic devices to realize the content projection this time.
  • the foregoing N second electronic devices may include a television and a lamp; wherein, the first electronic device determines the main device among the N second electronic devices, including: the first electronic device determines the television as the foregoing Nth electronic device. 2.
  • the main device in the electronic device; at this time, the preset projection control strategy may include: the television broadcasts the display content and audio content in the first content, and the television broadcaster sends control instructions to the light according to the first content to make the light The brightness or color of the light can achieve different tube effects.
  • the method further includes: the first electronic device sends the stored projection strategy to the master device.
  • the main device can also obtain the above projection strategy from other electronic devices or servers.
  • the method further includes: The electronic device performs time synchronization with the N second electronic devices; wherein, the first content sent by the first electronic device carries a time stamp, and the time stamp is used to indicate the playback progress of the first content. Since the time of each device is synchronized after the first electronic device and the aforementioned N second electronic devices are time synchronized, when the second electronic device plays the projected content according to the timestamp in the first content, it can be guaranteed that each second electronic device The playback progress between the electronic devices is the same.
  • the method further includes: the first electronic device receives the user's response to the above N second electronic devices The input projection strategy.
  • the user can manually set corresponding projection strategies for multiple devices participating in the content projection during the content projection process.
  • the present application provides a cross-device content projection method, including: a first electronic device displays an NFC tag binding interface, the binding interface includes a list of candidate devices waiting to be bound with the NFC tag, and the candidate device The candidate device in the list and the first electronic device are located in the same communication network; if the first electronic device detects the first operation of the user selecting M (M is an integer greater than 0) second electronic devices in the candidate device list, In response to the first operation, the first electronic device may prompt the user to bring the first electronic device close to or touch the aforementioned NFC tag, so that the first electronic device can write the identifications of the aforementioned M second electronic devices into the NFC tag to Establish a binding relationship between the NFC tag and the M second electronic devices.
  • M is an integer greater than 0
  • one or more second electronic devices bound to the NFC tag can be determined by reading the identification of the device bound to the NFC tag, that is, the target of content projection. equipment.
  • the first electronic device displays the binding interface of the NFC tag, including: the first electronic device reads a preset flag bit in the NFC tag; if the value in the flag bit is the first preset If the value is set, it indicates that the NFC tag has not been bound to any electronic device, and the first electronic device can open the preset projection application to display the binding interface of the NFC tag.
  • the method further includes: the first electronic device changes the value of the flag bit from the first preset The value is modified to the second preset value, thereby indicating that the NFC tag has been bound with one or more electronic devices.
  • the method further includes: the first electronic device displays the setting interface of the projection strategy; the first electronic device receives the user The projection strategy input to the above M second electronic devices in the setting interface, and save the projection strategy. That is, after the first electronic device establishes the corresponding binding relationship in the NFC tag, the user can continue to set the projection strategy of the M second electronic devices bound to the NFC tag when performing content projection in the projection application.
  • the aforementioned projection strategy may include the correspondence between different NFC operations and projection instructions. For example, the corresponding relationship between one touch of the NFC tag and the projection instruction 1; the second touch of the corresponding relationship between the NFC tag and the projection instruction 2.
  • the aforementioned projection strategy may include content projection rules set for each second electronic device.
  • the above M second electronic devices include a TV, a speaker, and a lamp, the user can respectively set specific projection rules when projecting to the TV, the speaker, and the lamp in the setting interface.
  • the foregoing projection strategy may be: use the speaker closest to the source device to play the projected content, or, the projection strategy may be: use the first speaker Playing the first audio component in the projected content and using the second speaker to play the second audio component in the projected content;
  • the above projection strategy may be: use the TV to play the display content in the projected content, and use the speaker to play the audio content in the projected content; or, use the TV to play the projection Display content in the content, and use speakers and TV to play the audio content in the projected content;
  • the above projection strategy is: use the TV to play the projected content, and the TV controls the lighting effect of the lamp.
  • the method further includes: the first electronic device connects the NFC tag with the M second electronic devices.
  • the binding relationship between them is sent to other electronic devices or servers.
  • the first electronic device can share the aforementioned binding relationship with other electronic devices for use, or the user can also obtain the aforementioned binding relationship when logging in to the server using other electronic devices.
  • the candidate device in the candidate device list and the first electronic device may be located in the same Wi-Fi network, or the candidate device in the candidate device list and the first electronic device may be bound to the same account.
  • writing the identification of the second electronic device into the NFC tag by the first electronic device includes: in response to a touch operation of the first electronic device approaching or touching the NFC tag, the first electronic device writing into the NFC tag Enter the identification of the second electronic device; or, after the first electronic device uses its NFC chip to detect the NFC signal from the NFC tag, it can write the identification of the second electronic device into the NFC tag.
  • the user can trigger the first electronic device to write the identification of the second electronic device into the NFC tag by approaching or touching the NFC tag.
  • reading the preset flag in the NFC tag by the first electronic device includes: in response to the touch operation of the first electronic device approaching or touching the NFC tag, the first electronic device can read the preset flag in the NFC tag Flag bit; or, after the first electronic device uses its NFC chip to detect the NFC signal from the NFC tag, it can read the flag bit preset in the NFC tag. In other words, the user can trigger the first electronic device to read the preset flag in the NFC tag by approaching or touching the NFC tag.
  • the present application provides a content projection system, including a first electronic device, N second electronic devices, and an NFC tag, where N is an integer greater than 1; the NFC tag and the above Nth NFC tag are stored in the NFC tag. 2. Binding relationship between electronic devices; wherein, the first electronic device is used to execute the cross-device content projection method described in any one of the above.
  • the foregoing N second electronic devices include a main device; wherein, the main device is used to: receive the first content sent by the first electronic device; and control the N second electronic devices according to a preset projection strategy. At least one second electronic device among the two electronic devices plays the first content.
  • the first electronic device can be used as the master device, and at least one second electronic device among the N second electronic devices can be controlled to play the first content according to a preset projection strategy.
  • the present application provides an electronic device including: a touch screen, a communication interface, one or more processors, a memory, and one or more computer programs; wherein the processor is coupled with the touch screen, the communication interface and the memory,
  • the above-mentioned one or more computer programs are stored in a memory, and when the electronic device is running, the processor executes the one or more computer programs stored in the memory, so that the electronic device executes any of the above-mentioned cross-device content Projection method.
  • the present application provides a computer storage medium, including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the cross-device content projection method described in any one of the above.
  • this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the cross-device content projection method described in any one of the above.
  • the content projection system described in the third aspect, the electronic device described in the fourth aspect, the computer-readable storage medium described in the fifth aspect, and the computer program product described in the sixth aspect provided above are all used Since the corresponding method provided above is executed, the beneficial effects that can be achieved can refer to the beneficial effects of the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is a schematic diagram 1 of the architecture of a content projection system provided by an embodiment of the application;
  • FIG. 2 is a second schematic diagram of the architecture of a content projection system provided by an embodiment of this application.
  • FIG. 3 is a third schematic diagram of the architecture of a content projection system provided by an embodiment of the application.
  • FIG. 4 is a fourth schematic diagram of the architecture of a content projection system provided by an embodiment of this application.
  • FIG. 5 is a first structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 6 is a schematic structural diagram of an operating system in an electronic device provided by an embodiment of this application.
  • FIG. 7 is a schematic diagram 1 of an application scenario of a cross-device content projection method provided by an embodiment of this application;
  • FIG. 8 is a first schematic flowchart of a cross-device content projection method according to an embodiment of the application.
  • FIG. 9 is a schematic diagram 2 of an application scenario of a cross-device content projection method provided by an embodiment of the application.
  • FIG. 10 is a third schematic diagram of an application scenario of a cross-device content projection method provided by an embodiment of this application.
  • FIG. 11 is a fourth schematic diagram of an application scenario of a cross-device content projection method provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram 5 of an application scenario of a cross-device content projection method provided by an embodiment of this application;
  • FIG. 13 is a sixth schematic diagram of an application scenario of a cross-device content projection method provided by an embodiment of this application.
  • FIG. 14 is a schematic diagram 7 of an application scenario of a cross-device content projection method provided by an embodiment of this application;
  • 15 is an eighth schematic diagram of an application scenario of a cross-device content projection method provided by an embodiment of this application.
  • 16 is a second schematic flowchart of a cross-device content projection method according to an embodiment of the application.
  • FIG. 17 is a schematic diagram 9 of an application scenario of a cross-device content projection method provided by an embodiment of the application.
  • FIG. 18 is a tenth schematic diagram of an application scenario of a cross-device content projection method provided by an embodiment of this application.
  • FIG. 19 is a schematic diagram eleventh of an application scenario of a cross-device content projection method provided by an embodiment of this application.
  • 20 is a schematic diagram 12 of an application scenario of a cross-device content projection method provided by an embodiment of this application;
  • FIG. 21 is a thirteenth schematic diagram of an application scenario of a cross-device content projection method provided by an embodiment of this application.
  • FIG. 22 is a second structural diagram of an electronic device provided by an embodiment of this application.
  • the cross-device content projection method provided by the embodiment of the present application can be applied to the communication system (also referred to as the content projection system) 100 shown in FIG. 1.
  • the communication system 100 may include N (N is an integer greater than 1) electronic devices. These N electronic devices can be interconnected through a communication network.
  • the foregoing communication network may be a wired network or a wireless network.
  • the aforementioned communication network may be a local area network (LAN), or a wide area network (wide area network, WAN), such as the Internet.
  • LAN local area network
  • WAN wide area network
  • the above-mentioned communication network can be implemented using any known network communication protocol.
  • the above-mentioned network communication protocol can be various wired or wireless communication protocols, such as Ethernet, universal serial bus (USB), Firewire (FIREWIRE), Global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code division multiple access (wideband code) division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), Bluetooth, wireless fidelity (Wi-Fi), NFC, voice over Internet protocol (VoIP) based on Internet protocol, communication protocol supporting network slicing architecture, or any other suitable communication protocol.
  • USB universal serial bus
  • Firewire FIREWIRE
  • GSM Global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • Bluetooth wireless fidelity
  • Wi-Fi wireless fidelity
  • VoIP voice over Internet protocol
  • various electronic devices in the communication system 100 may establish a Wi-Fi connection through a Wi-Fi protocol.
  • each electronic device in the communication system 100 can be interconnected through one or more servers after logging into the same account (for example, a Huawei account).
  • the above-mentioned communication system 100 may include a first electronic device 101 and a second electronic device 102.
  • the first electronic device 101 can be used as a source device
  • the second electronic device 102 can be used as a target device of the first electronic device 101.
  • the electronic device 101 can project the content displayed or played by the electronic device 101 to the second electronic device 102.
  • the specific content projected from one electronic device to another electronic device is called projected content.
  • the projected content can include text, pictures, videos, audio, animation, lighting effects, or web pages.
  • the electronic device can send the projected content such as text, picture, video, audio, animation or webpage to another electronic device for display or playback; or the electronic device can also send the light control instruction as the projected content to another electronic device.
  • the aforementioned communication system 100 may also include a third electronic device 103.
  • the first electronic device 101 when the first electronic device 101 is the source device, both the second electronic device 102 and the third electronic device 103 can serve as the target device of the first electronic device 101 to receive the first electronic device 101. Projected content sent.
  • the first electronic device 101 can simultaneously project the projected content to multiple electronic devices for display or playback.
  • a mobile phone can project its audio files to multiple speakers for playback at the same time.
  • the mobile phone can project the displayed video screen to the TV for display, and at the same time project the audio content corresponding to the video screen to the speaker for playback.
  • the source device in the communication system 100 can project the projected content to one or more target devices, so as to realize cross-device interaction when content projection is performed between multiple devices.
  • an electronic tag bound to one or more electronic devices may also be provided in the aforementioned communication system 100, which may also be referred to as a radio frequency tag or an RFID (radio frequency identification, radio frequency identification) tag.
  • Electronic equipment can read the information stored in the electronic tag by sending radio frequency signals.
  • the above electronic tags may include three implementation forms, namely: passive tags, semi-active tags, and active tags.
  • the above electronic tag may be any one of a passive tag, a semi-active tag, or an active tag.
  • (1) Passive tag When the electronic tag is a passive tag, there is no internal power supply in the electronic tag. When the electronic tag is close to the NFC (near field communication) chip of other devices, it can receive electromagnetic wave information sent by the NFC chip of other devices. At this time, the internal integrated circuit (IC) of the electronic tag is driven by the received electromagnetic wave signal. When the electronic tag receives electromagnetic wave signals of sufficient strength, it can send the data stored in the electronic tag to the NFC chip of other devices, such as the device information of the above-mentioned notebook computer.
  • NFC near field communication
  • the working mode of semi-active tags is similar to that of passive tags.
  • the electronic tag is a semi-active tag, the electronic tag includes a small battery, and the power of the small battery is sufficient to drive the internal IC of the electronic tag, so that the IC is in a working state. Since the semi-active tag includes the above-mentioned small battery; therefore, the reaction speed of the semi-active tag is faster than that of the passive tag.
  • Active tag When the electronic tag is an active tag, the electronic tag includes an internal power supply to supply the power required by the internal IC to generate external signals. Generally speaking, active tags allow radio frequency communication over a longer distance, and active tags have a larger storage space, which can be used to store data transmitted by the NFC chip of other devices.
  • the above-mentioned electronic tag may specifically be an NFC tag 301 implemented using NFC technology (the NFC tag may also be referred to as an NFC patch).
  • the NFC chip in an electronic device such as a mobile phone
  • the NFC chip in the phone can detect the NFC signal sent by the NFC tag 301, and then read the NFC signal stored in the NFC tag 301 through the NFC signal information.
  • the mobile phone can obtain the information stored in the NFC tag 301 from the NFC tag 301 in response to a touch operation of approaching or contacting the NFC tag 301.
  • a coil is generally provided in the NFC tag 301, and the binding relationship between the NFC tag 301 and one or more electronic devices can be stored through the coil.
  • One electronic device can be bound to one or more NFC tags 301.
  • each NFC tag 301 uniquely corresponds to an NFC card number, then the NFC card number and the identification of the electronic device A can be written in the coil of the NFC tag 301 in advance, so as to establish the NFC tag 301 and the electronic device A in the NFC tag 301.
  • the binding relationship between device A is generally provided in the NFC tag 301, and the binding relationship between the NFC tag 301 and one or more electronic devices can be stored through the coil.
  • One electronic device can be bound to one or more NFC tags 301.
  • each NFC tag 301 uniquely corresponds to an NFC card number, then the NFC card number and the identification of the electronic device A can be written in the coil of the NFC tag 301 in advance, so as to establish the NFC tag 301 and the electronic device A in the N
  • the binding relationship stored in the NFC tag 301 may be preset when the NFC tag 301 is shipped from the factory, or manually set by the user when using (for example, using the NFC tag 301 for the first time). Do not make any restrictions.
  • the NFC tag 301 Take the example of binding the NFC tag 301 to the TV (also called a smart TV) in the communication system 100, as shown in Figure 3, when the user needs to project the content displayed or played in the source device (such as a mobile phone) as the projected content
  • the source device such as a mobile phone
  • the NFC function of the mobile phone can be turned on to approach or touch the NFC tag 301.
  • the mobile phone can read the binding relationship between the NFC tag 301 and the smart TV from the NFC tag 301 by transmitting a near-field signal.
  • the mobile phone can read the identification of the smart TV from the NFC tag 301.
  • the identifier may be the MAC (media access control, media access control) address, device name, or IP address of the smart TV.
  • the mobile phone can determine that the target device for content projection this time is a smart TV by reading the above-mentioned binding relationship. Then, as the source device, the mobile phone can start sending the projected content to the smart TV according to the read smart TV identifier, so that the smart TV can be used as the target device to display or play the projected content, and complete the content projecting process this time.
  • the above-mentioned TV may be an analog TV that uses analog signals, or a digital TV that uses digital signals, or any display output device that can play images, audio, or video.
  • the above-mentioned TV may also be referred to as a smart screen or a large-screen device.
  • the NFC tag 301 can record the binding relationship between the NFC tag 301 and multiple electronic devices.
  • the NFC tag 301 can be bound to both a smart TV and a speaker (also called a smart speaker).
  • the identification of the smart TV and the smart speaker can be read, indicating that the user wants to project the content of the mobile phone to the smart TV and smart speakers.
  • the mobile phone can project the display content of the projected content to the smart TV for display according to a preset strategy, and at the same time project the audio content of the projected content to the smart speaker for playback, to complete the content projecting process.
  • the source device can easily and quickly determine the target device for this content projection, so as to automatically start projecting the content to the target device. It simplifies the user's operation process when performing content projection across devices, improves and enriches the user experience, and at the same time improves the work efficiency of collaboration between multiple devices during content projection.
  • the electronic devices in the aforementioned communication system 100 may specifically be mobile phones, tablet computers, TVs, laptops, smart home devices, wearable electronic devices, vehicle-mounted devices, virtual reality devices, etc. This is not done in this embodiment of the application. Any restrictions.
  • the smart home equipment can specifically be: TVs, speakers, air conditioners (also known as smart air conditioners), refrigerators (also known as smart refrigerators), electric lights (also known as smart lights or smart bulbs), or curtains (also known as smart lights or smart bulbs). Called smart curtains) and so on.
  • FIG. 5 shows a schematic diagram of the structure of the mobile phone.
  • the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, The speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, the sensor module 180 and so on.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, The speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, the sensor module 180 and so on.
  • USB universal serial bus
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile phone can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G, etc., which are applied to mobile phones.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the wireless communication module 160 can provide applications on mobile phones including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellite systems ( Global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite systems
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the mobile phone realizes the display function through GPU, display screen 194, and application processor.
  • the GPU is an image processing microprocessor, which is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the mobile phone may include one or N display screens 194, and N is a positive integer greater than one.
  • the mobile phone can realize the shooting function through ISP, camera 193, video codec, GPU, display 194 and application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile phone may include one or N cameras 193, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the phone can support one or more video codecs.
  • video codecs In this way, mobile phones can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the mobile phone can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the mobile phone can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the voice can be heard by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the mobile phone can be equipped with at least one microphone 170C.
  • the mobile phone can be equipped with two microphones 170C, in addition to collecting sound signals, it can also achieve noise reduction.
  • the mobile phone can also be equipped with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the mobile phone may also include a charging management module, a power management module, a battery, a button, an indicator, and one or more SIM card interfaces, etc.
  • a charging management module may also include a charging management module, a power management module, a battery, a button, an indicator, and one or more SIM card interfaces, etc.
  • a power management module may also include a charging management module, a power management module, a battery, a button, an indicator, and one or more SIM card interfaces, etc.
  • the above-mentioned mobile phone software system may adopt a layered architecture, event-driven architecture, micro-core architecture, micro-service architecture, or cloud architecture.
  • the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of a mobile phone by way of example.
  • FIG. 6 shows a software structure block diagram of a mobile phone according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of applications.
  • APPs applications
  • call memo
  • browser contact
  • camera camera
  • gallery calendar
  • map map
  • Bluetooth music, video, short message, etc.
  • a projection application can also be installed in the application layer. Users can open the projection application from the desktop, settings, or drop-down menus.
  • the above projection application can be used as a bridge between the mobile phone (ie, the source device) and other electronic devices (ie, the target device) when content is projected, and the projected content in the application that needs to be projected in the mobile phone is sent to the target device.
  • the projection application can receive the projection event reported by the application framework layer, and then the projection application can interact with the running application (such as a video APP), and use the content being displayed or played in the application as the projection content through Wi-Fi Wait for wireless communication to send to the target device.
  • the running application such as a video APP
  • the user can also use the above projection application to set the binding relationship between the NFC tag and one or more electronic devices.
  • the projection application can display a list of electronic devices to be bound.
  • the mobile phone can be brought close to the NFC tag that needs to be bound. In this way, the mobile phone can write the identification of the electronic device selected by the user in the projection application into the NFC tag through the NFC signal, thereby establishing a binding relationship between the NFC tag and one or more electronic devices in the NFC tag.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the NFC service (NFC service) can be run in the application framework layer.
  • the mobile phone can start to run the NFC service in the application framework layer after turning on the NFC function.
  • the NFC service can call the NFC driver of the kernel layer to read the binding relationship stored in the NFC tag, thereby obtaining the target device for this content projection.
  • the NFC service can report the projection event to the above-mentioned projection application, thereby triggering the projection application to send the content being displayed or played by the mobile phone as the projection content to the target device to start the content projection process.
  • the application framework layer may also include Wi-Fi service, window manager, content provider, view system, phone manager, resource manager, etc., in the embodiment of the present application There are no restrictions on this.
  • Wi-Fi services can be used to provide Wi-Fi-related functions such as joining a Wi-Fi network or establishing Wi-Fi and P2P connections with other electronic devices.
  • the above-mentioned window manager is used to manage window programs. The window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the above-mentioned content provider is used to store and obtain data, and make these data accessible to applications. The data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the above-mentioned view system includes visual controls, such as controls that display text, controls that display pictures, and so on. The view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the above-mentioned telephone manager is used to provide the communication function of the mobile phone. For example, the management of the call status (including connecting, hanging up, etc.).
  • the aforementioned resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the system library can include multiple functional modules.
  • surface manager surface manager
  • media library Media Libraries
  • three-dimensional graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer at least includes a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
  • each NFC tag 701 can store its own NFC card number in the NFC tag 701 when it leaves the factory.
  • a flag bit can be preset in each NFC tag 701, and the flag bit can be used to indicate whether the NFC tag 701 has established a binding relationship with the electronic device. For example, when the flag bit in the NFC tag 701 is 00, it means that the NFC tag 701 has not been bound to the electronic device; when the flag bit in the NFC tag 701 is 01, it means that the NFC tag 701 has been connected to one or more Electronic device binding.
  • the user uses the NFC tag 701 for the first time, he can use a preset projection application to establish a binding relationship between the NFC tag 701 and one or more electronic devices in the NFC tag 701.
  • the method of using the projection application to establish the above-mentioned binding relationship in the NFC tag 701 may include the following steps:
  • the mobile phone displays an NFC tag binding interface of the projection application, and the interface includes a list of devices to be bound.
  • the mobile phone and the NFC tag 701 can interact through the NFC signal, so that the mobile phone can read the NFC card number and the preset flag in the NFC tag 701. If the flag bit is 00, it means that the NFC tag 701 has not been bound to the electronic device. Furthermore, as shown in FIG. 9, the mobile phone may prompt the user to establish a binding relationship between the NFC tag 701 and one or more electronic devices.
  • the mobile phone can open the projection application and automatically jump to the binding interface 1001 of the NFC tag 701.
  • the mobile phone can display a device list 1002 composed of one or more electronic devices.
  • the electronic devices in the device list 1002 are all devices that can be bound with the NFC tag 701.
  • the electronic device in the device list 1002 may be one or more devices that log in to the same account (for example, a Huawei account) as a mobile phone.
  • the electronic device in the device list 1002 may be one or more devices that access the same Wi-Fi network as the mobile phone. The user can select the electronic device that needs to be bound with the NFC tag 701 in the device list 1002.
  • the NFC tag 701 may be bound to one or more electronic devices.
  • the user can select one or more electronic devices in the device list 1002 as the binding device of the NFC tag 701.
  • the binding option 1101 of a single electronic device and the binding option 1102 of multiple electronic devices can be preset in the projection application.
  • the mobile phone can prompt the user to select an electronic device from the device list to bind with the NFC tag 701 in the corresponding binding interface.
  • the mobile phone can display one or more preset device groups 1103 in the corresponding binding interface, and each device group includes multiple electronic devices. For example, smart TV and smart speaker 1 are a device group, smart speaker 1 and smart speaker 2 are a device group, and smart TV and smart light bulb are a device group. In this way, the user can trigger the mobile phone to bind the NFC tag 701 with multiple electronic devices in the device group by selecting the device group in the binding interface.
  • the mobile phone receives the user's first operation of selecting a bound device in the foregoing device list.
  • step S802 after the mobile phone displays the binding interface of the above projection application, the user can select one or more electronic devices bound to the NFC tag 701 from the device list or device group listed in the binding interface.
  • One or more electronic devices selected by the user may be referred to as a binding device of the NFC tag 701.
  • the mobile phone detects that the user has selected the binding device on the binding interface, it can continue to perform the following steps S803-S804.
  • the mobile phone In response to the first operation, the mobile phone prompts the user to use the mobile phone to approach the NFC tag 701 to be bound.
  • the mobile phone detects that the user has selected the smart TV and the smart light bulb in the above binding interface, and can determine the binding relationship between the NFC tag 701 and the smart TV and the smart light bulb. At this time, the mobile phone needs to write the binding relationship into the NFC tag 701. Since the mobile phone and the NFC tag 701 need to communicate through a short-distance NFC signal, as shown in Figure 12, if the mobile phone does not detect the NFC signal sent by the NFC tag 701, the mobile phone can display a prompt 1201 in the projection application. The prompt 1201 is used to guide the user to bring the mobile phone close to or touch the NFC tag 701 waiting to be bound with the smart TV and the smart light bulb.
  • the mobile phone writes the identification of the aforementioned bound device into the NFC tag 701 to establish a binding relationship between the NFC tag 701 and the bound device.
  • the user can approach or touch the NFC tag 701 according to the prompt shown in FIG. 12.
  • the mobile phone can detect the NFC signal sent by the NFC tag 701.
  • the mobile phone can write the identification of the binding device set by the user in the binding interface into the NFC tag 701.
  • the mobile phone can write the MAC address, device name, or IP address of the bound device into the NFC tag 701. In this way, the binding relationship between the NFC tag 701 and the binding device is established in the NFC tag 701.
  • a source device such as a mobile phone that performs content projection can determine one or more electronic devices bound to the NFC tag 701 by reading the identifier of the bound device in the NFC tag 701, that is, the target device that performs content projection.
  • the NFC tag 701 can modify the preset flag bit from 00 to 01 to indicate that the current NFC tag 701 has been bound to one or more electronic devices.
  • the user can continue to set the projection policy of the bound device bound to the NFC tag 701 in the projection application when performing content projection.
  • the mobile phone can provide different projection instructions corresponding to different NFC operations in the setting interface 1301 for the user to choose. For example, the user can set that when the NFC tag 701 is touched once, the corresponding projection instruction is to start projection. For example, the user can set that when the NFC tag 701 is continuously touched twice, the corresponding projection instruction is to play the next episode (or the next song). For another example, the user can set that when the NFC tag 701 is touched for more than a preset time, the corresponding projection instruction is to quit this content projection.
  • the mobile phone receives the projection strategy set by the user in the setting interface 1301, the association relationship between the NFC tag 701, the smart TV, and the above projection strategy can be established. Subsequently, the mobile phone can trigger the mobile phone to perform content projection to the smart TV according to the projection strategy set by the user by approaching or touching the NFC tag 701, thereby simplifying the operation process when performing content projection across devices.
  • the projection strategy when a bound device performs content projection.
  • the user can set in the setting interface 1401 to project the display content of the source device to the smart TV and display the content of the source device when projecting content to the smart TV, smart speaker, and smart bulb.
  • the audio content is projected to the smart speaker for playback, and the smart bulb can change the lighting effect according to the display content or the audio content.
  • the user can further set a specific projection strategy when projecting display content in a smart TV, a specific projection strategy when projecting audio content in a speaker, etc.
  • the embodiment of the present application does not impose any limitation on this.
  • the mobile phone After the mobile phone receives the projection strategy set by the user in the setting interface 1401, it can establish the association relationship between the NFC tag 701, the bound device (that is, the smart TV, the smart speaker, and the smart bulb), and the above projection strategy. Subsequently, the mobile phone can be triggered by approaching or touching the NFC tag 701 to trigger the mobile phone to perform content projection to the above three bound devices according to the projection strategy set by the user, thereby simplifying the operation process of cross-device content projection.
  • the NFC tag 701 the bound device (that is, the smart TV, the smart speaker, and the smart bulb)
  • the mobile phone can be triggered by approaching or touching the NFC tag 701 to trigger the mobile phone to perform content projection to the above three bound devices according to the projection strategy set by the user, thereby simplifying the operation process of cross-device content projection.
  • the projection strategy of the NFC tag 701 bound device during content projection can be manually set by the user using the projection application, or it can be preset by the mobile phone according to the type, location, and device capabilities of the bound device. .
  • the bound devices of the NFC tag 701 are smart speaker 1 and smart speaker 2
  • the mobile phone can default the projection strategy to use the smart speaker closest to the user for content projection.
  • the above-mentioned projection strategy may also be dynamically set by the source device during content projection to the bound device of the NFC tag 701.
  • a mobile phone projects content to a bound device (such as a smart TV and a smart speaker) of the NFC tag 701
  • the audio playback capabilities of the smart TV and smart speaker can be dynamically acquired.
  • the mobile phone can determine to project audio content on the smart TV and/or the smart speaker according to the audio playback capabilities of the smart TV and the smart speaker.
  • the embodiment of the present application does not impose any restrictions on the specific content of the projection strategy and the specific setting method of the projection strategy.
  • the user can set corresponding one or more binding devices for different NFC tags according to the above method.
  • the NFC function of the source device can be turned on to approach or touch the corresponding NFC tag to bind one or more bound NFC tags
  • the device is used as the target device for content projection this time to start the content projection process.
  • the method may include the following steps:
  • the mobile phone In response to a touch operation between the mobile phone and the NFC tag 701, the mobile phone obtains one or more bound devices bound to the NFC tag 701.
  • the mobile phone has set the corresponding binding device for the NFC tag 701. Then, when the user wants to project the content (such as display content, audio content) in the mobile phone (ie the source device) to the bound device of the NFC tag 701, as shown in Figure 17, the user can turn on the NFC function of the mobile phone and touch ( Or approaching) the NFC tag 701, that is, the touch operation of the mobile phone and the NFC tag 701 is performed.
  • the content such as display content, audio content
  • the user wants to project the content (such as display content, audio content) in the mobile phone (ie the source device) to the bound device of the NFC tag 701, as shown in Figure 17, the user can turn on the NFC function of the mobile phone and touch ( Or approaching) the NFC tag 701, that is, the touch operation of the mobile phone and the NFC tag 701 is performed.
  • the mobile phone In response to a touch operation between the mobile phone and the NFC tag 701, the mobile phone can read the identification of one or more bound devices that have been bound to the NFC tag 701 from the NFC tag 701, and the bound device can be used as the target of the mobile phone
  • the device participates in this content projection.
  • the user's touch operation of touching the NFC tag with the source device can trigger the source device to obtain the target device participating in this content projection, so as to automatically complete the subsequent content projection process with the target device, which simplifies the content projection time
  • the operating process improves the efficiency of multi-device collaborative work.
  • the mobile phone can establish the correspondence between the NFC tag 701 and the corresponding bound device by performing the above steps S801-S804.
  • the mobile phone sends the projected content to the bound device to start the current content projection.
  • the mobile phone reads the identification of only one bound device in the NFC tag 701, it means that there is only one bound device bound to the NFC tag 701. Then, the target device for content projection this time is the bound device.
  • the mobile phone can use the smart TV as the target device for this content projection, and send this time to the smart TV.
  • Project content starts content projection.
  • the projected content may include content being played by the mobile phone, for example, audio content and/or display content being played by the mobile phone.
  • the display content may include pictures, pictures in videos, or part or all of the content in the current display interface, and so on.
  • the mobile phone can query whether a smart TV is included in the currently connected Wi-Fi network according to the identifier of the smart TV. If a smart TV is included, it means that the smart TV has been connected to the Wi-Fi network, and the mobile phone can dynamically send the projected content to the smart TV through the Wi-Fi network. If the smart TV is not included, indicating that the smart TV has not been connected to the Wi-Fi network where the mobile phone is located, the mobile phone can prompt the user to connect the smart TV to the same Wi-Fi network where the mobile phone is located. Furthermore, the mobile phone can dynamically send the current projected content to the smart TV through the Wi-Fi network.
  • the mobile phone can also automatically establish a wireless communication connection with the smart TV according to the read identifier of the smart TV (for example, the MAC address of the smart TV).
  • the mobile phone can establish a Bluetooth connection or a Wi-Fi P2P connection with a smart TV, etc.
  • the embodiment of the present application does not impose any restriction on this.
  • the projection content sent by the mobile phone to the smart TV may include the display content of the mobile phone.
  • a mobile phone can send each frame of images displayed in real time to a smart TV by mirroring the screen, and the smart TV will display the display interface of the mobile phone synchronously.
  • a mobile phone can send part of the display content such as videos and pictures in the display interface of the mobile phone to a smart TV for display through a screen projection method of DLNA (digital living network alliance).
  • DLNA digital living network alliance
  • the mobile phone touches or approaches the aforementioned NFC tag 701
  • the mobile phone is displaying the playback interface of video A
  • the binding device of the NFC tag 701 is a smart TV
  • the mobile phone can be used as the source device to transfer the entire playback interface (ie The entire display content in the display interface) is sent to the smart TV as the projected content, or the mobile phone can be used as the source device to send the video image of Video A in the playback interface (that is, part of the display content in the display interface) as the projected content to the smart TV.
  • the mobile phone when the mobile phone touches or is close to the aforementioned NFC tag 701, if the mobile phone is displaying the playlist of the video APP, when the bound device of the NFC tag 701 is a smart TV, the mobile phone can also be used as the source device to display the playlist. Send to smart TV as projected content. Subsequently, if the mobile phone detects that the user selects to play video A in the above playlist, the mobile phone can continue to send the playback interface of video A or the video image of video A as projection content to the smart TV.
  • the projected content sent by the mobile phone to the smart TV may also include audio content being played by the mobile phone.
  • the audio content may be an audio file corresponding to the video screen being displayed by the mobile phone.
  • the smart TV can display or play the screencast content to complete this content projection.
  • the target device for this content projection is still an example of a smart TV.
  • the mobile phone When the mobile phone is projecting content to the smart TV, the user interacts with the NFC tag 701 through the mobile phone to trigger the mobile phone to send the corresponding content to the smart TV. In order to realize the corresponding control function in the content projection process.
  • the projection policy associated with the NFC tag 701 and the binding device can be preset.
  • the projection strategy includes different projection instructions corresponding to different NFC operations.
  • the projection instruction corresponding to the NFC operation that the mobile phone continuously touches the NFC tag 701 twice can be set to play the next episode (or the next song).
  • the mobile phone can send the projection of the next episode (or the next song) to the smart TV. instruction.
  • the smart TV can perform the operation of playing the next episode (or the next song) in response to the projection instruction.
  • the user can use the source device to input different NFC operations to the NFC tag to implement corresponding control functions, thereby enriching the user experience in the content projection scenario.
  • the mobile phone determines the main device for this content projection.
  • the master device (master) of this content projection may be the source device (ie, mobile phone), or may be one of multiple bound devices bound to the NFC tag 701.
  • the master device can be used as a control node to connect and interact with other devices (ie slave devices) through a star topology.
  • the mobile phone when there are multiple binding devices of the NFC tag 701, the mobile phone can determine the specific master device according to the device type and device capabilities of the multiple binding devices. For example, the mobile phone can query the computing capabilities of these multiple bound devices, and determine the bound device with the strongest computing capability as the master device for this content projection. At this time, the mobile phone and other bound devices can act as slaves of the master device. equipment.
  • the mobile phone may be preset with specific master devices corresponding to different content projection scenes.
  • the master device is a smart TV
  • the slave device is a mobile phone and a smart bulb.
  • the bound devices are smart speaker 1 and smart speaker 2
  • the master device is a mobile phone
  • the slave devices are smart speaker 1 and smart speaker 2.
  • the binding device is a smart TV and a smart speaker
  • the master device is a mobile phone
  • the slave device is a smart TV and a smart speaker.
  • the mobile phone can determine the specific main device corresponding to the content projection scene composed of the several binding devices according to the identities of the multiple binding devices read in the NFC tag 701.
  • the mobile phone If the mobile phone is the master device, the mobile phone sends the projected content to each bound device according to the projection strategy.
  • the mobile phone determines that the main device of this content projection is the mobile phone (ie the source device)
  • the mobile phone can be used as the control node of this content projection, and according to a certain projection strategy, send this time to each bound device (ie the target device) in real time.
  • each bound device After receiving the projected content, each bound device starts to play or display the projected content.
  • the above projection strategy can be preset when the user binds the NFC tag 701, or it can be preset by the mobile phone according to the device type and device capabilities of the bound device, or it can be dynamic after the mobile phone determines that it is the master device. Generated, the embodiment of the application does not impose any restriction on this.
  • the mobile phone when the binding devices of the NFC tag 701 are smart speaker 1 and smart speaker 2, the mobile phone can be used as the main device when projecting content to smart speaker 1 and smart speaker 2, smart speaker 1 and smart speaker 2
  • the smart speaker 2 can be used as a slave device of the mobile phone.
  • the projection strategy can be set to be related to the distance between the mobile phone and the smart speaker 1 and smart speaker 2.
  • the mobile phone can detect the distance between the mobile phone and the smart speaker 1 and the smart speaker 2 respectively. When the distance between the mobile phone and the smart speaker 1 is less than the preset value, and the distance between the mobile phone and the smart speaker 2 is greater than the preset value, it indicates that the user is closer to the smart speaker 1 and farther from the smart speaker 2. Then, the mobile phone can be used as the main device to send the projected content this time to the smart speaker 1, and the smart speaker 1 will play the projected content this time to complete the content projection.
  • the mobile phone can also send the projected content to the smart speaker closest to the mobile phone by default.
  • the mobile phone can send the projected content to the smart speaker 1 and the smart speaker 2 respectively according to the projection strategy of stereo playback.
  • the mobile phone can send the low-frequency components in the projected content to the smart speaker 1, and the smart speaker 1 will play the low-frequency components in the projected content.
  • the mobile phone can send the high-frequency components in the projected content to the smart speaker 2, which will be played by the smart speaker 2. High-frequency components in the projected content.
  • the mobile phone can send the audio file corresponding to the left channel in the projected content to smart speaker 1, and at the same time send the audio file corresponding to the right channel in the projected content to smart speaker 2, so that smart speaker 1 and smart speaker 2 can play separately The audio files of the left and right channels in the projected content.
  • the mobile phone can send the corresponding audio component of the projected content to each smart speaker according to the above method, so that more Each speaker plays the received audio components separately to achieve stereo or surround sound playback effects.
  • the mobile phone before the mobile phone sends the projected content to the smart speaker 1 and the smart speaker 2, it can also send a synchronization command to the smart speaker 1 and the smart speaker 2, and the smart speaker 1 and the smart speaker 2 can synchronize time with the mobile phone according to the synchronization command.
  • the mobile phone can mark one or more timestamps in the projected content to be sent, and send the projected content and the timestamps in the projected content to the smart speaker 1 and the smart speaker 2 together.
  • smart speaker 1 and smart speaker 2 can play each segment of the projected content according to the timestamp in the projected content, ensuring the smart speaker The playback progress of 1 and smart speaker 2 is the same.
  • the mobile phone can also calculate the transmission delay when the smart speaker 1 and the smart speaker 2 respond to the aforementioned synchronization command.
  • the transmission delay of the smart speaker 1 when responding to the aforementioned synchronization command is 300 ms
  • the transmission delay of the smart speaker 2 when responding to the aforementioned synchronization command is 500 ms.
  • the mobile phone can calculate the distance between the mobile phone and the smart speaker 1 and the smart speaker 2 according to the transmission delay.
  • the mobile phone can also detect the distance between the mobile phone and the smart speaker 1 and the smart speaker 2 through a distance sensor, an infrared sensor, etc., and the embodiment of the present application does not impose any limitation on this.
  • the mobile phone may also send projections to smart speaker 1 and smart speaker 2 according to the transmission delay of smart speaker 1 and smart speaker 2. content. Still taking the transmission delay of smart speaker 1 as 300ms and the transmission delay of smart speaker 2 as 500ms, the mobile phone can send the same projection content to smart speaker 2 200ms in advance before sending the projection content to smart speaker 1. In this way, the smart speaker 1 and smart speaker 2 can simultaneously receive the projection content sent by the mobile phone and start content projection.
  • the mobile phone when the mobile phone is the master device for this content projection, and the smart speaker 1 and smart speaker 2 are the slave devices of the mobile phone, the mobile phone can display the setting interface of the projection strategy.
  • the user can manually set which smart speaker is used to play the projected content sent by the mobile phone during this content projection in the setting interface.
  • the mobile phone can save the projection strategy set by the user for the mobile phone, smart speaker 1 and smart speaker 2.
  • the mobile phone when the mobile phone is used as the main device to project content to smart speaker 1 and smart speaker 2, the mobile phone can be based on the stored projections.
  • Strategies for content projection In other words, the user can manually set corresponding projection strategies for multiple devices participating in the content projection during the content projection process.
  • the mobile phone when the binding device of the NFC tag 701 is a smart TV and a smart speaker, the mobile phone can be used as the main device when projecting content to the smart TV and smart speaker, and the smart TV and smart speaker can be used as a mobile phone The slave device.
  • the projection strategy can be set to use a smart TV to play the display content in the projected content, and use a smart speaker to play the audio content in the projected content.
  • the mobile phone can be used as the main device to send the display content in this projected content to the smart TV, and the smart TV starts to display the display content.
  • the mobile phone can send the audio content in the projected content to the smart speaker, and the smart speaker starts to play the audio content.
  • the mobile phone can be used as the main device to send the display content and audio content in the current projected content to the smart TV, and the smart TV will play the display content and audio content.
  • the mobile phone can send the audio content in the projected content to the smart speaker, and the smart speaker starts to play the audio content. That is, the smart TV and smart speakers can play the audio content projected this time at the same time.
  • the above-mentioned smart TV may include one or more, and the above-mentioned smart speaker may also include one or more, which is not limited in the embodiment of the present application.
  • the mobile phone can perform time synchronization with the smart TV and smart speaker before sending the above display content and audio content to the smart TV and smart speaker. Furthermore, the mobile phone can send the time-stamped display content and audio content to the smart TV and the smart speaker, respectively, so that the smart TV and the smart speaker can simultaneously perform content projection according to the time stamp.
  • the projection strategy when the mobile phone projects content to the smart TV and smart speakers may be dynamically set.
  • a mobile phone can be used as a master device to acquire the device capabilities of smart TVs and smart speakers. Taking smart TVs with display and audio playback capabilities and smart speakers with audio playback capabilities as an example, the mobile phone can dynamically determine to project the display content in this projected content to the smart TV for display, and simultaneously display the audio content in this projected content. Cast to smart TV and smart speakers for playback.
  • the mobile phone can be used as the main device to send the display content and audio content in the current projection content to the smart TV, and at the same time send the audio content in the current projection content to the smart speaker.
  • the mobile phone sends the projected content to the main device, and the main device controls other bound devices to start this content projection according to the projection policy.
  • the mobile phone determines that the main device of this content projection is one of the multiple bound devices of the NFC tag 701, the mobile phone can send the content of this projection to the main device, and the main device controls the other devices according to a certain projection strategy The bound device starts content projection.
  • the binding device of the NFC tag 701 is a smart TV and a smart light bulb
  • the smart TV can be used as the master device for content projection
  • the smart light bulb can be used as the slave device of the smart TV.
  • the projection strategy can be set to use a smart TV to display and play the projected content, and the smart TV controls the light effect of the smart bulb.
  • the mobile phone that is, the source device
  • the smart TV that is, the main device
  • the mobile phone can also send the projection strategy of this content projection to the smart TV.
  • the smart TV may pre-store the projection strategy when the slave device is a smart bulb, and the embodiment of the present application does not impose any restriction on this.
  • the smart TV can be used as the main device to start displaying and playing the projected content sent by the mobile phone.
  • the smart TV can send corresponding control instructions to the smart bulb according to the projected content, so that the smart bulb can project different lighting effects during the content projection process.
  • the smart TV when the smart TV starts to display and play the projected content, the smart TV can send a light off instruction to the smart light bulb to control the smart light bulb to turn off the light source.
  • a smart TV can obtain the type of video being played. If a horror video is being played, the smart TV can control the smart light bulb to display a blue light source; if it is playing a love video, the smart TV can control the smart light bulb to display a pink light source, etc., so that the user can get the benefits during the content projection process Better scene experience.
  • the mobile phone when the mobile phone reads the NFC tag 701 and obtains that there are multiple bound devices bound to the NFC tag 701, the mobile phone can also default to itself as the main device in the content projection process. At this time, The mobile phone does not need to perform the above steps S1503 and S1505 any more, and can send the projected content to each bound device according to the projection strategy according to the relevant method in step S1504 to complete the content projection this time.
  • the user can easily and quickly project the projected content in the source device to the target device required by the user by touching the NFC tag, so as to achieve "touch "One-touch projection” function.
  • the source device can simultaneously project the projected content to multiple target devices at one time, and achieve different projection effects in different projection scenarios through the cooperation of multiple target devices, which improves the user experience and the collaborative work between multiple devices. effectiveness.
  • the mobile phone can also back up the binding relationship between the NFC tag 701 and the bound device to the application server of the projection application. For example, the mobile phone can send the NFC card number of the NFC tag 701 and the identification of one or more bound devices bound to the NFC tag 701 to the application server, so that the application server establishes a connection between the NFC tag 701 and the corresponding bound device. Binding relationship.
  • the new source device can retrieve the NFC tag 701 and the corresponding binding from the application server of the projection application. Determine the binding relationship between devices. Then, when the user touches the NFC tag 701 with the new source device, the new source device can also perform the above steps S1501-S1505 to project content to the corresponding bound device.
  • the NFC tag 701, the corresponding binding device and the corresponding projection strategy can also be shared with other users.
  • the user A can share the NFC tag 701, the bound device, and the projection strategy to the family of the user A (for example, the parents of the user A) through WeChat or the like.
  • the mobile phone of the parent of user A receives the shared content, the corresponding relationship between the NFC tag 701, the bound device, and the projection strategy can be saved.
  • the mobile phone can also perform the above steps S1501-S1505 to project content to the corresponding bound device.
  • the user when setting a projection strategy for the device bound to the NFC tag 701, the user can also set specific projection content, projection time, etc. in the projection strategy. For example, the user can set the projection content corresponding to the NFC tag 701 as learning video A for his child, and the projection time is 1 hour. Then, when the user touches the NFC tag 701 with their mobile phone, or the user shares the projection strategy with their parents, when the parents touch the NFC tag 701 with their mobile phone, the mobile phone can follow the projection content and projection time set by the user in the projection policy Project content to the corresponding bound device, so that the mobile phone can complete the content projection in a targeted manner, reducing the difficulty of operation when the elderly and children perform content projection.
  • the embodiment of the application discloses an electronic device including a processor, and a memory, a communication interface, an input device, and an output device connected to the processor.
  • the input device and the output device can be integrated into one device.
  • a touch sensor can be used as an input device
  • a display screen can be used as an output device
  • the touch sensor and display screen can be integrated into a touch screen.
  • the above-mentioned electronic device may include: a touch screen 2201 including a touch sensor 2206 and a display screen 2207; one or more processors 2202; a memory 2203; one or more application programs (not shown) Shown); a communication interface 2208; and one or more computer programs 2204, each of the above-mentioned devices can be connected through one or more communication buses 2205.
  • the one or more computer programs 2204 are stored in the aforementioned memory 2203 and are configured to be executed by the one or more processors 2202, and the one or more computer programs 2204 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all the relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the above-mentioned processor 2202 may specifically be the processor 110 shown in FIG. 5, the above-mentioned memory 2203 may specifically be the internal memory 121 shown in FIG. 5, and the above-mentioned display screen 2207 may specifically be the display screen 194 shown in FIG.
  • the above-mentioned touch sensor may specifically be a touch sensor in the sensor module 180 shown in FIG. 5, which is not limited in the embodiment of the present application.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of software products, and the computer software products are stored in a storage
  • the medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请提供一种跨设备的内容投射方法及电子设备,涉及终端领域,电子设备可方便快捷将投射内容投射至多个电子设备中播放,提高内容投射时多设备之间协同的工作效率。该方法包括:第一电子设备开始播放第一内容;所述第一电子设备从NFC标签中获取与所述NFC标签绑定的N个第二电子设备,N为大于1的整数;所述第一电子设备按照预设的投射策略将所述第一内容投射至所述N个第二电子设备中的至少一个第二电子设备上继续播放。

Description

一种跨设备的内容投射方法及电子设备
本申请要求于2019年10月30日提交国家知识产权局、申请号为201911047072.0、发明名称为“一种跨设备的内容投射方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端领域,尤其涉及一种跨设备的内容投射方法及电子设备。
背景技术
随着智能终端技术的发展,一个用户或家庭中往往具备多个电子设备,并且,用户经常需要在多个电子设备之间进行切换。例如,用户使用手机观看视频,回家后可能希望将视频切换至电视继续观看。又例如,用户在家可使用笔记本电脑办公,当用户离开家时可能希望将笔记本电脑中的文件切换至手机中继续办公。
在这种跨设备交互的场景下,通常需要用户手动的将一个设备中的内容投射到另一个或多个设备上。例如,用户可以将手机、智能电视、音箱等电子设备接入同一Wi-Fi网络,当用户需要将手机中的内容投射至其他电子设备上时,用户可以使用手机中的投屏功能或投屏软件搜索当前Wi-Fi网络内的多个电子设备。进而,用户可在搜索到的多个电子设备中选择本次接收投射内容的目标设备。这样,手机可通过Wi-Fi网络向目标设备投射图片、视频、音频等投射内容。显然,这种在多个设备之间切换投射内容的过程较耗时较长、操作较为繁琐,用户的使用体验不高。
发明内容
本申请提供一种跨设备的内容投射方法及电子设备,电子设备可方便快捷将投射内容投射至其他多个电子设备中播放,提高内容投射时多设备之间协同的工作效率。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种跨设备的内容投射方法,包括:第一电子设备开始播放第一内容,例如,第一内容可以包括显示内容和/或音频内容;进而,当第一电子设备与NFC标签之间的距离足够近时,第一电子设备可从NFC标签中获取与该NFC标签绑定的N(N为大于1的整数)个第二电子设备;这样,第一电子设备可按照预设的投射策略将第一内容投射至上述N个第二电子设备中的至少一个第二电子设备上继续播放。
也就是说,第一电子设备通过读取NFC标签中绑定的第二电子设备,可方便、快速的确定出本次内容投射的多个目标设备,从而自动开始向多个目标设备投射本次投射内容,简化了跨设备进行内容投射时用户的操作流程,提高和丰富了用户的使用体验,同时提高了内容投射时多设备之间协同的工作效率。
其中,第一电子设备向第二电子设备投射的第一内容可以包括第一电子设备正在显示界面中显示的部分或全部显示内容。例如,第一电子设备可将正在显示的第一界 面(例如桌面)中的所有显示内容作为第一内容投射至第二电子设备中。又例如,第一电子设备可将正在显示的播放界面中某一视频中的图像作为第一内容投射至第二电子设备中。
或者,第一电子设备向第二电子设备投射的第一内容也可以包括第一电子设备正在播放音频内容,例如第一电子设备正在播放的音乐或正在播放的与视频同步的音频。当然,当第一电子设备将第一内容投射至第二电子设备后,如果第一电子设备响应用户操作开始播放其他内容(例如第二内容),则第一电子设备可继续将第二内容投射至第二电子设备中播放。
在一种可能的实现方式中,第一电子设备从NFC标签中获取与NFC标签绑定的N个第二电子设备,包括:响应于第一电子设备靠近或接触该NFC标签的碰一碰操作,第一电子设备读取NFC标签中存储的N个第二电子设备的标识,以确定与该NFC标签绑定的N个第二电子设备;或者,第一电子设备使用第一电子设备的NFC芯片检测到来自NFC标签的NFC信号后,通过该NFC信号读取NFC标签中存储的N个第二电子设备的标识,以确定与该NFC标签绑定的N个第二电子设备。
也就是说,用户可以通过靠近或触碰NFC标签的方式,触发第一电子设备通过NFC功能读取NFC标签中存储的第二电子设备的标识,从而确定本次与第一电子设备进行内容投射的N个第二电子设备。
在一种可能的实现方式中,第一电子设备按照预设的投射策略将第一内容投射至该N个第二电子设备中的至少一个第二电子设备上继续播放,包括:第一电子设备按照预设的投射策略,将第一内容发送给N个第二电子设备中的至少一个第二电子设备播放。也就是说,可由第一电子设备作为本次内容投射的主设备,控制第二电子设备进行内容投射。
示例性的,上述N个第二电子设备可包括第一音箱和第二音箱;其中,第一电子设备按照预设的投射策略,将第一内容发送给N个第二电子设备中的至少一个第二电子设备播放,包括:第一电子设备将第一内容发送给第一音箱播放,第一音箱为与第一电子设备距离最近的音箱;或者,第一电子设备将第一内容发送给第一音箱和第二音箱播放。
例如,第一电子设备可比较自身与第一音箱和第二音箱之间的距离。如果第一音箱与第一电子设备之间的距离小于预设值,而第二音箱与第一电子设备之间的距离大于预设值,说明第一音箱距离第一电子设备较近而第二音箱距离第一电子设备较远,则第一电子设备可将第一内容发送给第一音箱播放,完成本次内容投射。
又例如,如果第一音箱与第一电子设备之间的距离小于预设值,且第二音箱与第一电子设备之间的距离也小于预设值,说明第一音箱和第二音箱距离第一电子设备都很近,则第一电子设备可将第一内容发送给第一音箱和第二音箱这两个设备,从而将第一内容投射至第一音箱和第二音箱上播放。当然,第一电子设备也可以根据存储的投射策略确定将第一内容发送给具体哪个或哪些设备上播放,本申请实施例对此不做任何限制。
在一种可能的实现方式中,第一电子设备将第一内容发送给第一音箱和第二音箱播放,包括:第一电子设备将第一内容中的第一音频分量发送给第一音箱播放;并且, 第一电子设备将第一内容中的第二音频分量发送给第二音箱播放。当然,如果上述N个第二电子设备中还包括第三音箱时,手机可将第一内容中的第三音频分量发送给第三音箱播放。也就是说,第一电子设备可向每个音箱发送本次投射内容中对应的音频分量,使得多个音箱分别播放接收到的音频分量,实现立体声或环绕声的播放效果。
在一种可能的实现方式中,上述N个第二电子设备可包括音箱(音箱可以有一个或多个)和电视(电视可以有一个或多个);其中,第一电子设备按照预设的投射策略,将第一内容发送给N个第二电子设备中的至少一个第二电子设备播放,包括:第一电子设备可将第一内容中的显示内容(例如图像或视频)发送给电视播放;并且,第一电子设备将第一内容中的音频内容发送给音箱播放;或者,第一电子设备可将第一内容中的显示内容发送给电视播放;并且,第一电子设备将第一内容中的音频内容发送给电视和音箱播放。
在一种可能的实现方式中,在第一电子设备从NFC标签中获取与NFC标签绑定的N个第二电子设备之后,还包括:第一电子设备在上述N个第二电子设备中确定主设备;其中,第一电子设备按照预设的投射策略将第一内容投射至N个第二电子设备中的至少一个第二电子设备上继续播放,包括:第一电子设备将第一内容发送给主设备,以使得主设备按照预设的投射策略控制上述N个第二电子设备中的至少一个第二电子设备播放第一内容。也就是说,第一电子设备可在上述N个第二电子设备中确定一个主设备,由主设备控制上述N个第二电子设备实现本次内容投射。
示例性的,上述N个第二电子设备中可包括电视和灯;其中,第一电子设备在N个第二电子设备中确定主设备,包括:第一电子设备将电视确定为上述N个第二电子设备中的主设备;此时,预设的投射控策略可以包括:由电视播放第一内容中的显示内容和音频内容,由电视播根据第一内容向灯发送控制指令,以制灯发光的亮度或颜色,实现不同的灯管效果。
在一种可能的实现方式中,在第一电子设备在N个第二电子设备中确定主设备之后,还包括:第一电子设备将存储的投射策略发送给该主设备。当然,主设备也可以从其他电子设备或服务器中获取上述投射策略。
在一种可能的实现方式中,在第一电子设备按照预设的投射策略将第一内容投射至N个第二电子设备中的至少一个第二电子设备上继续播放之前,还包括:第一电子设备与这N个第二电子设备进行时间同步;其中,第一电子设备发送的第一内容中携带有时间戳,该时间戳用于指示第一内容的播放进度。由于第一电子设备与上述N个第二电子设备进行时间同步后各个设备的时间是同步的,因此,当第二电子设备按照第一内容中的时间戳播放投射内容时,可以保证各个第二电子设备之间的播放进度相同。
在一种可能的实现方式中,在第一电子设备从NFC标签中获取与NFC标签绑定的N个第二电子设备之后,还包括:第一电子设备接收用户对上述N个第二电子设备输入的投射策略。也就是说,用户可以在进行内容投射的过程中手动为本次参与内容投射的多个设备设置相应的投射策略。
第二方面,本申请提供一种跨设备的内容投射方法,包括:第一电子设备显示NFC标签的绑定界面,该绑定界面中包括等待与该NFC标签绑定的候选设备列表,候选设 备列表中的候选设备与第一电子设备位于同一通信网络内;如果第一电子设备检测到用户在上述候选设备列表中选择M(M为大于0的整数)个第二电子设备的第一操作,则响应于第一操作,第一电子设备可提示用户将第一电子设备靠近或接触上述NFC标签,使得第一电子设备可向该NFC标签中写入上述M个第二电子设备的标识,以建立该NFC标签与这M个第二电子设备之间的绑定关系。
这样,当第一电子设备后续需要进行内容投射时,可通过读取NFC标签中绑定设备的标识,确定出与NFC标签绑定的一个或多个第二电子设备,即进行内容投射的目标设备。
在一种可能的实现方式中,第一电子设备显示NFC标签的绑定界面,包括:第一电子设备读取NFC标签中预设的标志位;若该标志位中的取值为第一预设值,说明NFC标签还未与任何电子设备绑定,则第一电子设备可打开预设的投射应用显示该NFC标签的绑定界面。
在一种可能的实现方式中,在第一电子设备向NFC标签中写入上述M个第二电子设备的标识之后,还包括:第一电子设备将上述标志位的取值从第一预设值修改为第二预设值,从而指示该NFC标签已经与一个或多个电子设备完成绑定。
在一种可能的实现方式中,在第一电子设备向NFC标签中写入M个第二电子设备的标识之后,还包括:第一电子设备显示投射策略的设置界面;第一电子设备接收用户在该设置界面中对上述M个第二电子设备输入的投射策略,并保存该投射策略。也就是说,第一电子设备在NFC标签中建立相应的绑定关系后,用户可继续在投射应用中设置与该NFC标签绑定的M个第二电子设备在进行内容投射时的投射策略。
例如,当M=1时,上述投射策略可以包括不同NFC操作与投射指令之间的对应关系。例如,触碰一次NFC标签与投射指令1之间的对应关系;触碰二次NFC标签与投射指令2之间的对应关系。
又例如,当M>1时,上述投射策略可以包括为每一个第二电子设备设置的内容投射规则。例如,上述M个第二电子设备包括电视、音箱和灯,则用户可在设置界面中分别设置向电视、音箱和灯投射时的具体投射规则。
示例性的,当上述M个第二电子设备包括第一音箱和第二音箱时,上述投射策略可以为:使用距离源设备最近的音箱播放投射内容,或者,该投射策略为:使用第一音箱播放投射内容中的第一音频分量并使用第二音箱播放投射内容中的第二音频分量;
又例如,当上述M个第二电子设备包括电视和音箱时,上述投射策略可以为:使用电视播放投射内容中的显示内容,并使用音箱播放投射内容中的音频内容;或者,使用电视播放投射内容中的显示内容,并使用音箱和电视播放投射内容中的音频内容;
又例如,当上述M个第二电子设备包括电视和灯时,上述投射策略为:使用电视播放投射内容,并由电视控制灯的灯光效果。
在一种可能的实现方式中,在第一电子设备向NFC标签中写入上述M个第二电子设备的标识之后,还包括:第一电子设备将NFC标签与上述M个第二电子设备之间的绑定关系发送给其他电子设备或服务器。这样,第一电子设备可将上述绑定关系分享给其他电子设备使用,或者,用户使用其他电子设备登录服务器时也可获取到上述绑定关系。
示例性的,上述候选设备列表中的候选设备与第一电子设备可位于同一Wi-Fi网络内,或者,上述候选设备列表中的候选设备与第一电子设备可绑定在同一账号下。
示例性的,第一电子设备向NFC标签中写入第二电子设备的标识,包括:响应于第一电子设备靠近或接触该NFC标签的碰一碰操作,第一电子设备向NFC标签中写入第二电子设备的标识;或者,第一电子设备使用其NFC芯片检测到来自NFC标签的NFC信号后,可向该NFC标签中写入第二电子设备的标识。也就是说,用户可以通过靠近或触碰NFC标签的方式,触发第一电子设备向NFC标签中写入第二电子设备的标识。
类似的,第一电子设备读取NFC标签中预设的标志位,包括:响应于第一电子设备靠近或接触NFC标签的碰一碰操作,第一电子设备可读取NFC标签中预设的标志位;或者,第一电子设备使用其NFC芯片检测到来自NFC标签的NFC信号后,可读取到NFC标签中预设的标志位。也就是说,用户可以通过靠近或触碰NFC标签的方式,触发第一电子设备读取NFC标签中预设的标志位。
第三方面,本申请提供一种内容投射系统,包括第一电子设备、N个第二电子设备以及NFC标签,N为大于1的整数;该NFC标签中存储有该NFC标签与上述N个第二电子设备之间的绑定关系;其中,第一电子设备用于执行上述任一项所述的跨设备的内容投射方法。
在一种可能的实现方式中,上述N个第二电子设备中包括主设备;其中,主设备用于:接收第一电子设备发送的第一内容;按照预设的投射策略控制这N个第二电子设备中的至少一个第二电子设备播放第一内容。或者,可由第一电子设备作为主设备,按照预设的投射策略控制这N个第二电子设备中的至少一个第二电子设备播放第一内容。
第四方面,本申请提供一种电子设备,包括:触摸屏、通信接口、一个或多个处理器、存储器、以及一个或多个计算机程序;其中,处理器与触摸屏、通信接口以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,该处理器执行该存储器存储的一个或多个计算机程序,以使电子设备执行上述任一项所述的跨设备的内容投射方法。
第五方面,本申请提供一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述任一项所述的跨设备的内容投射方法。
第六方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述任一项所述的跨设备的内容投射方法。
可以理解地,上述提供的第三方面所述的内容投射系统、第四方面所述的电子设备、第五方面所述的计算机可读存储介质,以及第六方面所述的计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种内容投射系统的架构示意图一;
图2为本申请实施例提供的一种内容投射系统的架构示意图二;
图3为本申请实施例提供的一种内容投射系统的架构示意图三;
图4为本申请实施例提供的一种内容投射系统的架构示意图四;
图5为本申请实施例提供的一种电子设备的结构示意图一;
图6为本申请实施例提供的一种电子设备中操作系统的架构示意图;
图7为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图一;
图8为本申请实施例提供的一种跨设备的内容投射方法的流程示意图一;
图9为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图二;
图10为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图三;
图11为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图四;
图12为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图五;
图13为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图六;
图14为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图七;
图15为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图八;
图16为本申请实施例提供的一种跨设备的内容投射方法的流程示意图二;
图17为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图九;
图18为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图十;
图19为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图十一;
图20为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图十二;
图21为本申请实施例提供的一种跨设备的内容投射方法的应用场景示意图十三;
图22为本申请实施例提供的一种电子设备的结构示意图二。
具体实施方式
下面将结合附图对本实施例的实施方式进行详细描述。
本申请实施例提供的一种跨设备的内容投射方法,可应用于图1所示的通信系统(也可称为内容投射系统)100中。如图1所示,该通信系统100中可包括N(N为大于1的整数)个电子设备。这N个电子设备之间可通过通信网络互联。
示例性的,上述通信网络可以是有线网络,也可以是无线网络。例如,上述通信网络可以是局域网(local area networks,LAN),也可以是广域网(wide area networks,WAN),例如互联网。上述通信网络可使用任何已知的网络通信协议来实现,上述网络通信协议可以是各种有线或无线通信协议,诸如以太网、通用串行总线(universal serial bus,USB)、火线(FIREWIRE)、全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE)、蓝牙、无线保真(wireless fidelity,Wi-Fi)、NFC、基于互联网协议的语音通话(voice over Internet protocol,VoIP)、支持网络切片架构的通信协议或任何其他合适的通信协议。
示例性地,在一些实施例中,通信系统100中的各个电子设备之间可通过Wi-Fi协议建立Wi-Fi连接。在另一些实施例中,通信系统100中的各个电子设备登陆同一账号(例如华为账号)后可通过一个或多个服务器互联。
示例性的,上述通信系统100中可以包括第一电子设备101和第二电子设备102。 例如,如图2中的(a)所示,第一电子设备101可作为源设备,第二电子设备102可以作为第一电子设备101的目标设备。电子设备101可将其显示或播放的内容投射至第二电子设备102中。后续实施例中可将一个电子设备投射至另一个电子设备上的具体内容称为投射内容,例如,该投射内容可以包括文本、图片、视频、音频、动画、灯效或网页等。示例性的,电子设备可以将文本、图片、视频、音频、动画或网页等投射内容发送给另一个电子设备进行显示或播放;或者,电子设备也可将灯光的控制指令作为投射内容发送给另一个电子设备,从而控制灯光产生相应的灯效。
在一些实施例中,第一电子设备101的目标设备可以有多个。例如,上述通信系统100中除了第一电子设备101和第二电子设备102外还可以包括第三电子设备103。如图2中的(b)所示,当第一电子设备101为源设备时,第二电子设备102和第三电子设备103均可作为第一电子设备101的目标设备接收第一电子设备101发来的投射内容。这样,第一电子设备101可将投射内容同时投射至多个电子设备上显示或播放。例如,手机可将其音频文件同时投射至多个音箱中播放。又例如,手机可将显示的视频画面投射至电视中显示,同时将与视频画面对应的音频内容投射至音箱中播放。
也就是说,通信系统100中的源设备可将投射内容投射至一个或多个目标设备中,实现多设备之间进行内容投射时的跨设备交互。
在本申请实施例中,还可以在上述通信系统100中设置与一个或多个电子设备绑定的电子标签,也可称为射频标签或RFID(radio frequency identification,无线电射频识别)标签。电子设备通过发送射频信号可读取电子标签中存储的信息。
为了方便本领域技术人员理解,本申请实施例这里对上述电子标签的工作原理进行介绍。
示例性的,上述电子标签可以包括三种实现形式,即:被动式标签、半主动式标签和主动式标签。本申请实施例中,上述电子标签可以是被动式标签、半主动式标签或者主动式标签中的任一种。
(1)被动式标签:当电子标签为被动式标签时,电子标签中没有内部供电电源。电子标签与其他设备的NFC(near field communication,近场通信)芯片靠近时,可以接收到其他设备的NFC芯片发送的电磁波信息。此时,电子标签的内部集成电路(integrated circuit,IC)通过接收到的电磁波信号进行驱动。当电子标签接收到足够强度的电磁波信号时,可以向其他设备的NFC芯片发送电子标签中保存的数据,如上述笔记本电脑的设备信息。
(2)半主动式标签:半主动式标签的工作方式与被动式标签的工作方式类似。当电子标签为半主动式标签时,电子标签中包括一个小型电池,该小型电池的电力足以驱动电子标签的内部IC,使得IC处于工作的状态。由于半主动式标签中包括上述小型电池;因此相比于被动式标签,半主动式标签的反应速度更快。
(3)主动式标签:当电子标签为主动式标签时,电子标签中包括内部供电电源,用以供应内部IC所需电源以产生对外的讯号。一般来说,主动式标签允许在较长的距离进行射频通信,并且主动式标签拥有较大的存储空间,可以用来储存其他设备的NFC芯片传输过来的数据。
如图3所示,上述电子标签具体可以为使用NFC技术实现的NFC标签301(NFC 标签也可称为NFC贴片)。当电子设备(例如手机)中的NFC芯片与NFC标签301接触或靠近时,手机中的NFC芯片可检测到NFC标签301发出的NFC信号,进而通过该NFC信号读取到NFC标签301中存储的信息。也就是说,手机可响应与NFC标签301靠近或接触的碰一碰操作,从NFC标签301中获取NFC标签301中存储的信息。
示例性的,NFC标签301中一般设置有线圈,通过该线圈可存储NFC标签301与一个或多个电子设备之间的绑定关系。一个电子设备可以与一个或多个NFC标签301绑定。例如,每个NFC标签301均唯一对应一个NFC卡号,那么,可预先在NFC标签301的线圈中写入其NFC卡号和电子设备A的标识,从而在NFC标签301中建立该NFC标签301与电子设备A之间的绑定关系。
可以理解的是,NFC标签301中存储的绑定关系可以是NFC标签301出厂时预先设置的,也可以是用户在使用(例如首次使用)NFC标签301时手动设置的,本申请实施例对此不做任何限制。
以NFC标签301与通信系统100中的电视(也可称为智能电视)绑定举例,仍如图3所示,当用户需要将源设备(例如手机)中显示或播放的内容作为投射内容投射至智能电视(即目标设备)上时,可打开手机的NFC功能靠近或触碰NFC标签301。当手机与NFC标签301之间的距离足够近时,手机通过发射近场信号可从NFC标签301中读取到NFC标签301与智能电视之间的绑定关系。例如,手机可从NFC标签301中读取到智能电视的标识。该标识可以为智能电视的MAC(media access control,媒体访问控制)地址、设备名称或IP地址等。
也就是说,手机通过读取上述绑定关系可确定出本次进行内容投射的目标设备为智能电视。那么,手机作为源设备,可根据读取到的智能电视的标识开始向智能电视发送本次投射内容,使得智能电视可作为目标设备显示或播放该投射内容,完成本次内容投射过程。
其中,上述电视(或智能电视)可以是使用模拟信号工作的模拟电视机,也可以是使用数字信号工作的数字电视机,还可以是能够播放图像、音频或视频的任意显示输出设备。在一些场景中,也可以将上述电视(或智能电视)称为智慧屏或大屏设备。
在一些实施例中,NFC标签301可以记录该NFC标签301与多个电子设备之间的绑定关系。例如,NFC标签301可与智能电视和音箱(也可称为智能音箱)均绑定。那么,如图4所示,当用户打开手机的NFC功能靠近或触碰NFC标签301时,可读取到智能电视和智能音箱的标识,说明用户本次希望将手机中的投射内容投射至智能电视和智能音箱中。进而,手机可按照预设的策略将投射内容中的显示内容投射至智能电视中显示,同时将投射内容中的音频内容投射至智能音箱中播放,完成本次内容投射过程。
可以看出,用户通过使用源设备与NFC标签“碰一碰”的方式,使源设备可方便、快速的确定出本次内容投射的目标设备,从而自动开始向目标设备投射本次投射内容,简化了跨设备进行内容投射时用户的操作流程,提高和丰富了用户的使用体验,同时提高了内容投射时多设备之间协同的工作效率。
示例性的,上述通信系统100中的电子设备具体可以为手机、平板电脑、电视、 笔记本电脑、智能家居设备、可穿戴电子设备、车载设备、虚拟现实设备等,本申请实施例对此不做任何限制。其中,智能家居设备具体可以为:电视、音箱、空调(也可称为智能空调)、冰箱(也可称为智能冰箱)、电灯(也可称为智能灯或智能灯泡)或窗帘(也可称为智能窗帘)等。
以手机作为上述电子设备举例,图5示出了手机的结构示意图。
手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180等。
可以理解的是,本发明实施例示意的结构并不构成对手机的具体限定。在本申请另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接 收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行 算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机可以支持一种或多种视频编解码器。这样,手机可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机可以设置至少一个麦克风170C。在另一些实施例中,手机可 以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180中可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
当然,手机还可以包括充电管理模块、电源管理模块、电池、按键、指示器以及1个或多个SIM卡接口等,本申请实施例对此不做任何限制。
上述手机的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明手机的软件结构。
仍以手机为上述电子设备举例,图6示出了本申请实施例的手机的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序。
如图6所示,应用程序层中可以安装通话,备忘录,浏览器,联系人,相机,图库,日历,地图,蓝牙,音乐,视频,短信息等APP(应用,application)。
在本申请实施例中,仍如图6所示,应用程序层中还可以安装投射应用。用户可以从桌面、设置功能或下拉菜单等入口打开投射应用。
上述投射应用可作为内容投射时手机(即源设备)与其他电子设备(即目标设备)之间的桥梁,将手机中需要投射的应用内的投射内容发送给目标设备。例如,投射应用可接收应用程序框架层上报的投屏事件,进而,投射应用可与正在运行的应用(例如视频APP)交互,将该应用中正在显示或播放的内容作为投射内容通过Wi-Fi等无线通信方式发送给目标设备。
另外,用户还可以使用上述投射应用设置NFC标签与一个或多个电子设备之间的绑定关系。例如,可在投射应用中设置一个用于绑定NFC标签的选项。手机检测到用户打开该选项后,投射应用可显示待绑定的电子设备的列表。用户在该列表中选中需要绑定的一个或多个电子设备后,可将手机靠近需要绑定的NFC标签。这样,手机通过NFC信号可将用户在投射应用中选中的电子设备的标识写入NFC标签中,从而在NFC标签内建立该NFC标签与一个或多个电子设备之间的绑定关系。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
在本申请实施例中,如图6所示,应用程序框架层中可以运行NFC服务(NFC service)。
示例性的,手机开启NFC功能后可在应用程序框架层中开始运行NFC服务。当手机靠近或触碰NFC标签时,NFC服务可调用内核层的NFC驱动读取NFC标签中存储的绑定关系,从而获取到进行本次内容投射的目标设备。进而,NFC服务可向上述投射应用上报投射事件,从而触发投射应用将手机正在显示或播放的内容作为投射内容发送给目标设备,开始本次内容投射过程。
当然,如图6所示,应用程序框架层还可以包括Wi-Fi服务(Wi-Fi service)、窗口管理器,内容提供器,视图系统,电话管理器,资源管理器等,本申请实施例对此不做任何限制。
其中,Wi-Fi服务可用于提供加入Wi-Fi网络或与其他电子设备建立Wi-Fi P2P连接等Wi-Fi相关的功能。上述窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。上述内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。上述视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。上述电话管理器用于提供手机的通信功能。例如通话状态的管理(包括接通,挂断等)。上述资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
如图6所示,系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
以下将结合附图详细阐述本申请实施例提供的一种跨设备的内容投射方法。
示例性的,如图7所示,每个NFC标签701在出厂时均可在NFC标签701内存储自身的NFC卡号。并且,如图7所示,每个NFC标签701中可预先设置一个标志位,该标志位可用于指示NFC标签701是否与电子设备建立了绑定关系。例如,当 NFC标签701中的标志位为00时,说明该NFC标签701还未与电子设备绑定;当NFC标签701中的标志位为01时,说明该NFC标签701已经与一个或多个电子设备绑定。
用户在首次使用NFC标签701时,可使用预设的投射应用在NFC标签701内建立该NFC标签701与一个或多个电子设备之间的绑定关系。
以手机中安装有上述投射应用举例,如图8所示,使用投射应用在NFC标签701内建立上述绑定关系的方法可包括以下步骤:
S801、手机显示投射应用的NFC标签绑定界面,该界面中包括待绑定的设备列表。
示例性的,用户在首次使用NFC标签701时,可打开手机的NFC功能靠近或触碰NFC标签701。此时,手机与NFC标签701可通过NFC信号进行交互,使得手机读取到NFC标签701中的NFC卡号以及预设的标志位。如果该标志位为00,说明NFC标签701还未与电子设备绑定。进而,如图9所示,手机可提示用户建立NFC标签701与一个或多个电子设备之间的绑定关系。
如果检测到用户点击图9所示的确认按钮901,如图10所示,则手机可打开投射应用并自动跳转至NFC标签701的绑定界面1001。在绑定界面1001中,手机可显示一个或多个电子设备组成的设备列表1002。设备列表1002中的电子设备均为可以与NFC标签701绑定的设备。例如,设备列表1002中的电子设备可以为与手机登录同一账号(例如华为账号)的一个或多个设备。又例如,设备列表1002中的电子设备可以为与手机接入同一Wi-Fi网络的一个或多个设备。用户在设备列表1002中可选择需要与NFC标签701绑定的电子设备。
在本申请实施例中,NFC标签701可与一个或多个电子设备绑定。也就是说,用户可以在上述设备列表1002中选择一个或多个电子设备作为NFC标签701的绑定设备。
或者,如图11所示,投射应用中可预先设置单个电子设备的绑定选项1101和多个电子设备的绑定选项1102。如果用户选中绑定选项1101,则手机可在对应的绑定界面中提示用户从设备列表中选择一个电子设备与NFC标签701绑定。如果用户选中绑定选项1102,仍如图11所示,手机可在对应的绑定界面中显示预先设置的一个或多个设备组1103,每个设备组中包括多个电子设备。例如,智能电视和智能音箱1为一个设备组,智能音箱1和智能音箱2为一个设备组,智能电视和智能灯泡为一个设备组。这样,用户通过在绑定界面中选择设备组,可触发手机将NFC标签701与设备组中的多个电子设备绑定。
S802、手机接收用户在上述设备列表中选择绑定设备的第一操作。
在步骤S802中,手机显示出上述投射应用的绑定界面后,用户可在绑定界面列出的设备列表或设备组中选择与NFC标签701绑定的一个或多个电子设备。用户选择的一个或多个电子设备可称为NFC标签701的绑定设备。手机检测到用户在绑定界面上选择了绑定设备后,可继续执行下述步骤S803-S804。
S803、响应于第一操作,手机提示用户使用手机靠近待绑定的NFC标签701。
以绑定设备为智能电视和智能灯泡举例,手机检测到用户在上述绑定界面选择了智能电视和智能灯泡后,可确定NFC标签701与智能电视和智能灯泡之间的绑定关系。此时,手机需要将该绑定关系写入NFC标签701内。由于手机与NFC标签701之间 需要通过短距离的NFC信号进行通信,因此,如图12所示,如果手机没有检测到NFC标签701发出的NFC信号,则手机可在投射应用中显示提示1201,提示1201用于指导用户将手机靠近或接触等待与智能电视和智能灯泡绑定的NFC标签701。
S804、手机将上述绑定设备的标识写入NFC标签701中,以建立NFC标签701与绑定设备之间的绑定关系。
示例性的,用户可按照图12所示的提示将手机靠近或接触NFC标签701。当手机与NFC标签701之间的距离足够近时,手机可检测到NFC标签701发出的NFC信号。进而,如图13所示,手机可将用户在绑定界面中设置的绑定设备的标识写入NFC标签701中。例如,手机可将绑定设备的MAC地址、设备名称或IP地址等写入NFC标签701中。这样,在NFC标签701中便建立了NFC标签701与绑定设备之间的绑定关系。后续,手机等进行内容投射的源设备通过读取NFC标签701中绑定设备的标识,可确定出与NFC标签701绑定的一个或多个电子设备,即进行内容投射的目标设备。
另外,手机将上述绑定设备的标识写入NFC标签701后,NFC标签701可将预设的标志位从00修改为01,以指示当前NFC标签701已经与一个或多个电子设备绑定。
在一些实施例中,手机将绑定设备的标识写入NFC标签701后,用户可继续在投射应用中设置与NFC标签701绑定的绑定设备在进行内容投射时的投射策略。
以NFC标签701的绑定设备为智能电视举例,手机将智能电视的标识写入NFC标签701后,可在投射应用中提示用户设置向智能电视进行内容投射时的投射策略。如图14所示,手机可在设置界面1301中提供不同NFC操作所对应的不同投射指令供用户选择。例如,用户可设置触碰NFC标签701一次时,对应的投射指令为开始投射。例如,用户可设置连续触碰NFC标签701两次时,对应的投射指令为播放下一集(或下一首)。又例如,用户可设置触碰NFC标签701超过预设时间时,对应的投射指令为退出本次内容投射。
那么,手机接收到用户在设置界面1301中设置的投射策略后,可建立NFC标签701、智能电视以及上述投射策略之间的关联关系。后续,手机可通过靠近或触碰NFC标签701触发手机按照用户设置的投射策略向智能电视进行内容投射,从而简化跨设备进行内容投射时的操作流程。
以NFC标签701的绑定设备为智能电视、智能音箱和智能灯泡举例,手机将智能电视、智能音箱和智能灯泡的标识写入NFC标签701后,也可在投射应用中提示用户设置向这三个绑定设备进行内容投射时的投射策略。示例性的,如图15所示,用户可在设置界面1401中设置在向智能电视、智能音箱和智能灯泡进行内容投射时,将源设备的显示内容投射至智能电视中显示,将源设备的音频内容投射至智能音箱中播放,并且,智能灯泡可根据显示内容或音频内容进行灯效变化。当然,用户还可以进一步设置在智能电视中投射显示内容时的具体投射策略、在音箱中投射音频内容时的具体投射策略等,本申请实施例对此不做任何限制。
类似的,手机接收到用户在设置界面1401中设置的投射策略后,可建立NFC标签701、绑定设备(即智能电视、智能音箱和智能灯泡)以及上述投射策略之间的关联关系。后续,手机可通过靠近或触碰NFC标签701触发手机按照用户设置的投射策 略向上述三个绑定设备进行内容投射,从而简化跨设备进行内容投射时的操作流程。
需要说明的是,NFC标签701的绑定设备在进行内容投射时的投射策略可以是用户使用投射应用手动设置的,也可以是手机根据绑定设备的类型、位置、设备能力等信息预先设置的。例如,当NFC标签701的绑定设备为智能音箱1和智能音箱2时,手机可默认投射策略为使用距离用户最近的智能音箱进行内容投射。
在另一些实施例中,上述投射策略也可以是源设备在向NFC标签701的绑定设备进行内容投射的过程中动态设置的。例如,当手机向NFC标签701的绑定设备(例如智能电视和智能音箱)进行内容投射时,可动态的获取智能电视和智能音箱的音频播放能力。进而,手机可根据智能电视和智能音箱的音频播放能力确定在智能电视和/或智能音箱上投射音频内容。本申请实施例对投射策略的具体内容和投射策略的具体设置方式不做任何限制。
示例性的,用户可按照上述方法为不同的NFC标签分别设置对应的一个或多个绑定设备。当用户需要在某一个或某一组绑定设备上进行内容投射时,可开启源设备的NFC功能靠近或触碰对应的NFC标签,从而将NFC标签中已绑定的一个或多个绑定设备作为本次进行内容投射时的目标设备开始内容投射过程。
以下将以手机为源设备举例阐述手机通过触碰NFC标签701向目标设备进行内容投射的方法,如图16所示,该方法可包括以下步骤:
S1501、响应于手机与NFC标签701的碰一碰操作,手机获取与NFC标签701绑定的一个或多个绑定设备。
示例性的,通过上述步骤S801-S804,手机已经为NFC标签701设置了对应的绑定设备。那么,当用户希望将手机(即源设备)中的内容(例如显示内容、音频内容)投射至NFC标签701的绑定设备时,如图17所示,用户可开启手机的NFC功能触碰(或靠近)NFC标签701,即执行手机与NFC标签701的碰一碰操作。
响应于手机与NFC标签701的碰一碰操作,手机可从NFC标签701中读取到已经与NFC标签701绑定的一个或多个绑定设备的标识,该绑定设备可作为手机的目标设备参与本次内容投射。也就是说,用户使用源设备触碰NFC标签的碰一碰操作,可触发源设备获取到参与本次内容投射的目标设备,从而自动与目标设备完成后续的内容投射过程,简化了内容投射时的操作流程,提高了多设备协同工作时的效率。
当然,如果NFC标签701中没有存储绑定设备的标识,则手机可通过执行上述步骤S801-S804建立NFC标签701与相应绑定设备之间的对应关系。
S1502、当NFC标签701的绑定设备为一个电子设备时,手机向该绑定设备发送投射内容开始本次内容投射。
当手机读取到NFC标签701中只有一个绑定设备的标识时,说明与NFC标签701绑定的绑定设备只有一个,那么,本次进行内容投射的目标设备即为该绑定设备。
以绑定设备为智能电视举例,手机读取到NFC标签701中智能电视的标识后,如图18所示,手机可将智能电视作为本次内容投射的目标设备,向智能电视发送本次的投射内容开始内容投射。其中,该投射内容可以包括手机正在播放的内容,例如,手机正在播放的音频内容和/或显示内容。该显示内容可以包括图片、视频中的画面或当前显示界面中的部分或全部内容等。
例如,手机可根据智能电视的标识查询当前接入的Wi-Fi网络中是否包含智能电视。如果包含智能电视,说明智能电视已经接入Wi-Fi网络,那么,手机可通过该Wi-Fi网络向智能电视动态的发送本次的投射内容。如果不包含智能电视,说明智能电视还未接入手机所在的Wi-Fi网络,则手机可提示用户将智能电视接入手机所在的同一Wi-Fi网络中。进而,手机可通过该Wi-Fi网络向智能电视动态的发送本次的投射内容。
或者,如果手机所在的Wi-Fi网络中不包含智能电视,手机也可以根据读取到的智能电视的标识(例如智能电视的MAC地址)自动与智能电视建立无线通信连接。例如,手机可与智能电视建立蓝牙连接或Wi-Fi P2P连接等,本申请实施例对此不做任何限制。
另外,手机向智能电视发送的投射内容可以包括手机的显示内容。例如,手机可通过镜像投屏的方式,将实时显示的每一帧图像发送给智能电视,由智能电视同步显示手机的显示界面。又例如,手机可通过DLNA(digital living network alliance,数字生活网络联盟)的投屏的方式,将手机显示界面中的视频、图片等部分显示内容发送给智能电视进行显示。
示例性的,当手机接触或靠近上述NFC标签701时,如果手机正在显示视频A的播放界面,则当NFC标签701的绑定设备为智能电视时,手机可作为源设备将整个播放界面(即显示界面中全部显示内容)作为投射内容发送给智能电视,或者,手机可作为源设备将播放界面中视频A的视频图像(即显示界面中部分显示内容)作为投射内容发送给智能电视。
又例如,当手机接触或靠近上述NFC标签701时,如果手机正在显示视频APP的播放列表,则当NFC标签701的绑定设备为智能电视时,手机也可作为源设备将正在显示的播放列表作为投射内容发送给智能电视。后续,如果手机检测到用户在上述播放列表中选择播放视频A,则手机可继续将视频A的播放界面或视频A的视频图像作为投射内容发送给智能电视。
当然,手机向智能电视发送的投射内容也可以包括手机正在播放的音频内容,例如,该音频内容可以是与手机正在显示的视频画面对应的音频文件。智能电视接收到手机实时发来的投屏内容后可显示或播放该投屏内容,以完成本次内容投射。
在一些实施例中,仍以本次内容投射的目标设备为智能电视举例,手机在向智能电视进行内容投射的过程中,用户通过手机与NFC标签701的交互,可触发手机向智能电视发送对应的投射指令,从而实现内容投射过程中相应的控制功能。
示例性的,用户在投射应用中为NFC标签701设置绑定设备时,可预先设置与NFC标签701和绑定设备关联的投射策略。例如,该投射策略包括不同NFC操作所对应的不同投射指令。示例性的,可设置手机连续触碰NFC标签701两次这一NFC操作对应的投射指令为播放下一集(或下一首)。
那么,在手机向智能电视进行内容投射的过程中,如果手机检测到用户输入连续触碰NFC标签701两次的操作,则手机可向智能电视发送播放下一集(或下一首)的投射指令。智能电视可响应该投射指令执行播放下一集(或下一首)的操作。也就是说,在内容投射过程中用户可使用源设备向NFC标签输入不同的NFC操作实现相应的控制功能,从而丰富内容投射场景下用户的使用体验。
S1503、当NFC标签701的绑定设备为多个电子设备时,手机确定本次内容投射的主设备。
其中,本次内容投射的主设备(master)可以为源设备(即手机),也可以为与NFC标签701绑定的多个绑定设备中的一个。主设备可作为控制节点通过星型拓扑结构与其他设备(即从设备)连接并交互。
在一些实施例中,当NFC标签701的绑定设备有多个时,手机可根据这多个绑定设备的设备类型、设备能力等信息确定具体的主设备。例如,手机可查询这多个绑定设备的计算能力,并将计算能力最强的绑定设备确定为本次内容投射的主设备,此时,手机与其他绑定设备可作为主设备的从设备。
在另一些实施例中,手机可预先设置在不同内容投射场景下对应的具体主设备。例如,可设置当绑定设备为智能电视和智能灯泡时,主设备为智能电视,从设备为手机和智能灯泡。又例如,可设置当绑定设备为智能音箱1和智能音箱2时,主设备为手机,从设备为智能音箱1和智能音箱2。又例如,可设置当绑定设备为智能电视和智能音箱时,主设备为手机,从设备为智能电视和智能音箱。那么,手机可根据在NFC标签701中读取到的多个绑定设备的标识,确定出在这几个绑定设备组成的内容投射场景下对应的具体主设备。
S1504、若手机为主设备,则手机按照投射策略向各个绑定设备发送投射内容。
如果手机确定出本次内容投射的主设备为手机(即源设备),则手机可作为本次内容投射的控制节点,按照一定的投射策略向各个绑定设备(即目标设备)实时发送本次的投射内容,使得各个绑定设备接收到投射内容后开始播放或显示该投射内容。其中,上述投射策略可以是用户在绑定NFC标签701时预先设置的,也可以是手机根据绑定设备的设备类型、设备能力等信息预先设置的,也可以是手机确定自己为主设备后动态生成的,本申请实施例对此不做任何限制。
示例性的,如图19所示,当NFC标签701的绑定设备为智能音箱1和智能音箱2时,手机向智能音箱1和智能音箱2进行内容投射时可作为主设备,智能音箱1和智能音箱2可作为手机的从设备。在这种投射场景下,可设置投射策略与手机距离智能音箱1和智能音箱2的距离相关。
例如,手机可检测手机分别与智能音箱1和智能音箱2之间的距离。当手机与智能音箱1之间的距离小于预设值,而手机与智能音箱2之间的距离大于预设值时,说明用户距离智能音箱1较近而距离智能音箱2较远。那么,手机可作为主设备将本次的投射内容发送给智能音箱1,由智能音箱1播放本次投射内容完成内容投射。当然,手机也可默认向与手机距离最近的智能音箱发送本次投射内容。
或者,如果手机与智能音箱1和智能音箱2之间的距离均小于预设值,说明用户与智能音箱1和智能音箱2的距离均较近。那么,手机可按照立体声播放的投射策略分别向智能音箱1和智能音箱2发送投射内容。例如,手机可向智能音箱1发送投射内容中的低频分量,由智能音箱1播放投射内容中的低频分量,同时,手机可向智能音箱2发送投射内容中的高频分量,由智能音箱2播放投射内容中的高频分量。又例如,手机可向智能音箱1发送投射内容中与左声道对应的音频文件,同时向智能音箱2发送投射内容中与右声道对应的音频文件,使得智能音箱1和智能音箱2分别播放 投射内容中左声道和右声道的音频文件。当然,如果上述绑定设备中还包括除智能音箱1和智能音箱2之外的更多智能音箱时,手机可按照上述方法向每个智能音箱发送本次投射内容中对应的音频分量,使得多个音箱分别播放接收到的音频分量,实现立体声或环绕声的播放效果。
示例性的,手机向智能音箱1和智能音箱2发送投射内容之前,还可以向智能音箱1和智能音箱2发送同步指令,智能音箱1和智能音箱2可根据该同步指令与手机进行时间同步,以保证智能音箱1和智能音箱2的播放进度相同。例如,手机可在准备发送的投射内容中标记一个或多个时间戳,并将投射内容和投射内容中的时间戳一并发送给智能音箱1和智能音箱2。由于智能音箱1、智能音箱2以及手机进行时间同步后这三个设备的时间是同步的,因此,智能音箱1和智能音箱2可按照投射内容中的时间戳播放每一段投射内容,保证智能音箱1和智能音箱2的播放进度相同。
另外,手机还可以计算智能音箱1和智能音箱2响应上述同步指令时的传输时延。例如,智能音箱1响应上述同步指令时的传输时延为300ms,智能音箱2响应上述同步指令时的传输时延为500ms。那么,手机可根据该传输时延分别计算手机与智能音箱1和智能音箱2之间的距离。当然,手机也可以通过距离传感器、红外传感器等检测手机与智能音箱1和智能音箱2之间的距离,本申请实施例对此不做任何限制。
在一些实施例中,为了保证智能音箱1和智能音箱2能够同步播放手机发送的投射内容,手机还可以按照智能音箱1和智能音箱2的传输时延分别向智能音箱1和智能音箱2发送投射内容。仍以智能音箱1的传输时延为300ms,智能音箱2的传输时延为500ms举例,手机可在向智能音箱1发送投射内容前提前200ms向智能音箱2发送相同的投射内容,这样,智能音箱1和智能音箱2能够同时接收到手机发来的投射内容开始内容投射。
又或者,仍如图19所示,当手机为本次内容投射的主设备,智能音箱1和智能音箱2为手机的从设备时,手机可显示投射策略的设置界面。用户可在该设置界面中手动设置本次内容投射时使用哪个智能音箱播放手机发来的投射内容。并且,手机可保存用户为手机、智能音箱1和智能音箱2设置的投射策略,后续,当手机再次作为主设备向智能音箱1和智能音箱2进行内容投射时,手机可根据已存储的上述投射策略进行内容投射。也就是说,用户可以在进行内容投射的过程中手动为本次参与内容投射的多个设备设置相应的投射策略。
示例性的,如图20所示,当NFC标签701的绑定设备为智能电视和智能音箱时,手机向智能电视和智能音箱进行内容投射时可作为主设备,智能电视和智能音箱可作为手机的从设备。在这种投射场景下,可设置投射策略为使用智能电视播放投射内容中的显示内容,使用智能音箱播放投射内容中的音频内容。
那么,手机可作为主设备将本次投射内容中的显示内容发送给智能电视,由智能电视开始显示该显示内容。同时,手机可将本次投射内容中的音频内容发送给智能音箱,由智能音箱开始播放该音频内容。
或者,手机可作为主设备将本次投射内容中的显示内容和音频内容发送给智能电视,由智能电视播放该显示内容和音频内容。同时,手机可将本次投射内容中的音频内容发送给智能音箱,由智能音箱开始播放该音频内容。即智能电视和智能音箱可同 时播放本次投射的音频内容。其中,上述智能电视可以包括一个或多个,上述智能音箱也可以包括一个或多个,本申请实施例对此不做任何限制。
类似的,为了保证智能电视显示的显示内容与智能音箱播放的音频内容同步,手机在向智能电视和智能音箱发送上述显示内容和音频内容之前,可与智能电视和智能音箱进行时间同步。进而,手机可将添加有时间戳的显示内容和音频内容分别发送给智能电视和智能音箱,使得智能电视和智能音箱能够同步的按照时间戳进行内容投射。
或者,手机向智能电视和智能音箱进行内容投射时的投射策略可以是动态设置的。例如,手机可作为主设备获取智能电视和智能音箱的设备能力。以智能电视具有显示和音频播放能力、智能音箱具有音频播放能力举例,手机可动态的确定将本次投射内容中的显示内容投射至智能电视中显示,并将本次投射内容中的音频内容同时投射至智能电视和智能音箱中播放。进而,手机可作为主设备将本次投射内容中的显示内容和音频内容发送给智能电视,同时将本次投射内容中的音频内容发送给智能音箱。
S1505、若手机不是主设备,则手机将投射内容发送给主设备,由主设备按照投射策略控制其他绑定设备开始本次内容投射。
如果手机确定出本次内容投射的主设备为NFC标签701的多个绑定设备中的一个,则手机可将本次投射内容发送给主设备,由该主设备按照一定的投射策略控制其他各个绑定设备开始内容投射。
示例性的,如图21所示,当NFC标签701的绑定设备为智能电视和智能灯泡时,智能电视可作为内容投射时的主设备,智能灯泡可作为智能电视的从设备。在这种投射场景下,可设置投射策略为使用智能电视显示和播放投射内容,并由智能电视控制智能灯泡的灯效。
那么,手机(即源设备)可将本次需要投射的投射内容发送给智能电视(即主设备)。当然,手机也可将本次内容投射的投射策略发送给智能电视。或者,智能电视内可预先存储从设备为智能灯泡时的投射策略,本申请实施例对此不做任何限制。进而,智能电视可作为可作为主设备开始显示和播放手机发来的投射内容。同时,智能电视可根据投射内容向智能灯泡发送相应的控制指令,使智能灯泡在内容投射过程中投射出不同的灯效。
例如,当智能电视开始显示和播放投射内容时,智能电视可向智能灯泡发送关灯指令,以控制智能灯泡关闭灯源。又例如,智能电视可获取到正在播放的视频的类型。如果正在播放恐怖类型的视频,则智能电视可控制智能灯泡显示蓝色的光源;如果正在播放爱情类型的视频,则智能电视可控制智能灯泡显示粉色的光源等,使得用户在内容投射过程中获得较好的场景体验。
在另一些实施例中,手机通过读取NFC标签701获取到与NFC标签701绑定的绑定设备有多个时,手机也可默认自身为本次内容投射过程中的主设备,此时,手机无需再执行上述步骤S1503和S1505,可按照步骤S1504中的相关方法按照投射策略向各个绑定设备发送投射内容,完成本次内容投射。
可以看出,在本申请实施例提供的内容投射方法中,用户可以通过触碰NFC标签的方式,方便、快捷的将源设备中的投射内容投射至用户所需的目标设备中,实现“碰一碰投射”的功能。并且,源设备可一次性将投射内容同时投射至多个目标设备中, 通过多个目标设备的协同配合在不同投射场景下实现不同的投射效果,提高用户的使用体验以及多设备之间协同的工作效率。
在一些实施例中,用户在手机的投射应用中设置NFC标签701的绑定设备后,手机还可以将NFC标签701与绑定设备之间的绑定关系备份至投射应用的应用服务器中。例如,手机可以将NFC标签701的NFC卡号,以及与NFC标签701绑定的一个或多个绑定设备的标识发送给应用服务器,使得应用服务器建立该NFC标签701与对应绑定设备之间的绑定关系。
这样,当用户更换手机(即源设备)时,用户可在新的源设备上安装并登录投射应用,进而,新的源设备可从投射应用的应用服务器中重新获取到NFC标签701与对应绑定设备之间的绑定关系。那么,用户使用新的源设备触碰NFC标签701时,新的源设备同样可执行上述步骤S1501-S1505向对应的绑定设备进行内容投射。
在一些实施例中,用户在手机的投射应用中设置NFC标签701的绑定设备和投射策略后,还可以将NFC标签701、对应的绑定设备以及对应的投射策略分享给其他用户。例如,用户A可将NFC标签701、绑定设备以及投射策略通过微信等方式分享给用户A的家人(例如用户A的父母)。那么,用户A父母的手机接收到该分享内容后可保存NFC标签701、绑定设备以及投射策略之间的对应关系。后续,用户A的父母使用其手机触碰NFC标签701时,该手机也可同样执行上述步骤S1501-S1505向对应的绑定设备进行内容投射。
另外,用户在为NFC标签701的绑定设备设置投射策略时,还可以在投射策略中设置具体的投射内容、投射时间等。例如,用户可为自己孩子设置与NFC标签701对应的投射内容为学习视频A,投射时间为1小时。那么,用户使用其手机触碰NFC标签701时,或者,用户将该投射策略分享给父母,父母使用其手机触碰NFC标签701时,手机可按照用户在投射策略中设置的投射内容和投射时间向对应的绑定设备进行内容投射,使手机可以有针对性的完成本次内容投射,降低老人和小孩进行内容投射时的操作难度。
本申请实施例公开了一种电子设备,包括处理器,以及与处理器相连的存储器、通信接口、输入设备和输出设备。其中,输入设备和输出设备可集成为一个设备,例如,可将触摸传感器作为输入设备,将显示屏作为输出设备,并将触摸传感器和显示屏集成为触摸屏。
此时,如图22所示,上述电子设备可以包括:触摸屏2201,所述触摸屏2201包括触摸传感器2206和显示屏2207;一个或多个处理器2202;存储器2203;一个或多个应用程序(未示出);通信接口2208;以及一个或多个计算机程序2204,上述各器件可以通过一个或多个通信总线2205连接。其中该一个或多个计算机程序2204被存储在上述存储器2203中并被配置为被该一个或多个处理器2202执行,该一个或多个计算机程序2204包括指令,上述指令可以用于执行上述实施例中的各个步骤。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应实体器件的功能描述,在此不再赘述。
示例性的,上述处理器2202具体可以为图5所示的处理器110,上述存储器2203具体可以为图5所示的内部存储器121,上述显示屏2207具体可以为图5所示的显示 屏194,上述触摸传感器具体可以为图5所示的传感器模块180中的触摸传感器,本申请实施例对此不做任何限制。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (25)

  1. 一种跨设备的内容投射方法,其特征在于,包括:
    第一电子设备开始播放第一内容;
    所述第一电子设备从近场通信NFC标签中获取与所述NFC标签绑定的N个第二电子设备,N为大于1的整数;
    所述第一电子设备按照预设的投射策略将所述第一内容投射至所述N个第二电子设备中的至少一个第二电子设备上继续播放。
  2. 根据权利要求1所述的方法,其特征在于,所述第一电子设备从NFC标签中获取与所述NFC标签绑定的N个第二电子设备,包括:
    响应于所述第一电子设备靠近或接触所述NFC标签的碰一碰操作,所述第一电子设备读取所述NFC标签中存储的N个第二电子设备的标识,以确定与所述NFC标签绑定的N个第二电子设备;或者,
    所述第一电子设备使用NFC芯片检测到来自所述NFC标签的NFC信号后,读取所述NFC标签中存储的N个第二电子设备的标识,以确定与所述NFC标签绑定的N个第二电子设备;其中,所述NFC芯片包含在所述第一电子设备中。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第一电子设备按照预设的投射策略将所述第一内容投射至所述N个第二电子设备中的至少一个第二电子设备上继续播放,包括:
    所述第一电子设备按照预设的投射策略,将所述第一内容发送给所述N个第二电子设备中的至少一个第二电子设备播放。
  4. 根据权利要求3所述的方法,其特征在于,所述N个第二电子设备包括第一音箱和第二音箱;
    其中,所述第一电子设备按照预设的投射策略,将所述第一内容发送给所述N个第二电子设备中的至少一个第二电子设备播放,包括:
    所述第一电子设备将所述第一内容发送给所述第一音箱播放,所述第一音箱为与所述第一电子设备距离最近的音箱;或者,
    所述第一电子设备将所述第一内容发送给所述第一音箱和所述第二音箱播放。
  5. 根据权利要求4所述的方法,其特征在于,所述第一电子设备将所述第一内容发送给所述第一音箱和所述第二音箱播放,包括:
    所述第一电子设备将所述第一内容中的第一音频分量发送给所述第一音箱播放;并且,所述第一电子设备将所述第一内容中的第二音频分量发送给所述第二音箱播放。
  6. 根据权利要求3所述的方法,其特征在于,所述N个第二电子设备包括音箱和电视;
    其中,所述第一电子设备按照预设的投射策略,将所述第一内容发送给所述N个第二电子设备中的至少一个第二电子设备播放,包括:
    所述第一电子设备将所述第一内容中的显示内容发送给所述电视播放;并且,所述第一电子设备将所述第一内容中的音频内容发送给所述音箱播放;或者,
    所述第一电子设备将所述第一内容中的显示内容发送给所述电视播放;并且,所述第一电子设备将所述第一内容中的音频内容发送给所述电视和所述音箱播放。
  7. 根据权利要求1或2所述的方法,其特征在于,在所述第一电子设备从NFC标签中获取与所述NFC标签绑定的N个第二电子设备之后,还包括:
    所述第一电子设备在所述N个第二电子设备中确定主设备;
    其中,所述第一电子设备按照预设的投射策略将所述第一内容投射至所述N个第二电子设备中的至少一个第二电子设备上继续播放,包括:
    所述第一电子设备将所述第一内容发送给所述主设备,以使得所述主设备按照预设的投射策略控制所述N个第二电子设备中的至少一个第二电子设备播放所述第一内容。
  8. 根据权利要求7所述的方法,其特征在于,所述N个第二电子设备包括电视和灯;
    其中,所述第一电子设备在所述N个第二电子设备中确定主设备,包括:
    所述第一电子设备将所述电视确定为所述N个第二电子设备中的主设备;
    其中,预设的投射策略包括:由所述电视播放所述第一内容中的显示内容和音频内容,由所述电视播根据所述第一内容向所述灯发送控制指令,以控制所述灯发光的亮度或颜色。
  9. 根据权利要求7或8所述的方法,其特征在于,在所述第一电子设备在所述N个第二电子设备中确定主设备之后,还包括:
    所述第一电子设备将存储的投射策略发送给所述主设备。
  10. 根据权利要求3-9中任一项所述的方法,其特征在于,在所述第一电子设备按照预设的投射策略将所述第一内容投射至所述N个第二电子设备中的至少一个第二电子设备上继续播放之前,还包括:
    所述第一电子设备与所述N个第二电子设备进行时间同步;
    其中,所述第一电子设备发送的所述第一内容中携带有时间戳,所述时间戳用于指示所述第一内容的播放进度。
  11. 根据权利要求1-10中任一项所述的方法,其特征在于,在所述第一电子设备从NFC标签中获取与所述NFC标签绑定的N个第二电子设备之后,还包括:
    所述第一电子设备接收用户对所述N个第二电子设备输入的投射策略。
  12. 一种跨设备的内容投射方法,其特征在于,包括:
    第一电子设备显示近场通信NFC标签的绑定界面,所述绑定界面中包括等待与所述NFC标签绑定的候选设备列表,所述候选设备列表中的候选设备与所述第一电子设备位于同一通信网络内;
    所述第一电子设备检测到用户在所述候选设备列表中选择M个第二电子设备的第一操作,M为大于0的整数;
    响应于所述第一操作,所述第一电子设备提示用户将所述第一电子设备靠近或接触所述NFC标签;
    所述第一电子设备向所述NFC标签中写入所述M个第二电子设备的标识,以建立所述NFC标签与所述M个第二电子设备之间的绑定关系。
  13. 根据权利要求12所述的方法,其特征在于,所述第一电子设备显示NFC标签的绑定界面,包括:
    所述第一电子设备读取所述NFC标签中预设的标志位;
    若所述标志位中的取值为第一预设值,则所述第一电子设备打开预设的投射应用显示所述NFC标签的绑定界面。
  14. 根据权利要求13所述的方法,其特征在于,在所述第一电子设备向所述NFC标签中写入所述M个第二电子设备的标识之后,还包括:
    所述第一电子设备将所述标志位的取值从所述第一预设值修改为第二预设值。
  15. 根据权利要求12-14中任一项所述的方法,其特征在于,在所述第一电子设备向所述NFC标签中写入所述M个第二电子设备的标识之后,还包括:
    所述第一电子设备显示投射策略的设置界面;
    所述第一电子设备接收用户在所述设置界面中对所述M个第二电子设备输入的投射策略,并保存所述投射策略。
  16. 根据权利要求15所述的方法,其特征在于,当M=1时,所述投射策略包括不同NFC操作与投射指令之间的对应关系。
  17. 根据权利要求15所述的方法,其特征在于,当M>1时,所述投射策略包括为每一个第二电子设备设置的内容投射规则。
  18. 根据权利要求17所述的方法,其特征在于,
    当所述M个第二电子设备包括第一音箱和第二音箱时,所述投射策略为:使用距离源设备最近的音箱播放投射内容,或者,所述投射策略为:使用所述第一音箱播放投射内容中的第一音频分量并使用所述第二音箱播放投射内容中的第二音频分量;
    当所述M个第二电子设备包括电视和音箱时,所述投射策略为:使用所述电视播放投射内容中的显示内容,并使用所述音箱播放投射内容中的音频内容;或者,使用所述电视播放投射内容中的显示内容,并使用所述音箱和所述电视播放投射内容中的音频内容;
    当所述M个第二电子设备包括电视和灯时,所述投射策略为:使用所述电视播放投射内容,并由所述电视控制所述灯的灯光效果。
  19. 根据权利要求12-18中任一项所述的方法,其特征在于,在所述第一电子设备向所述NFC标签中写入所述M个第二电子设备的标识之后,还包括:
    所述第一电子设备将所述NFC标签与所述M个第二电子设备之间的绑定关系发送给其他电子设备或服务器。
  20. 根据权利要求12-19中任一项所述的方法,其特征在于,
    所述候选设备列表中的候选设备与所述第一电子设备位于同一Wi-Fi网络内,或者,所述候选设备列表中的候选设备与所述第一电子设备绑定在同一账号下;或者,
    所述第一电子设备向所述NFC标签中写入所述第二电子设备的标识,包括:响应于所述第一电子设备靠近或接触所述NFC标签的碰一碰操作,所述第一电子设备向所述NFC标签中写入所述第二电子设备的标识;或者,所述第一电子设备使用NFC芯片检测到来自所述NFC标签的NFC信号后,向所述NFC标签中写入所述第二电子设备的标识,所述NFC芯片包含在所述第一电子设备中;或者,
    所述第一电子设备读取所述NFC标签中预设的标志位,包括:响应于所述第一电子设备靠近或接触所述NFC标签的碰一碰操作,所述第一电子设备读取所述NFC标 签中预设的标志位;或者,所述第一电子设备使用NFC芯片检测到来自所述NFC标签的NFC信号后,读取所述NFC标签中预设的标志位。
  21. 一种内容投射系统,其特征在于,包括第一电子设备、N个第二电子设备以及NFC标签,N为大于1的整数;所述NFC标签中存储有所述NFC标签与所述N个第二电子设备之间的绑定关系;其中,所述第一电子设备用于执行如权利要求1-11或权利要求12-20中任一项所述的跨设备的内容投射方法。
  22. 根据权利要求21所述的系统,其特征在于,所述N个第二电子设备中包括主设备;
    其中,所述主设备用于:接收所述第一电子设备发送的第一内容;按照预设的投射策略控制所述N个第二电子设备中的至少一个第二电子设备播放所述第一内容。
  23. 一种电子设备,其特征在于,包括:
    触摸屏,所述触摸屏包括触摸传感器和显示屏;
    一个或多个处理器;
    通信接口;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-11或权利要求12-20中任一项所述的跨设备的内容投射方法。
  24. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-11或权利要求12-20中任一项所述的跨设备的内容投射方法。
  25. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-11或权利要求12-20中任一项所述的跨设备的内容投射方法。
PCT/CN2020/124854 2019-10-30 2020-10-29 一种跨设备的内容投射方法及电子设备 WO2021083280A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/773,346 US11818420B2 (en) 2019-10-30 2020-10-29 Cross-device content projection method and electronic device
EP20881192.7A EP4044609A4 (en) 2019-10-30 2020-10-29 CROSS-DEVICE METHOD OF CONTENT PROJECTION AND ELECTRONIC DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911047072.0 2019-10-30
CN201911047072.0A CN110958475A (zh) 2019-10-30 2019-10-30 一种跨设备的内容投射方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021083280A1 true WO2021083280A1 (zh) 2021-05-06

Family

ID=69975927

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/124854 WO2021083280A1 (zh) 2019-10-30 2020-10-29 一种跨设备的内容投射方法及电子设备

Country Status (4)

Country Link
US (1) US11818420B2 (zh)
EP (1) EP4044609A4 (zh)
CN (1) CN110958475A (zh)
WO (1) WO2021083280A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518257A (zh) * 2021-05-25 2021-10-19 青岛海信商用显示股份有限公司 多系统投屏处理方法与设备

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110958475A (zh) 2019-10-30 2020-04-03 华为终端有限公司 一种跨设备的内容投射方法及电子设备
CN113691842B (zh) * 2020-05-18 2023-03-31 荣耀终端有限公司 一种跨设备的内容投射方法及电子设备
CN113890932A (zh) * 2020-07-02 2022-01-04 华为技术有限公司 一种音频控制方法、系统及电子设备
CN112040059B (zh) 2020-08-10 2022-11-11 北京小米移动软件有限公司 应用控制方法、应用控制装置及存储介质
CN114222167B (zh) * 2020-09-04 2023-12-19 百度在线网络技术(北京)有限公司 云投屏控制方法、装置、服务器、系统和可读存储介质
CN114268617B (zh) * 2020-09-15 2023-09-01 华为技术有限公司 电子设备及其定位控制方法和介质
CN114189946A (zh) * 2020-09-15 2022-03-15 华为技术有限公司 数据分享的方法、电子设备和系统
CN112367543B (zh) * 2020-10-27 2023-08-15 海信视像科技股份有限公司 显示设备、移动终端、投屏方法及投屏系统
CN114530148A (zh) * 2020-10-30 2022-05-24 华为终端有限公司 一种控制方法、装置及电子设备
CN112398855B (zh) * 2020-11-16 2023-04-07 Oppo广东移动通信有限公司 应用内容跨设备流转方法与装置、电子设备
CN112307405B (zh) * 2020-11-16 2024-03-22 Oppo广东移动通信有限公司 跨设备的应用接力方法、装置、设备、系统及存储介质
CN112565843A (zh) * 2020-12-02 2021-03-26 广州朗国电子科技有限公司 基于dlna的投屏控制方法、装置、一体机及存储介质
CN112603288A (zh) * 2020-12-17 2021-04-06 深圳市信佰德科技有限公司 一种腕带式装置
CN115147056A (zh) * 2021-03-31 2022-10-04 华为技术有限公司 一种设备间应用协同工作的方法及设备
CN115248885A (zh) * 2021-04-26 2022-10-28 华为技术有限公司 设备发现方法、系统及其电子设备
CN115514396B (zh) * 2021-06-23 2023-06-13 广州视源电子科技股份有限公司 基于nfc的传屏设备连接方法、装置和计算机设备
US20230032806A1 (en) * 2021-07-28 2023-02-02 Microsoft Technology Licensing, Llc Dedicated wide area network slices
CN113639424A (zh) * 2021-07-30 2021-11-12 青岛海尔空调器有限总公司 用于控制空调的方法、装置及空调
CN113704517A (zh) * 2021-08-31 2021-11-26 维沃移动通信有限公司 多媒体文件的播放方法和装置
CN113766036B (zh) * 2021-09-18 2024-04-09 海信视像科技股份有限公司 显示设备及信息交互方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2681667A1 (en) * 2011-03-04 2014-01-08 Telcordia Technologies, Inc. Method and system supporting mobile coalitions
CN103795444A (zh) * 2012-08-24 2014-05-14 谷歌公司 启用nfc的便携式设备的家庭自动化设备配对
CN104798379A (zh) * 2012-11-12 2015-07-22 三星电子株式会社 在多媒体设备之间共享输出设备以传送和接收数据的方法和系统
CN105573609A (zh) * 2014-09-29 2016-05-11 纬创资通股份有限公司 内容分享方法与装置
CN102486716B (zh) * 2010-12-06 2016-06-08 Lg电子株式会社 移动终端及其显示方法
CN110392292A (zh) * 2019-08-16 2019-10-29 单正建 一种多部智能电子设备同步协同方法及多媒体播放系统
CN110958475A (zh) * 2019-10-30 2020-04-03 华为终端有限公司 一种跨设备的内容投射方法及电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101384025B (zh) * 2007-09-04 2011-12-21 英华达(南京)科技有限公司 用以分享同一群组中移动终端装置的影音文件的方法
US20130198056A1 (en) * 2012-01-27 2013-08-01 Verizon Patent And Licensing Inc. Near field communication transaction management and application systems and methods
WO2013150334A1 (en) * 2012-04-03 2013-10-10 Nokia Corporation Apparatus for splitting and outputting multi-channel composite audio signals
CN104063155B (zh) 2013-03-20 2017-12-19 腾讯科技(深圳)有限公司 内容分享方法、装置及电子设备
CN103595775B (zh) * 2013-11-04 2018-01-19 惠州Tcl移动通信有限公司 媒体文件的共享方法及系统
CN103974457A (zh) * 2014-04-09 2014-08-06 歌尔声学股份有限公司 无线设备间建立连接的方法、无线设备和无线通信系统
CN104202461A (zh) * 2014-08-11 2014-12-10 苏州易动智能科技有限公司 一种连接智能手机功能同步化的汽车音响系统
KR20180137913A (ko) * 2017-06-20 2018-12-28 삼성전자주식회사 컨텐츠를 재생하기 위한 전자 장치 및 그의 동작 방법
US10440473B1 (en) * 2018-06-22 2019-10-08 EVA Automation, Inc. Automatic de-baffling
US10911103B2 (en) * 2018-08-19 2021-02-02 International Forte Group LLC Portable electronic device for facilitating a proximity based interaction with a short range communication enabled object
CN115209194B (zh) * 2019-08-09 2023-07-21 荣耀终端有限公司 一种通过遥控器实现一碰投屏的终端设备、方法以及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102486716B (zh) * 2010-12-06 2016-06-08 Lg电子株式会社 移动终端及其显示方法
EP2681667A1 (en) * 2011-03-04 2014-01-08 Telcordia Technologies, Inc. Method and system supporting mobile coalitions
CN103795444A (zh) * 2012-08-24 2014-05-14 谷歌公司 启用nfc的便携式设备的家庭自动化设备配对
CN104798379A (zh) * 2012-11-12 2015-07-22 三星电子株式会社 在多媒体设备之间共享输出设备以传送和接收数据的方法和系统
CN105573609A (zh) * 2014-09-29 2016-05-11 纬创资通股份有限公司 内容分享方法与装置
CN110392292A (zh) * 2019-08-16 2019-10-29 单正建 一种多部智能电子设备同步协同方法及多媒体播放系统
CN110958475A (zh) * 2019-10-30 2020-04-03 华为终端有限公司 一种跨设备的内容投射方法及电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518257A (zh) * 2021-05-25 2021-10-19 青岛海信商用显示股份有限公司 多系统投屏处理方法与设备
CN113518257B (zh) * 2021-05-25 2023-11-10 青岛海信商用显示股份有限公司 多系统投屏处理方法与设备

Also Published As

Publication number Publication date
EP4044609A1 (en) 2022-08-17
US11818420B2 (en) 2023-11-14
CN110958475A (zh) 2020-04-03
US20220353571A1 (en) 2022-11-03
EP4044609A4 (en) 2022-12-07

Similar Documents

Publication Publication Date Title
WO2021083280A1 (zh) 一种跨设备的内容投射方法及电子设备
WO2020244495A1 (zh) 一种投屏显示方法及电子设备
WO2020244492A1 (zh) 一种投屏显示方法及电子设备
WO2021023220A1 (zh) 一种内容接续方法、系统及电子设备
WO2021233079A1 (zh) 一种跨设备的内容投射方法及电子设备
WO2021078284A1 (zh) 一种内容接续方法及电子设备
WO2022257977A1 (zh) 电子设备的投屏方法和电子设备
WO2022100304A1 (zh) 应用内容跨设备流转方法与装置、电子设备
CN113923230B (zh) 数据同步方法、电子设备和计算机可读存储介质
WO2022089271A1 (zh) 无线投屏方法、移动设备及计算机可读存储介质
CN111628916B (zh) 一种智能音箱与电子设备协作的方法及电子设备
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
WO2021249318A1 (zh) 一种投屏方法和终端
WO2022135527A1 (zh) 一种视频录制方法及电子设备
WO2022262387A1 (zh) 音量管理的方法及电子设备
CN114040242A (zh) 投屏方法和电子设备
WO2022222713A1 (zh) 一种编解码器协商与切换方法
WO2022156721A1 (zh) 一种拍摄方法及电子设备
US20220311700A1 (en) Method for multiplexing http channels and terminal
WO2023005711A1 (zh) 一种服务的推荐方法及电子设备
WO2022199491A1 (zh) 一种立体声组网方法、系统及相关装置
CN116795435A (zh) 兼容性管控方法及相关设备
WO2022002218A1 (zh) 一种音频控制方法、系统及电子设备
WO2023025059A1 (zh) 一种通信系统及通信方法
WO2023283941A1 (zh) 一种投屏图像的处理方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20881192

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020881192

Country of ref document: EP

Effective date: 20220503

NENP Non-entry into the national phase

Ref country code: DE