WO2022252842A1 - 一种媒体文件传输方法及装置 - Google Patents

一种媒体文件传输方法及装置 Download PDF

Info

Publication number
WO2022252842A1
WO2022252842A1 PCT/CN2022/086907 CN2022086907W WO2022252842A1 WO 2022252842 A1 WO2022252842 A1 WO 2022252842A1 CN 2022086907 W CN2022086907 W CN 2022086907W WO 2022252842 A1 WO2022252842 A1 WO 2022252842A1
Authority
WO
WIPO (PCT)
Prior art keywords
transmission
media file
end device
transmission mode
capability information
Prior art date
Application number
PCT/CN2022/086907
Other languages
English (en)
French (fr)
Inventor
提纯利
韦家毅
孙瑞囡
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022252842A1 publication Critical patent/WO2022252842A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware

Definitions

  • the embodiments of the present application relate to the field of communications, and in particular, to a method and device for transmitting media files.
  • the sending end device when media files are transmitted between electronic devices, after the sending end device obtains the media files, it decompresses and de-DRMs the media files according to the media files to obtain the bare code stream (code stream that can be played directly), and then uses the transmission interface to Directly transmit uncompressed raw code stream or lossless and lightly compressed bare code stream to the receiving end, and display and play on the receiving end.
  • the bare code stream code stream that can be played directly
  • the media file transmission method and device provided by the present application reduce the amount of data to be transmitted when transmitting media files, thereby saving transmission bandwidth.
  • a method for transmitting a media file is provided, the method is applied to a sending end device, and the sending end device supports a first transmission mode, a second transmission mode, and a third transmission mode; The file is restored to be playable.
  • the sender device performs part of the operation of restoring the media file to playability.
  • the sender device transparently transmits the media file; the transmission channel corresponding to a transmission mode includes A unit that performs operations to restore a media file to playability in transfer mode.
  • the method includes: acquiring a media file according to the first capability information; the first capability information is used to indicate the media processing capability of the receiving end device, or the first capability information is used to indicate the media processing capabilities of the sending end device and the receiving end device; From the transmission channel corresponding to the transmission mode supported by the first capability information, determine the first transmission channel for transmitting the media file according to the rules; the sending end device transmits the media file to the receiving end device through the unit of the first transmission channel in the sending end device .
  • the media file transmission method provided by the embodiment of the present application supports multiple transmission modes between the sending end device and the receiving end device. According to the capability information of the sending end device and the receiving end device, it is determined that the sending end device performs the restoration of the media file to be playable Part or all of the multiple operations, the receiving end device performs the rest of the operations. In different transmission modes, the sending end device performs different operations among the multiple operations for restoring the media file to be playable. Therefore, during the transmission of media files, the computing and media processing capabilities of the receiving end device can be utilized to reduce the amount of data on the transmission interface between the receiving and receiving ends and save bandwidth.
  • there are multiple media files requested by the sending end device including: respectively determining the first transmission method for transmitting each media file according to the rules from the transmission channel corresponding to the transmission mode supported by the first capability information Channels: transmit the corresponding media files to the receiving end device through the units of each first transmission channel in the sending end device.
  • This embodiment describes that when the sending device requests multiple media files at the same time, a transmission channel is determined for each media file, and each media file is transmitted to the receiving device.
  • the method provided in this application may further include: acquiring second capability information, where the second capability information is used to indicate one or more of the following capabilities: The transmission capacity of the interface, the computing power of the sending end device, and the computing power of the receiving end device.
  • the method provided by the present application further includes: The first transmission channel of a media file is adjusted until the capability indicated by the second capability information supports the resources occupied by the plurality of first transmission channels.
  • the first media file is one or more of the multiple media files.
  • This embodiment describes that when the capability indicated by the second capability information does not support the resources occupied by multiple first transmission channels, adjust the first transmission channel for transmitting the first media file until the capability indicated by the second capability information supports multiple first transmission channels. Resources occupied by a transmission channel, so as to realize the simultaneous transmission of multiple media files.
  • the transmission channel corresponding to the first transmission mode includes a unit in the sending end device that performs the operation of restoring the media file to playable;
  • the transmission channel corresponding to the second transmission mode includes a unit in the sending end device The unit that executes the operation before the first operation in the operation of restoring the media file to the playable operation, and the unit that executes the operation after the first operation and the operation after the first operation in the operation of restoring the media file to the playable operation in the receiving end device;
  • One operation is any operation supported by the receiving end device among the operations of restoring the media file to be playable;
  • the transmission channel corresponding to the third transmission mode includes a unit in the receiving end device that performs the operation of restoring the media file to be playable.
  • the first transmission channel for transmitting each media file is determined according to the rules, which can be specifically implemented as follows: according to the first capability information, Determine the third capability information of each transmission channel corresponding to the transmission mode supported by the first capability information, the third capability information is used to indicate the characteristics of the media files that each transmission channel supports transmission; determine the transmission channel whose third capability information satisfies the rules , as the first transmission channel.
  • the rules can be specifically implemented as follows: according to the first capability information, Determine the third capability information of each transmission channel corresponding to the transmission mode supported by the first capability information, the third capability information is used to indicate the characteristics of the media files that each transmission channel supports transmission; determine the transmission channel whose third capability information satisfies the rules , as the first transmission channel.
  • the above rules may include one or more of the following: playback effect priority, minimum computing resource consumption of the sending end device, minimum computing resource consumption of the receiving end device, and minimum interface transmission bandwidth occupation.
  • the second operation is the last operation among the operations of restoring the media file to be playable, and the supported transmission modes include the first transmission mode; or, if the first capability information indicates that the receiving end device supports the second operation, it is determined that the supported transmission mode includes the first transmission mode and the second transmission mode; or, if the first capability information indicates that the receiving end device supports the media file Return to all operations in the playable operations, and confirm that the supported transmission modes include the first transmission mode, the second transmission mode and the third transmission mode.
  • This embodiment illustrates that the transmission mode supported between the sending end device and the receiving end device is determined according to the processing capability of the receiving end device for media files.
  • the operation of restoring the media file to be playable includes: a decoding operation and a rendering operation.
  • the decoding operation includes a decompression operation, or, the decoding operation includes a decompression operation and a deDRM operation.
  • the transmission channel corresponding to the first transmission mode includes the decoding unit and the rendering unit in the sending end device;
  • the transmission channel corresponding to the second transmission mode includes the decoding unit in the sending end device and the rendering unit in the receiving end device;
  • the third transmission mode corresponds to The transmission channel includes the decoding unit and the rendering unit in the receiver device.
  • the method provided in this application may further include: if the first capability information indicates that the receiving device does not support rendering operations, determining that the transmission mode supported by the first capability information includes the first transmission mode; The end device supports the rendering operation but does not support the decoding operation, and it is determined that the transmission mode supported by the first capability information includes the first transmission mode and the second transmission mode; or, if the first capability information indicates that the receiving end device supports the rendering operation and supports the decoding operation,
  • the transmission modes that are determined to be supported by the first capability information include a first transmission mode, a second transmission mode, and a third transmission mode. This embodiment describes the determination of the transmission mode supported between the sending end device and the receiving end device according to the specific operations in the media file processing supported by the receiving end device.
  • the first capability information includes one or more of the following: codec capabilities, DRM-decoding capabilities, interface encryption and decryption capabilities, audio and video rendering capabilities, and Displays playback capabilities.
  • the media file includes a bare-code media file to be displayed on the sender device, or the media file includes a compressed media file to be displayed on the sender device.
  • a method for transmitting media files is provided, the method is applied to a receiving end device, and the receiving end device supports a first transmission mode, a second transmission mode, and a third transmission mode; the first transmission mode is for the receiving end device to receive Played media files, in the second transfer mode, the receiver device performs part of the operation of restoring the media file to be playable, and in the third transfer mode, the receiver device performs all operations of restoring the media file to be playable .
  • the method may include: according to the first capability information, the receiving end device determines the first transmission channel for transmitting the media file according to rules from the transmission channels corresponding to the transmission mode supported by the first capability information; wherein the first capability information is used to indicate The media processing capability of the sending end device, or, the first capability information is used to indicate the media processing capabilities of the sending end device and the receiving end device; a transmission channel corresponding to a transmission mode includes a transmission mode that executes restoring the media file to a playable
  • the operating unit the receiving end device obtains the playable media file through the unit in the receiving end device through the first transmission channel, and displays and plays the playable media file.
  • the media file transmission method provided by the embodiment of the present application supports multiple transmission modes between the sending end device and the receiving end device. According to the capability information of the sending end device and the receiving end device, it is determined that the sending end device performs the restoration of the media file to be playable Part or all of the multiple operations, the receiving end device performs the rest of the operations. In different transmission modes, the sending end device performs different operations among the multiple operations for restoring the media file to be playable. Therefore, during the transmission of media files, the computing and media processing capabilities of the receiving end device can be utilized to reduce the amount of data on the transmission interface between the receiving and receiving ends and save bandwidth.
  • there are multiple media files requested by the sending end device including: respectively determining the first transmission method for transmitting each media file according to the rules from the transmission channel corresponding to the transmission mode supported by the first capability information Channels: transmit the corresponding media files to the receiving end device through the units of each first transmission channel in the sending end device.
  • This embodiment describes that when the sending device requests multiple media files at the same time, a transmission channel is determined for each media file, and each media file is transmitted to the receiving device.
  • the method provided in this application may further include: acquiring second capability information, where the second capability information is used to indicate one or more of the following capabilities: The transmission capacity of the interface, the computing power of the sending end device, and the computing power of the receiving end device.
  • the method provided by the present application further includes: The first transmission channel of a media file is adjusted until the capability indicated by the second capability information supports the resources occupied by the plurality of first transmission channels.
  • the first media file is one or more of the multiple media files.
  • This embodiment describes that when the capability indicated by the second capability information does not support the resources occupied by multiple first transmission channels, adjust the first transmission channel for transmitting the first media file until the capability indicated by the second capability information supports multiple first transmission channels. Resources occupied by a transmission channel, so as to realize the simultaneous transmission of multiple media files.
  • the transmission channel corresponding to the first transmission mode includes a unit for performing a playback operation in the receiving device; The unit of the operation before the first operation in the operation, and the unit of performing the first operation and the operation after the first operation in the operation of restoring the media file to the playable operation in the receiving end device; the first operation is to restore the media file to In the playable operations, any operation supported by the receiving end device; the transmission channel corresponding to the third transmission mode includes the unit in the receiving end device that performs the operation of restoring the media file to be playable, and the unit that performs the playback operation.
  • This embodiment describes the media processing units in the sending end device and the receiving end device respectively corresponding to the three transmission modes.
  • the first transmission channel for transmitting each media file is determined according to the rules, which can be specifically implemented as follows: according to the first capability information, Determine the third capability information of each transmission channel corresponding to the transmission mode supported by the first capability information, the third capability information is used to indicate the characteristics of the media files that each transmission channel supports transmission; determine the transmission channel whose third capability information satisfies the rules , as the first transmission channel.
  • the rules can be specifically implemented as follows: according to the first capability information, Determine the third capability information of each transmission channel corresponding to the transmission mode supported by the first capability information, the third capability information is used to indicate the characteristics of the media files that each transmission channel supports transmission; determine the transmission channel whose third capability information satisfies the rules , as the first transmission channel.
  • the above rules may include one or more of the following: playback effect priority, minimum computing resource consumption of the sending end device, minimum computing resource consumption of the receiving end device, and minimum interface transmission bandwidth occupation.
  • the second operation is to restore the media file to the last operation in the playable operation, and it is determined that the supported transmission mode includes the first transmission mode; or, if The receiving device supports the second operation, and the supported transmission modes include the first transmission mode and the second transmission mode; or, if the receiving device supports all operations in the operation of restoring the media file to playable, the confirmed supported transmission modes include A first transmission mode, a second transmission mode and a third transmission mode.
  • This embodiment illustrates that the transmission mode supported between the sending end device and the receiving end device is determined according to the processing capability of the receiving end device for media files.
  • the operation of restoring the media file to be playable includes: a decoding operation and a rendering operation.
  • the decoding operation includes a decompression operation, or, the decoding operation includes a decompression operation and a deDRM operation.
  • the transmission channel corresponding to the first transmission mode includes the decoding unit and the rendering unit in the sending end device;
  • the transmission channel corresponding to the second transmission mode includes the decoding unit in the sending end device and the rendering unit in the receiving end device;
  • the third transmission mode corresponds to The transmission channel includes the decoding unit and the rendering unit in the receiver device.
  • the method provided in this application may further include: if the first capability information indicates that the receiving device does not support rendering operations, determining that the transmission mode supported by the first capability information includes the first transmission mode; The end device supports the rendering operation but does not support the decoding operation, and it is determined that the transmission mode supported by the first capability information includes the first transmission mode and the second transmission mode; or, if the first capability information indicates that the receiving end device supports the rendering operation and supports the decoding operation,
  • the transmission modes that are determined to be supported by the first capability information include a first transmission mode, a second transmission mode, and a third transmission mode. This embodiment illustrates that the transmission mode supported between the sending end device and the receiving end device is determined according to the specific operations in the media file processing supported by the receiving end device.
  • the first capability information includes one or more of the following: codec capabilities, DRM-decoding capabilities, interface encryption and decryption capabilities, audio and video rendering capabilities, and Displays playback capabilities.
  • the media file includes a bare-code media file to be displayed on the sender device, or the media file includes a compressed media file to be displayed on the sender device.
  • playing the playable media file includes: synchronously playing the playable media file according to time stamp information in the media file.
  • This embodiment illustrates that the receiver device can play multiple playable media files synchronously.
  • a media file transmission device is provided, and the media file transmission device is deployed on a sending end device that executes the media file transmission method provided in the first aspect or any possible implementation manner of the first aspect.
  • the media file transmission apparatus may include a first acquiring unit, a first determining unit, and a processing unit. in:
  • the first acquiring unit is configured to acquire the media file according to the first capability information, where the first capability information is used to indicate the media processing capability of the receiver device communicating with the sender device, or the first capability information is used to indicate the sending The media processing capability of the end device and the receiving end device.
  • the first determining unit is configured to determine the first transmission channel for transmitting media files according to the rules from the transmission channels corresponding to the transmission modes supported by the first capability information; a transmission channel corresponding to a transmission mode includes a transmission mode that executes the media file Reverts to a unit of a playable operation.
  • the processing unit is configured to transmit the media file to the receiving end device through the unit in the sending end device through the first transmission channel.
  • a media file transmission device is provided, and the media file transmission device is deployed on a receiver device that executes the media file transmission method provided in the second aspect or any possible implementation manner of the second aspect.
  • the receiver device may include a first determining unit, an acquiring unit, and a playing unit. in:
  • the first determination unit is configured to determine the first transmission channel for transmitting media files with the sending end device according to the rules from the transmission channels corresponding to the transmission modes supported by the first capability information; a transmission channel corresponding to a transmission mode includes a transmission mode under Execute restore media files to playable media files.
  • the acquiring unit is configured to acquire playable media files through the first transmission channel.
  • the playback unit is used to play a playable media file.
  • each unit in the fourth aspect is the same as that described in the method of the second aspect, and will not be repeated here.
  • a fifth aspect provides a computer-readable storage medium, including instructions, which, when run on a computer, cause the computer to execute the media file transmission method provided in the above-mentioned first aspect or any possible implementation thereof.
  • a computer-readable storage medium including instructions, which, when run on a computer, cause the computer to execute the media file transmission method provided in the above-mentioned second aspect or any possible implementation thereof.
  • a computer program product containing instructions is provided, and when it is run on a computer, it causes the computer to execute the media file transmission method provided in the above first aspect or any possible implementation thereof.
  • a computer program product containing instructions which, when run on a computer, causes the computer to execute the media file transmission method provided in the above second aspect or any possible implementation manner thereof.
  • the present application provides a chip system, which includes a processor and may further include a memory, configured to implement corresponding functions in the above method.
  • the system-on-a-chip may consist of chips, or may include chips and other discrete devices.
  • a media file transmission system including the sending end device as described in the third aspect and the receiving end device as described in the fourth aspect, with the above first and second aspects and any possible implementation way of functioning.
  • FIG. 1 is a schematic diagram of a scenario of media file transmission provided by an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a mobile phone provided in an embodiment of the present application.
  • FIG. 4 is a schematic framework diagram of a system for transmitting media files provided by an embodiment of the present application.
  • FIG. 5 is a schematic framework diagram of another system for transmitting media files provided by an embodiment of the present application.
  • FIG. 6 is a schematic flow diagram of a method for media file transmission provided in an embodiment of the present application.
  • FIG. 7 is a schematic flow chart of another method for media file transmission provided by the embodiment of the present application.
  • FIG. 8 is a schematic diagram of an AR/VR scene provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a media file transmission scenario provided by an embodiment of the present application.
  • FIG. 10 is a schematic framework diagram of another system for transmitting media files provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a device for transmitting media files provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of another device for transmitting media files provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a sending end device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of another device for transmitting media files provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of a receiver device provided by an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same function and effect. Those skilled in the art can understand that words such as “first” and “second” do not limit the number and execution order, and words such as “first” and “second” do not necessarily limit the difference. There is no sequence or order of magnitude among the technical features described in the "first” and “second”.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. To be precise, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner for easy understanding.
  • At least one can also be described as one or more, and multiple can be two, three, four or more, which is not limited in this application.
  • the network architecture and scenarios described in the embodiments of the present application are for more clearly illustrating the technical solutions of the embodiments of the present application, and do not constitute limitations on the technical solutions provided by the embodiments of the present application.
  • the evolution of the network architecture and the emergence of new business scenarios, the technical solutions provided by the embodiments of the present application are also applicable to similar technical problems.
  • a media file refers to a file containing audio and video content.
  • Bare stream refers to audio and video content that can be played directly.
  • the media files obtained by the electronic device from the local or the server can be encrypted and encoded audio and video content, which can be decrypted and decompressed to obtain the bare code stream.
  • the scene of media file transmission is shown in Figure 1.
  • the communication unit of the sending end device such as mobile phone, personal computer and other terminal devices
  • the media file compressed media stream
  • the decoder to obtain the bare code stream and Metadata
  • the rendering unit renders the bare code stream according to the metadata, and the rendered bare code stream can be directly displayed and played on the sending device (optional).
  • the sender device obtains the playable bare code stream of the rendered media file, it directly passes through the media interface (such as high definition multimedia interface (high definition multimedia interface, HDMI), universal serial bus (universal serial bus, USB) and other media transmission interfaces. ), transmitted to the receiving end device (such as smart TV, car display, etc.) for display and playback.
  • the media interface such as high definition multimedia interface (high definition multimedia interface, HDMI), universal serial bus (universal serial bus, USB) and other media transmission interfaces.
  • the communication unit of the sending end device may acquire media file data from a network or a local storage unit. Wherein, if the media file data is digital copyright protected content, it is also necessary to de-DRM the decoded bare code stream.
  • the media file data transmitted by the media interface is all decoded, so the amount of data is large and the occupied bandwidth is high.
  • the computing and media processing capabilities of the receiving end devices are also constantly improving, and the current media file transmission process does not make full use of the capabilities of the receiving end devices.
  • the present application provides a media file transmission method, which can be applied to the process of transmitting media files between the sending end device and the receiving end device.
  • multiple transmission modes are supported between the sending end device and the receiving end device.
  • Capability information of the sending and receiving parties to determine that the sending device performs some or all of the multiple operations to restore the media file to be playable, and the receiving device performs the rest of the operations.
  • the sending device executes the media file
  • the actions in the multiple actions to restore to playable are different. In this way, during the transmission of media files, the computing and media processing capabilities of the receiving end device can be used to reduce the amount of data on the transmission interface between the sending and receiving ends and save bandwidth.
  • the solution provided by this application can be used to de-DRM in the receiving end device, so that high copyright protection can be achieved. Quality media file transfer.
  • the sending device can directly transparently transmit the media file data to the The receiving end device decodes, de-DRMs (optional), renders the media file, and then displays and plays the media file.
  • de-DRMs optionally renders the media file
  • displays and plays the media file Compared with the current media file data processed by the sending end device to be playable and then transmitted to For the receiving end device, the amount of data transmitted on the media interface is greatly reduced, which saves the transmission bandwidth.
  • the communication system may include a sending end device 201 and a receiving end device 202 .
  • the sending end device 201 transmits media files with the receiving end device 202 through the media interface 203 .
  • the communication system shown in FIG. 2 may be a screen projection scenario, a screen mirroring scenario, or a scenario of transmitting media files at a user end in an AR/VR scenario.
  • the embodiments of the present application do not limit the application scenarios of the solutions provided in this application.
  • the sending end device 201 may be a mobile phone, a tablet computer, a desktop, a laptop, a handheld computer, a cellular phone, and a set-top box, and other devices with mobile communication capabilities.
  • the receiver device 202 may be a TV, a mobile phone, a tablet computer, a desktop, a laptop, a handheld computer, a vehicle display, a head-mounted display device, a projector, and other devices capable of displaying and playing.
  • the embodiment of the present application does not specifically limit the specific product forms of the above-mentioned sending end device 201 and receiving end device 202 .
  • the media interface 203 is a physical connection between the sending end device 201 and the receiving end device 202 .
  • the media interface 203 may be in the form of a wired electrical signal interface, a wired optical signal interface, or a wireless signal interface, and the embodiment of the present application does not limit the type of the media interface 203 .
  • the media interface 203 may be connected to the output interface of the sending end device 201 and connected to the input interface of the receiving end device 202 .
  • the output interface in the sending end device 201 and the input interface in the receiving end device 202 can be loaded in the chip.
  • the sending end device 201 is a mobile phone
  • the receiving end device 202 is a smart TV.
  • the mobile phone sends the video media file played by the mobile phone to the smart TV through the screen projection operation.
  • the mobile phone 100 is used as an example to introduce the transmitting device and the receiving device provided in the embodiment of the present application.
  • the mobile phone 100 shown in FIG. combine two or more components, or may have different component configurations.
  • the various components shown in Figure 3 may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the mobile phone 100 may specifically include: a processor 101, a radio frequency (radio frequency, RF) circuit 102, a memory 103, a touch screen 104, a Bluetooth device 105, one or more sensors 106, wireless fidelity (wireless fidelity, WI-FI) device 107, positioning device 108, audio circuit 109, peripheral interface 110, power supply system 111, fingerprint reader 112 and other components. These components may communicate via one or more communication buses or signal lines (not shown in Figure 3).
  • RF radio frequency
  • Each component of the mobile phone 100 is specifically introduced below in conjunction with FIG. 3 :
  • the processor 101 is the control center of the mobile phone 100. It uses various interfaces and lines to connect various parts of the mobile phone 100. By running or executing the application program (application, App) stored in the memory 103, and calling the data stored in the memory 103 and commands to execute various functions of the mobile phone 100 and process data.
  • the processor 101 may include one or more processing units; the processor 101 may also integrate an application processor and a modem processor. Among them, the application processor mainly handles the operating system, user interface and application programs. The modem processor primarily handles wireless communications. It can be understood that the foregoing modem processor may not be integrated into the processor 101 .
  • the processor 101 may acquire a media file, and perform operations such as decoding, de-DRM, and rendering. Alternatively, the processor 101 may transparently transmit the media file to other devices, or send a playable media file, or send a partially processed media file.
  • the radio frequency circuit 102 can be used for receiving and sending wireless signals during sending and receiving information or talking. Specifically, the radio frequency circuit 102 may receive the downlink data from the base station and send it to the processor 101 for processing. In addition, data related to uplink is sent to the base station. Generally, the radio frequency circuit 102 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuit 102 can also communicate with other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile Communications, General Packet Radio Service, Code Division Multiple Access, Wideband Code Division Multiple Access, Long Term Evolution, Email, Short Message Service, etc.
  • the memory 103 is used to store application programs and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running the application programs and data stored in the memory 103 .
  • the memory 103 mainly includes an area for storing programs and an area for storing data.
  • the stored program area can store an operating system and at least one application program required by a function (such as a sound playing function, an image playing function, etc.).
  • the storage data area can store data (such as audio data, phone book, etc.) created according to the use of the mobile phone 100 .
  • the memory 103 may include a high-speed random access memory, and may also include a non-volatile memory, such as a magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the memory 103 can store various operating systems, such as those developed by Apple Inc. operating system, developed by Google operating system, etc.
  • the touch screen 104 may include a touch-sensitive surface 104-1 and a display 104-2.
  • the touch-sensitive surface 104-1 (such as a touch panel) can collect touch events of the user of the mobile phone 100 on or near it (for example, the user uses any suitable object such as a finger or a stylus on the touch-sensitive surface 104-1). or operations near the touch-sensitive surface 104-1), and send the collected touch information to other devices such as the processor 101.
  • a user's touch event near the touch-sensitive surface 104-1 may be called a floating touch.
  • the hovering touch may mean that the user does not need to directly touch the touchpad in order to select, move or drag objects (such as icons, etc.), but the user only needs to be near the mobile terminal to perform desired functions.
  • the terms "touch”, “contact”, etc. do not imply direct contact with the touch screen, but contact near or close to it.
  • the touch-sensitive surface 104-1 capable of floating touch can be implemented by capacitive, infrared light sensing, ultrasonic waves, and the like.
  • the touch-sensitive surface 104-1 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then After sending to the processor 101, the touch controller can also receive and execute the instruction sent by the processor 101.
  • the touch-sensitive surface 104-1 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • a display (also referred to as a display screen) 104 - 2 may be used to display information entered by or provided to a user and various menus of the handset 100 .
  • the display 104-2 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the touch-sensitive surface 104-1 can be covered on the display 104-2, and when the touch-sensitive surface 104-1 detects a touch event on or near it, it is sent to the processor 101 to determine the type of the touch event, and then the processor 101 may provide a corresponding visual output on the display 104-2 according to the type of the touch event.
  • the touch-sensitive surface 104-1 and the display screen 104-2 are used as two independent components to realize the input and output functions of the mobile phone 100, in some embodiments, the touch-sensitive surface 104- 1 is integrated with the display screen 104-2 to realize the input and output functions of the mobile phone 100.
  • the touch screen 104 is formed by stacking multiple layers of materials. In the embodiment of the present application, only the touch-sensitive surface (layer) and the display screen (layer) are shown, and other layers are not described in the embodiment of the present application.
  • the touch-sensitive surface 104-1 may cover the display 104-2, and the size of the touch-sensitive surface 104-1 is larger than the size of the display screen 104-2, so that the display screen 104- 2 all covered under the touch-sensitive surface 104-1, or, the above-mentioned touch-sensitive surface 104-1 can be arranged on the front of the mobile phone 100 in the form of a full panel, that is, the user's touch on the front of the mobile phone 100 can be sensed by the mobile phone 100, In this way, a full touch experience on the front of the mobile phone 100 can be realized.
  • the touch-sensitive surface 104-1 is configured on the front of the mobile phone 100 in the form of a full panel
  • the display screen 104-2 can also be configured in the form of a full panel on the front of the mobile phone 100, so that on the front of the mobile phone 100 A frameless structure can be realized.
  • the mobile phone 100 may also include a bluetooth device 105 for implementing data exchange between the mobile phone 100 and other short-distance mobile terminals (such as mobile phones, smart watches, etc.).
  • the Bluetooth device in the embodiment of the present application may be an integrated circuit or a Bluetooth chip or the like.
  • Cell phone 100 may also include at least one sensor 106, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor can include an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display on the touch screen 104 according to the brightness of the ambient light, and the proximity sensor can turn off the power of the display when the mobile phone 100 is moved to the ear. .
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the application of mobile phone 100 attitude (such as horizontal and vertical screen switching, Related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, tap), etc.; as for other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. that can also be configured on the mobile phone 100, here I won't go into details.
  • the mobile phone 100 may also have a fingerprint identification function.
  • the fingerprint reader 112 can be configured on the back of the mobile phone 100 (such as the bottom of the rear camera), or the fingerprint reader 112 can be configured on the front of the mobile phone 100 (such as the bottom of the touch screen 104, such as on the home screen key of the mobile phone 100).
  • the fingerprint recognition function can also be implemented by configuring the fingerprint reader 112 in the touch screen 104 , that is, the fingerprint reader 112 can be integrated with the touch screen 104 to realize the fingerprint recognition function of the mobile phone 100 .
  • the fingerprint reader 112 may be configured in the touch screen 104 , may be a part of the touch screen 104 , or may be configured in the touch screen 104 in other ways.
  • the fingerprint reader 112 can also be implemented as a full-panel fingerprint reader, therefore, the touch screen 104 can be regarded as a panel that can collect fingerprints at any position.
  • the fingerprint reader 112 can process the collected fingerprints.
  • the fingerprint identifier 112 may perform fingerprint verification and other processing on the collected fingerprints.
  • the fingerprint identifier 112 may also send the processing result of the fingerprint verification (such as whether the fingerprint verification is passed) to the processor 101, so that the processor 101 makes a corresponding response according to the received fingerprint verification result.
  • the fingerprint identifier 112 may also send the collected fingerprint to the processor 101, so that the processor 101 processes the fingerprint (for example, fingerprint verification, etc.).
  • the main component of the fingerprint reader 112 in the embodiment of the present application is a fingerprint sensor, which can adopt any type of sensing technology, including but not limited to optical, capacitive, piezoelectric or ultrasonic sensing technologies.
  • the WI-FI device 107 is used to provide the mobile phone 100 with network access following WI-FI-related standard protocols.
  • the mobile phone 100 can access the WI-FI access point through the WI-FI device 107, thereby helping users send and receive emails, It provides users with wireless broadband Internet access for browsing the web and accessing streaming media, etc.
  • the WI-FI device 107 can also serve as a WI-FI wireless access point, and can provide WI-FI network access for other mobile terminals.
  • the positioning device 108 is configured to provide a geographic location for the mobile phone 100 . It can be understood that the positioning device 108 may specifically be a receiver of a positioning system such as a global positioning system (global positioning system, GPS) and a Beidou satellite navigation system. After receiving the geographic location sent by the positioning system, the positioning device 108 sends the information to the processor 101 for processing, or to the memory 103 for storage. In some other embodiments, the positioning device 108 may be an assisted global positioning system (AGPS) receiver. AGPS is an operation mode for GPS positioning with certain assistance.
  • AGPS assisted global positioning system
  • the positioning device 108 can communicate with Positioning assistance is obtained by communicating with an auxiliary positioning server (such as the positioning server of the mobile phone 100).
  • the AGPS system assists the positioning device 108 to complete ranging and positioning services by acting as an auxiliary server. Positioning assistance.
  • the audio circuit 109 , the speaker 113 , and the microphone 114 can provide an audio interface between the user and the mobile phone 100 .
  • the audio circuit 109 can transmit the electrical signal converted from the received audio data to the loudspeaker 114, and the loudspeaker 113 converts it into a sound signal output; After being received, it is converted into audio data, and then the audio data is output to the RF circuit 102 for sending to another mobile phone, or the audio data is output to the memory 103 for further processing.
  • the peripheral interface 110 is used to provide various interfaces for external input/output devices (such as keyboard, mouse, external display, external memory, SIM card, etc.). For example, it is connected to a mouse through a universal serial bus interface, and is connected to a subscriber identity module (Subscriber Identity Module, SIM) card provided by a telecom operator through a metal contact on a subscriber identity module card slot.
  • SIM Subscriber Identity Module
  • the peripheral interface 110 may be used to couple the aforementioned external input/output peripherals to the processor 101 and the memory 103 .
  • the mobile phone 100 can also include a power supply device 111 (such as a battery and a power management chip) for supplying power to various components.
  • a power supply device 111 such as a battery and a power management chip
  • the battery can be logically connected to the processor 101 through the power management chip, so that the power supply device 111 can be used to manage charging, discharging, and power consumption management. and other functions.
  • the mobile phone 100 may also include a camera (front camera and/or rear camera), a flash, a micro projection device, a near field communication (near field communication, NFC) device, etc., which will not be described in detail here.
  • a camera front camera and/or rear camera
  • a flash a flash
  • micro projection device a micro projection device
  • a near field communication (near field communication, NFC) device etc., which will not be described in detail here.
  • the memory 103 of the mobile phone 100 can store operating system.
  • the operating system is a mobile device operating system based on Linux, and combines the above hardware in the mobile phone 100 to realize various functions.
  • the stored The software architecture of an operating system It should be noted that the embodiment of this application only uses The operating system is used as an example to illustrate the software environment required by the mobile terminal to implement the technical solution of this embodiment. Those skilled in the art can understand that this embodiment of the present application can also be implemented with other operating systems.
  • the present application provides a media file transmission method, which is applied in the process of sending a media file from a sending end device to a receiving end device.
  • the media file described in this application may include the naked code media file to be displayed on the sending device, or the media file described in this application may include the compressed content to be displayed on the sending device or, the media files described in this application may include compressed network media files acquired by the sender device.
  • the sending end device and/or the receiving end device supports the first transmission mode, the second transmission mode and the third transmission mode.
  • the sending device restores the media file to be playable, and the receiving device receives the playable media file.
  • the sending end device performs a part of the operation of restoring the media file to be playable, and the receiving end device performs part of the operation of restoring the media file to be playable.
  • the operation of restoring the media file to be playable includes A, B, C, and D (A, B, C, and D do not include display and playback operations)
  • the second transmission mode may include that the sending end device executes A, and the receiving end The device executes B, C, and D; or, the sending end device executes A and B, and the receiving end device executes C and D; or, the sending end device executes A, B, and C, and the receiving end device executes D.
  • the sending device transparently transmits the media file, and the receiving device restores the media file to be playable.
  • the operation of restoring the media file to be playable includes: a decoding operation and a rendering operation.
  • the decoding operation includes a decompression operation, or, the decoding operation includes a decompression operation and a deDRM operation.
  • the end device also includes a communication unit, a decoding unit, a rendering unit and a display unit.
  • the first transmission mode can be the mode 1 shown in Figure 4, where the sending device decodes and renders the media file, and sends the rendered bare code stream to the display of the receiving device through the media interface
  • the unit displays, and the transmission path of the media file in the first transmission mode can be shown as the arrow marked with mode 1 in FIG. 4 .
  • the second transmission mode can be the mode 2 shown in FIG. 4.
  • the sending end device decodes the media file to obtain the raw code stream of the media file and the metadata used for audio and video rendering, and converts these The content is encapsulated into a bit stream, which is sent to the receiving end device through the media interface.
  • the receiving end device decapsulates the bit stream to obtain the bare code stream of the media file and the metadata used for audio and video rendering. After using the metadata to render in the rendering unit, The playback is displayed by the display unit.
  • the transmission path of the media file in the second transmission mode may be shown by the arrow marked with mode 2 in FIG. 4 .
  • the third transmission mode may be the mode 3 shown in FIG. 4.
  • the sending device encapsulates the compressed media file into a bit stream, sends it to the receiving device through the media interface, and the receiving device decapsulates it.
  • the compressed media file is obtained from the bit stream, and the decoding unit of the receiving end device decodes the compressed media file to obtain the bare code stream of the media file and the metadata used for audio and video rendering.
  • the The display unit displays playback.
  • the transmission path of the media file in the third transmission mode may be shown by the arrow marked with mode 3 in FIG. 4 .
  • the sending end device and/or the receiving end device can support the first transmission mode, the second transmission mode and the third transmission mode.
  • the embodiment does not limit the specific configuration process.
  • the connection method can be a wired electrical signal, a wired optical signal, or a wireless signal, which is not limited in this embodiment of the application.
  • the devices at both ends negotiate capabilities, so that the sending device acquires first capability information, and the first capability information may be used to indicate the media processing capability of the receiving device communicating with the sending device, or , the first capability information may be used to indicate the media processing capabilities of the sending end device and the receiving end device.
  • the media processing capability includes the processing capability of each media processing unit in the device for media files.
  • the media processing capability may include: one or more of codec capability, DRM solution capability, audio and video rendering capability, display playback capability, and interface encryption and decryption capability.
  • the codec capability may include: one or more of the types of compression protocols that can be decoded, and the resolution, frame rate, and number of bits of the media that can be decoded in real time.
  • the DRM solution capability may include: supported DRM protocol types, one or more of real-time decodable media resolution, frame rate, bit number, and the like.
  • Audio and video rendering capabilities may include: support for HDR, 3D Audio and other rendering protocol types, one or more of the resolution, frame rate, and number of bits that can be rendered in real time.
  • Display playback capabilities may include one or more of: playable resolution, frame rate, number of bits, display peak brightness, contrast, number and arrangement of audio playback channels, supported sound playback modes, etc.
  • Interface encryption and decryption capabilities may include: support for interface encryption and decryption protocol types, one or more of the media resolution, frame rate, and bit number that can perform interface encryption and decryption in real time.
  • the above media processing capability is only an example and does not constitute a specific limitation. In actual applications, the content of the media processing capability may be configured according to actual requirements.
  • the sending end device may determine the transmission mode supported by the first capability information.
  • the transmission mode supported by the first capability information is one or more of the first transmission mode, the second transmission mode, and the third transmission mode.
  • determining the transmission mode supported by the first capability information by the sending device refers to determining which of the receiving devices is capable of restoring the media file to a playable operation according to the media processing capability of the receiving device indicated by the first capability information. supported, and then determine the transmission mode supported by the first capability information according to the operations supported by the receiving end device.
  • the first capability information supports the first transmission mode, the second transmission mode and the third transmission mode. If in all operations, starting from the last operation, one or more consecutive operations are supported by the receiving end device, the first capability information supports the first transmission mode and the second transmission mode; if the receiving end device does not support all operations In the last operation, the first capability information only supports the first transmission mode.
  • determining the transmission mode supported by the first capability information may include:
  • the second operation is the last operation among the operations of restoring the media file to be playable, and it is determined that the supported transmission modes only include the first transmission mode.
  • the first capability information indicates that the receiver device supports the second operation, it is determined that the transmission modes supported by the first capability information include the first transmission mode and the second transmission mode.
  • the first capability information indicates that the receiving device supports all operations in restoring the media file to be playable, it is determined that the transmission modes supported by the first capability information include the first transmission mode, the second transmission mode and the third transmission mode.
  • the operation of restoring the media file to be playable includes: a decoding operation and a rendering operation.
  • the decoding operation includes a decompression operation, or, the decoding operation includes a decompression operation and a deDRM operation.
  • Determining the transmission mode supported by the first capability information may include: if the first capability information indicates that the receiving end device does not support rendering operations, determining that the transmission mode supported by the first capability information includes the first transmission mode; or, if the first capability information indicates receiving The end device supports the rendering operation but does not support the decoding operation, and it is determined that the transmission mode supported by the first capability information includes the first transmission mode and the second transmission mode; or, if the first capability information indicates that the receiving end device supports the rendering operation and supports the decoding operation,
  • the transmission modes that are determined to be supported by the first capability information include a first transmission mode, a second transmission mode, and a third transmission mode.
  • the sending device establishes a connection with the receiving device to transmit the media file
  • the above operations of establishing the connection, obtaining the first capability information, and determining the transmission mode supported by the first capability information may be performed, waiting for the transmission of the media file. Then, when the media file needs to be transmitted, the solution shown in FIG. 6 is executed.
  • the sending end device may also send attribute information of the media file to be transmitted to the receiving end device, the attribute information Can be used to indicate transport-related characteristics of a media file.
  • the attribute information may include but not limited to one or more of the following information: size, digital copyright protection or not, rendering requirements, and so on.
  • the transmission path of the media file may be called a transmission channel, a transmission channel corresponding to a transmission mode, including a unit for performing each operation of restoring the media file to a playable state. Therefore, the units in the transmission channel corresponding to the first transmission mode are all in the sending end device; part of the units in the transmission channel corresponding to the second transmission mode are in the sending end device, and the other parts are in the receiving end device; the third transmission mode The units in the corresponding transmission channel are all in the receiving end device.
  • the transmission channel corresponding to the first transmission mode includes a unit in the sending device that performs an operation of restoring the media file to a playable state.
  • the transmission channel corresponding to the second transmission mode it includes the unit in the sending device performing the operation before the first operation in the operation of restoring the media file to be playable, and the unit in the receiving device performing the operation of restoring the media file to be playable
  • the third transmission mode it includes a unit in the receiving end device performing an operation of restoring the media file to a playable state.
  • the operation of restoring the media file to be playable includes: a decoding operation and a rendering operation.
  • the decoding operation includes a decompression operation, or, the decoding operation includes a decompression operation and a deDRM operation.
  • the transmission channel corresponding to the first transmission mode includes the decoding unit and the rendering unit in the sending end device;
  • the transmission channel corresponding to the second transmission mode includes the decoding unit in the sending end device and the rendering unit in the receiving end device;
  • the third transmission mode corresponds to The transmission channel includes the decoding unit and the rendering unit in the receiver device.
  • one transmission mode may correspond to multiple transmission channels.
  • the multiple transmission channels corresponding to one transmission mode since their working principles are the same, the subsequent content will not be described one by one.
  • the sending end device 51 includes a video receiving unit 511, a video decompression unit 512, a DRM decompression unit 513, a media rendering unit 514, an interface encryption unit 515, and an output interface 516
  • the receiver device 52 includes an input interface 521 , a video decompression unit 522 , a DRM decompression unit 523 , a media rendering unit 524 , an interface decryption unit 525 and a display playback unit 526 . It should be noted that, in the system shown in FIG. 5 , only one unit performing the same operation is shown in the sending-end device and the receiving-end device. Therefore, one transmission mode corresponds to one transmission channel.
  • the video receiving unit 511 is configured to receive video stream information from a video source through a wired or wireless network, bus or interface.
  • the video decompression unit 512 is used to decompress the media files on the sending end device.
  • the de-DRM unit 513 (optional) is configured to de-DRM the DRM-encrypted media file in the sending device.
  • the media rendering unit 514 (optional) is configured to perform rendering processing such as HDR and three-dimensional sound on the video.
  • the interface encryption unit 515 (optional) is used to perform interface encryption on the video signal.
  • the output interface 516 is used to convert media files into physical layer signals through interface coding, modulation, etc., and send them to the transmission channel.
  • the transmission channel may be a wired electric signal, an optical signal, or a radio signal, and the like.
  • the transmission channel can aggregate the original audio and video data signals, compressed audio and video signals and other data signals, as well as control signals and handshake signals.
  • the input interface 521 is used to receive physical layer signals from the transmission channel, and continue operations such as demodulation and interface decryption to recover media files.
  • the interface decryption unit 525 (optional) is used to perform interface decryption on the received media file.
  • the media rendering unit 524 (optional) is used to perform rendering processing such as HDR and three-dimensional sound on the video.
  • the display and playback unit 526 is used for display and playback of playable media files.
  • the video decompression unit 522 is used to decompress the media files on the receiving end device.
  • the de-DRM unit 523 (optional) is used to de-DRM the DRM-encrypted media file at the receiving end.
  • the sending end device 51 may also include a content request unit 517, configured to request a corresponding video source according to application requirements and capability negotiation results.
  • the sending end device 51 may also include a capability negotiation unit 518
  • the receiving end device 52 may also include a capability negotiation unit 527
  • the capability negotiation unit 518/527 is used for displaying, rendering, and decompressing the two ends. , interface encryption and decryption, DRM solution and other capabilities to obtain the first capability information.
  • the first transmission mode corresponds to transmission channel 1, which includes a video decompression unit 512, a DRM decompression unit 513, a media rendering unit 514, and an interface encryption Unit 515 and output interface 516 , input interface 521 , interface decryption unit 525 and display and playback unit 526 .
  • the second transmission mode corresponds to transmission channel 2, which includes a video decompression unit 512, a DRM decompression unit 513, an interface encryption unit 515, an output interface 516, an input interface 521, an interface decryption unit 525, a media rendering unit 524, and a display playback unit 526.
  • the third transmission mode corresponds to the transmission channel 3 , which includes an output interface 516 , an input interface 521 , a video decompression unit 522 , a DRM decompression unit 523 , a media rendering unit 524 and a display and playback unit 526 .
  • the method for media file transmission may include:
  • the sending end device acquires a media file to be transmitted according to the first capability message.
  • the sending end device may directly acquire the media file to be transmitted, and there is no limitation on the features of the acquired media file.
  • the sending end device may determine the transmission mode supported by the first capability information according to the obtained first capability information, and then determine the transmission channel corresponding to the transmission mode supported by the first capability information, Then analyze and record the third capability information of each transmission channel corresponding to the transmission mode supported by the first capability information, where the third capability information is used to indicate the highest or lowest capability of the transmission channel for transmitting media files.
  • the third capability information may include one or more of the following: resolution, frame rate, number of bits, whether there is DRM encryption, rendering capability supported by the channel, and the like.
  • the sender device requests to acquire a media file conforming to third capability information of a transmission channel corresponding to a transmission mode supported by the first capability information according to a transmission channel corresponding to a transmission mode supported by the first capability information.
  • the sending end device may request from a media file source (such as a network server, local storage, etc., which may provide content resources in multiple formats/parameters) through the content request unit 517 in the architecture as shown in FIG.
  • a media file source such as a network server, local storage, etc., which may provide content resources in multiple formats/parameters
  • the media file conforming to the transmission capability of the transmission channel corresponding to the transmission mode supported by the first capability information may refer to the media file with the weakest third capability information among the transmission channels corresponding to the transmission mode supported by the first capability information.
  • the media files that can be transmitted by the transmission channel may refer to the media file with the weakest third capability information among the transmission channels corresponding to the transmission mode supported by the first capability information.
  • the sending end device may acquire multiple media files from one or more content sources at the same time.
  • the media file acquired by the sending device may include a bare code media file to be displayed in the sending device, or a compressed media file of content to be displayed in the sending device, or a compressed network media file obtained by the sending device document.
  • the media file described in the embodiment of the present application may be a code stream or a video signal, or in other forms, and the embodiment of the present application does not limit the type of the media file.
  • the media files to be transmitted acquired by the sending end device may include but not limited to one or more of the following media files: media files locally stored by the sending end device, externally stored at the sending end, or transmitted through an interface media files, media files from the network (such as media files of applications such as on-demand, live broadcast, video calls, and video conferences), and media files generated by local rendering or recording (such as games, screen recordings, and media files captured by cameras).
  • media files locally stored by the sending end device externally stored at the sending end, or transmitted through an interface media files
  • media files from the network such as media files of applications such as on-demand, live broadcast, video calls, and video conferences
  • media files generated by local rendering or recording such as games, screen recordings, and media files captured by cameras.
  • the sending end device determines the first transmission channel for transmitting the media file according to the rules from the transmission channels corresponding to the transmission modes supported by the first capability information.
  • the sending device determines the first transmission channel for transmitting the media file with the receiving device according to the rules from the transmission channels corresponding to the transmission mode supported by the first capability information. All media files to be transmitted after the end device and the receiving end device establish a connection this time.
  • S602 may be performed before S601.
  • the embodiment of the present application does not limit the execution order of each step in the solution, and can be configured according to actual needs.
  • the sending end device determines the first transmission channel for transmitting the media file to be transmitted obtained in S601 to the receiving end device according to rules from the transmission channels corresponding to the transmission mode supported by the first capability information.
  • the first transmission channel determined in S602 refers to the transmission channel corresponding to a certain transmission mode supported by the first capability information.
  • S602 when there are multiple units performing the same operation in the sending end device/receiving end device, in S602 first determine a plurality of first transmission channels corresponding to a certain transmission mode according to the rules, and at this time may A finally determined first transmission channel is selected from multiple first transmission channels according to a load balancing or polling mechanism or other methods.
  • a process of selecting a transmission channel from multiple transmission channels corresponding to a transmission mode is not specifically limited.
  • the first transmission channel corresponding to a certain transmission mode is determined according to the rules, which is the final determined first transmission channel. a transmission channel.
  • the third capability information is used to indicate the characteristics of the media files that each transmission channel supports transmission; determine the transmission channel whose third capability information satisfies the rules, as the first transmission channel.
  • the above rules are used to indicate the needs of the user, and may be preset by the system or manually input by the user, and the method of obtaining the rules is not limited in this embodiment of the present application.
  • the above-mentioned rules may include one or more of the following contents: playing effect is given priority, computing resource consumption of the sending end device is the lowest, computing resource consumption of the receiving end device is the lowest, and interface transmission bandwidth usage is the lowest.
  • the transmission channel whose third capability information satisfies the rule is the transmission channel with the smallest media file size.
  • the sending end device determines the transmission channels corresponding to the transmission mode supported by the first capability information according to the rules respectively.
  • the first transmission channel of media files when there are multiple media files to be transmitted (simultaneously transmitted), in S602, the sending end device determines the transmission channels corresponding to the transmission mode supported by the first capability information according to the rules respectively. The first transmission channel of media files.
  • the method provided by this application may further include S602a.
  • the sending end device acquires the second capability information.
  • the second capability information is used to indicate the transmission capability of the sending end device and the receiving end device, and the second capability information is used to indicate one or more of the following capabilities: the transmission capability of the interface between the sending end device and the receiving end device , the computing power of the sending end device, and the computing power of the receiving end device.
  • the method provided by the present application may further include S602b.
  • the sending end device judges whether the capability indicated by the second capability information supports resources occupied by multiple first transmission channels.
  • the capability indicated by the second capability information supports resources occupied by multiple first transmission channels, perform S603; otherwise, perform S602c, and then perform S603.
  • the sending end device adjusts the first transmission channel of the first media file until the capability indicated by the second capability information supports resources occupied by multiple first transmission channels.
  • the first media file is one or more of the multiple media files.
  • the method of selecting the first media file is not limited in the embodiment of the present application, and it can be selected at any time, or the media file with the highest resource occupation, or other methods.
  • first transmission channels of the multiple media files selected in S602 may be the same or different, which is not limited in this embodiment of the present application.
  • the sending end device transmits the media file to the receiving end device through the first transmission channel in the unit of the sending end device.
  • the sending device transmits the media file to the receiving device through an output interface (also referred to as a media interface) after performing corresponding operations on the media file through the unit in the sending device through the first transmission channel .
  • an output interface also referred to as a media interface
  • the sender device restores the media file to be playable, and then performs interface encryption on the playable media file After that, the playable media file is transmitted to the receiving end device through the output interface.
  • the sender device performs decompression operation and deDRM operation on the media file, and then converts the raw code stream and metadata
  • the data is encrypted by the interface, and the raw code stream and metadata encrypted by the interface are transmitted to the receiving end device through the output interface, and the rendering operation is performed on the receiving end device.
  • the sending device performs transparent transmission through the output interface, and in the receiving device Perform the operation to restore the media file to be playable.
  • the first transmission channel for transmitting each media file is obtained in S602, and the sending end device transmits the first transmission channel for each media file in S603
  • the units in the sending end device respectively transmit corresponding media files to the receiving end device.
  • each media file may include time stamp information for synchronously displaying the multiple media files on the receiving end device.
  • the receiving end device determines the first transmission channel for transmitting the media file with the sending end device according to the rules from among the transmission channels corresponding to the transmission mode supported by the first capability information.
  • the first capability information, the transmission mode supported by the first capability information, and the transmission channel corresponding to the transmission mode supported by the first capability information have all been described in detail in the foregoing content, and will not be repeated here.
  • the specific implementation of the first transmission channel for the receiving end device to determine the transmission and the sending end device to transmit the media file according to the rules can refer to the first transmission channel for the sending end device to determine the transmission and the sending end device for transmitting the media file according to the rules in S602.
  • the specific implementation of a transmission channel will not be repeated here.
  • the first transmission channel determined in S604 corresponds to the same transmission mode as the first transmission channel determined in S602.
  • the receiving end device acquires a playable media file through a unit in the receiving end device through the first transmission channel.
  • the receiving end device receives the media file transmitted by the sending end device through an input interface (also called a media interface), and the received media file is a playable media file or a non-playable media file, depending on the sending The transmission mode between the end device and the receiving end device.
  • an input interface also called a media interface
  • the sending end device restores the media file to be playable, and then performs interface encryption on the playable media file Afterwards, the playable media file is transmitted to the receiving device through the output interface, and the receiving device receives the playable media file through the input interface, and obtains the playable media file after decrypting the interface.
  • the sender device decompresses and de-DRMs the media file, and then converts the raw code stream and metadata
  • the data is encrypted by the interface, and the raw code stream and metadata encrypted by the interface are transmitted to the receiving device through the output interface.
  • the receiver device receives the encrypted raw code stream and metadata through the input interface, decrypts the interface, and then performs rendering operations in the rendering unit according to the metadata to obtain playable media files.
  • the sender device after receiving the media file, performs transparent transmission through the output interface.
  • the receiver device receives the encrypted media file through the input interface, and then the receiver device decompresses the media file through the decompression unit, and de-DRMs the decoded media file through the de-DRM unit to obtain the bare code stream and metadata , and then the rendering unit performs rendering operations according to the metadata to obtain playable media files.
  • the first transmission channel for transmitting each media file is obtained in S604, and the receiving end device transmits the first transmission channel for each media file in S605
  • the units in the receiving end device acquire corresponding playable media files respectively.
  • the receiving end device plays the playable media file.
  • the receiving end device plays the playable media file obtained in S605.
  • the media files are played synchronously according to the time stamp information carried in the media files.
  • the media file transmission method provided by this application, a variety of transmission modes are supported between the sending end device and the receiving end device, and according to the capability information of the sending and receiving parties, it is determined that the sending end device performs a part of the multiple operations to restore the media file to be playable Or all, the other operations are performed by the receiving end device. In different transmission modes, the operations performed by the sending end device among the multiple operations for restoring the media file to be playable are different. In this way, during the transmission of media files, the computing and media processing capabilities of the receiving end device can be utilized to reduce the amount of data on the transmission interface between the sending and receiving ends and save bandwidth.
  • Embodiment 1 is used to share the mobile communication capability of the sending-end device with the receiving-end device, and use the display and playback capability of the receiving-end device to obtain a better media playing experience.
  • the sending end device can be a mobile phone, tablet, wireless communication router, etc. with 5G and other mobile communication capabilities
  • the receiving end device can be a TV, vehicle display, PC, head-mounted display device, projector, etc.
  • the media file transmission system in this scenario can have the architecture shown in Figure 5.
  • the output interface, the transmission channel in the sending end device, and the input interface in the receiving end device constitute a transmission interface system, and the form of the transmission interface system can be wired transmission.
  • the video receiving unit 511 shown in FIG. 5 may be a wireless network receiving unit.
  • the media files mainly come from the network, including video on demand, live broadcast, video call, multi-party video conference and so on.
  • the network video source contains compressed video content.
  • media files usually include video stream slices V1, V2...Vn of various qualities, and different stream slices usually correspond to different resolutions, whether they have DRM encryption, whether they contain HDR, a combination of 3D Audio metadata, etc.
  • Embodiment 1 after the sending end device establishes a connection with the receiving end device, the encoding and decoding, de-DRM capabilities, and rendering capabilities of the sending and receiving parties are determined by their respective capability negotiation units according to the user's preference for content selection and wireless network transmission status.
  • the encryption and decryption capabilities of the interface and the user's selection of content are negotiated, and the first capability information is obtained.
  • the sending device selects the most suitable video stream slice VM (M belongs to 1 to n) according to the first capability information in the current situation.
  • RrDec is also the maximum resolution supported by channel 3
  • min(Rt, RtDec) is the maximum resolution supported by channel 1 and channel 2.
  • the transmission channel for transmitting the video stream slice VM is selected (that is, the first transmission channel is determined).
  • channel 3 can be selected to directly transmit the compressed code stream of the VM to the receiving device, saving resources.
  • select channel 1 to resolve Vx in the sending device.
  • RtDec when RtDec ⁇ R0, it means that the capability of the sending device is weaker than that of the receiving device, and channel 2 can be selected to transmit media files and rendered on the receiving device.
  • the sending device cannot perform the corresponding DRM decryption, but the receiving end has the corresponding DRM decryption capability, select channel 3 to transparently transmit the compressed code stream, and the receiving device Decompress, de-DRM and display after rendering.
  • the computing resources of the receiving end device can be preferentially used to save the power consumption of the mobile phone end.
  • the capabilities of the capability negotiating unit and the content requesting unit may be used as service modules for distributed application integration.
  • the media file transmission system provided by this application provides a variety of transmission modes, and integrates computing resources and decompression, decryption, rendering, and playback capabilities of both ends of the device through capability negotiation to achieve optimal Play effects.
  • using the transparent transmission mode shown in channel 3 of this application can save the computing power of the sending end device, and realize the communication capability of the sending end device under the condition that the sending end device does not have the ability to decompress and decrypt high-resolution media files. Shared to the receiving end device, relying on the receiving end device's better decompression, decryption and other capabilities to process and play the media file.
  • the sending device mobile phone, tablet, PC, set-top box, game console, etc.
  • the sending end device and the receiving end device form a media playback system as shown in Figure 5 through the transmission interface system, which supports simultaneous transmission of multiple media files.
  • the transmission interface system may be the output interface 516 in the sending end device, the transmission channel, and the input interface 521 in the receiving end device shown in FIG. 5 .
  • the form of the transmission interface system may be an electrical signal interface for wired transmission, an optical signal interface for optical fiber transmission, a radio signal interface, a wireless optical signal interface, and the like.
  • the source of the media file may be local storage, network server, mobile communication, audio and video rendered by applications such as a game engine at the sending end, and the like.
  • the multiple media files transmitted by the sending device to the receiving device may be media files from different angles of the same content in the AR/VR scene.
  • the sending end device can select two or more media files (which can be media files of multiple angles of the same content) according to the scheme provided by this application, and respectively select the transmission channel for transmitting each media file (in FIG. 5 Any one of channel 1, channel 2, and channel 3), and transmit them separately.
  • the transmission channels of each media file can be the same or different.
  • the receiver device can receive two or more media files through the transmission interface system for display and playback.
  • the mobile phone transmits two channels of video to the AR/VR glasses at the same time, and the bandwidth of the transmission interface system is not enough to transmit two channels of bare-coded media files, one channel can use channel 1 to transmit the bare-coded stream, and the other channel can pass through channel 1. 3.
  • the compressed data is transmitted, decoded and rendered by the receiving device to make full use of the transmission bandwidth.
  • the transmission interface system supports the transmission of two channels of bare-coded media files, but the computing power of the receiving device and the computing power of the sending device can only render one of the media files in real time, and one of the channels can use channel 1 to transmit and render The bare code, and the other channel transmits the bare code data and metadata through channel 2, and renders in the receiving device to make full use of the computing resources at both ends to achieve real-time playback.
  • bit streams can be aggregated and packaged to produce an aggregated signal, and the physical layer signal can be obtained after modulation of the bit stream or the aggregated signal, and transmitted through the physical channel in the transmission interface system.
  • each bit stream is obtained after demodulation and decapsulation operations, and after unpacking and channel decoding, the video stream signal v1-vm is recovered, and the media files transmitted by channel 3 are recovered as needed Perform processing such as decompression, DRM removal, and rendering, and perform rendering processing on the media signal transmitted through channel 2.
  • each media file is displayed in multiple windows, picture-in-picture, multi-view, VR, etc.
  • the display and playback unit of the receiving device buffers these signals, and uses information such as time stamps carried in the media files to align the time and then display and play synchronously.
  • the capabilities of the capability negotiation unit and the content requesting unit may be used as service modules for distributed application integration.
  • the media file transmission system provided by the present application provides multiple transmission modes, and multiple media files can be combined and selected in each transmission mode to realize an optimal transmission combination mode.
  • the channel 2 compressed data transparent transmission mode corresponding to the second transmission mode occupies a low bandwidth.
  • real-time transmission of two-channel and multi-channel media can be realized under the condition of limited transmission bandwidth.
  • data processing requires high software and hardware resources.
  • This solution can effectively coordinate the computing resources at the sending and receiving ends to realize real-time processing of multi-channel media. Compared with the prior art, this solution can use bandwidth resources more flexibly, and coordinate the allocation of media processing capabilities and computing resources at both ends.
  • one or more receiver devices can be connected to multiple sender devices at the same time, and use multiple devices to acquire multi-channel or multi-modal video/image information to collaboratively complete complex machine vision tasks.
  • a sending device transmits multiple media files to a receiving device (a display device such as a TV, a vehicle host, etc.).
  • the sending end device and the receiving end device form a media playback system as shown in Figure 10 through the transmission interface system, which supports the simultaneous transmission of multiple media files, and supports mirroring its own content to the receiving end device while delivering media files, so as to draw It can be displayed in the form of medium picture or window, and realize the collaborative interaction of multiple screens.
  • the transmission interface system may be the output interface 516 in the sending end device, the transmission channel, and the input interface 521 in the receiving end device shown in FIG. 10 .
  • the form of the transmission interface system may be an electrical signal interface for wired transmission, an optical signal interface for optical fiber transmission, a radio signal interface, a wireless optical signal interface, and the like.
  • the source of the media file may be local storage, network server, mobile communication, audio and video rendered by applications such as a game engine at the sending end, and the like.
  • the display content of the sending device may be used as one of the media files to be transmitted (media file X).
  • Embodiment 3 is to mirror the content displayed by itself to the receiving end device while transmitting one or more media files in Embodiment 1 and Embodiment 2, and display it in the form of a window or a picture-in-picture, etc. Realize multi-screen collaborative display interaction.
  • the specific implementation of transmitting one or more media files from the sending device to the receiving device is the same as that in the first and second embodiments, and will not be repeated here.
  • the manner of transmitting the media file X in the third embodiment will be described below.
  • the media file X displayed by the sender device may be the naked code video information to be displayed in the sender device's display drive unit.
  • the sending end device may further include a display control unit 519 .
  • the display control unit 519 may transmit the media file X used for display in the sending device to the output interface 516 .
  • the sending end device may use the media rendering unit 514 to perform operations such as resolution adjustment and brightness video on the media file X, and then use the channel 1 to transmit to the receiving end device.
  • the sending end device may further include a compression encoding unit 520 .
  • the display control unit 519 can transmit the media file X used for display to the compression encoding unit 520 for compression encoding, and then transmit the compressed media file X to the receiving end device through the channel 3 .
  • the multiple channels of media information transmitted in the three embodiments may be packaged and then aggregated and transmitted in the transmission channel.
  • the receiving end device can display other media files received on the display and playback unit of the receiving end device together with the screen-casting media file X in the form of picture-in-picture, multi-window, etc.
  • the sending end device may further include a display interaction unit 521 .
  • the sending end device can use the interactive capabilities such as the touch screen of the display interaction unit 521 to control the receiving end device.
  • the capabilities of the capability negotiating unit and the content requesting unit may be used as service modules for distributed application integration.
  • Embodiment 3 adds the capability of mirroring screen projection, uses windows or picture-in-picture to synchronously display the display content of the sending device, and realizes multi-screen collaborative interaction.
  • the foregoing mainly introduces the solutions provided by the embodiments of the present application from the perspective of the working principles of the sending end device and the receiving end device.
  • the above-mentioned sending-end device and the receiving-end device include hardware structures and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the units and algorithm steps of each example described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the embodiment of the present application can divide the functional modules of the sending end device and the receiving end device provided by the application according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be divided into integrated in one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 11 shows a possible structural diagram of a media file transmission apparatus 110 deployed in the sending end device involved in the above embodiment.
  • the media file transmission apparatus 110 may be a functional module or a chip.
  • the media file transmission apparatus 110 may include: a first acquiring unit 1101 , a first determining unit 1102 , and a processing unit 1103 .
  • the first acquisition unit 1101 is used to execute the process S601 in FIG. 6 or FIG. 7;
  • the first determination unit 1102 is used to execute the process S602 in FIG. 6 or FIG. It is used to execute the process S603 in FIG. 6 or FIG. 7 .
  • all relevant content of each step involved in the above-mentioned method embodiment can be referred to the function description of the corresponding function module, and will not be repeated here.
  • the media file transmission apparatus 110 may further include a second acquiring unit 1104 and a second determining unit 1105 .
  • the second acquiring unit 1104 is configured to execute the process S602a in FIG. 7 .
  • the second determining unit 1105 is configured to determine the transmission mode supported by the first capability information.
  • FIG. 13 shows a possible structural diagram of the sending end device 130 involved in the above embodiment.
  • the sending end device 130 may be the sending end device described in the foregoing method embodiments.
  • the sending end device 130 may include: a processing module 1301 and a communication module 1302 .
  • the processing module 1301 is used to control and manage the actions of the sending end device 130, and the communication module 1302 is used to communicate with other devices.
  • the processing module 1301 is configured to execute any one of the processes S601 to S603 in FIG. 6 or FIG. 7 , or the processes S602a to S602c in FIG. 7 .
  • the sending end device 130 may also include a storage module 1303 for storing program codes and data of the sending end device 130 .
  • the processing module 1301 may be the processor 101 in the physical structure shown in FIG. 3 , and may be a processor or a controller. For example, it may be a CPU, a general processor, DSP, ASIC, FPGA or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It can implement or execute the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processing module 1301 may also be a combination that implements computing functions, for example, a combination of one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
  • the communication module 1302 may be the radio frequency circuit 102 in the physical structure shown in FIG.
  • the communication module 1302 may be a communication port, or may be a transceiver, a transceiver circuit, or a communication interface.
  • the above-mentioned communication interface may realize communication with other devices through the above-mentioned components having the function of sending and receiving.
  • the above-mentioned elements having the function of sending and receiving may be realized by an antenna and/or a radio frequency device.
  • the storage module 1303 may be the memory 103 in the physical structure shown in FIG. 3 .
  • the processing module 1301 is a processor
  • the communication module 1302 is a radio frequency circuit
  • the storage module 1303 is a memory
  • the sending end device 130 involved in FIG. 13 in the embodiment of the present application may be the sending end device shown in FIG. 3 .
  • the media file transmission device 110 or the sending end device 130 provided by the embodiment of the present application can be used to implement the corresponding functions in the methods implemented by the above embodiments of the present application.
  • the media file transmission device 110 or the sending end device 130 provided by the embodiment of the present application can be used to implement the corresponding functions in the methods implemented by the above embodiments of the present application.
  • FIG. 14 shows a possible structural diagram of a media file transmission apparatus 140 deployed in the receiving end device involved in the above embodiment.
  • the media file transmission device 140 may be a functional module or a chip.
  • the media file transmission apparatus 140 may include: a first determining unit 1401 , an acquiring unit 1402 and a playing unit 1403 .
  • the first determination unit 1401 is used to execute the process S604 in FIG. 6 or FIG. 7;
  • the acquisition unit 1402 is used to execute the process S65 in FIG. 6 or FIG. 7;
  • the playback unit 1403 is used to execute the process in FIG. 6 or FIG. 7 S606.
  • all relevant content of each step involved in the above-mentioned method embodiment can be referred to the function description of the corresponding function module, and will not be repeated here.
  • FIG. 15 shows a possible structural diagram of the receiving end device 150 involved in the above embodiment.
  • the receiving end device 150 may be the receiving end device described in the foregoing method embodiments.
  • the receiver device 150 may include: a processing module 1501 and a communication module 1502 .
  • the processing module 1501 is used to control and manage the actions of the receiver device 150, and the communication module 1502 is used to communicate with other devices.
  • the processing module 1501 is configured to execute any one of the processes S604 to S606 in FIG. 6 or FIG. 7 .
  • the receiver device 150 may also include a storage module 1503 for storing program codes and data of the receiver device 150 .
  • the processing module 1501 may be the processor 101 in the physical structure shown in FIG. 3 , and may be a processor or a controller. For example, it may be a CPU, a general processor, DSP, ASIC, FPGA or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It can implement or execute the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processing module 1501 may also be a combination that implements computing functions, for example, a combination of one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
  • the communication module 1502 may be the radio frequency circuit 102 in the physical structure shown in FIG.
  • the communication module 1502 may be a communication port, or may be a transceiver, a transceiver circuit, or a communication interface.
  • the above-mentioned communication interface may realize communication with other devices through the above-mentioned components having the function of sending and receiving.
  • the above-mentioned elements having the function of sending and receiving may be realized by an antenna and/or a radio frequency device.
  • the storage module 1503 may be the memory 103 in the physical structure shown in FIG. 3 .
  • the processing module 1501 is a processor
  • the communication module 1502 is a radio frequency circuit
  • the storage module 1503 is a memory
  • the receiving end device 150 involved in FIG. 15 of the embodiment of the present application may be the receiving end device shown in FIG. 3 .
  • the media file transmission device 140 or the receiving end device 150 provided by the embodiment of the present application can be used to implement the corresponding functions in the methods implemented by the above embodiments of the present application.
  • the media file transmission device 140 or the receiving end device 150 provided by the embodiment of the present application can be used to implement the corresponding functions in the methods implemented by the above embodiments of the present application.
  • a computer-readable storage medium on which instructions are stored, and when the instructions are executed, the media file transmission method in the foregoing method embodiments is executed.
  • a computer program product containing instructions is provided, and when the computer program product runs on a computer, the computer executes the media file transmission method in the above method embodiments when executed.
  • An embodiment of the present application further provides a chip system, where the chip system includes a processor, configured to implement the technical method of the embodiment of the present invention.
  • the system-on-a-chip further includes a memory for storing necessary program instructions and/or data of the embodiments of the present invention.
  • the system-on-a-chip further includes a memory, which is used for the processor to call the application program code stored in the memory.
  • the system-on-a-chip may consist of one or more chips, and may also include chips and other discrete devices, which is not specifically limited in this embodiment of the present application.
  • the steps of the methods or algorithms described in connection with the disclosure of this application can be implemented in the form of hardware, or can be implemented in the form of a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, and software modules can be stored in RAM, flash memory, ROM, erasable programmable read-only memory (erasable programmable ROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), registers, hard disk, removable hard disk, compact disc read-only (CD-ROM), or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and storage medium can be located in the ASIC.
  • the ASIC may be located in the core network interface device.
  • the processor and the storage medium may also exist in the core network interface device as discrete components.
  • the memory may be coupled to the processor, for example, the memory may exist independently and be connected to the processor through a bus. Memory can also be integrated with the processor.
  • the memory may be used to store application program codes for executing the technical solutions provided by the embodiments of the present application, and the execution is controlled by the processor.
  • the processor is used to execute the application program code stored in the memory, so as to realize the technical solution provided by the embodiment of the present application.
  • the above-mentioned transmitting-end device/receiving-end device includes corresponding hardware structures and/or software modules for performing various functions.
  • the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the example units and algorithm steps described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Professionals and technicians may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the embodiments of the present application.
  • the embodiment of the present application also provides a sending end device/receiving end device that implements the above method embodiments.
  • the sending end device/receiving end device can be divided into functional modules, for example, each function can be divided into each
  • a function module can also integrate two or more functions into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • the disclosed system, device and method may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation may either be integrated into another system, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage
  • the medium includes several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk, and other various media capable of storing program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

一种媒体文件传输方法及装置,涉及通信领域,在传输媒体文件时降低传输数据量,进而节约传输带宽。该方法包括:根据第一能力信息,获取媒体文件;从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输媒体文件的第一传输通道;通过第一传输通道在发送端设备中的单元,向接收端设备传输媒体文件。其中,发送端设备支持第一传输模式、第二传输模式和第三传输模式;第一传输模式由发送端设备将媒体文件恢复至可播放,第二传输模式由发送端设备执行将媒体文件恢复至可播放的操作中的部分操作,第三传输模式由发送端设备透传媒体文件。

Description

一种媒体文件传输方法及装置
本申请要求于2021年5月31日提交国家知识产权局、申请号为202110600628.5、申请名称为“一种媒体文件传输方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及通信领域,尤其涉及一种媒体文件传输方法及装置。
背景技术
随着用户对音视频体验追求的不断提升,带有高动态范围(high-dynamic-range,HDR)、三维声(3 dimensions audio,3D Audio)等效果的超高清音视频媒体文件资源日渐丰富。对于高质量内容的版权保护需求也与日俱增,数字版权管理(digital rights management,DRM)、接口传输加密等技术的迅速普及。使得媒体文件传输时的解压缩、解DRM、HDR、3D Audio渲染等,都需要大量的计算资源的支持。
当前,媒体文件在电子设备间传输时,发送端设备获取到媒体文件后,根据媒体文件的媒体文件,进行解压缩和解DRM获取裸码流(可以直接播放的码流),再利用传输接口,将未经压缩的裸码流或者无损轻压缩的裸码流直接传输至接收端,在接收端进行显示播放。
因此,媒体文件在电子设备间传输时,传输数据量大,占用传输带宽高,若同时传输多路高分辨率的媒体文件,该问题会更加明显。
发明内容
本申请提供的媒体文件传输方法及装置,在传输媒体文件时降低传输数据量,进而节约传输带宽。
为达到上述目的,本申请采用如下技术方案:
第一方面,提供了一种媒体文件传输方法,该方法应用于发送端设备,发送端设备支持第一传输模式、第二传输模式和第三传输模式;第一传输模式由发送端设备将媒体文件恢复至可播放,第二传输模式由发送端设备执行将媒体文件恢复至可播放的操作中的部分操作,第三传输模式由发送端设备透传媒体文件;一个传输模式对应的传输通道包括一个传输模式下执行将媒体文件恢复至可播放的操作的单元。该方法包括:根据第一能力信息,获取媒体文件;第一能力信息用于指示接收端设备的媒体处理能力,或者,第一能力信息用于指示发送端设备和接收端设备的媒体处理能力;从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输媒体文件的第一传输通道;发送端设备通过第一传输通道在发送端设备中的单元,向接收端设备传输媒体文件。
本申请实施例提供的媒体文件传输方法,发送端设备与接收端设备间支持多种传输模式,根据发送端设备和接收端设备的能力信息,确定由发送端设备执行将媒体文件恢复为可播放的多个操作中的部分或全部,由接收端设备执行其余操作,不同传输 模式中,发送端设备执行的将媒体文件恢复为可播放的多个操作中的操作不同。从而在媒体文件传输时,能够利用接收端设备的计算和媒体处理能力,降低收发端之间传输接口上的数据量,节约带宽。
在一种可能的实施方式中,发送端设备请求的媒体文件为多个,包括:从第一能力信息支持的传输模式对应的传输通道中,按照规则分别确定传输每个媒体文件的第一传输通道;通过每个第一传输通道在发送端设备中的单元,分别向接收端设备传输对应的媒体文件。该实施方式说明了在发送端设备同时请求多个媒体文件时,为每个媒体文件确定传输通道,向接收端设备传输每个媒体文件。
在一种可能的实施方式中,本申请提供的方法还可以包括:获取第二能力信息,第二能力信息用于指示下述能力中的一项或多项:发送端设备与接收端设备间接口的传输能力、发送端设备的计算能力、接收端设备的计算能力。相应的,在按照规则分别确定传输每个媒体文件的第一传输通道之后,若第二能力信息指示的能力不支持多个第一传输通道占用的资源,本申请提供的方法还包括:将第一媒体文件的第一传输通道进行调整,直至第二能力信息指示的能力支持所述多个所述第一传输通道占用的资源。第一媒体文件为多个媒体文件中一个或多个。该实施方式说明了当第二能力信息指示的能力不支持多个第一传输通道占用的资源时,调整传输第一媒体文件的第一传输通道,直到第二能力信息指示的能力支持多个第一传输通道占用的资源,从而实现多个媒体文件的同时传输。
在一种可能的实施方式中,第一传输模式对应的传输通道,包括发送端设备中执行将媒体文件恢复至可播放的操作的单元;第二传输模式对应的传输通道,包括发送端设备中执行将媒体文件恢复至可播放的操作中第一操作之前的操作的单元,以及接收端设备中执行将媒体文件恢复至可播放的操作中第一操作以及第一操作之后的操作的单元;第一操作为将媒体文件恢复至可播放的操作中接收端设备支持的任一操作;第三传输模式对应的传输通道,包括接收端设备中执行将媒体文件恢复至可播放的操作的单元。该实施方式说明了三种传输模式分别对应的所使用的发送端设备和接收端设备中的媒体处理单元。
在一种可能的实施方式中,上述从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输每个媒体文件的第一传输通道,具体可以实现为:根据第一能力信息,确定第一能力信息支持的传输模式对应的每个传输通道的第三能力信息,第三能力信息用于指示每个传输通道支持传输的媒体文件的特征;确定第三能力信息满足规则的传输通道,作为第一传输通道。该实施方式说明了确定传输模式后,将第三能力信息满足规则的传输通道确定为第一传输通道,以使得确定的传输通道满足用户的需求。
在一种可能的实施方式中,上述规则可以包括下述内容中一项或多项:播放效果优先、发送端设备计算资源消耗最低、接收端设备计算资源消耗最低、接口传输带宽占用最低。
在一种可能的实施方式中,若第一能力信息指示接收端设备不支持第二操作,第二操作为将媒体文件恢复至可播放的操作中最后一个操作,确定支持的传输模式包括第一传输模式;或者,若第一能力信息指示接收端设备支持第二操作,确定支持的传 输模式包括第一传输模式和第二传输模式;或者,若第一能力信息指示接收端设备支持将媒体文件恢复至可播放的操作中所有操作,确定支持的传输模式包括第一传输模式、第二传输模式和第三传输模式。该实施方式说明了根据接收端设备对媒体文件的处理能力,确定发送端设备和接收端设备之间支持的传输模式。
在一种可能的实施方式中,将媒体文件恢复至可播放的操作包括:解码操作、渲染操作。解码操作包括解压缩操作,或者,解码操作包括解压缩操作和解DRM操作。第一传输模式对应的传输通道包括发送端设备中的解码单元和渲染单元;第二传输模式对应的传输通道包括发送端设备中的解码单元和接收端设备中的渲染单元;第三传输模式对应的传输通道包括接收端设备中的解码单元和渲染单元。相应的,本申请提供的方法还可以包括:若第一能力信息指示接收端设备不支持渲染操作,确定第一能力信息支持的传输模式包括第一传输模式;或者,若第一能力信息指示接收端设备支持渲染操作但不支持解码操作,确定第一能力信息支持的传输模式包括第一传输模式和第二传输模式;或者,若第一能力信息指示接收端设备支持渲染操作且支持解码操作,确定第一能力信息支持的传输模式包括第一传输模式、第二传输模式和第三传输模式。该实施方式说明了根据接收端设备所支持的媒体文件处理中的具体操作,确定发送端设备和接收端设备之间支持的传输模式。
在一种可能的实施方式中,第一能力信息包括下述内容中一项或多项:发送端设备和接收端设备的编解码能力、解DRM能力、接口加解密能力、音视频渲染能力以及显示播放能力。
在一种可能的实施方式中,媒体文件包括发送端设备中待显示的裸码媒体文件,或者,媒体文件包括发送端设备中待显示的内容压缩后的媒体文件。
第二方面,提供了一种媒体文件传输方法,该方法应用于接收端设备,接收端设备支持第一传输模式、第二传输模式和第三传输模式;第一传输模式为接收端设备接收可播放的媒体文件,第二传输模式由接收端设备执行将媒体文件恢复至可播放的操作中的部分操作,第三传输模式由接收端设备执行将媒体文件恢复至可播放的操作中的全部操作。该方法可以包括:接收端设备根据第一能力信息,从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输媒体文件的第一传输通道;其中,第一能力信息用于指示发送端设备的媒体处理能力,或者,第一能力信息用于指示发送端设备和接收端设备的媒体处理能力;一个传输模式对应的传输通道包括一个传输模式下执行将媒体文件恢复至可播放的操作的单元;接收端设备通过第一传输通道在接收端设备中的单元,获取可播放的媒体文件,显示播放可播放的媒体文件。
本申请实施例提供的媒体文件传输方法,发送端设备与接收端设备间支持多种传输模式,根据发送端设备和接收端设备的能力信息,确定由发送端设备执行将媒体文件恢复为可播放的多个操作中的部分或全部,由接收端设备执行其余操作,不同传输模式中,发送端设备执行的将媒体文件恢复为可播放的多个操作中的操作不同。从而在媒体文件传输时,能够利用接收端设备的计算和媒体处理能力,降低收发端之间传输接口上的数据量,节约带宽。
在一种可能的实施方式中,发送端设备请求的媒体文件为多个,包括:从第一能力信息支持的传输模式对应的传输通道中,按照规则分别确定传输每个媒体文件的第 一传输通道;通过每个第一传输通道在发送端设备中的单元,分别向接收端设备传输对应的媒体文件。该实施方式说明了在发送端设备同时请求多个媒体文件时,为每个媒体文件确定传输通道,向接收端设备传输每个媒体文件。
在一种可能的实施方式中,本申请提供的方法还可以包括:获取第二能力信息,第二能力信息用于指示下述能力中的一项或多项:发送端设备与接收端设备间接口的传输能力、发送端设备的计算能力、接收端设备的计算能力。相应的,在按照规则分别确定传输每个媒体文件的第一传输通道之后,若第二能力信息指示的能力不支持多个第一传输通道占用的资源,本申请提供的方法还包括:将第一媒体文件的第一传输通道进行调整,直至第二能力信息指示的能力支持所述多个所述第一传输通道占用的资源。第一媒体文件为多个媒体文件中一个或多个。该实施方式说明了当第二能力信息指示的能力不支持多个第一传输通道占用的资源时,调整传输第一媒体文件的第一传输通道,直到第二能力信息指示的能力支持多个第一传输通道占用的资源,从而实现多个媒体文件的同时传输。
在一种可能的实施方式中,第一传输模式对应的传输通道,包括接收端设备中执行播放操作的单元;第二传输模式对应的传输通道,包括发送端设备执行将媒体文件恢复至可播放的操作中第一操作之前的操作的单元,以及接收端设备中执行将媒体文件恢复至可播放的操作中第一操作以及第一操作之后的操作的单元;第一操作为将媒体文件恢复至可播放的操作中接收端设备支持的任一操作;第三传输模式对应的传输通道,包括接收端设备中执行将媒体文件恢复至可播放的操作的单元,以及执行播放操作的单元。该实施方式说明了三种传输模式分别对应的所使用的发送端设备和接收端设备中的媒体处理单元。
在一种可能的实施方式中,上述从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输每个媒体文件的第一传输通道,具体可以实现为:根据第一能力信息,确定第一能力信息支持的传输模式对应的每个传输通道的第三能力信息,第三能力信息用于指示每个传输通道支持传输的媒体文件的特征;确定第三能力信息满足规则的传输通道,作为第一传输通道。该实施方式说明了确定传输模式后,将第三能力信息满足规则的传输通道确定为第一传输通道,以使得确定的传输通道满足用户的需求。
在一种可能的实施方式中,上述规则可以包括下述内容中一项或多项:播放效果优先、发送端设备计算资源消耗最低、接收端设备计算资源消耗最低、接口传输带宽占用最低。
在一种可能的实施方式中,若接收端设备不支持第二操作,第二操作为将媒体文件恢复至可播放操作中最后一个操作,确定支持的传输模式包括第一传输模式;或者,若接收端设备支持第二操作,确定支持的传输模式包括第一传输模式和第二传输模式;或者,若接收端设备支持将媒体文件恢复至可播放的操作中所有操作,确定支持的传输模式包括第一传输模式、第二传输模式和第三传输模式。该实施方式说明了根据接收端设备对媒体文件的处理能力,确定发送端设备和接收端设备之间支持的传输模式。
在一种可能的实施方式中,将媒体文件恢复至可播放的操作包括:解码操作、渲染操作。解码操作包括解压缩操作,或者,解码操作包括解压缩操作和解DRM操作。 第一传输模式对应的传输通道包括发送端设备中的解码单元和渲染单元;第二传输模式对应的传输通道包括发送端设备中的解码单元和接收端设备中的渲染单元;第三传输模式对应的传输通道包括接收端设备中的解码单元和渲染单元。相应的,本申请提供的方法还可以包括:若第一能力信息指示接收端设备不支持渲染操作,确定第一能力信息支持的传输模式包括第一传输模式;或者,若第一能力信息指示接收端设备支持渲染操作但不支持解码操作,确定第一能力信息支持的传输模式包括第一传输模式和第二传输模式;或者,若第一能力信息指示接收端设备支持渲染操作且支持解码操作,确定第一能力信息支持的传输模式包括第一传输模式、第二传输模式和第三传输模式。该实施方式说明了根据接收端设备所支持的媒体文件处理中的具体操作,确定发送端设备和接收端设备之间支持的传输模式。
在一种可能的实施方式中,第一能力信息包括下述内容中一项或多项:发送端设备和接收端设备的编解码能力、解DRM能力、接口加解密能力、音视频渲染能力以及显示播放能力。
在一种可能的实施方式中,媒体文件包括发送端设备中待显示的裸码媒体文件,或者,媒体文件包括发送端设备中待显示的内容压缩后的媒体文件。
在一种可能的实施方式中,播放可播放的媒体文件包括:根据媒体文件中的时间戳信息,同步播放可播放的媒体文件。该实施方式说明了接收端设备可以同步播放多个可播放的媒体文件。
第三方面,提供一种媒体文件传输装置,该媒体文件传输装置部署于执行第一方面或第一方面中任意一种可能的实现方式所提供的媒体文件传输方法的发送端设备。例如,该媒体文件传输装置可以包括第一获取单元、第一确定单元、处理单元。其中:
第一获取单元,用于根据第一能力信息,获取媒体文件,该第一能力信息用于指示与发送端设备通信的接收端设备的媒体处理能力,或者,该第一能力信息用于指示发送端设备和接收端设备的媒体处理能力。
第一确定单元,用于从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输媒体文件的第一传输通道;一个传输模式对应的传输通道包括一个传输模式下执行将媒体文件恢复至可播放的操作的单元。
处理单元,用于通过第一传输通道在发送端设备中的单元,向接收端设备传输媒体文件。
需要说明的是,第三方面的各个单元具体实现同第一方面的方法描述,这里不再赘述。
第四方面,提供一种媒体文件传输装置,该媒体文件传输装置部署于于执行第二方面或第二方面中任意一种可能的实现方式所提供的媒体文件传输方法的接收端设备。例如,该接收端设备可以包括第一确定单元、获取单元、播放单元。其中:
第一确定单元,用于从第一能力信息支持的传输模式对应的传输通道中,按照规则确定与发送端设备传输媒体文件的第一传输通道;一个传输模式对应的传输通道包括一个传输模式下执行将媒体文件恢复至可播放的媒体文件。
获取单元,用于通过第一传输通道获取可播放的媒体文件。
播放单元,用于播放可播放的媒体文件。
需要说明的是,第四方面的各个单元具体实现同第二方面的方法描述,这里不再赘述。
第五方面,提供了一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行上述第一方面或其任一种可能的实现方式提供的媒体文件传输方法。
第六方面,提供了一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行上述第二方面或其任一种可能的实现方式提供的媒体文件传输方法。
第七方面,提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第一方面或其任一种可能的实现方式提供的媒体文件传输方法。
第八方面,提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第二方面或其任一种可能的实现方式提供的媒体文件传输方法。
第九方面,本申请提供了一种芯片系统,该芯片系统包括处理器,还可以包括存储器,用于实现上述方法中相应的功能。该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
第十方面,提供了一种媒体文件传输系统,包括如第三方面所述的发送端设备以及如第四方面所述的接收端设备,具备上述第一方面和第二方面以及任一可能实现方式的功能。
其中,需要说明的是,上述各个方面中的任意一个方面的各种可能的实现方式,在方案不矛盾的前提下,均可以进行组合。
附图说明
图1为本申请实施例提供的一种媒体文件传输的场景示意图;
图2为本申请实施例提供的一种通信系统的结构示意图;
图3为本申请实施例提供的手机的结构示意图;
图4为本申请实施例提供的一种传输媒体文件的系统的框架示意图;
图5为本申请实施例提供的另一种传输媒体文件的系统的框架示意图;
图6为本申请实施例提供的一种媒体文件传输的方法流程示意图;
图7为本申请实施例提供的另一种媒体文件传输的方法流程示意图;
图8为本申请实施例提供的一种AR/VR场景示意图;
图9为本申请实施例提供的一种媒体文件传输的场景示意图;
图10为本申请实施例提供的再一种传输媒体文件的系统的框架示意图;
图11为本申请实施例提供的一种传输媒体文件的装置的结构示意图;
图12为本申请实施例提供的另一种传输媒体文件的装置的结构示意图;
图13为本申请实施例提供的一种发送端设备的结构示意图;
图14为本申请实施例提供的另一种传输媒体文件的装置的结构示意图;
图15为本申请实施例提供的一种接收端设备的结构示意图。
具体实施方式
在本申请实施例中,为了便于清楚描述本申请实施例的技术方案,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。该“第一”、第二”描述的技术特征间无先后 顺序或者大小顺序。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念,便于理解。
在本申请实施例中,至少一个还可以描述为一个或多个,多个可以是两个、三个、四个或者更多个,本申请不做限制。
此外,本申请实施例描述的网络架构以及场景是为了更加清楚的说明本申请实施例的技术方案,并不构成对于本申请实施例提供的技术方案的限定,本领域普通技术人员可知,随着网络架构的演变和新业务场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
在描述本申请的实施例之前,此处先对本申请涉及的名词统一进行解释说明,后续不再一一进行说明。
媒体文件,是指包含音视频内容的文件。
裸码流,是指可以直接播放的音视频内容。电子设备从本地或者服务器获取的媒体文件,可以为加密编码后的音视频内容,对其解密解压缩可以获得裸码流。
为了下述各实施例的描述清楚简洁,首先给出相关技术的简要介绍:
当前,媒体文件传输的场景如图1所示,发送端设备(如手机、个人电脑等终端设备)的通信单元获取媒体文件(压缩媒体流)后,由解码器进行解码,获取裸码流及元数据,由渲染单元根据元数据对裸码流进行渲染,渲染后的裸码流可以直接在发送端设备中显示播放(可选)。发送端设备得到渲染后的媒体文件的可播放裸码流后,直接通过媒体接口(如高清多媒体接口(high definition multimedia interface,HDMI)、通用串行总线(universal serial bus,USB)等媒体传输接口),传输到接收端设备(如智能电视、车载显示器等)显示播放。需要说明的是,发送端设备的通信单元,可以从网络或本地存储单元获取媒体文件数据。其中,若媒体文件数据为数字版权保护内容,还需要对解码后的裸码流进行解DRM。
由于图1示意的场景中,媒体接口传输的媒体文件数据都是经过解码后的,因此数据量大,占用带宽高。
然而,随着智能化设备的普及,接收端设备的计算和媒体处理能力也在不断提升,当前的媒体文件传输过程并未发挥接收端设备的能力。
再者,对于包含数字版权保护内容的媒体文件,若在发送端设备中进行了解DRM,之后经过媒体接口对其传输,由于版权保护内容传输的限定,必须要在接口两侧进行对称的接口加解密处理,若传输双方不支持对称的接口加解密,当前的媒体文件传输方法将无法传输受版权保护的高质量媒体文件,限制了媒体文件传输的应用场景。
基于此,本申请提供一种媒体文件传输方法,该方法可以应用于发送端设备与接收端设备传输媒体文件的过程中,该方法中发送端设备与接收端设备间支持多种传输模式,根据收发双方的能力信息,确定由发送端设备执行将媒体文件恢复为可播放的多个操作中的部分或全部,由接收端设备执行其余操作,不同传输模式中,发送端设备执行的将媒体文件恢复为可播放的多个操作中的操作不同。这样一来,在媒体文件 传输时,可以利用接收端设备的计算和媒体处理能力,降低收发端之间传输接口上的数据量,节约带宽。
进一步的,对于包含数字版权保护内容的媒体文件,即使传输双方不支持对称的接口加解密,可以通过本申请提供的方案,在接收端设备中进行解DRM,这样就可以实现受版权保护的高质量媒体文件的传输。
例如,在一种传输模式中,当接收端设备的媒体处理能力优于发送端设备时,发送端设备获取到未解码的媒体文件数据后,可以直接将该媒体文件数据通过媒体接口透传至接收端设备,由接收端设备对媒体文件进行解码、解DRM(可选)、渲染后,再显示播放,相比于当前由发送端设备将媒体文件数据处理至可播放再通过媒体接口传输至接收端设备,媒体接口上传输的数据量大大减小,也就节约了传输带宽。
本申请提供的方案,可以应用于图2所示的通信系统中。如图2所示,该通信系统可以包括发送端设备201以及接收端设备202。发送端设备201通过媒体接口203与接收端设备202传输媒体文件。
具体的,图2示意的通信系统,可以为投屏场景、屏幕镜像场景或者AR/VR场景中用户端的传输媒体文件的场景,本申请实施例对于本申请提供的方案的应用场景不予限定。
其中,发送端设备201可以为手机、平板电脑、桌面型、膝上型、手持计算机、蜂窝电话、以及机顶盒等具有移动通信能力的设备。接收端设备202,可以为电视、手机、平板电脑、桌面型、膝上型、手持计算机、车载显示器、头戴式显示设备、以及投影仪等具有显示播放能力的设备。本申请实施例对上述发送端设备201和接收端设备202的具体产品形态不作具体限制。
媒体接口203是发送端设备201与接收端设备202之间的物理连接。媒体接口203可以是有线电信号接口、有线光信号接口或者无线信号接口等形式,本申请实施例对于媒体接口203的类型不予限定。
媒体接口203可以与发送端设备201中的输出接口连接,与接收端设备202中的输入接口连接。发送端设备201中的输出接口、接收端设备202中的输入接口可以加载于芯片中。
示例性的,发送端设备201为手机,接收端设备202为智能电视。用户使用手机播放视频时,通过投屏操作,由手机向智能电视发送手机播放的视频媒体文件。
请参考图3,本申请实施例这里以发送端设备、接收端设备为手机100为例,对本申请实施例提供的发送端设备、接收端设备进行介绍。其中,本领域技术人员可以理解,图3所示的手机100仅仅是一个范例,并不构成对手机的限定,并且手机可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图3中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
如图3所示,手机100具体可以包括:处理器101、射频(radio frequency,RF)电路102、存储器103、触摸屏104、蓝牙装置105、一个或多个传感器106、无线保真(wireless fidelity,WI-FI)装置107、定位装置108、音频电路109、外设接口110、电源系统111以及指纹识别器112等部件。这些部件可通过一根或多根通信总线或信 号线(图3中未示出)进行通信。
下面结合图3对手机100的各个部件进行具体的介绍:
处理器101是手机100的控制中心,利用各种接口和线路连接手机100的各个部分,通过运行或执行存储在存储器103内的应用程序(application,App),以及调用存储在存储器103内的数据和指令,执行手机100的各种功能和处理数据。在一些实施例中,处理器101可包括一个或多个处理单元;处理器101还可以集成应用处理器和调制解调处理器。其中,应用处理器主要处理操作系统、用户界面和应用程序等。调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器101中。例如,处理器101可以获取媒体文件,进行解码、解DRM、渲染等操作。或者,处理器101可以向其他设备透传媒体文件,或者发送可播放的媒体文件,或者发送进行了部分处理的媒体文件。
射频电路102可用于在收发信息或通话过程中,无线信号的接收和发送。具体地,射频电路102可以将基站的下行数据接收后,给处理器101处理。另外,将涉及上行的数据发送给基站。通常,射频电路102包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路102还可以通过无线通信和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
存储器103用于存储应用程序以及数据,处理器101通过运行存储在存储器103的应用程序以及数据,执行手机100的各种功能以及数据处理。存储器103主要包括存储程序区以及存储数据区。其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)。存储数据区可以存储根据使用手机100时所创建的数据(比如音频数据、电话本等)。此外,存储器103可以包括高速随机存取存储器,还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。存储器103可以存储各种操作系统,例如苹果公司所开发的
Figure PCTCN2022086907-appb-000001
操作系统,谷歌公司所开发的
Figure PCTCN2022086907-appb-000002
操作系统等。
触摸屏104可以包括触敏表面104-1和显示器104-2。其中,触敏表面104-1(例如触控面板)可采集手机100的用户在其上或附近的触摸事件(比如用户使用手指、触控笔等任何适合的物体在触敏表面104-1上或在触敏表面104-1附近的操作),并将采集到的触摸信息发送给其他器件例如处理器101。其中,用户在触敏表面104-1附近的触摸事件可以称之为悬浮触控。悬浮触控可以是指,用户无需为了选择、移动或拖动目标(例如图标等)而直接接触触控板,而只需用户位于移动终端附近以便执行所想要的功能。在悬浮触控的应用场景下,术语“触摸”、“接触”等不会暗示用于直接接触触摸屏,而是在其附近或接近的接触。能够进行悬浮触控的触敏表面104-1可以采用电容式、红外光感以及超声波等实现。触敏表面104-1可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再发送给处理器101,触摸控制器还可以接收处理器101发送的指令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种 类型来实现触敏表面104-1。显示器(也称为显示屏)104-2可用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单。可以采用液晶显示器、有机发光二极管等形式来配置显示器104-2。触敏表面104-1可以覆盖在显示器104-2之上,当触敏表面104-1检测到在其上或附近的触摸事件后,传送给处理器101以确定触摸事件的类型,随后处理器101可以根据触摸事件的类型在显示器104-2上提供相应的视觉输出。虽然在图3中,触敏表面104-1与显示屏104-2是作为两个独立的部件来实现手机100的输入和输出功能,但是在某些实施例中,可以将触敏表面104-1与显示屏104-2集成而实现手机100的输入和输出功能。可以理解的是,触摸屏104是由多层材料堆叠而成,本申请实施例中只展示出了触敏表面(层)和显示屏(层),其他层在本申请实施例中不予记载。另外,在本申请其他一些实施例中,触敏表面104-1可以覆盖在显示器104-2之上,并且触敏表面104-1的尺寸大于显示屏104-2的尺寸,使得显示屏104-2全部覆盖在触敏表面104-1下面,或者,上述触敏表面104-1可以以全面板的形式配置在手机100的正面,也即用户在手机100正面的触摸均能被手机100感知,这样就可以实现手机100正面的全触控体验。在其他一些实施例中,触敏表面104-1以全面板的形式配置在手机100的正面,显示屏104-2也可以以全面板的形式配置在手机100的正面,这样在手机100的正面就能够实现无边框的结构。
手机100还可以包括蓝牙装置105,用于实现手机100与其他短距离的移动终端(例如手机、智能手表等)之间的数据交换。本申请实施例中的蓝牙装置可以是集成电路或者蓝牙芯片等。
手机100还可以包括至少一种传感器106,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节触摸屏104的显示器的亮度,接近传感器可在手机100移动到耳边时,关闭显示器的电源。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机100姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不予赘述。
在本申请各个实施例中,手机100还可以具有指纹识别功能。例如,可以在手机100的背面(例如后置摄像头的下方)配置指纹识别器112,或者在手机100的正面(例如触摸屏104的下方,再例如手机100的主屏幕键上)配置指纹识别器112。另外,也可以通过在触摸屏104中配置指纹识别器112来实现指纹识别功能,即指纹识别器112可以与触摸屏104集成在一起来实现手机100的指纹识别功能。在这种情况下,该指纹识别器112可以配置在触摸屏104中,可以是触摸屏104的一部分,也可以以其他方式配置在触摸屏104中。另外,该指纹识别器112还可以被实现为全面板指纹识别器,因此,可以把触摸屏104看成是任何位置都可以进行指纹采集的一个面板。在一些实施例中,该指纹识别器112可以对采集到的指纹进行处理。例如,指纹识别器112可以对采集到的指纹进行指纹验证等处理。指纹识别器112还可以将指纹验证的处理结果(如指纹验证是否通过)发送给处理器101,以便处理器101根据接收到的指纹验证的结果进行相应的响应。在其他一些实施例中,该指纹识别器112也可以 将采集到的指纹发送给处理器101,以便处理器101对该指纹进行处理(例如指纹验证等)。本申请实施例中的指纹识别器112的主要部件是指纹传感器,该指纹传感器可以采用任何类型的感测技术,包括但不限于光学式、电容式、压电式或超声波传感技术等。
WI-FI装置107,用于为手机100提供遵循WI-FI相关标准协议的网络接入,手机100可以通过WI-FI装置107接入到WI-FI接入点,进而帮助用户收发电子邮件、浏览网页和访问流媒体等,它为用户提供了无线的宽带互联网访问。在其他一些实施例中,该WI-FI装置107也可以作为WI-FI无线接入点,可以为其他移动终端提供WI-FI网络接入。
定位装置108,用于为手机100提供地理位置。可以理解的是,该定位装置108具体可以是全球定位系统(global positioning system,GPS)、北斗卫星导航系统等定位系统的接收器。定位装置108在接收到上述定位系统发送的地理位置后,将该信息发送给处理器101处理,或者发送给存储器103保存。在另外的一些实施例中,该定位装置108可以是辅助全球卫星定位系统(assisted global positioning system,AGPS)的接收器。AGPS是一种在一定辅助配合下进行GPS定位的运行方式,它可以利用基站的信号,配合GPS卫星信号,可以让手机100定位的速度更快;在AGPS系统中,该定位装置108可通过与辅助定位服务器(例如手机100定位服务器)的通信而获得定位辅助。AGPS系统通过作为辅助服务器来协助定位装置108完成测距和定位服务,在这种情况下,辅助定位服务器通过无线通信网络与移动终端例如手机100的定位装置108(即GPS接收器)通信而提供定位协助。
音频电路109、扬声器113、麦克风114可提供用户与手机100之间的音频接口。音频电路109可将接收到的音频数据转换后的电信号,传输到扬声器114,由扬声器113转换为声音信号输出;另一方面,麦克风114将收集的声音信号转换为电信号,由音频电路109接收后转换为音频数据,再将音频数据输出至RF电路102以发送给比如另一手机,或者将音频数据输出至存储器103以便进一步处理。
外设接口110,用于为外部的输入/输出设备(例如键盘、鼠标、外接显示器、外部存储器、用户识别模块卡等)提供各种接口。例如通过通用串行总线接口与鼠标连接,通过用户识别模块卡卡槽上的金属触点与电信运营商提供的用户识别模块(Subscriber Identity Module,SIM)卡连接。外设接口110可以被用来将上述外部的输入/输出外围设备耦接到处理器101和存储器103。
手机100还可以包括给各个部件供电的电源装置111(比如电池和电源管理芯片),电池可以通过电源管理芯片与处理器101逻辑相连,从而通过电源装置111实现管理充电、放电、以及功耗管理等功能。
尽管图3未示出,手机100还可以包括摄像头(前置摄像头和/或后置摄像头)、闪光灯、微型投影装置、近场通信(near field communication,NFC)装置等,在此不予赘述。
示例性地,在手机100的存储器103中可以存储
Figure PCTCN2022086907-appb-000003
操作系统。该操作系统是一个以Linux为基础的移动设备操作系统,并结合手机100中的上述硬件实现各种各样的功能。下面,将详细说明该存储的
Figure PCTCN2022086907-appb-000004
操作系统的软件架构。需要说 明的是,本申请实施例仅以
Figure PCTCN2022086907-appb-000005
操作系统为示例来说明移动终端要实现本实施例的技术方案的所需的软件环境,本领域技术人员可以理解,本申请实施例也可以以其它操作系统来实现。
下面将结合附图,对本申请中的技术方案进行描述。
一方面,本申请提供一种媒体文件传输方法,应用于发送端设备向接收端设备发送媒体文件的过程中。
一种可能的实现方式中,本申请描述的媒体文件,可以包括发送端设备中待显示的裸码媒体文件,或者,本申请描述的媒体文件,可以包括发送端设备中待显示的内容压缩后的媒体文件,或者,本申请描述的媒体文件,可以包括发送端设备获取的压缩后的网络媒体文件。
其中,发送端设备和/或接收端设备支持第一传输模式、第二传输模式和第三传输模式。
第一传输模式由发送端设备将媒体文件恢复至可播放,接收端设备接收可播放的媒体文件。
第二传输模式由发送端设备执行将媒体文件恢复至可播放的操作中的部分操作,接收端设备执行将媒体文件恢复至可播放的操作中的部分操作。应理解,第二传输模式可以有一种或多种。比如,将媒体文件恢复至可播放的操作中包括A、B、C及D(A、B、C及D中不包含显示播放操作),第二传输模式可以包括发送端设备执行A,接收端设备执行B、C及D;或者,发送端设备执行A和B,接收端设备执行C和D;或者,发送端设备执行A、B和C,接收端设备执行D。
第三传输模式由发送端设备透传媒体文件,由接收端设备将媒体文件恢复至可播放。
示例性的,假设将媒体文件恢复至可播放的操作包括:解码操作、渲染操作。该解码操作包括解压缩操作,或者,该解码操作包括解压缩操作和解DRM操作,在图4示意的传输媒体文件的系统中,发送端设备包括通信单元、解码单元、渲染单元和显示单元,接收端设备也包括通信单元、解码单元、渲染单元和显示单元。
在图4示意的系统中,第一传输模式可以为图4示意的模式1,由发送端设备将媒体文件进行解码及渲染,将渲染后的裸码流通过媒体接口发送至接收端设备的显示单元进行显示,第一传输模式中媒体文件的传输路径可以如图4中标注了模式1的箭头所示。
在图4示意的系统中,第二传输模式可以为图4示意的模式2,由发送端设备将媒体文件进行解码,得到媒体文件的裸码流和用于音视频渲染的元数据,将这些内容封装为bit流,通过媒体接口发送至接收端设备,接收端设备进行解封bit流获得媒体文件的裸码流和用于音视频渲染的元数据,利用元数据在渲染单元进行渲染后,由显示单元显示播放。第二传输模式中媒体文件的传输路径可以如图4中标注了模式2的箭头所示。
在图4示意的系统中,第三传输模式可以为图4示意的模式3,由发送端设备将压缩的媒体文件封装为bit流,通过媒体接口发送至接收端设备,接收端设备进行解封bit流获得压缩的媒体文件,接收端设备的解码单元对压缩的媒体文件进行解码,得到 媒体文件的裸码流和用于音视频渲染的元数据,利用元数据在渲染单元进行渲染后,由显示单元显示播放。第三传输模式中媒体文件的传输路径可以如图4中标注了模式3的箭头所示。
具体的,可以通过配置发送端设备、接收端设备间的接口类型或者接口能力,使得发送端设备和/或接收端设备可以支持第一传输模式、第二传输模式和第三传输模式,本申请实施例对于具体配置过程不予限定。
其中,发送端设备与接收端设备进行通信之前,需先通过双方间的接口建立物理连接,连接方式可以为有线电信号、有线光信号或无线信号等方式,本申请实施例不予限定。在发送端设备与接收端设备建立物理连接之后,发送端设备与接收端设备通过设备发现协议和握手协议发现彼此并建立通信连接。
在两者建立通信连接之后,两端设备通过能力协商,以使得发送端设备获取第一能力信息,该第一能力信息可以用于指示与发送端设备通信的接收端设备的媒体处理能力,或者,该第一能力信息可以用于指示发送端设备和接收端设备的媒体处理能力。
其中,媒体处理能力包括设备内各媒体处理单元对于媒体文件的处理能力。
示例性的,媒体处理能力可以包括:编解码能力、解DRM能力、音视频渲染能力、显示播放能力以及接口加解密能力中的一项或多项。
其中,编解码能力可以包括:能够解码的压缩协议类型,可实时解码媒体的分辨率、帧率、bit位数等中的一项或多项。
解DRM能力可以包括:支持的解DRM协议类型,可实时解码的媒体分辨率、帧率、bit位数等中的一项或多项。
音视频渲染能力可以包括:支持HDR、3D Audio等渲染的协议类型,可实时渲染的分辨率、帧率、bit位数等中的一项或多项。
显示播放能力可以包括:可播放的分辨率、帧率、bit位数,显示峰值亮度、对比度,音频播放的声道数量、排布,支持的声音播放模式等中的一项或多项。
接口加解密能力可以包括:支持接口加解密协议类型,可实时进行接口加解密的媒体分辨率、帧率、bit位数等中的一项或多项。
需要说明的是,上述媒体处理能力仅为示例,并不构成具体限定,在实际应用中,可能根据实际需求配置媒体处理能力的内容。
进一步的,发送端设备在获取到第一能力信息之后,就可以确定第一能力信息支持的传输模式。其中,第一能力信息支持的传输模式为第一传输模式、第二传输模式、第三传输模式中的一种或多种。
具体的,发送端设备确定第一能力信息支持的传输模式,是指根据第一能力信息指示的接收端设备的媒体处理能力,确定将媒体文件恢复至可播放的操作中,哪些是接收端设备支持的,然后根据接收端设备支持的操作,确定第一能力信息支持的传输模式。
示例性的,若所有操作都是接收端设备支持的,第一能力信息就支持第一传输模式、第二传输模式和第三传输模式。若所有操作中,从最后一个操作开始,连续的一个或多个操作是接收端设备支持的,第一能力信息就支持第一传输模式和第二传输模式;若接收端设备不支持所有操作中最后一个操作,第一能力信息就仅支持第一传输 模式。
一种可能的实现方式中,确定第一能力信息支持的传输模式可以包括:
若第一能力信息指示接收端设备不支持第二操作,第二操作为将媒体文件恢复至可播放的操作中最后一个操作,确定支持的传输模式仅包括第一传输模式。或者,若第一能力信息指示接收端设备支持第二操作,确定第一能力信息支持的传输模式包括第一传输模式和第二传输模式。或者,若第一能力信息指示接收端设备支持将媒体文件恢复至可播放的操作中所有操作,确定第一能力信息支持的传输模式包括第一传输模式、第二传输模式和第三传输模式。
示例性的,将媒体文件恢复至可播放的操作包括:解码操作、渲染操作。该解码操作包括解压缩操作,或者,该解码操作包括解压缩操作和解DRM操作。确定第一能力信息支持的传输模式可以包括:若第一能力信息指示接收端设备不支持渲染操作,确定第一能力信息支持的传输模式包括第一传输模式;或者,若第一能力信息指示接收端设备支持渲染操作但不支持解码操作,确定第一能力信息支持的传输模式包括第一传输模式和第二传输模式;或者,若第一能力信息指示接收端设备支持渲染操作且支持解码操作,确定第一能力信息支持的传输模式包括第一传输模式、第二传输模式和第三传输模式。
应理解,发送端设备与接收端设备建立连接传输媒体文件之前,就可以进行上述建立连接、获取第一能力信息以及确定第一能力信息支持的传输模式的操作,等待传输媒体文件。然后,在需要传输媒体文件时,执行如图6所示的方案。
进一步的,发送端设备与接收端设备在建立连接,或者在能力协商获取第一能力信息的过程中,还可以由发送端设备向接收端设备发送即将传输的媒体文件的属性信息,该属性信息可以用于指示媒体文件的与传输相关的特征。例如,该属性信息可以包括但不限于下述信息中一项或多项:大小、是否数字版权保护、渲染需求等。
进一步的,可以将媒体文件的传输路径称为传输通道,一种传输模式对应的传输通道,包括执行将媒体文件恢复至可播放的每个操作的单元。因此,第一传输模式对应的传输通道中的单元都在发送端设备中;第二传输模式对应的传输通道中的单元一部分在发送端设备中,其他部分在接收端设备中;第三传输模式对应的传输通道中的单元全部在接收端设备中。
具体的,第一传输模式对应的传输通道中,包括发送端设备中执行将媒体文件恢复至可播放的操作的单元。第二传输模式对应的传输通道中,包括发送端设备中执行将媒体文件恢复至可播放的操作中第一操作之前的操作的单元,以及接收端设备中执行将媒体文件恢复至可播放的操作中第一操作以及第一操作之后的操作的单元;其中,第一操作为将媒体文件恢复至可播放的操作中接收端设备支持的任一操作。第三传输模式中,包括接收端设备中执行将媒体文件恢复至可播放的操作的单元。
示例性的,将媒体文件恢复至可播放的操作包括:解码操作、渲染操作。该解码操作包括解压缩操作,或者,该解码操作包括解压缩操作和解DRM操作。第一传输模式对应的传输通道包括发送端设备中的解码单元和渲染单元;第二传输模式对应的传输通道包括发送端设备中的解码单元和接收端设备中的渲染单元;第三传输模式对应的传输通道包括接收端设备中的解码单元和渲染单元。
需要说明的是,在发送端设备和/或接收端设备中,执行相同操作的单元可以有一个或多个,因此,一个传输模式可以对应多个传输通道。对于一个传输模式对应的多个传输通道,由于其工作原理相同,后续内容不再一一说明。
示例性的,在图5示意的传输媒体文件的系统中,发送端设备51包括视频接收单元511、视频解压缩单元512、解DRM单元513、媒体渲染单元514、接口加密单元515以及输出接口516;接收端设备52包括输入接口521、视频解压缩单元522、解DRM单元523、媒体渲染单元524、接口解密单元525以及显示播放单元526。需要说明的是,在图5示意的系统中,发送端设备和接收端设备中,执行相同操作的单元只示意了一个,因此,一个传输模式则对应一个传输通道。
其中,视频接收单元511用于通过有线、无线网络、总线或接口,接收来自视频源的视频流信息。视频解压缩单元512用于在发送端设备对媒体文件进行解压缩。解DRM单元513(可选)用于在发送端设备中对DRM加密的媒体文件进行相应的解DRM。媒体渲染单元514(可选)用于对视频进行HDR、三维声等渲染处理。接口加密单元515(可选)用于对视频信号进行接口加密。输出接口516用于通过接口编码、调制等将媒体文件转化为物理层信号,发送到传输信道。传输信道可以是有线电信号、光信号或无线电信号等。传输信道可以对原始额音视频数据信号、压缩的音视频信号和其他数据信号以及控制信号、握手信号做数据聚合。
其中,输入接口521用于从传输信道接收物理层信号,继续解调、接口解密等操作恢复出媒体文件。接口解密单元525(可选)用于对接收到的媒体文件进行接口解密。媒体渲染单元524(可选)用于对视频进行HDR、三维声等渲染处理。显示播放单元526用于对可播放的媒体文件进行显示播放。视频解压缩单元522用于在接收端设备对媒体文件进行解压缩。解DRM单元523(可选)用于在接收端对DRM加密的媒体文件进行相应的解DRM。
进一步的,如图5所示,发送端设备51还可以包括内容请求单元517,用于根据应用需求和能力协商结果请求相应视频源。
进一步的,如图5所示,发送端设备51还可以包括能力协商单元518,接收端设备52还可以包括能力协商单元527,能力协商单元518/527用于对两端的显示、渲染、解压缩、接口加解密、解DRM等能力进行协商,以获取第一能力信息。
在图5示意的传输媒体文件的系统中,传输受DRM保护的媒体文件时,第一传输模式对应传输通道1,其包括视频解压缩单元512、解DRM单元513、媒体渲染单元514、接口加密单元515以及输出接口516、输入接口521、接口解密单元525以及显示播放单元526。第二传输模式对应传输通道2,其包括视频解压缩单元512、解DRM单元513、接口加密单元515以及输出接口516、输入接口521、接口解密单元525、媒体渲染单元524以及显示播放单元526。第三传输模式对应传输通道3,其包括输出接口516、输入接口521、视频解压缩单元522、解DRM单元523、媒体渲染单元524以及显示播放单元526。
如图6所示,本申请提供的媒体文件传输的方法可以包括:
S601、发送端设备根据第一能力消息,获取待传输的媒体文件。
一种可能的实现方式中,S601中发送端设备可以直接获取待传输的媒体文件,且 对获取的媒体文件不做的特征不进行限定。
另一种可能的实现方式中,S601中发送端设备可以根据已经获取的第一能力信息,确定的第一能力信息支持的传输模式,进而确定第一能力信息支持的传输模式对应的传输通道,然后分析并记录第一能力信息支持的传输模式对应的每个传输通道的第三能力信息,该第三能力信息用于指示传输通道的传输媒体文件的最高能力或者最低能力。示例性的,第三能力信息可以包括下述内容中的一项或多项:分辨率、帧率、bit位数、是否有DRM加密、该通道支持的渲染能力等。S601中,发送端设备根据第一能力信息支持的传输模式对应的传输通道,请求获取符合第一能力信息支持的传输模式对应的传输通道的第三能力信息的媒体文件。
示例性的,发送端设备可以通过如图5示意的架构中的内容请求单元517,从媒体文件源(如网络服务器、本地存储等,可能提供多格式/参数的内容资源),请求符合第一能力信息支持的传输模式对应的传输通道的第三能力信息的媒体文件。
一种可能的实现方式中,符合第一能力信息支持的传输模式对应的传输通道的传输能力的媒体文件,可以指第一能力信息支持的传输模式对应的传输通道中第三能力信息最弱的传输通道可以传输的媒体文件。
另一种可能的实现方式中,S601中发送端设备可以从一个或多个内容源同时获取多个媒体文件。
可选的,发送端设备获取的媒体文件可以包括发送端设备中待显示的裸码媒体文件,或者发送端设备中待显示内容压缩后的媒体文件,或者发送端设备获取的压缩后的网络媒体文件。
可选的,本申请实施例描述的媒体文件,可以为码流或者视频信号,或者其他形式,本申请实施例对于媒体文件的类型不予限定。
示例性的,S601中发送端设备获取的待传输的媒体文件可以包括但不限于下述媒体文件中的一项或多项:发送端设备本地存储的媒体文件、发送端外部存储或通过接口传输的媒体文件、来自网络的媒体文件(如点播、直播、视频通话、视频会议等应用的媒体文件)、本地渲染或录制生成的媒体文件(如游戏、录屏、摄像头拍的等媒体文件)。
S602、发送端设备从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输媒体文件的第一传输通道。
一种可能的实现方式中,S602中发送端设备从第一能力信息支持的传输模式对应的传输通道中,按照规则确定与接收端设备传输媒体文件的第一传输通道,该媒体文件可以为发送端设备与接收端设备本次建立连接后进行传输的所有媒体文件。相应的,在该实现方式中,S602可以在S601之前执行。当然,本申请实施例对于方案中各个步骤的执行顺序并不予限定,均可以根据实际需求配置。
一种可能的实现方式中,S602中发送端设备从第一能力信息支持的传输模式对应的传输通道中,按照规则确定与接收端设备传输S601中获取的待传输媒体文件的第一传输通道。
如前述,一个传输模式对应的传输通道可以有一个或多个,S602中确定的第一传输通道,是指第一能力信息支持的某一种传输模式对应的传输通道。
一种可能的实现方式中,当发送端设备/接收端设备中执行同一个操作的单元有多个时,S602中先根据规则确定某一传输模式对应的多个第一传输通道,此时可以根据负载平衡或者轮询机制或者其他方式,从多个第一传输通道中选取最终确定的一个第一传输通道。本申请实施例,对于从一个传输模式对应的多个传输通道中,选取一个传输通道的过程,不予具体限定。
另一种可能的实现方式中,当发送端设备/接收端设备中执行同一个操作的单元只有一个时,S602中根据规则确定某一传输模式对应的一个第一传输通道,就是最终确定的第一传输通道。
一种可能的实现方式中,S602中从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输媒体文件的第一传输通道,具体可以实现为:根据第一能力信息,确定第一能力信息支持的传输模式对应的每个传输通道的第三能力信息,该第三能力信息用于指示每个传输通道支持传输的媒体文件的特征;确定第三能力信息满足规则的传输通道,作为第一传输通道。
其中,上述规则用于指示用户的需求,可以为系统预设也可以为用户手动输入,本申请实施例对于该规则的获取方式不予限定。
示例性的,上述规则可以包括下述内容中一项或多项:播放效果优先、发送端设备计算资源消耗最低、接收端设备计算资源消耗最低、接口传输带宽占用最低。
示例性的,假设规则为发送端设备计算资源消耗最低,第三能力信息满足规则的传输通道就是传输媒体文件的大小最小的传输通道。
另一种可能的实现方式中,当待传输的媒体文件(同时传输)为多个时,S602中发送端设备从第一能力信息支持的传输模式对应的传输通道中,按照规则分别确定传输每个媒体文件的第一传输通道。
进一步可选的,当待传输的媒体文件为多个时,如图7所示,本申请提供的方法还可以包括S602a。
S602a、发送端设备获取第二能力信息。
其中,第二能力信息用于指示发送端设备与接收端设备的传输能力,第二能力信息用于指示下述能力中的一项或多项:发送端设备与接收端设备间接口的传输能力、发送端设备的计算能力、接收端设备的计算能力。
相应的,在S602之后,如图7所示,本申请提供的方法还可以包括S602b。
S602b、发送端设备判断第二能力信息指示的能力,是否支持多个第一传输通道占用的资源。
若第二能力信息指示的能力支持多个第一传输通道占用的资源,执行S603;否则,执行S602c,然后执行S603。
S602c、发送端设备将第一媒体文件的第一传输通道进行调整,直至第二能力信息指示的能力支持多个第一传输通道占用的资源。
其中,第一媒体文件为多个媒体文件中一个或多个。
需要说明的是,对于选取第一媒体文件的方式,本申请实施例不予限定,可以随时选取,或者选取占用资源最高的媒体文件,或者其他方式。
还需要说明的是,S602中选择的多个媒体文件的第一传输通道,可以相同,也可 以不同,本申请实施例对此不予限定。
S603、发送端设备通过第一传输通道在发送端设备中的单元,向接收端设备传输媒体文件。
具体的,S603中,发送端设备通过第一传输通道在发送端设备中的单元,对媒体文件进行对应的操作后,通过输出接口(也可称为媒体接口),向接收端设备传输媒体文件。
示例性的,如图5所述的系统,若第一传输通道为第一传输模式对应的通道1,S603中发送端设备对媒体文件恢复为可播放,然后将可播放的媒体文件进行接口加密后,通过输出接口,向接收端设备传输可播放的媒体文件。
示例性的,如图5所述的系统,若第一传输通道为第二传输模式对应的通道2,S603中发送端设备对媒体文件进行解压缩操作和解DRM操作,然后将裸码流和元数据进行接口加密,通过输出接口,向接收端设备传输接口加密后的裸码流和元数据,渲染操作在接收端设备中进行。
示例性的,如图5所述的系统,若第一传输通道为第三传输模式对应的通道3,S603中发送端设备对媒体文件接收后,通过输出接口进行透传,在接收端设备中进行将媒体文件恢复至可播放的操作。
一种可能的实现方式中,若待传输的媒体文件为多个,S602中获取了传输每个媒体文件的第一传输通道,S603中发送端设备则通过传输每个媒体文件的第一传输通道在发送端设备中的单元,分别向接收端设备传输对应媒体文件。
另一种可能的实现方式中,若待传输的媒体文件为多个,可以在每个媒体文件包括时间戳信息,用于在接收端设备中同步显示该多个媒体文件。
S604、接收端设备从第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输与发送端设备传输媒体文件的第一传输通道。
其中,第一能力信息、第一能力信息支持的传输模式、第一能力信息支持的传输模式对应的传输通道都已经在前述内容中进行了详细描述,此处不再赘述。
需要说明的是,S604中接收端设备按照规则确定传输与发送端设备传输媒体文件的第一传输通道的具体实现,可以参照S602中发送端设备按照规则确定传输与发送端设备传输媒体文件的第一传输通道的具体实现,此处不再赘述。
一种可能的实现方式中,若待传输的媒体文件为多个,S604中分别获取传输每个媒体文件的第一传输通道。
进一步的,S604中确定的第一传输通道,与S602中确定的第一传输通道,对应于同一个传输模式。
S605、接收端设备通过第一传输通道在接收端设备中单元,获取可播放的媒体文件。
具体的,S605中,接收端设备通过输入接口(也可称为媒体接口)接收发送端设备传输的媒体文件,接收的媒体文件为可播放的媒体文件,或者不可播放的媒体文件,取决于发送端设备与接收端设备之间的传输模式。
示例性的,如图5所述的系统,若第一传输通道为第一传输模式对应的通道1,S605中发送端设备对媒体文件恢复为可播放,然后将可播放的媒体文件进行接口加密 后,通过输出接口,向接收端设备传输可播放的媒体文件,接收端设备通过输入接口接收到可播放的媒体文件,进行接口解密后,获取可播放的媒体文件。
示例性的,如图5所述的系统,若第一传输通道为第二传输模式对应的通道2,S605中发送端设备对媒体文件进行解压缩操作和解DRM操作,然后将裸码流和元数据进行接口加密,通过输出接口,向接收端设备传输接口加密后的裸码流和元数据。接收端设备通过输入接口接收加密后的裸码流和元数据,进行接口解密,然后在渲染单元按照元数据进行渲染操作得到可播放的媒体文件。
示例性的,如图5所述的系统,若第一传输通道为第三传输模式对应的通道3,S605中发送端设备对媒体文件接收后,通过输出接口进行透传。接收端设备通过输入接口接收加密的媒体文件,然后接收端设备通过解压缩单元对媒体文件进行解压缩操作,通过解DRM单元对解码后的媒体文件进行解DRM操作,得到裸码流和元数据,然后在渲染单元按照元数据进行渲染操作得到可播放的媒体文件。
一种可能的实现方式中,若待传输的媒体文件为多个,S604中获取了传输每个媒体文件的第一传输通道,S605中接收端设备则通过传输每个媒体文件的第一传输通道在接收端设备中的单元,分别获取对应的可播放的媒体文件。
S606、接收端设备播放可播放的媒体文件。
具体的,S606中接收端设备播放S605中获取的可播放的媒体文件。
一种可能的实现方式中,当多个媒体文件需要同步播放时,S606中根据媒体文件中携带的时间戳信息同步后播放。
另一种可能的实现方式中,当多个媒体文件需要同时显示时,可以利用窗口或画中画的形式进行显示。
通过本申请提供媒体文件传输方法,发送端设备与接收端设备间支持多种传输模式,根据收发双方的能力信息,确定由发送端设备执行将媒体文件恢复为可播放的多个操作中的部分或全部,由接收端设备执行其余操作,不同传输模式中,发送端设备执行的将媒体文件恢复为可播放的多个操作中的操作不同。这样一来,在媒体文件传输时,可以利用接收端设备的计算和媒体处理能力,降低收发端之间传输接口上的数据量,节约带宽。
下面通过具体的实施例,对本申请提供的媒体文件传输方法进行详细说明。
实施例一
实施例一用于将发送端设备的移动通信能力共享给接收端设备,利用接收端设备的显示播放能力获得更好的媒体播放体验。在该场景中,发送端设备可以为具有5G等移动通信能力的手机、平板、无线通信路由器等,接收端设备可以为电视、车载显示器、PC、头戴显示设备、投影仪等。
该场景的媒体文件传输系统可以如图5所示的架构,发送端设备中的输出接口、传输信道以及接收端设备中的输入接口构成传输接口系统,该传输接口系统的形态可以为有线传输的电信号接口、光纤传输的光信号接口、无线电信号接口、无线光信号接口等。在该实施例中,图5中示意的视频接收单元511可以为无线网络接收单元。
本实施例中媒体文件主要来自于网络,包括视频点播、直播、视频通话、多方视频会议等。网络视频源中为经过压缩的视频内容。其中,媒体文件通常包括多种质量 的视频码流切片V1、V2…Vn,不同的码流切片通常对应不同的分辨率、是否有DRM加密、是否含有HDR、3D Audio元数据的组合等。
在实施例一中,发送端设备与接收端设备建立连接后,通过各自的能力协商单元根据用户对内容的选择倾向、无线网络传输状态,对收发双方的编解码、解DRM能力、渲染能力,接口加解密能力和用户对内容的选择进行能力协商,获取到第一能力信息,发送端设备根据第一能力信息,选择当前情况下最适合的视频码流切片VM(M属于1到n)。
例如,用户选择视频质量优先(最高分辨率),则根据视频源内码流分辨率R1-Rn、接收端支持显示分辨率Rp、接收端支持的解压缩分辨率RrDec(通道3可透传的分辨率)、发送端支持的解压缩分辨率RtDec、接口系统支持传输的裸码媒体分辨率Rt,按如下公式选择共同支持的最高分辨率:R0=min(max(R1,…Rn),Rp,max(RrDec,min(Rt,RtDec))))。其中,RrDec也为通道3支持的最大分辨率,min(Rt,RtDec)为通道1和通道2支持的最大分辨率。
然后,根据各个参数以及收发双方的能力以及规则,选取传输视频码流切片VM的传输通道(即确定第一传输通道)。
例如,当Rtdec>=R0且接收端设备支持视频码流选择VM(R=R0)所需的解DRM和渲染能力,说明发送端设备的分辨率过大,且接收端设备具备视频所需的能力,可以选择通道3直接向接收端设备传输VM的压缩码流,节约资源。
例如,当接收端设备不支持所需渲染能力,选择Vx(视频码流切片VX的分辨率R=min(R0,RrDec))作为新的码流,选择通道1在发送端设备中将Vx解压缩、解DRM(若需要),然后渲染后进行接口解密向接收端设备传输。
例如,当RtDec<R0,说明发送端设备的能力比接收端设备的能力弱,可以选择通道2传输媒体文件,在接收端设备中进行渲染。。
例如,当待播放的媒体文件被某种DRM加密,发送端设备无法进行相应的DRM解密,而接收端具备相应的DRM解密能力时,选择通道3进行压缩码流的透传,由接收端设备进行解压缩、解DRM以及渲染后显示。
例如,当接收端设备和发送端设备中有采用自身电池供电的设备时,可以根据电源使用的规划,选择优先使用某一段设备的计算资源。例如,使用手机为发送端向电视/车机传输媒体文件时,可优先使用接收端设备的计算资源,以节省手机端的电源消耗。
进一步可选的,在实施例一种,如图5所示,发送端设备与接收端设备中,可以能力协商单元和内容请求单元的能力作为服务化模块,供分布式应用集成。
在实施例一的场景中,本申请所提供的媒体文件传输系统,提供了多种传输模式,通过能力协商综合两端设备的计算资源和解压、解密、渲染、播放等能力,实现最优的播放效果。其中,在采用本申请通道3所示的透传模式,可以节省发送端设备的算力,在发送端设备不具备解压缩、解密高分辨率媒体文件的条件下实现将发送端设备的通信能力共享给接收端设备,依靠接收端设备更好的解压缩、解密等能力对媒体文件进行处理播放。
实施例二
在图8所示的AR/VR场景中,发送端设备(手机、平板、PC、机顶盒、游戏主机等)向接收端设备(电视、车载主机、AR/VR头显等显示设备)传输多个媒体文件。发送端设备与接收端设备通过传输接口系统组成如图5所示的媒体播放系统,支持多路媒体文件的同时传输。该传输接口系统可以为图5中示意的发送端设备中的输出接口516、传输信道以及接收端设备中的输入接口521。
在实施例二中,传输接口系统的形态可以是有线传输的电信号接口、光纤传输的光信号接口、无线电信号接口、无线光信号接口等。
在该实施例中,媒体文件的来源可以是本地存储、网络服务器、移动通信、发送端游戏引擎等应用渲染生产的音视频等。
例如,在实施例二中,发送端设备向接收端设备传输的多个媒体文件,可以为AR/VR场景中同一内容的不同角度的媒体文件。
发送端设备可以按照本申请提供的方案,将两个或两个以上的媒体文件(可以为同一内容的多个角度的媒体文件),分别选择传输每个媒体文件的传输通道(图5中的通道1、通道2、通道3中的任一个),分别传输。每个媒体文件的传输通道可以相同,也可以不同。
接收端设备可以通过传输接口系统,接收两个或两个以上的媒体文件,进行显示播放。
例如,在手机向AR/VR眼镜同时传输两路视频时,传输接口系统的带宽也不足以传输两路裸码的媒体文件时,可以其中1路采用通道1传输裸码流,另一路通过通道3传输压缩数据,由接收端设备解码并渲染,以充分利用传输带宽。
例如,传输接口系统支持传输两路裸码的媒体文件,但接收端设备的计算能力和发送端设备的计算能力都只能实时渲染其中一路媒体文件时,可以其中1路采用通道1传输渲染后的裸码,另一路通过通道2传输裸码数据和元数据,在接收端设备中进行渲染,以充分利用两端的算力资源实现实时播放。
具体的,在发送端设备的输出接口中,可以对各个待传输的m个媒体文件:v1-vm(m<=N)进行信道编码和打包为bit流b1~bm。可选的,可以对其中的若干路bit流进行聚合封装生产聚合信号,经bit流或聚合信号进行调制等操作后获得物理层信号,通过传输接口系统中的物理信道传输。物理层信号传输到接收端输入接口后,进行解调和解封装操作后得到各个bit流,进行拆包和信道解码后,恢复出视频流信号v1-vm,根据需要对其中通道3传输的媒体文件进行解压、解DRM和渲染等处理,对其中采用通道2传输的媒体信号进行渲染处理。最后,根据应用需要或用户选择,对各个媒体文件采用多窗口、画中画、多视角、VR等方式进行显示。
进一步的,当某2路或多路媒体文件需要同步显示时,在接收端设备的显示播放单元对这几路信号进行缓存,利用媒体文件中携带的时间戳等信息对齐时间后同步显示播放。
进一步可选的,在实施例二中,如图5所示,发送端设备与接收端设备中,可以能力协商单元和内容请求单元的能力作为服务化模块,供分布式应用集成。
在实施例二的场景中,本申请所提供的媒体文件传输系统,提供了多种传输模式,多个媒体文件可以在各个传输模式中进行组合选择,实现最优的传输组合方式。其中, 第二传输模式对应的通道2压缩数据透传模式占用带宽低,通过与其他模式组合,可以在传输带宽受限的情况下实现两路及多路媒体的实时传输。同时,播放多路媒体文件时,数据处理需要的软硬件资源高,本方案能够有效协同收发两端的计算资源,实现多路媒体实时处理。本方案相较于现有技术能够更加灵活的使用带宽资源,协调两端媒体处理能力和计算资源的分配。
进一步的,一个或多个接收端设备作为计算处理中心,可以同时连接多个发送端设备,利用多个获取多路或多模态的视频/图像信息,协同完成复杂的机器视觉任务。
实施例三
如图9所示媒体文件传输场景中,发送端设备(手机、平板等)向接收端设备(电视、车载主机等显示设备)传输多个媒体文件。发送端设备与接收端设备通过传输接口系统组成如图10所示的媒体播放系统,支持多路媒体文件的同时传输,支持在投送媒体文件的同时将自身内容镜像到接收端设备,以画中画或窗口等形式显示,并实现多屏的协同交互。
该传输接口系统可以为图10中示意的发送端设备中的输出接口516、传输信道以及接收端设备中的输入接口521。
在实施例三中,传输接口系统的形态可以是有线传输的电信号接口、光纤传输的光信号接口、无线电信号接口、无线光信号接口等。
在该实施例中,媒体文件的来源可以是本地存储、网络服务器、移动通信、发送端游戏引擎等应用渲染生产的音视频等。另外,在实施例三中,发送端设备的显示内容可以作为待传输的媒体文件之一(媒体文件X)。
应理解,实施例三是在实施例一和实施例2传输一路或多路媒体文件的同时,将自身显示的内容镜像投送到接收端设备,并以窗口或画中画的等形式显示,实现多屏协同显示交互。
需要说明的是,实施例三中,发送端设备向接收端设备传输一路或多路媒体文件的具体实现,与实施例一和实施例二相同,不再赘述。下面对实施例三中传输媒体文件X的方式进行说明。
一种可能的实现方式中,实施例三中,发送端设备传输的自身显示的媒体文件X,可以为发送端设备显示驱动单元中待显示的裸码视频信息。
如图10所示,发送端设备中还可以包括显示控制单元519。
一种可能的实现方式中,显示控制单元519可以将发送端设备中用于显示的媒体文件X传输到输出接口516。
可选的,发送端设备可以利用媒体渲染单元514对媒体文件X进行分辨率调整、亮度视频等操作后,利用通到1向接收端设备传输。
另一种可能的实现方式中,如图10所示,发送端设备中还可以包括压缩编码单元520。显示控制单元519可以将用于显示的媒体文件X传输到压缩编码单元520进行压缩编码,之后,利用通道3向接收端设备传输压缩后的媒体文件X。
可选的,实施例三种传输的多路媒体信息可以打包后,在传输信道中进行聚合传输。
进一步的,接收端设备可以将接收到其他媒体文件,与投屏的媒体文件X以画中 画、多窗口等形式一起显示在接收端设备的显示播放单元上。
进一步的,如图10所示,发送端设备中还可以包括显示交互单元521。发送端设备可以利用显示交互单元521的触控屏幕等交互能力,对接收端设备进行控制。
进一步可选的,在实施例三中,如图5所示,发送端设备与接收端设备中,可以能力协商单元和内容请求单元的能力作为服务化模块,供分布式应用集成。
该实施例三在实施例一和实施例二的基础上,增加了镜像投屏能力,利用窗口或画中画同步显示发送端设备的显示内容,并且实现了多屏协同交互。
应当理解的是,本申请中对技术特征、技术方案或类似语言的描述并不是暗示在任意的单个实施例中可以实现所有的特点。还可以任何适当的方式组合本实施例中所描述的技术特征和技术方案。
上述主要从发送端设备、接收端设备的工作原理角度对本申请实施例提供的方案进行了介绍。可以理解的是,上述发送端设备、接收端设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对执行本申请提供的发送端设备、接收端设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图11示出了上述实施例中所涉及的发送端设备中部署的媒体文件传输装置110的一种可能的结构示意图。该媒体文件传输装置110可以为功能模块或者芯片。如图11所示,媒体文件传输装置110可以包括:第一获取单元1101、第一确定单元1102、处理单元1103。其中,第一获取单元1101用于执行图6或图7中的过程S601;第一确定单元1102用于执行图6或图7中的过程S602、图7中的过程S602b、S602c;处理单元1103用于执行图6或图7中的过程S603。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
可选的,如图12所示,媒体文件传输装置110还可以包括第二获取单元1104和第二确定单元1105。其中,第二获取单元1104用于执行图7中的过程S602a。第二确定单元1105用于确定第一能力信息支持的传输模式。
在采用集成的单元的情况下,图13示出了上述实施例中所涉及的发送端设备130的一种可能的结构示意图。该发送端设备130可以为上述方法实施例中描述的发送端设备。发送端设备130可以包括:处理模块1301、通信模块1302。处理模块1301用于对发送端设备130的动作进行控制管理,通信模块1302用于与其他设备通信。例如,处理模块1301用于执行图6或图7中的过程S601至S603中任一过程、或图7中的过 程S602a至S602c。发送端设备130还可以包括存储模块1303,用于存储发送端设备130的程序代码和数据。
其中,处理模块1301可以为图3所示的实体结构中的处理器101,可以是处理器或控制器。例如可以是CPU,通用处理器,DSP,ASIC,FPGA或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理模块1301也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。通信模块1302可以为图3所示的实体结构中的射频电路102,通信模块1302可以是通信端口,或者可以是收发器、收发电路或通信接口等。或者,上述通信接口可以通过上述具有收发功能的元件,实现与其他设备的通信。上述具有收发功能的元件可以由天线和/或射频装置实现。存储模块1303可以是图3所示的实体结构中的存储器103。
当处理模块1301为处理器,通信模块1302为射频电路,存储模块1303为存储器时,本申请实施例图13所涉及的发送端设备130可以为图3所示的发送端设备。
如前述,本申请实施例提供的媒体文件传输装置110或发送端设备130可以用于实施上述本申请各实施例实现的方法中相应的功能,为了便于说明,仅示出了与本申请实施例相关的部分,具体技术细节未揭示的,请参照本申请各实施例。
在采用对应各个功能划分各个功能模块的情况下,图14示出了上述实施例中所涉及的接收端设备中部署的媒体文件传输装置140的一种可能的结构示意图。该媒体文件传输装置140可以为功能模块或者芯片。如图14所示,媒体文件传输装置140可以包括:第一确定单元1401、获取单元1402以及播放单元1403。其中,第一确定单元1401用于执行图6或图7中的过程S604;获取单元1402用于执行图6或图7中的过程S65;播放单元1403用于执行图6或图7中的过程S606。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
在采用集成的单元的情况下,图15示出了上述实施例中所涉及的接收端设备150的一种可能的结构示意图。该接收端设备150可以为上述方法实施例中描述的接收端设备。接收端设备150可以包括:处理模块1501、通信模块1502。处理模块1501用于对接收端设备150的动作进行控制管理,通信模块1502用于与其他设备通信。例如,处理模块1501用于执行图6或图7中的过程S604至S606中任一过程。接收端设备150还可以包括存储模块1503,用于存储接收端设备150的程序代码和数据。
其中,处理模块1501可以为图3所示的实体结构中的处理器101,可以是处理器或控制器。例如可以是CPU,通用处理器,DSP,ASIC,FPGA或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理模块1501也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。通信模块1502可以为图3所示的实体结构中的射频电路102,通信模块1502可以是通信端口,或者可以是收发器、收发电路或通信接口等。或者,上述通信接口可以通过上述具有收发功能的元件,实现与其他设备的通信。上述具有收发功能的元件可以由天线和/或射频装置实现。存储模块1503可以是图3所示的实体结构中的存储器103。
当处理模块1501为处理器,通信模块1502为射频电路,存储模块1503为存储器 时,本申请实施例图15所涉及的接收端设备150可以为图3所示的接收端设备。
如前述,本申请实施例提供的媒体文件传输装置140或接收端设备150可以用于实施上述本申请各实施例实现的方法中相应的功能,为了便于说明,仅示出了与本申请实施例相关的部分,具体技术细节未揭示的,请参照本申请各实施例。
作为本实施例的另一种形式,提供一种计算机可读存储介质,其上存储有指令,该指令被执行时执行上述方法实施例中的媒体文件传输方法。
作为本实施例的另一种形式,提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得该计算机执行时执行上述方法实施例中的媒体文件传输方法。
本申请实施例再提供一种芯片系统,该芯片系统包括处理器,用于实现本发明实施例的技术方法。在一种可能的设计中,该芯片系统还包括存储器,用于保存本发明实施例必要的程序指令和/或数据。在一种可能的设计中,该芯片系统还包括存储器,用于处理器调用存储器中存储的应用程序代码。该芯片系统,可以由一个或多个芯片构成,也可以包含芯片和其他分立器件,本申请实施例对此不作具体限定。
结合本申请公开内容所描述的方法或者算法的步骤可以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于RAM、闪存、ROM、可擦除可编程只读存储器(erasable programmable ROM,EPROM)、电可擦可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、只读光盘(CD-ROM)或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于核心网接口设备中。当然,处理器和存储介质也可以作为分立组件存在于核心网接口设备中。或者,存储器可以与处理器耦合,例如存储器可以是独立存在,通过总线与处理器相连接。存储器也可以和处理器集成在一起。存储器可以用于存储执行本申请实施例提供的技术方案的应用程序代码,并由处理器来控制执行。处理器用于执行存储器中存储的应用程序代码,从而实现本申请实施例提供的技术方案。
可以理解的是,上述发送端设备/接收端设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例还提供一种实现上述各方法实施例的发送端设备/接收端设备,具体的,可以对该发送端设备/接收端设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际 实现时可以有另外的划分方式。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (29)

  1. 一种媒体文件传输方法,其特征在于,应用于发送端设备,所述发送端设备支持第一传输模式、第二传输模式和第三传输模式;所述第一传输模式由所述发送端设备将媒体文件恢复至可播放,所述第二传输模式由所述发送端设备执行将媒体文件恢复至可播放的操作中的部分操作,所述第三传输模式由所述发送端设备透传媒体文件;所述方法包括:
    根据第一能力信息,获取媒体文件;其中,所述第一能力信息用于指示与所述发送端设备通信的接收端设备的媒体处理能力,或者,所述第一能力信息用于指示所述发送端设备和所述接收端设备的媒体处理能力;
    从所述第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输所述媒体文件的第一传输通道;一个传输模式对应的传输通道包括所述一个传输模式下执行将媒体文件恢复至可播放的操作的单元;
    通过所述第一传输通道在所述发送端设备中的单元,向所述接收端设备传输所述媒体文件。
  2. 根据权利要求1所述的方法,其特征在于,所述媒体文件为多个;
    从所述第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输所述媒体文件的第一传输通道,包括:从所述第一能力信息支持的传输模式对应的传输通道中,按照规则分别确定传输每个媒体文件的第一传输通道;
    所述通过所述第一传输通道在所述发送端设备中的单元,向所述接收端设备传输所述媒体文件,包括:通过每个所述第一传输通道在所述发送端设备中的单元,分别向所述接收端设备传输对应所述媒体文件。
  3. 根据权利要求2所述的方法,其特征在于,
    所述方法还包括:获取第二能力信息,所述第二能力信息用于指示下述能力中的一项或多项:所述发送端设备与所述接收端设备间接口的传输能力、所述发送端设备的计算能力、所述接收端设备的计算能力;
    在所述按照规则分别确定传输每个媒体文件的第一传输通道之后,若所述第二能力信息指示的能力不支持所述多个所述第一传输通道占用的资源,所述方法还包括:将第一媒体文件的第一传输通道进行调整,直至所述第二能力信息指示的能力支持所述多个所述第一传输通道占用的资源;所述第一媒体文件为多个媒体文件中一个或多个。
  4. 根据权利要求1所述的方法,其特征在于,
    所述第一传输模式对应的传输通道,包括所述发送端设备中执行将媒体文件恢复至可播放的操作的单元;
    所述第二传输模式对应的传输通道,包括所述发送端设备中执行将媒体文件恢复至可播放的操作中第一操作之前的操作的单元,以及所述接收端设备中执行将媒体文件恢复至可播放的操作中所述第一操作以及所述第一操作之后的操作的单元;所述第一操作为将媒体文件恢复至可播放的操作中所述接收端设备支持的任一操作;
    所述第三传输模式对应的传输通道,包括所述接收端设备中执行将媒体文件恢复至可播放的操作的单元。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述从所述第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输所述媒体文件的第一传输通道,包括:
    根据所述第一能力信息,确定所述第一能力信息支持的传输模式对应的每个传输通道的第三能力信息,所述第三能力信息用于指示每个传输通道支持传输的媒体文件的特征;
    确定所述第三能力信息满足所述规则的传输通道,作为所述第一传输通道。
  6. 根据权利要求5所述的方法,其特征在于,所述规则包括下述内容中一项或多项:
    播放效果优先、发送端设备计算资源消耗最低、接收端设备计算资源消耗最低、接口传输带宽占用最低。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述方法还包括:
    若所述第一能力信息指示所述接收端设备不支持第二操作,所述第二操作为将媒体文件恢复至可播放的操作中最后一个操作,确定所述支持的传输模式包括所述第一传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持所述第二操作,确定所述支持的传输模式包括所述第一传输模式和所述第二传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持将媒体文件恢复至可播放的操作中所有操作,确定所述支持的传输模式包括所述第一传输模式、所述第二传输模式和所述第三传输模式。
  8. 根据权利要求1-6任一项所述的方法,其特征在于,所述将媒体文件恢复至可播放的操作包括:解码操作、渲染操作;所述解码操作包括解压缩操作,或者,所述解码操作包括解压缩操作和解数字版权管理DRM操作;所述第一传输模式对应的传输通道包括所述发送端设备中的解码单元和渲染单元;所述第二传输模式对应的传输通道包括所述发送端设备中的解码单元和所述接收端设备中的渲染单元;所述第三传输模式对应的传输通道包括所述接收端设备中的解码单元和渲染单元;
    所述方法还包括:
    若所述第一能力信息指示所述接收端设备不支持渲染操作,确定所述支持的传输模式包括所述第一传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持渲染操作但不支持解码操作,确定所述支持的传输模式包括所述第一传输模式和所述第二传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持渲染操作且支持解码操作,确定所述支持的传输模式包括所述第一传输模式、所述第二传输模式和所述第三传输模式。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述第一能力信息包括下述内容中一项或多项:
    所述发送端设备和所述接收端设备的编解码能力、解DRM能力、接口加解密能力、音视频渲染能力以及显示播放能力。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述媒体文件包括所述发送端设备中待显示的裸码媒体文件,或者,所述媒体文件包括所述发送端设备中待显示的内容压缩后的媒体文件。
  11. 一种媒体文件传输方法,其特征在于,应用于接收端设备,所述接收端设备支持第一传输模式、第二传输模式和第三传输模式;所述第一传输模式为所述接收端设备接收可播放的媒体文件,所述第二传输模式由所述接收端设备执行将媒体文件恢复至可播放的操作中的部分操作,所述第三传输模式由所述接收端设备执行将媒体文件恢复至可播放的操作中的全部操作;所述方法包括:
    从第一能力信息支持的传输模式对应的传输通道中,按照规则确定与发送端设备传输媒体文件的第一传输通道;其中,所述第一能力信息用于指示与所述接收端设备通信的发送端设备的媒体处理能力,或者,所述第一能力信息用于指示所述发送端设备和所述接收端设备的媒体处理能力;一个传输模式对应的传输通道包括所述一个传输模式下执行将媒体文件恢复至可播放的操作的单元;
    通过所述第一传输通道在所述接收端设备中单元,获取可播放的媒体文件;
    播放可播放的媒体文件。
  12. 根据权利要求11所述的方法,其特征在于,所述媒体文件为多个,所述播放可播放的媒体文件包括:
    根据所述媒体文件中的时间戳信息,同步播放可播放的媒体文件。
  13. 一种媒体文件传输装置,其特征在于,所述装置部署于发送端设备,所述发送端设备支持第一传输模式、第二传输模式和第三传输模式;所述第一传输模式由所述发送端设备将媒体文件恢复至可播放,所述第二传输模式由所述发送端设备执行将媒体文件恢复至可播放的操作中的部分操作,所述第三传输模式由所述发送端设备透传媒体文件;所述装置包括:
    第一获取单元,用于根据第一能力信息,获取媒体文件;其中,所述第一能力信息用于指示与所述发送端设备通信的接收端设备的媒体处理能力,或者,所述第一能力信息用于指示所述发送端设备和所述接收端设备的媒体处理能力;
    第一确定单元,用于从所述第一能力信息支持的传输模式对应的传输通道中,按照规则确定传输所述媒体文件的第一传输通道;一个传输模式对应的传输通道包括所述一个传输模式下执行将媒体文件恢复至可播放的操作的单元;
    处理单元,用于通过所述第一传输通道在所述发送端设备中的单元,向所述接收端设备传输所述媒体文件。
  14. 根据权利要求13所述的装置,其特征在于,所述媒体文件为多个;
    所述第一确定单元具体用于:从所述第一能力信息支持的传输模式对应的传输通道中,按照规则分别确定传输每个媒体文件的第一传输通道;
    所述处理单元具体用于:通过每个所述第一传输通道在所述发送端设备中的单元,分别向所述接收端设备传输对应所述媒体文件。
  15. 根据权利要求14所述的装置,其特征在于,
    所述装置还包括第二获取单元,用于获取第二能力信息,所述第二能力信息用于指示下述能力中的一项或多项:所述发送端设备与所述接收端设备间接口的传输能力、所述发送端设备的计算能力、所述接收端设备的计算能力;
    所述第一确定单元还用于,在所述按照规则分别确定传输每个媒体文件的第一传输通道之后,若所述第二能力信息指示的能力不支持所述多个所述第一传输通道占用的资源,将第一媒体文件的第一传输通道进行调整,直至所述第二能力信息指示的能力支持所述多个所述第一传输通道占用的资源;所述第一媒体文件为多个媒体文件中一个或多个。
  16. 根据权利要求13所述的装置,其特征在于,
    所述第一传输模式对应的传输通道,包括所述发送端设备中执行将媒体文件恢复至可播放的操作的单元;
    所述第二传输模式对应的传输通道,包括所述发送端设备中执行将媒体文件恢复至可播放的操作中第一操作之前的操作的单元,以及所述接收端设备中执行将媒体文件恢复至可播放的操作中所述第一操作以及所述第一操作之后的操作的单元;所述第一操作为将媒体文件恢复至可播放的操作中所述接收端设备支持的任一操作;
    所述第三传输模式对应的传输通道,包括所述接收端设备中执行将媒体文件恢复至可播放的操作的单元。
  17. 根据权利要求13-16任一项所述的装置,其特征在于,所述第一确定单元具体用于:
    根据所述第一能力信息,确定所述第一能力信息支持的传输模式对应的每个传输通道的第三能力信息,所述第三能力信息用于指示每个传输通道支持传输的媒体文件的特征;
    确定所述第三能力信息满足所述规则的传输通道,作为所述第一传输通道。
  18. 根据权利要求17所述的装置,其特征在于,所述规则包括下述内容中一项或多项:
    播放效果优先、发送端设备计算资源消耗最低、接收端设备计算资源消耗最低、接口传输带宽占用最低。
  19. 根据权利要求13-18任一项所述的装置,其特征在于,所述装置还包括第二确定单元,用于:
    若所述第一能力信息指示所述接收端设备不支持第二操作,所述第二操作为将媒体文件恢复至可播放的操作中最后一个操作,确定所述支持的传输模式包括所述第一传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持所述第二操作,确定所述支持的传输模式包括所述第一传输模式和所述第二传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持将媒体文件恢复至可播放的操作中所有操作,确定所述支持的传输模式包括所述第一传输模式、所述第二传输模式和所述第三传输模式。
  20. 根据权利要求13-18任一项所述的装置,其特征在于,所述将媒体文件恢复至可播放的操作包括:解码操作、渲染操作;所述解码操作包括解压缩操作,或者,所述解码操作包括解压缩操作和解数字版权管理DRM操作;所述第一传输模式对应的传输通道包括所述发送端设备中的解码单元和渲染单元;所述第二传输模式对应的传输通道包括所述发送端设备中的解码单元和所述接收端设备中的渲染单元;所述第三传输模式对应的传输通道包括所述接收端设备中的解码单元和渲染单元;所述装置还包括第二确定单元,用于:
    若所述第一能力信息指示所述接收端设备不支持渲染操作,确定所述支持的传输模式包括所述第一传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持渲染操作但不支持解码操作,确定所述支持的传输模式包括所述第一传输模式和所述第二传输模式;
    或者,
    若所述第一能力信息指示所述接收端设备支持渲染操作且支持解码操作,确定所述支持的传输模式包括所述第一传输模式、所述第二传输模式和所述第三传输模式。
  21. 根据权利要求13-20任一项所述的装置,其特征在于,所述第一能力信息包括下述内容中一项或多项:
    所述发送端设备和所述接收端设备的编解码能力、解DRM能力、接口加解密能力、音视频渲染能力以及显示播放能力。
  22. 根据权利要求13-21任一项所述的装置,其特征在于,所述媒体文件包括所述发送端设备中待显示的裸码媒体文件,或者,所述媒体文件包括所述发送端设备中待显示的内容压缩后的媒体文件。
  23. 一种媒体文件传输装置,其特征在于,所述装置部署于接收端设备,所述接收端设备支持第一传输模式、第二传输模式和第三传输模式;所述第一传输模式为所述接收端设备接收可播放的媒体文件,所述第二传输模式由所述接收端设备执行将媒体文件恢复至可播放的操作中的部分操作,所述第三传输模式由所述接收端设备执行将媒体文件恢复至可播放的操作中的全部操作;所述装置包括:
    第一确定单元,用于从所述第一能力信息支持的传输模式对应的传输通道中,按照规则确定与发送端设备传输媒体文件的第一传输通道;一个传输模式对应的传输通道包括所述一个传输模式下执行将媒体文件恢复至可播放的操作的单元;
    获取单元,用于通过所述第一传输通道获取可播放的媒体文件;
    播放单元,用于播放可播放的媒体文件。
  24. 根据权利要求23所述的装置,其特征在于,所述媒体文件为多个,所述播放单元:
    根据所述媒体文件中的时间戳信息,同步播放可播放的媒体文件。
  25. 一种发送端设备,其特征在于,所述发送端设备包括:处理器和存储器;
    所述存储器与所述处理器连接;所述存储器用于存储计算机指令,当所述处理器执行所述计算机指令时,所述网络设备执行如权利要求1至10中任一项所述的方法。
  26. 一种接收端设备,其特征在于,所述接收端设备包括:处理器和存储器;
    所述存储器与所述处理器连接;所述存储器用于存储计算机指令,当所述处理器执行所述计算机指令时,所述网络设备执行如权利要求11或12所述的方法。
  27. 一种媒体文件传输系统,其特征在于,包括如权利要求25所述的发送端设备以及如权利要求26所述的接收端设备。
  28. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得计算机执行权利要求1至12中任一项所述的方法。
  29. 一种计算机程序产品,其特征在于,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至12中任一项所述的方法。
PCT/CN2022/086907 2021-05-31 2022-04-14 一种媒体文件传输方法及装置 WO2022252842A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110600628.5A CN115426338A (zh) 2021-05-31 2021-05-31 一种媒体文件传输方法及装置
CN202110600628.5 2021-05-31

Publications (1)

Publication Number Publication Date
WO2022252842A1 true WO2022252842A1 (zh) 2022-12-08

Family

ID=84195439

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/086907 WO2022252842A1 (zh) 2021-05-31 2022-04-14 一种媒体文件传输方法及装置

Country Status (2)

Country Link
CN (1) CN115426338A (zh)
WO (1) WO2022252842A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105191326A (zh) * 2013-05-08 2015-12-23 高通股份有限公司 无线通信系统中的视频流送
CN110392047A (zh) * 2019-07-02 2019-10-29 华为技术有限公司 数据传输方法、装置及设备
CN111258526A (zh) * 2020-05-06 2020-06-09 上海幻电信息科技有限公司 投屏方法和系统
CN112019877A (zh) * 2020-10-19 2020-12-01 深圳乐播科技有限公司 基于vr设备的投屏方法、装置、设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105191326A (zh) * 2013-05-08 2015-12-23 高通股份有限公司 无线通信系统中的视频流送
CN110392047A (zh) * 2019-07-02 2019-10-29 华为技术有限公司 数据传输方法、装置及设备
CN111258526A (zh) * 2020-05-06 2020-06-09 上海幻电信息科技有限公司 投屏方法和系统
CN112019877A (zh) * 2020-10-19 2020-12-01 深圳乐播科技有限公司 基于vr设备的投屏方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN115426338A (zh) 2022-12-02

Similar Documents

Publication Publication Date Title
US11711623B2 (en) Video stream processing method, device, terminal device, and computer-readable storage medium
US9264478B2 (en) Home cloud with virtualized input and output roaming over network
US9609427B2 (en) User terminal apparatus, electronic device, and method for controlling the same
WO2022121775A1 (zh) 一种投屏方法及设备
US8606183B2 (en) Method and apparatus for remote controlling bluetooth device
US9509947B2 (en) Method and apparatus for transmitting file during video call in electronic device
EP3007455A1 (en) Information processing device and information processing method
US11068148B2 (en) Information processing device
US9955197B2 (en) Encrypted screencasting
CN109168032B (zh) 视频数据的处理方法、终端、服务器及存储介质
CN111510757A (zh) 一种共享媒体数据流的方法、装置以及系统
CN111010588B (zh) 直播处理方法、装置、存储介质及设备
CN113709493B (zh) 一种kvm系统的视频流数据加密装置、方法及设备
CN116170629A (zh) 一种传输码流的方法、电子设备及计算机可读存储介质
WO2022252842A1 (zh) 一种媒体文件传输方法及装置
CN112272319A (zh) 音视频数据的传输方法、装置、存储介质与电子设备
CN109714628B (zh) 播放音视频的方法、装置、设备、存储介质及系统
US11825235B2 (en) Electronic device for processing image and image processing method thereof
CN103428526A (zh) 视频传输装置、视频传输方法以及视频传输系统
CN115550683A (zh) 一种视频数据的传输方法及装置
US20170048532A1 (en) Processing encoded bitstreams to improve memory utilization
WO2022033379A1 (zh) 一种媒体信息传输方法及装置
EP4135263A1 (en) Data communication method and related apparatus
WO2024007998A1 (zh) 一种数据传输的方法、电子设备和通信系统
CN117156190A (zh) 投屏管理方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22814889

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22814889

Country of ref document: EP

Kind code of ref document: A1