WO2024060719A1 - 一种数据传输的方法、装置、电子设备及存储介质 - Google Patents

一种数据传输的方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2024060719A1
WO2024060719A1 PCT/CN2023/100759 CN2023100759W WO2024060719A1 WO 2024060719 A1 WO2024060719 A1 WO 2024060719A1 CN 2023100759 W CN2023100759 W CN 2023100759W WO 2024060719 A1 WO2024060719 A1 WO 2024060719A1
Authority
WO
WIPO (PCT)
Prior art keywords
data stream
code rate
request message
information
video
Prior art date
Application number
PCT/CN2023/100759
Other languages
English (en)
French (fr)
Inventor
贾宇航
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2024060719A1 publication Critical patent/WO2024060719A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests

Definitions

  • the present application relates to the field of communications, and specifically, to a data transmission method, device, electronic equipment and storage medium.
  • Panoramic video is different from the single viewing angle of traditional video, allowing users to watch freely in 360 degrees.
  • VR panoramic video also allows users to move freely when watching videos, providing a 360-degree free viewing angle at any position in the scene.
  • users can follow the video content to immerse themselves in the highly realistic scenes and gain An unprecedented experience. Because of this, panoramic video is becoming more and more popular.
  • high bit rate transmission is generally used when transmitting panoramic video, which requires relatively high network resources. Therefore, how to transmit panoramic videos so that users can obtain better experience quality when watching panoramic videos and save network resources is an issue that needs to be solved urgently.
  • Embodiments of the present application provide a data transmission method, which can improve the user's experience when watching panoramic videos while saving network resources.
  • embodiments of the present application provide a data transmission method.
  • the method is applied to a first server and includes: receiving a request message sent by a second server, where the request message is used to request audio data stream or video data. stream, the request message includes the identification information of the terminal device; according to the request message and the bandwidth detection result, configure the code rate of the audio data stream transmission or the code rate of the video data stream transmission; according to the audio data stream Send the audio data stream to the terminal device according to the code rate and data transmission protocol, or send the video data stream to the terminal device according to the code rate and data transmission protocol of the video data stream.
  • embodiments of the present application provide a data transmission method.
  • the method is applied to a second server and includes: receiving a first request message sent by a terminal device for requesting a data stream.
  • the first request message including motion data of the terminal device; determining viewing angle information of the terminal device based on the motion data of the terminal device; determining code rate information of the data stream based on the viewing angle information; sending a second request message to the first server , the second request message includes the code rate information information and the perspective information, and the second request message also includes identification information of the terminal device.
  • a data transmission device including:
  • a transceiver unit configured to receive a request message sent by the second server, where the request message is used to request an audio data stream or a video data stream, where the request message includes identification information of the terminal device;
  • a processing unit configured to configure the code rate for data stream transmission according to the request message and the bandwidth detection result
  • the transceiver unit is also configured to send the data stream to the terminal device according to the code rate and data transmission protocol.
  • an embodiment of the present application provides a data transmission device, including:
  • a transceiver unit configured to receive a first request message sent by the terminal device for requesting a data stream, where the first request message includes motion data of the terminal device;
  • a processing unit configured to determine the viewing angle information of the terminal device according to the motion data of the terminal device
  • the processing unit is also configured to determine the code rate information of the data stream according to the viewing angle information
  • the transceiver unit is also configured to send a second request message to the first server.
  • the second request message includes the code rate information and the viewing angle information.
  • the second request message also includes the identification information of the terminal device. .
  • embodiments of the present application provide an electronic device, including:
  • a processor adapted to implement computer instructions
  • the memory stores computer instructions, and the computer instructions are suitable for being loaded by the processor and executing the method of the first aspect.
  • embodiments of the present application provide a computer-readable storage medium.
  • the computer-readable storage medium stores computer instructions.
  • the computer instructions When the computer instructions are read and executed by a processor of a computer device, the computer device executes the above-mentioned step.
  • One side approach One side approach.
  • embodiments of the present application provide a computer program product or computer program.
  • the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method of the first aspect.
  • the first server determines the code rate of the transmitted video data stream or audio data stream based on the request message sent by the second server and the bandwidth detection result, and based on the determined code rate sum of the transmitted video data stream or audio data stream
  • the data transmission protocol transmits video data streams or audio data streams to terminal devices to improve user experience, and separately transmitting video data streams or audio data streams can reduce the occupation of network resources.
  • Figure 1 is an optional schematic diagram of the system architecture involved in the embodiment of the present application.
  • FIG. 2 is an optional schematic diagram of the system architecture involved in the embodiment of the present application.
  • Figure 3 is a schematic flow chart of a data transmission method provided by an embodiment of the present application.
  • Figure 4 is a schematic flow chart of a data transmission method provided by an embodiment of the present application.
  • Figure 5 is a schematic flow chart of a data transmission method provided by an embodiment of the present application.
  • Figure 6 is a schematic flow chart of a data transmission method provided by an embodiment of the present application.
  • Figure 7 is a schematic block diagram of a device according to an embodiment of the present application.
  • Figure 8 is a schematic block diagram of a device according to an embodiment of the present application.
  • Figure 9 is a schematic block diagram of a device according to an embodiment of the present application.
  • Figure 10 is a schematic block diagram of an electronic device provided by an embodiment of the present application.
  • Fig. 1 is an optional schematic diagram of a system architecture 100 involved in an embodiment of the present application.
  • the system architecture 100 includes a second terminal 110, a communication device 120, a core network device 130, a server 140, an audio streaming server 150, a video streaming server 160 and a first terminal 170, wherein different devices can communicate and interact with each other in a wired or wireless manner.
  • the second terminal 110 may be a device that displays panoramic video, such as a virtual reality (Virtual Reality, VR) device, an augmented reality (Augmented Reality, AR) device, a mixed reality (Mixed Reality, MR) device or similar devices.
  • VR devices can be VR glasses, VR headsets and other devices that apply VR technology
  • AR devices can be AR glasses, AR TVs, AR headsets and other devices that apply AR technology
  • MR devices can be MR glasses, MR terminals, MR headsets, MR wearable devices and other devices that apply MR technology, but are not limited to these.
  • the second terminal may also be a (cloud) server with a display function.
  • the communication device 120 mainly refers to an active communication device that can serve as a transmission source. It is an access device for terminals to access the network through wireless means. It is mainly responsible for wireless resource management, quality of service (QoS) management on the air interface side, Data compression and encryption, etc.
  • QoS quality of service
  • base station NodeB evolved base station eNodeB
  • base station in 5G mobile communication system or new generation wireless (new radio, NR) communication system base station in future mobile communication system, etc.
  • Core network equipment 130 is responsible for processing forwarding information and may include 4G/5G core network or other gateways, such as user plane function (UPF) network element, access and mobility management function (access and mobility management function (AMF) network element, session management function (SMF) network element, policy control function (PCF) network element, etc.
  • UPF user plane function
  • AMF access and mobility management function
  • SMF session management function
  • PCF policy control function
  • the server 140 is responsible for receiving the code rate information and motion data sent from the second terminal 110 .
  • the server 140 may determine the viewing angle information of the second terminal 110 according to the motion data.
  • the audio streaming server 150 is responsible for receiving the audio stream, allowing the second terminal 110 and the server 140 to pull the audio stream.
  • the video streaming server 160 is responsible for receiving the video stream, allowing the second terminal 110 and the server 140 to pull the video stream.
  • the second terminal device 110 and servers such as the server 140, the audio streaming server 150 and the video streaming server 160, may not be in the same local area network.
  • server 140, the audio streaming server 150 and the video streaming server 160 can be cloud servers, and the cloud servers can provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware Cloud servers that provide basic cloud computing services such as software services, domain name services, security services, Content Delivery Network (CDN), and big data and artificial intelligence platforms.
  • cloud servers can provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware Cloud servers that provide basic cloud computing services such as software services, domain name services, security services, Content Delivery Network (CDN), and big data and artificial intelligence platforms.
  • CDN Content Delivery Network
  • the first terminal 170 may be a device that captures video or images, such as a camera, a sensor, a millimeter wave radar, a lidar, a PC, a (cloud) server, etc.
  • the first terminal transmits the local high-resolution panoramic video to the audio streaming server 150 and video streaming server 160.
  • FIG. 2 is an optional schematic diagram of another system architecture 200 involved in the embodiment of the present application.
  • the system architecture 200 includes a second terminal 210 , a server 220 , an audio streaming server 230 , a video streaming server 240 and a first terminal 250 .
  • system architecture 200 includes the second terminal 210, the server 220, the audio streaming server 230, the video streaming server 240 and the first terminal 250. Reference may be made to the second terminal 110, the server 120, the audio streaming server included in the system architecture 100. 130. The video streaming server 140 and the first terminal 170 understand.
  • System architecture 200 is an exemplary system architecture diagram of devices in the same local area network.
  • bit rate is the number of data bits transmitted per unit time during data transmission.
  • the generally used unit is kbps, that is, thousands. bits per second.
  • bit rate is directly proportional to the clarity. The higher the bit rate, the clearer the image; the lower the bit rate, the less clear the image.
  • transmitting data at a high bit rate will occupy more network resources. Due to changes in the user's head, the user's perspective often changes. Therefore, how to select the bit rate to transmit audio and video information under different network bandwidths and from different perspectives is an issue that needs to be solved urgently.
  • this application proposes a data transmission method, which is applied to a server.
  • the server is a server that sends a video data stream or an audio data stream to a terminal device.
  • the server can transmit data based on viewing angle information and/or code rate information, as well as bandwidth. Detection results, determine the code rate of the transmitted video data stream or audio data stream, and transmit the video data stream or audio data stream to the terminal device according to the determined code rate and data transmission protocol to improve user experience Experience.
  • Figure 3 is a schematic flow chart of a data transmission method 300 provided by an embodiment of the present application.
  • Method 300 may be performed by any electronic device with data processing capabilities.
  • the electronic device may be implemented as a server or a computer.
  • the following description takes the electronic device as the first server as an example.
  • method 300 may include steps 310 to 330.
  • the first server receives a request message sent by the second server.
  • the request message is used to request an audio data stream or a video data stream.
  • the request message includes identification information of the terminal device.
  • the first server configures the code rate for audio data stream transmission or the code rate for video data stream transmission according to the request message and the bandwidth detection result.
  • the first server sends the audio data stream to the terminal device according to the code rate and data transmission protocol of the audio data stream, or sends the video data stream to the terminal device according to the code rate and data transmission protocol of the video data stream.
  • the first server may be a video streaming media server or an audio streaming media server, which is related to the functions provided by the first server.
  • the first server can be a video streaming server and an audio streaming server at the same time, but the video stream and the audio stream are processed and transmitted separately.
  • the first server can determine the code rate of the transmitted data stream according to the request message and the bandwidth detection result, and transmit the data stream to the terminal device according to the determined code rate of the transmitted data stream and the data transmission protocol to improve the user experience, and video Data streams and audio data streams are transmitted separately, reducing the usage of network resources.
  • the first server is a video streaming server
  • the request message is used to request the video data stream.
  • the request message also includes perspective information.
  • the request message is used to request the video stream from the first server.
  • configure The code rate for data stream transmission includes: configuring the transmission code rate of the first data stream to the first code rate based on the viewing angle information and bandwidth detection results, and the first data stream is a data stream within the first threshold of the viewing angle range; based on the viewing angle information and the bandwidth detection result, configure the transmission code rate of the second data stream to the second code rate, the second data stream is a data stream outside the first threshold of the viewing angle range, and the first code rate is greater than the second code rate.
  • the user's viewing angle is an area. Within the viewing angle area, the user is more sensitive to the image perception in the central area of the viewing angle, but not sensitive enough to the image perception in the edge area of the viewing angle. Therefore, according to the viewing angle information and bandwidth detection, the configuration
  • the first data corresponding to the first threshold of the viewing angle range The transmission bit rate of the stream is the first bit rate, and the transmission bit rate of the second data stream corresponding to the configured viewing angle range is outside the first threshold of the second bit rate.
  • the first bit rate is greater than the second bit rate, that is, the first data stream adopts
  • the second data stream is transmitted at a higher code rate, and the second data stream is transmitted at a lower code rate. Based on the viewing angle information and bandwidth detection information, data streams in different areas within the viewing angle range are transmitted at different bit rates, which can reduce the occupancy rate of network resources while ensuring user experience.
  • the transmission bit rate of the second data stream corresponding to the viewing angle range outside the first threshold can also be appropriately increased so that the user can obtain better viewing quality.
  • the first code rate and the second code rate determined by the first server should be within a range supported by the terminal device.
  • the first server can configure the transmission code rate of the first data stream corresponding to the first threshold of the viewing angle range within the code rate range, and configure the first code rate for the viewing angle range.
  • the transmission code rate of the corresponding second data stream outside the threshold is the second code rate.
  • the first server is a video streaming server.
  • the request message is used to request a video data stream.
  • the request message also includes viewing angle information and code rate information.
  • the request message is used to request a video stream from the first server.
  • configuring the code rate for data stream transmission including: configuring the transmission code rate of the first data stream to the first code rate based on the viewing angle information, code rate information, and bandwidth detection results, and the first data stream being the first threshold of the viewing angle range data stream within; according to the viewing angle information, code rate information and bandwidth detection results, configure the transmission code rate of the second data stream to the second code rate, the second data stream is a data stream outside the first threshold of the viewing angle range, the first code The rate is greater than the second code rate.
  • the request message may include code rate information requested by the terminal device.
  • the first server may determine it in combination with the code rate information requested by the terminal device to avoid the terminal device not supporting the third code rate.
  • code rate information used by the server. Therefore, according to the viewing angle information, code rate information and bandwidth detection, the transmission bit rate of the corresponding first data stream within the first threshold of the viewing angle range is configured to be the first bit rate, and the transmission bit rate of the corresponding second data stream outside the first threshold of the viewing angle range is configured.
  • the transmission code rate is the second code rate, and the first code rate is greater than the second code rate, that is, the first data stream is transmitted at a higher code rate, and the second data stream is transmitted at a lower code rate.
  • bit rate information and bandwidth detection information within the bit rate range supported by the terminal device, data streams in different areas within the viewing angle range are transmitted at different bit rates, which can ensure user experience while reducing network resources. occupancy rate.
  • the video data stream includes multiple video frames, and each video frame in the multiple video frames includes an I frame, a B frame, and a P frame, where the data transmission protocol adopted by the I frame is a transmission control protocol (transmission control protocol). , TCP), the data transmission protocol used by B frames and P frames is user datagram protocol (UDP).
  • TCP transmission control protocol
  • UDP user datagram protocol
  • video frames are divided into I frames, B frames and P frames.
  • I frames are intra-coded frames, also known as key frames
  • P frames are forward predicted frames, also known as forward reference frames
  • B frames are bidirectional interpolated frames, also known as bidirectional reference frames.
  • I frames are a complete picture, while P frames and B frames record What is recorded is the change relative to the I frame. Without the I frame, the P frame and B frame cannot be decoded.
  • the I frame is transmitted through a separate channel using TCP to ensure that no frame is lost during the transmission process, avoiding mosaics, blurs, etc. in the video; B frames and P frames are transmitted through a UDP data channel, and forward error correction (FEC) redundancy is performed according to the network detection status.
  • FEC forward error correction
  • the first server is an audio streaming server.
  • the request message is used to request the audio data stream.
  • the request message also includes code rate information.
  • the code rate for data stream transmission is configured, including: According to the code rate Information and bandwidth detection results, configure the transmission bit rate of the audio stream to the third bit rate.
  • the third code rate can be a low code rate or a high code rate.
  • the data transmission protocol of the audio data stream is Transmission Control Protocol TCP.
  • the data stream includes a time stamp.
  • the data stream is a video data stream, and the video data stream includes a time stamp; or, when the first server is an audio stream server, the data stream is an audio data stream, and the audio data stream includes a time stamp. logo.
  • the data stream includes a time stamp and is suitable for audio and video synchronization on the terminal device side.
  • the first server detects the downlink bandwidth of the first server and the terminal device to obtain the bandwidth detection result.
  • downlink refers to the process of the first server sending the data stream to the terminal device.
  • FIG. 4 is a schematic flow chart of a data transmission method 400 provided by an embodiment of the present application.
  • Method 400 may be performed by any electronic device with data processing capabilities.
  • the electronic device may be implemented as a server or a computer.
  • the following description takes the electronic device as the second server as an example, which is mainly used to determine the user's perspective based on the motion data sent by the terminal device.
  • method 400 may include steps 410 to 440.
  • the second server receives a first request message sent by the terminal device for requesting a data stream, where the first request message includes motion data of the terminal device.
  • the second server determines the viewing angle information of the terminal device based on the motion data of the terminal device.
  • the second server determines the code rate information of the data stream based on the perspective information.
  • the second server sends a second request message to the first server.
  • the second request message includes code rate information, viewing angle information, and identification information of the terminal device.
  • the second server receives the first request message sent by the terminal device and sends a second request message to the first server.
  • the second request message includes the code rate information and the viewing angle information, so that the first server can obtain the code rate information and the viewing angle information according to the viewing angle information and the rate information, and Based on the bandwidth detection results, the code rate of the transmitted data stream is determined, and the data stream is transmitted to the terminal device according to the determined code rate of the transmitted data stream to improve the user experience.
  • the first request message also includes video bit rate information
  • determining the bit rate information of the data stream based on the viewing angle information includes: determining the bit rate information of the data stream based on the viewing angle information and the video bit rate information.
  • the first request message sent by the terminal device received by the second server is used to request a video stream.
  • the first request message includes video code rate information.
  • the video code rate information may be a code rate range.
  • the second server A bit rate information can be determined based on the viewing angle information and within the range indicated by the video bit rate information.
  • the first request message also includes audio bit rate information and video bit rate information.
  • Determining the bit rate information of the terminal device according to the viewing angle information includes: determining the video of the data stream according to the viewing angle information and the video bit rate information. Code rate information.
  • the method also includes: audio code rate information, determining the audio code rate information of the data stream.
  • the first request message sent by the terminal device received by the second server is used to request a video stream and an audio stream.
  • the first request message includes video code rate information, and the video code rate information may be a code rate range,
  • the second server may determine a bit rate information within the range indicated by the video bit rate information based on the viewing angle information.
  • the first request message also includes audio code rate information.
  • the audio code rate information may be a code rate range.
  • the second server may determine audio code rate information within the range indicated by the audio code rate information.
  • the content of the first request message received by the second server is different. This is because the terminal device can determine according to the service type that the first request message is used to request a video stream, or that the first request message is used to request a video stream and an audio stream. For example, when the running service audio of the terminal device can be ignored, the first request message is used to request the video stream, such as in a remote driving scenario, where audio is not of high concern. When the running service audio of the terminal device cannot be ignored, the first request message is used to request the video stream and audio stream, such as remote video conferencing and live broadcast scenarios, where both the audio stream and the video stream are very important.
  • Figure 5 is a schematic flow chart of a data transmission method 500 provided by an embodiment of the present application.
  • Method 500 may be performed by any electronic device with data processing capabilities.
  • the electronic device may be implemented as a terminal device or a computer.
  • the following description takes the electronic device as a terminal device as an example.
  • method 500 may include steps 510 to 530.
  • the terminal device obtains motion data.
  • Motion data can be 6 degrees of freedom data (6 degrees of freedom, 6DoF).
  • the object has six degrees of freedom in space, that is, the degrees of freedom of movement along the three rectangular coordinate axes of x, y, and z and the degrees of freedom around the three coordinate axes. Rotational freedom.
  • the terminal device sends a first request message to the second server according to the service type.
  • the first request message is used to request a video stream, or the first request message is used to request a video stream and an audio stream.
  • the first request message includes motion data.
  • the terminal device requests different content from the second server according to the service type, which can effectively alleviate the pressure on the transmission bandwidth between the terminal device and the streaming media server and improve the transmission efficiency.
  • the first request message is used to request a video stream, and the first request message also includes video bit rate information.
  • the first request message is used to request a video stream and an audio stream, and the first request message also includes video bit rate information and audio bit rate information.
  • the method also includes the terminal device receiving an audio data stream, the audio data stream including a time mark; the terminal device receiving a video data stream, the video data stream including a time mark; the terminal device synchronizing the video data and audio data according to the time mark. .
  • Figures 3 to 5 above describe in detail the method embodiments of the present application from a single side perspective.
  • a data transmission method provided by the embodiment of the present application is described below from an interactive perspective through FIG. 6 .
  • Figure 6 is a schematic flow chart of a data transmission method 600 provided by an embodiment of the present application.
  • Method 600 may include steps 601 to 612.
  • the first terminal separates the video stream and audio stream of the local high-resolution panoramic video to obtain the audio stream and the video stream.
  • Audio and video are processed separately. Firstly, it can alleviate the pressure on the transmission bandwidth between the first terminal and the streaming media server. Secondly, according to business needs, subsequent selection can be made on demand at the second terminal to improve transmission efficiency.
  • the video streaming server starts bandwidth detection to identify and measure the maximum available bandwidth of the current downlink.
  • the video streaming server can perform bandwidth detection periodically or in real time.
  • the audio streaming server starts bandwidth detection, identifies and measures the maximum available bandwidth of the current downlink.
  • the audio streaming server can perform bandwidth detection periodically or in real time.
  • the first terminal device pushes the video stream to the video streaming server.
  • the first terminal device pushes the audio stream to the audio streaming server.
  • the streaming media server is not limited, and the streaming media server can be a real-time streaming protocol (RTSP) server, a real-time messaging protocol (RTMP) server, a hypertext transfer protocol (HTTP) server, etc.
  • RTSP real-time streaming protocol
  • RTMP real-time messaging protocol
  • HTTP hypertext transfer protocol
  • the second terminal device determines the motion data
  • the second terminal device sends the first request message to the server.
  • the second terminal device sends a first request message to the server according to the service type.
  • the first request message is used to request a video stream, or the first request message is used to request a video stream and an audio stream.
  • the first request message includes motion data, and the first request message is used to request a video stream and an audio stream.
  • the request message further includes video bit rate information, or the first request message further includes video bit rate information and audio bit rate information.
  • the server sends a second request message requesting the video data stream to the video streaming server.
  • the second request message includes viewing angle information, or the second request message includes viewing angle information and video stream code rate information.
  • the server determines the viewing angle information of the second terminal device based on the motion data of the second terminal device, and determines the video stream bit rate information based on the determined viewing angle information.
  • the motion server sends the viewing angle information and bit rate information to the video streaming server, or The motion server sends the viewing angle information to the video streaming server.
  • the server sends a third request message requesting the audio data stream to the audio streaming server.
  • the third request message includes the audio stream code rate information.
  • the video streaming media server determines the code rate of the video data stream transmission and the video transmission protocol adopted based on the second request message and the bandwidth detection result, and sends the video data stream to the second terminal device.
  • the audio streaming media server determines the code rate of the audio data stream transmission and the adopted video transmission protocol based on the third request message and the bandwidth detection result, and sends the audio data stream to the second terminal device.
  • the second terminal device receives the video data stream and the audio data stream, synchronizes the audio and video, and plays the audio and video.
  • the synchronization is performed based on the time stamp information PTS of the audio data stream and video data stream received by the second terminal device. If there is an audio stream, the audio stream clock is used as the reference clock to synchronize the video stream. For example, if the audio is slow, some video frames are discarded or the delay is increased; if the video is slow, some non-I-frame video frames are discarded.
  • the second terminal device needs to send motion data to the server, and the server determines the viewing angle information based on the motion data, then determines the code rate information, and sends a request message to the audio streaming server or video streaming server. It should be understood that when the second terminal device has computing functions, the The second terminal device can determine the viewing angle information and code rate information based on the motion data. The second terminal device sends a request message to the audio streaming server or the video streaming server. The request message includes the viewing angle information and/or the code rate information, that is, the second terminal device The device can also function as a server.
  • the size of the sequence numbers of the above-mentioned processes does not mean the order of execution.
  • the execution order of each process should be determined by its functions and internal logic, and should not be used in this application.
  • the implementation of the examples does not constitute any limitations. It is to be understood that these serial numbers are interchangeable under appropriate circumstances so that the described embodiments of the application can be practiced in orders other than those illustrated or described.
  • FIG. 7 is a schematic block diagram of a device 700 according to an embodiment of the present application.
  • the device 700 can implement the function of the first server in the above method, that is, a video streaming server or an audio streaming server.
  • the device 700 may include a transceiver unit 710 and a processing unit 720 .
  • the transceiver unit 710 is configured to receive a request message sent by the second server, where the request message is used to request an audio data stream or a video data stream, where the request message includes identification information of the terminal device.
  • the processing unit 720 is configured to configure the code rate of the audio data stream transmission or the code rate of the video data stream transmission according to the request message and the bandwidth detection result.
  • the transceiver unit 710 sends the audio data stream to the terminal device according to the code rate and data transmission protocol of the audio data stream, or sends the audio data stream to the terminal device according to the code rate and data transmission protocol of the video data stream.
  • the device sends the video data stream.
  • the request message is used to request a video data stream, and the request message also includes perspective information.
  • the processing unit 720 is specifically used to:
  • the transmission code rate of the first data stream to a first code rate, and the first data stream is a data stream within a first threshold of the viewing angle range;
  • the transmission code rate of the second data stream is configured as a second code rate.
  • the second data stream is a data stream outside the first threshold of the viewing angle range.
  • the first code rate greater than the second code rate.
  • the request message is used to request a video data stream, and the request message also includes viewing angle information and code rate information.
  • the processing unit 720 is specifically used to:
  • the code rate information and the bandwidth detection result configure the transmission code rate of the first data stream to a first code rate, and the first data stream is a data stream within a first threshold of the viewing angle range;
  • the code rate information and the bandwidth detection result configure the transmission code rate of the second data stream to a second code rate, and the second data stream is a data stream outside the first threshold of the viewing angle range, The first code rate is greater than the second code rate.
  • the video data stream includes a plurality of video frames, and each video frame in the plurality of video frames includes an I frame, a B frame, and a P frame, wherein the data transmission protocol adopted by the I frame is a transmission control protocol, and the data transmission protocol adopted by the B frame and the P frame is the User Data Packet Protocol.
  • the request message is used to request an audio data stream, and the request message also includes code rate information.
  • the processing unit 720 is specifically configured to: configure according to the code rate information and the bandwidth detection result.
  • the transmission bit rate of the audio stream is the third bit rate.
  • the data transmission protocol of the audio data stream is a transmission control protocol.
  • the video data stream includes a time stamp and the audio data stream includes a time stamp.
  • the processing unit 720 is also configured to detect the downlink bandwidth of the first server and the terminal device to obtain the bandwidth detection result.
  • the device embodiments and the method embodiments may correspond to each other, and similar descriptions may refer to the method embodiments. To avoid repetition, they will not be repeated here.
  • the device 700 for data processing in this embodiment may correspond to the execution subject that executes the method 300 of the embodiment of the present application
  • the aforementioned and other operations and/or functions of each module in the device 700 are respectively to implement Figure 3
  • the corresponding processes of each method are not repeated here for the sake of brevity.
  • FIG 8 is a schematic block diagram of a device 800 according to an embodiment of the present application.
  • the device 800 can implement the function of the second server, that is, the motion server in the above method.
  • the device 800 may include a transceiver unit 810 and a processing unit 820.
  • the transceiver unit 810 is configured to receive a first request message sent by the terminal device for requesting a data stream, where the first request message includes motion data of the terminal device.
  • the processing unit 820 is configured to determine the viewing angle information of the terminal device according to the motion data of the terminal device.
  • the processing unit 820 is configured to determine the code rate information of the data stream according to the perspective information.
  • Transceiver unit 810 configured to send a second request message to the first server, where the second request message
  • the information includes the code rate information, the viewing angle information and the identification information of the terminal device.
  • the first request message also includes video code rate information
  • the processing unit 820 is specifically configured to determine the bit rate information of the data stream according to the viewing angle information and the video bit rate information.
  • the first request message also includes audio code rate information and video code rate information
  • the processing unit is specifically configured to: determine the data stream according to the perspective information and the video code rate information. video bitrate information;
  • the processing unit 820 is also configured to determine the audio code rate information of the data stream according to the audio code rate information.
  • the device embodiments and the method embodiments may correspond to each other, and similar descriptions may refer to the method embodiments. To avoid repetition, they will not be repeated here.
  • the device 800 for data processing in this embodiment may correspond to the execution subject that executes the method 400 of the embodiment of the present application, the aforementioned and other operations and/or functions of each module in the device 800 are respectively to implement Figure 4 The corresponding processes of each method are not repeated here for the sake of brevity.
  • Figure 9 is a schematic block diagram of a device 900 according to an embodiment of the present application.
  • the device 900 can implement the functions of the terminal device, that is, the second terminal device, in the above method.
  • the device 900 may include a processing unit 910 and a transceiver unit 920.
  • Processing unit 910 used to obtain motion data.
  • the transceiver unit 920 is configured to send a first request message to the second server according to the service type.
  • the first request message is used to request a video stream, or the first request message is used to request a video stream and an audio stream.
  • the first request message includes motion data.
  • the first request message is used to request a video stream, and the first request message also includes video bit rate information.
  • the first request message is used to request a video stream and an audio stream, and the first request message also includes video bit rate information and audio bit rate information.
  • the transceiver unit 920 is also configured to receive an audio data stream, and the audio data stream includes a time identifier; the terminal device receives a video data stream, and the video data stream includes a time identifier; and the processing unit 910 is also configured to, according to the time identifier, Synchronize video data and audio data.
  • the device embodiments and the method embodiments may correspond to each other, and similar descriptions may refer to the method embodiments. To avoid repetition, they will not be repeated here.
  • the device 900 for data processing in this embodiment may correspond to the execution subject that executes the method 500 of the embodiment of the present application
  • the aforementioned and other operations and/or functions of each module in the device 900 are respectively to implement Figure 5
  • the corresponding processes of each method are not repeated here for the sake of brevity.
  • the software module may be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, register, etc.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps in the above method embodiment in combination with its hardware.
  • Figure 10 is a schematic block diagram of an electronic device 1000 provided by an embodiment of the present application.
  • the electronic device 1000 may include:
  • Memory 1010 and processor 1020 are used to store computer programs and transmit the program code to the processor 1020.
  • the processor 1020 can call and run the computer program from the memory 1010 to implement the method in the embodiment of the present application.
  • the processor 1020 may be configured to execute steps of each execution subject in the above method 300 according to instructions in the computer program.
  • the processor 1020 may include but is not limited to:
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the memory 1010 includes but is not limited to:
  • Non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electrically removable memory. Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory.
  • the volatile memory may be random access memory (RAM), which is used as an external cache.
  • RAM static random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • Enhanced SDRAM, ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • Direct Rambus RAM Direct Rambus RAM
  • the computer program can be divided into one or more modules, and the one or more modules are stored in the memory 1010 and executed by the processor 1020 to complete the tasks provided by this application.
  • the one or more modules may be a series of computer program instruction segments capable of completing specific functions. The instruction segments are used to describe the execution process of the computer program in the electronic device 1000 .
  • the electronic device 1000 may also include:
  • Communication interface 1030 the communication interface 1030 can be connected to the processor 1020 or the memory 1010.
  • the processor 1020 can control the communication interface 1030 to communicate with other devices. Specifically, it can send information or data to other devices, or receive information or data sent by other devices.
  • communication interface 1030 may include a transmitter and a receiver.
  • the communication interface 1030 may further include an antenna, and the number of antennas may be one or more.
  • bus system where in addition to the data bus, the bus system also includes a power bus, a control bus and a status signal bus.
  • a communication device including a processor and a memory, wherein the memory is used to store a computer program, and the processor is used to call and run the computer program stored in the memory, so that the encoder executes the method of the above method embodiment.
  • a computer storage medium is provided, with a computer program stored thereon.
  • the computer program When the computer program is executed by a computer, the computer can perform the method of the above method embodiment.
  • embodiments of the present application also provide a computer program product containing instructions, which when executed by a computer causes the computer to perform the method of the above method embodiments.
  • a computer program product or computer program including computer instructions stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method of the above method embodiment.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted over a wired connection from a website, computer, server, or data center (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website, computer, server or data center.
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, tape), optical media (eg, digital video disc (DVD)), or semiconductor media (eg, solid state disk (SSD)), etc.
  • B corresponding to A means that B is associated with A.
  • B can be determined based on A.
  • determining B based on A does not mean determining B only based on A.
  • B can also be determined based on A and/or other information.
  • At least one means one or more
  • plural means two or more than two.
  • and/or describes the relationship between associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and B exists alone, where A and B can be singular or plural.
  • the character "/” generally indicates that the related objects are in an “or” relationship.
  • At least one of the following” or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • At least one of a, b, or c can mean: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • modules and algorithm steps of each example described in conjunction with the embodiments disclosed herein can be implemented with electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Professional technicians can use different methods for each specific application. method to implement the described functionality, but such implementation should not be considered beyond the scope of this application.
  • the disclosed equipment, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules is only a logical function division. In actual implementation, there may be other division methods.
  • multiple modules or components may be combined or may be Integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or modules, and may be in electrical, mechanical or other forms.
  • Modules described as separate components may or may not be physically separated, and components shown as modules may or may not be physical modules, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. For example, each functional module in each embodiment of the present application can be integrated into a processing module, or each module can exist physically alone, or two or more modules can be integrated into one module.

Abstract

本申请提供一种数据传输的方法、装置、电子设备以及存储介质,涉及数据传输技术领域。该方法中,第一服务器接收第二服务器发送请求消息,所述请求消息用于请求音频数据流或视频数据流,所述请求消息包括终端设备的标识信息;根据所述请求消息和带宽检测结果,配置所述音频数据流传输的码率或所述视频数据流传输的码率;根据所述音频数据流的码率和数据传输协议,向所述终端设备发送所述音频数据流,或者根据所述视频数据流的码率和数据传输协议,向所述终端设备发送所述视频数据流。该方法通过请求信息和带宽检测结果,对数据流传输的码率进行配置,提高了用户在观看全景视频时的体验质量,并且可以节省网络资源。

Description

一种数据传输的方法、装置、电子设备及存储介质
本申请要求于2022年09月19日提交中国专利局、申请号为202211139880.1、申请名称为“一种数据传输的方法、装置、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及通信领域,并且具体的,涉及一种数据传输的方法、装置、电子设备及存储介质。
背景技术
全景视频有别于传统视频单一的观看视角,让用户可以360度自由观看。而VR全景视频在此基础上,还允许用户在观看视频时可以自由移动观看,提供场景中任意位置的360度自由视角,用户在观看全景视频时跟随视频内容沉浸式感受高逼真的场景,获得前所未有的体验。正因如此,全景视频受到越来越多的追捧。全景视频为了使用户获得良好的体验,全景视频传输时一般采用高码率传输,需要较高的网络资源。因此,如何传输全景视频,使得用户在观看全景视频时获得较好的体验质量,并且可以节省网络资源是一个亟待解决的问题。
发明内容
本申请实施例提供了一种数据传输的方法,实现了在节省网络资源的同时,提高用户观看全景视频时的体验感。
第一方面,本申请实施例提供了一种数据传输的方法,所述方法应用于第一服务器,包括:接收第二服务器发送的请求消息,所述请求消息用于请求音频数据流或视频数据流,所述请求消息包括终端设备的标识信息;根据所述请求消息和带宽检测结果,配置所述音频数据流传输的码率或所述视频数据流传输的码率;根据所述音频数据流的码率和数据传输协议,向所述终端设备发送所述音频数据流,或者根据所述视频数据流的码率和数据传输协议,向所述终端设备发送所述视频数据流。
第二方面,本申请实施例提供了一种数据传输的方法,所述方法应用于第二服务器,包括:接收终端设备发送的用于请求数据流的第一请求消息,所述第一请求消息包括终端设备的运动数据;根据所述终端设备的运动数据,确定所述终端设备的视角信息;根据所述视角信息,确定所述数据流的码率信息;向第一服务器发送第二请求消息,所述第二请求消息包括所述码率信 息和所述视角信息,所述第二请求消息还包括所述终端设备的标识信息。
第三方面,本申请实施例提供了一种数据传输的装置,包括:
收发单元,用于接收第二服务器发送的请求消息,所述请求消息用于请求音频数据流或视频数据流,所述请求消息包括终端设备的标识信息;
处理单元,用于根据所述请求消息和带宽检测结果,配置数据流传输的码率;
所述收发单元还用于根据所述码率和数据传输协议,向所述终端设备发送所述数据流。
第四方面,本申请实施例提供了一种数据传输的装置,包括:
收发单元,用于接收终端设备发送的用于请求数据流的第一请求消息,所述第一请求消息包括终端设备的运动数据;
处理单元,用于根据所述终端设备的运动数据,确定所述终端设备的视角信息;
所述处理单元还用于根据所述视角信息,确定所述数据流的码率信息;
所述收发单元还用于向第一服务器发送第二请求消息,所述第二请求消息包括所述码率信息和所述视角信息,所述第二请求消息还包括所述终端设备的标识信息。
第五方面,本申请实施例本申请提供了一种电子设备,包括:
处理器,适于实现计算机指令;以及,
存储器,存储有计算机指令,计算机指令适于由处理器加载并执行上述第一方面的方法。
第六方面,本申请实施例提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机指令,该计算机指令被计算机设备的处理器读取并执行时,使得计算机设备执行上述第一方面的方法。
第七方面,本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述第一方面的方法。
通过上述技术方案,第一服务器根据第二服务器发送的请求消息以及带宽检测结果,确定传输的视频数据流或音频数据流的码率,根据确定的传输视频数据流或音频数据流的码率和数据传输协议向终端设备传输视频数据流或音频数据流来提高用户的体验度,并且分开传输视频数据流或音频数据流可以减小对网络资源的占用。
附图说明
图1为本申请实施例涉及的系统架构的一个可选的示意图;
图2为本申请实施例涉及的系统架构的一个可选的示意图;
图3为本申请实施例提供的一种数据传输的方法的示意性流程图;
图4为本申请实施例提供的一种数据传输的方法的示意性流程图;
图5为本申请实施例提供的一种数据传输的方法的示意性流程图;
图6为本申请实施例提供的一种数据传输的方法的示意性流程图;
图7是本申请实施例的装置的示意性框图;
图8是本申请实施例的装置的示意性框图;
图9是本申请实施例的装置的示意性框图;
图10为本申请实施例提供的电子设备的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
图1为本申请实施例涉及的系统架构100的一个可选的示意图。如图1所示,系统架构100包括第二终端110、通信设备120、核心网设备130、服务器140、音频流媒体服务器150、视频流媒体服务器160和第一终端170,其中,不同设备之间可以通过有线或无线方式进行通信交互。
其中,第二终端110可以是显示全景视频的设备,如虚拟现实(Virtual Reality,VR)设备、增强现实(Augmented Reality,AR)设备、混合现实(Mixed Reality,MR)设备或其类似者设备。例如VR设备可以是VR眼镜、VR头显等应用了VR技术的设备,AR设备可以是AR眼镜、AR电视、AR头显等应用了AR技术的设备,MR设备可以是MR眼镜、MR终端、MR头显、MR可穿戴设备等应用了MR技术的设备,但并不局限于此。例如,第二终端也可以是具备显示功能的(云)服务器。
通信设备120主要指可以充当发射源的有源通信设备,是终端通过无线方式接入到网络中的接入设备,主要负责空口侧的无线资源管理、服务质量(quality of service,QoS)管理、数据压缩和加密等。例如:基站NodeB、演进型基站eNodeB、5G移动通信系统或新一代无线(new radio,NR)通信系统中的基站、未来移动通信系统中的基站等。
核心网设备130负责处理转发信息,可以包括4G/5G核心网或其他网关,如用户面功能(user plane function,UPF)网元、接入和移动性管理功能(access  and mobility management function,AMF)网元、会话管理功能(session management function,SMF)网元、策略控制功能(policy control function,PCF)网元等。
服务器140负责接收由第二终端110发来的码率信息和运动数据。服务器140可以根据运动数据确定第二终端110的视角信息。
音频流媒体服务器150负责接收音频流,允许第二终端110和服务器140拉取音频流。
视频流媒体服务器160负责接收视频流,允许第二终端110和服务器140拉取视频流。
应理解,第二终端设备110和服务器,如服务器140、音频流媒体服务器150和视频流媒体服务器160,可以不在同一局域网内。
还应理解,服务器140、音频流媒体服务器150和视频流媒体服务器160可以为云服务器,云服务器是可以提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Delivery Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。
第一终端170可以是捕获视频或图像的设备,如摄像头、传感器、毫米波雷达、激光雷达、PC、(云)服务器等,第一终端将本地高分辨率全景视频传至音频流媒体服务器150和视频流媒体服务器160上。
图2为本申请实施例涉及的另一个系统架构200的一个可选的示意图。如图2所示,系统架构200包括第二终端210、服务器220、音频流媒体服务器230、视频流媒体服务器240和第一终端250。
应理解,系统架构200包括第二终端210、服务器220、音频流媒体服务器230、视频流媒体服务器240和第一终端250可以参考系统架构100包括的第二终端110、服务器120、音频流媒体服务器130、视频流媒体服务器140和第一终端170进行理解。系统架构200为设备在同一局域网内的示例性的系统架构图。
在目前的360度视频传输过程中,为了能够获得更好的观看体验,往往需要高码率传输数据,码率就是数据传输时单位时间传送的数据位数,一般用的单位是kbps,即千位每秒。在分辨率一定的情况下,码率与清晰度成正比关系,码率越高,图像越清晰;码率越低,图像越不清晰。但高码率传输数据,会占用更多的网络资源。由于用户的头部的变化,会导致用户的视角常常发生变化。因此,如何在不同的网络带宽下,针对不同视角,如何选择码率传输音视频信息,这是一项亟待解决的问题。
因此,本申请提出了一种数据传输方法,该方法应用于服务器,该服务器是向终端设备发送视频数据流或音频数据流的服务器,该服务器可以根据视角信息和/或码率信息,以及带宽检测结果,确定传输的视频数据流或音频数据流的码率,根据确定的传输视频数据流或音频数据流的码率和数据传输协议向终端设备传输视频数据流或音频数据流来提高用户的体验度。
以下结合附图对本申请实施例提供的方案进行描述。
图3为本申请实施例提供的一种数据传输的方法300的示意性流程图。方法300可以由任何具有数据处理能力的电子设备执行。例如,该电子设备可实施为服务器或计算机。下面以该电子设备是第一服务器为例进行说明。如图3所示,方法300可以包括步骤310至330。
310,第一服务器接收第二服务器发送的请求消息,请求消息用于请求音频数据流或视频数据流,请求消息包括终端设备的标识信息。
320,第一服务器根据请求消息和带宽检测结果,配置音频数据流传输的码率或视频数据流传输的码率。
330,第一服务器根据音频数据流的码率和数据传输协议,向终端设备发送音频数据流,或者根据视频数据流的码率和数据传输协议,向终端设备发送视频数据流。
应理解,第一服务器可以是视频流媒体服务器,也可以是音频流媒体服务器,这和第一服务器提供的功能有关。当然,第一服务器可以同时是视频流媒体服务器和音频流媒体服务器,只不过视频流和音频流分开处理传输。
第一服务器可以根据请求消息,以及带宽检测结果,确定传输的数据流的码率,根据确定的传输数据流的码率和数据传输协议向终端设备传输数据流来提高用户的体验度,并且视频数据流和音频数据流是分开传输的,降低了对网络资源的占用率。
可选的,第一服务器为视频流服务器,请求消息用于请求视频数据流,请求消息还包括视角信息,该请求消息用于向第一服务器请求视频流,根据请求消息和带宽检测结果,配置数据流传输的码率,包括:根据视角信息和带宽检测结果,配置第一数据流的传输码率为第一码率,第一数据流为视角范围第一阈值内的数据流;根据视角信息和带宽检测结果,配置第二数据流的传输码率为第二码率,第二数据流为视角范围第一阈值外的数据流,第一码率大于第二码率。
具体而言,用户的视角为一个区域,在视角区域内,用户对处于视角中心区域的图像感知比较敏感,而对处于视角边缘区域的图像感知不够敏感,因此,根据视角信息和带宽检测,配置视角范围第一阈值内对应的第一数据 流的传输码率为第一码率,配置视角范围第一阈值外对应的第二数据流的传输码率为第二码率,第一码率大于第二码率,即第一数据流采用较高的码率传输,第二数据流采用较低的码率传输。根据视角信息和带宽检测信息,将视角范围内不同区域的数据流采用不同的码率传输,可以在保障用户体验度的同时,降低网络资源的占用率。
应理解,在带宽足够的情况下,也可以适当提高视角范围第一阈值外对应的第二数据流的传输码率,以便于用户获得更好的观看质量。
还应理解,第一服务器确定的第一码率和第二码率应当在终端设备支持的范围内。例如,事先约定终端设备支持的码率范围,第一服务器可以在该码率范围内配置视角范围第一阈值内对应的第一数据流的传输码率为第一码率,配置视角范围第一阈值外对应的第二数据流的传输码率为第二码率。
可选的,第一服务器为视频流服务器,请求消息用于请求视频数据流,请求消息还包括视角信息和码率信息,该请求消息用于向第一服务器请求视频流,根据请求消息和带宽检测结果,配置数据流传输的码率,包括:根据视角信息、码率信息和带宽检测结果,配置第一数据流的传输码率为第一码率,第一数据流为视角范围第一阈值内的数据流;根据视角信息、码率信息和带宽检测结果,配置第二数据流的传输码率为第二码率,第二数据流为视角范围第一阈值外的数据流,第一码率大于所述第二码率。
具体而言,请求消息中可以包括终端设备请求的码率信息,第一服务器在确定传输视频数据流的码率时,可以结合终端设备请求的码率信息来确定,以避免终端设备不支持第一服务器采用的码率信息的情况。因此,根据视角信息、码率信息和带宽检测,配置视角范围第一阈值内对应的第一数据流的传输码率为第一码率,配置视角范围第一阈值外对应的第二数据流的传输码率为第二码率,第一码率大于第二码率,即第一数据流采用较高的码率传输,第二数据流采用较低的码率传输。根据视角信息、码率信息和带宽检测信息,在终端设备支持的码率范围内,将视角范围内不同区域的数据流采用不同的码率传输,可以在保障用户体验度的同时,降低网络资源的占用率。
可选的,视频数据流包括多个视频帧,多个视频帧中的每个视频帧包括I帧、B帧和P帧,其中,I帧采用的数据传输协议是传输控制协议(transmission control protocol,TCP),B帧和P帧采用的数据传输协议是用户数据包协议(user datagram protocol,UDP)。
具体而言,视频帧分为I帧,B帧和P帧。其中,I帧是内部编码帧,也称为关键帧;P帧是前向预测帧,也称为前向参考帧,B帧是双向内插帧,也称为双向参考帧。简单地讲,I帧是一个完整的画面,而P帧和B帧记 录的是相对于I帧的变化。如果没有I帧,P帧和B帧就无法解码。为了提高用户的观看质量,I帧采用单独通道传输,使用TCP进行传输,保证传输过程不丢帧,避免视频出现马赛克,模糊等现象;B帧和P帧则通过一个UDP的数据通道进行传输,同时根据网络检测状况进行前向纠错(forward error correction,FEC)冗余。
可选的,第一服务器为音频流服务器,请求消息用于请求音频数据流,请求消息还包括码率信息,根据请求消息和带宽检测结果,配置数据流传输的码率,包括:根据码率信息和带宽检测结果,配置音频流的传输码率为第三码率。
应理解,第三码率可以是低码率,也可以是高码率。
可选的,音频数据流的数据传输协议是传输控制协议TCP。
可选的,数据流包括时间标识。
具体而言,第一服务器为视频流服务器时,数据流是视频数据流,视频数据流包括时间标识,或者,第一服务器为音频流服务器时,数据流是音频数据流,音频数据流包括时间标识。数据流包括时间标识适用于在终端设备侧进行音视频同步。
可选的,第一服务器检测第一服务器与终端设备的下行带宽,以获取带宽检测结果。
应理解,下行指的是第一服务器向终端设备发送数据流的过程。
图4为本申请实施例提供的一种数据传输的方法400的示意性流程图。方法400可以由任何具有数据处理能力的电子设备执行。例如,该电子设备可实施为服务器或计算机。下面以该电子设备是第二服务器为例进行说明,主要用于根据终端设备发送的运动数据确定用户视角。如图4所示,方法400可以包括步骤410至440。
410,第二服务器接收终端设备发送的用于请求数据流的第一请求消息,第一请求消息包括终端设备的运动数据。
420,第二服务器根据终端设备的运动数据,确定终端设备的视角信息。
430,第二服务器根据视角信息,确定数据流的码率信息。
440,第二服务器向第一服务器发送第二请求消息,第二请求消息包括码率信息、视角信息和终端设备的标识信息。
第二服务器接收终端设备发送的第一请求消息,并向第一服务器发送第二请求消息,第二请求消息包括码率信息和视角信息,以便于第一服务器可以根据视角信息和率信息,以及带宽检测结果,确定传输的数据流的码率,根据确定的传输数据流的码率向终端设备传输数据流来提高用户的体验度。
可选的,第一请求消息还包括视频码率信息,根据视角信息,确定数据流的码率信息,包括:根据视角信息和视频码率信息,确定数据流的码率信息。
具体而言,第二服务器接收到的终端设备发送的第一请求消息用于请求视频流,第一请求消息包括了视频码率信息,该视频码率信息可以是一个码率范围,第二服务器可以根据视角信息,在该视频码率信息指示的范围内,确定一个码率信息。
可选的,第一请求消息还包括音频码率信息和视频码率信息,根据视角信息,确定终端设备的码率信息,包括:根据视角信息和所述视频码率信息,确定数据流的视频码率信息。该方法还包括:音频码率信息,确定数据流的音频码率信息。
具体而言,第二服务器接收到的终端设备发送的第一请求消息用于请求视频流和音频流,第一请求消息包括了视频码率信息,该视频码率信息可以是一个码率范围,第二服务器可以根据视角信息,在该视频码率信息指示的范围内,确定一个码率信息。第一请求消息还包括了音频码率信息,该音频码率信息可以是一个码率范围,第二服务器可以在该音频码率信息指示的范围内,确定一个音频码率信息。
第二服务器收到的第一请求消息请求的内容不一样,这是因为终端设备可以根据业务类型确定第一请求消息用于请求视频流,或者第一请求消息用于请求视频流和音频流。例如,当终端设备的运行的业务音频可忽略时,第一请求消息用于请求视频流,如远程驾驶场景,对于音频的关注度不高。当终端设备的运行的业务音频不可忽略时,第一请求消息用于请求视频流和音频流,如远程视频会议、直播场景,音频流和视频流均非常重要。
图5为本申请实施例提供的一种数据传输的方法500的示意性流程图。方法500可以由任何具有数据处理能力的电子设备执行。例如,该电子设备可实施为终端设备或计算机。下面以该电子设备是终端设备为例进行说明,。如图5所示,方法500可以包括步骤510至530。
510,终端设备获取运动数据。
运动数据可以是6自由度数据(6 degree of freedom,6DoF)物体在空间具有六个自由度,即沿x、y、z三个直角坐标轴方向的移动自由度与绕该三个坐标轴的转动自由度。
520,终端设备根据业务类型,向第二服务器发送第一请求消息,第一请求消息用于请求视频流,或者第一请求消息用于请求视频流和音频流,第一请求消息包括运动数据。
终端设备根据业务类型,向第二服务器请求不同的内容,可以有效缓解终端设备和流媒体服务器之间的传输带宽的压力,提高传输效率。
可选的,第一请求消息用于请求视频流,第一请求消息还包括视频码率信息。
可选的,第一请求消息用于请求视频流和音频流,第一请求消息还包括视频码率信息和音频码率信息。
可选的,该方法还包括终端设备接收音频数据流,该音频数据流包括时间标识;终端设备接收视频数据流,该视频数据流包括时间标识;终端设备根据时间标识,同步视频数据和音频数据。
上文图3至图5,是从单侧的角度详细描述了本申请的方法实施例。为了更清楚的理解本申请实施例,下面通过图6从交互的角度描述本申请实施例提供的一种数据传输的方法。如图6所示,图6为本申请实施例提供的一种数据传输的方法600的示意性流程图。方法600可以包括步骤601至612。
601,第一终端将本地高分辨率全景视频的视频流和音频流进行分离,得到音频流和视频流。
音频和视频单独处理,一是可以缓解第一终端和流媒体服务器之间的传输带宽的压力,二是根据业务需要,后续可以在第二终端按需选择,提高传输效率。
602,视频流媒体服务器开启带宽检测,识别并测量当前下行链路的最大可用带宽。
视频流媒体服务器可以周期性的进行带宽检测,也可以实时性的进行带宽检测。
603,音频流媒体服务器开启带宽检测,识别并测量当前下行链路的最大可用带宽。
音频流媒体服务器可以周期性的进行带宽检测,也可以实时性的进行带宽检测。
604,第一终端设备将视频流推送至视频流媒体服务器。
605,第一终端设备将音频流推送至音频流媒体服务器。
应理解,所述流媒体服务器不限定,流媒体服务器可以是实时流传输协议(real-time streaming protocol,RTSP)服务器,实时信息传送协议(real-time messaging protocol,RTMP)服务器,超文本传输协议(hyper text transfer protocol,HTTP)服务器等中的一种。
606,第二终端设备确定运动数据
607,第二终端设备向服务器发送第一请求消息。
第二终端设备根据业务类型,向服务器发送第一请求消息,第一请求消息用于请求视频流,或者第一请求消息用于请求视频流和音频流,第一请求消息包括运动数据,第一请求消息还包括视频码率信息,或者第一请求消息还包括视频码率信息和音频码率信息。
608,服务器向视频流媒体服务器发送请求视频数据流的第二请求消息,第二请求消息包括视角信息,或者第二请求消息包括视角信息和视频流码率信息。
服务器基于第二终端设备的运动数据,确定第二终端设备的视角信息,并根据确定的视角信息确定视频流码率信息,同时运动服务器将视角信息和码率信息发送至视频流媒体服务器,或者运动服务器将视角信息发送至视频流媒体服务器。
609,服务器向音频流媒体服务器发送请求音频数据流的第三请求消息,第三请求消息包括音频流码率信息。
610,视频流媒体服务器根据第二请求消息和带宽检测结果,确定视频数据流传输的码率和采用的视频传输协议,并向第二终端设备发送视频数据流。
该步骤可以参考方法300中的对应段落进行理解,此处不再赘述。
611,音频流媒体服务器根据第三请求消息和带宽检测结果,确定音频数据流传输的码率和采用的视频传输协议,并向第二终端设备发送音频数据流。
612,第二终端设备接收视频数据流和音频数据流,并进行音视频同步,播放音视频。
应理解同步的原则如下:
以第二终端设备接收到的音频数据流和视频数据流的时间标识信息PTS为基准进行同步。若存在音频流,则同时以音频流时钟为基准时钟,同步视频流,如:若音频慢,则丢弃部分视频帧或增加时延;若视频慢,则丢弃部分非I帧视频帧。
对于起播阶段,特别是实时数据流,由于视频解码需要依赖第一个I帧,而音频是可以实时输出,可能出现的情况是视频PTS超前音频PTS较多,这种情况下进行同步,势必造成较为明显的慢同步。处理这种情况的较好方法是将较为多余的音频数据丢弃,尽量减少起播阶段的音视频差距。
在上述描述中,第二终端设备需要向服务器发送运动数据,由服务器根据运动数据确定视角信息,进而确定码率信息,向音频流媒体服务器或视频流媒体服务器发送请求消息。应理解,在第二终端设备具备计算功能时,第 二终端设备可以根据运动数据确定视角信息和码率信息,第二终端设备向音频流媒体服务器或视频流媒体服务器发送请求消息,该请求消息包括视角信息和/或码率信息,即第二终端设备可以兼具服务器的功能。
以上结合附图详细描述了本申请的具体实施方式,但是,本申请并不限于上述实施方式中的具体细节,在本申请的技术构思范围内,可以对本申请的技术方案进行多种简单变型,这些简单变型均属于本申请的保护范围。例如,在上述具体实施方式中所描述的各个具体技术特征,在不矛盾的情况下,可以通过任何合适的方式进行组合,为了避免不必要的重复,本申请对各种可能的组合方式不再另行说明。又例如,本申请的各种不同的实施方式之间也可以进行任意组合,只要其不违背本申请的思想,其同样应当视为本申请所公开的内容。
还应理解,在本申请的各种方法实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。应理解这些序号在适当情况下可以互换,以便描述的本申请的实施例能够以除了在图示或描述的那些以外的顺序实施。
上文结合图1至图6,详细描述了本申请的方法实施例,下文结合图7至图10,详细描述本申请的装置实施例。
图7是本申请实施例的装置700的示意性框图,该装置700可以实现上述方法中第一服务器,即视频流媒体服务器或音频流媒体服务器的功能。如图7所示,装置700可包括收发单元710和处理单元720。
收发单元710,用于接收第二服务器发送的请求消息,所述请求消息用于请求音频数据流或视频数据流,所述请求消息包括终端设备的标识信息。
处理单元720,用于根据所述请求消息和带宽检测结果,配置所述音频数据流传输的码率或所述视频数据流传输的码率。
所述收发单元710根据所述音频数据流的码率和数据传输协议,向所述终端设备发送所述音频数据流,或者根据所述视频数据流的码率和数据传输协议,向所述终端设备发送所述视频数据流。
在一些实施例中,所述请求消息用于请求视频数据流,所述请求消息还包括视角信息,所述处理单元720具体用于:
根据所述视角信息和所述带宽检测结果,配置第一数据流的传输码率为第一码率,所述第一数据流为视角范围第一阈值内的数据流;
根据所述视角信息和所述带宽检测结果,配置第二数据流的传输码率为第二码率,所述第二数据流为视角范围第一阈值外的数据流,所述第一码率 大于所述第二码率。
在一些实施例中,所述请求消息用于请求视频数据流,所述请求消息还包括视角信息和码率信息,所述处理单元720具体用于:
根据所述视角信息、所述码率信息和所述带宽检测结果,配置第一数据流的传输码率为第一码率,所述第一数据流为视角范围第一阈值内的数据流;根据所述视角信息、所述码率信息和所述带宽检测结果,配置第二数据流的传输码率为第二码率,所述第二数据流为视角范围第一阈值外的数据流,所述第一码率大于所述第二码率。
在一些实施例中,所述视频数据流包括多个视频帧,所述多个视频帧中的每个视频帧包括I帧、B帧和P帧,其中,所述I帧采用的数据传输协议是传输控制协议,所述B帧和所述P帧采用的数据传输协议是用户数据包协议。
在一些实施例中,所述请求消息用于请求音频数据流,所述请求消息还包括码率信息,所述处理单元720具体用于:根据所述码率信息和所述带宽检测结果,配置所述音频流的传输码率为第三码率。
在一些实施例中,所述音频数据流的数据传输协议是传输控制协议。
在一些实施例中,所述视频数据流包括时间标识,所述音频数据流包括时间标识。
在一些实施例中,所述处理单元720还用于检测所述第一服务器与所述终端设备的下行带宽,以获取所述带宽检测结果。
应理解,装置实施例与方法实施例可以相互对应,类似的描述可以参照方法实施例。为避免重复,此处不再赘述。具体地,当在该实施例中数据处理的装置700可以对应于执行本申请实施例的方法300的执行主体时,装置700中的各个模块的前述和其它操作和/或功能分别为了实现图3中的各个方法相应流程,为了简洁,在此不再赘述。
图8是本申请实施例的装置800的示意性框图,该装置800可以实现上述方法中第二服务器,即运动服务器的功能。如图8所示,装置800可包括收发单元810和处理单元820。
收发单元810,用于接收终端设备发送的用于请求数据流的第一请求消息,所述第一请求消息包括终端设备的运动数据。
处理单元820,用于根据所述终端设备的运动数据,确定所述终端设备的视角信息。
处理单元820,用于根据所述视角信息,确定所述数据流的码率信息。
收发单元810,用于向第一服务器发送第二请求消息,所述第二请求消 息包括所述码率信息、所述视角信息和所述终端设备的标识信息。
在一些实施例中,所述第一请求消息还包括视频码率信息,
所述处理单元820具体用于:根据所述视角信息和所述视频码率信息,确定所述数据流的码率信息。
在一些实施例中,所述第一请求消息还包括音频码率信息和视频码率信息,所述处理单元具体用于:根据所述视角信息和所述视频码率信息,确定所述数据流的视频码率信息;
所述处理单元820还用于:根据所述音频码率信息,确定所述数据流的音频码率信息。
应理解,装置实施例与方法实施例可以相互对应,类似的描述可以参照方法实施例。为避免重复,此处不再赘述。具体地,当在该实施例中数据处理的装置800可以对应于执行本申请实施例的方法400的执行主体时,装置800中的各个模块的前述和其它操作和/或功能分别为了实现图4中的各个方法相应流程,为了简洁,在此不再赘述。
图9是本申请实施例的装置900的示意性框图,该装置900可以实现上述方法中终端设备,即第二终端设备的功能。如图9所示,装置900可包括处理单元910和收发单元920。
处理单元910,用于获取运动数据。
收发单元920,用于根据业务类型,向第二服务器发送第一请求消息,第一请求消息用于请求视频流,或者第一请求消息用于请求视频流和音频流,第一请求消息包括运动数据。
在一些实施例中,第一请求消息用于请求视频流,第一请求消息还包括视频码率信息。
在一些实施例中,第一请求消息用于请求视频流和音频流,第一请求消息还包括视频码率信息和音频码率信息。
在一些实施例中,收发单元920还用于接收音频数据流,该音频数据流包括时间标识;终端设备接收视频数据流,该视频数据流包括时间标识;处理单元910还用于根据时间标识,同步视频数据和音频数据。
应理解,装置实施例与方法实施例可以相互对应,类似的描述可以参照方法实施例。为避免重复,此处不再赘述。具体地,当在该实施例中数据处理的装置900可以对应于执行本申请实施例的方法500的执行主体时,装置900中的各个模块的前述和其它操作和/或功能分别为了实现图5中的各个方法相应流程,为了简洁,在此不再赘述。
上文中结合附图从功能模块的角度描述了本申请实施例的装置和系统。 应理解,该功能模块可以通过硬件形式实现,也可以通过软件形式的指令实现,还可以通过硬件和软件模块组合实现。具体地,本申请实施例中的方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路和/或软件形式的指令完成,结合本申请实施例公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。可选地,软件模块可以位于随机存储器,闪存、只读存储器、可编程只读存储器、电可擦写可编程存储器、寄存器等本领域的成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法实施例中的步骤。
如图10是本申请实施例提供的电子设备1000的示意性框图。
如图10所示,该电子设备1000可包括:
存储器1010和处理器1020,该存储器1010用于存储计算机程序,并将该程序代码传输给该处理器1020。换言之,该处理器1020可以从存储器1010中调用并运行计算机程序,以实现本申请实施例中的方法。
例如,该处理器1020可用于根据该计算机程序中的指令执行上述方法300中各执行主体的步骤。
在本申请的一些实施例中,该处理器1020可以包括但不限于:
通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等等。
在本申请的一些实施例中,该存储器1010包括但不限于:
易失性存储器和/或非易失性存储器。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DR RAM)。
在本申请的一些实施例中,该计算机程序可以被分割成一个或多个模块,该一个或者多个模块被存储在该存储器1010中,并由该处理器1020执行,以完成本申请提供的方法。该一个或多个模块可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述该计算机程序在该电子设备1000中的执行过程。
可选的,该电子设备1000还可包括:
通信接口1030,该通信接口1030可连接至该处理器1020或存储器1010。
其中,处理器1020可以控制该通信接口1030与其他设备进行通信,具体地,可以向其他设备发送信息或数据,或接收其他设备发送的信息或数据。示例性的,通信接口1030可以包括发射机和接收机。通信接口1030还可以进一步包括天线,天线的数量可以为一个或多个。
应当理解,该电子设备1000中的各个组件通过总线系统相连,其中,总线系统除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。
根据本申请的一个方面,提供了一种通信装置,包括处理器和存储器,该存储器用于存储计算机程序,该处理器用于调用并运行所述存储器中存储的计算机程序,使得所述编码器执行上述方法实施例的方法。
根据本申请的一个方面,提供了一种计算机存储介质,其上存储有计算机程序,该计算机程序被计算机执行时使得该计算机能够执行上述方法实施例的方法。或者说,本申请实施例还提供一种包含指令的计算机程序产品,该指令被计算机执行时使得计算机执行上述方法实施例的方法。
根据本申请的另一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述方法实施例的方法。
换言之,当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行该计算机程序指令时,全部或部分地产生按照本申请实施例该的流程或功能。该计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,该计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。 该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如数字视频光盘(digital video disc,DVD))、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
应理解,在本申请实施例中,“与A对应的B”表示B与A相关联。在一种实现方式中,可以根据A确定B。但还应理解,根据A确定B并不意味着仅仅根据A确定B,还可以根据A和/或其它信息确定B。
在本申请的描述中,除非另有说明,“至少一个”是指一个或多个,“多个”是指两个或多于两个。另外,“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
还应理解,本申请实施例中出现的第一、第二等描述,仅作示意与区分描述对象之用,没有次序之分,也不表示本申请实施例中对设备个数的特别限定,不能构成对本申请实施例的任何限制。
还应理解,说明书中与实施例有关的特定特征、结构或特性包括在本申请的至少一个实施例中。此外,这些特定的特征、结构或特性可以任意适合的方式结合在一个或多个实施例中。
此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或服务器不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
可以理解的是,在本申请的具体实施方式中,可能涉及到用户信息等相关的数据。当本申请以上实施例运用到具体产品或技术中时,需要获得用户许可或者同意,且相关数据的收集、使用和处理需要遵守相关国家和地区的相关法律法规和标准。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的模块及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方 法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,该模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。例如,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。
以上仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以该权利要求的保护范围为准。

Claims (15)

  1. 一种数据传输的方法,所述方法应用于第一服务器,其特征在于,包括:
    接收来自第二服务器的请求消息,所述请求消息用于请求音频数据流或视频数据流,所述请求消息包括终端设备的标识信息;
    根据所述请求消息和带宽检测结果,配置所述音频数据流传输的码率或所述视频数据流传输的码率;
    根据所述音频数据流的码率和数据传输协议,向所述终端设备发送所述音频数据流,或者根据所述视频数据流的码率和数据传输协议,向所述终端设备发送所述视频数据流。
  2. 根据权利要求1所述的方法,其特征在于,所述请求消息用于请求视频数据流,所述请求消息还包括视角信息,所述根据所述请求消息和带宽检测结果,配置数据流传输的码率,包括:
    根据所述视角信息和所述带宽检测结果,配置第一数据流的传输码率为第一码率,所述第一数据流为视角范围第一阈值内的数据流;
    根据所述视角信息和所述带宽检测结果,配置第二数据流的传输码率为第二码率,所述第二数据流为视角范围第一阈值外的数据流,所述第一码率大于所述第二码率。
  3. 根据权利要求1所述的方法,其特征在于,所述请求消息用于请求视频数据流,所述请求消息还包括视角信息和码率信息,所述根据所述请求消息和带宽检测结果,配置数据流传输的码率,包括:
    根据所述视角信息、所述码率信息和所述带宽检测结果,配置第一数据流的传输码率为第一码率,所述第一数据流为视角范围第一阈值内的数据流;
    根据所述视角信息、所述码率信息和所述带宽检测结果,配置第二数据流的传输码率为第二码率,所述第二数据流为视角范围第一阈值外的数据流,所述第一码率大于所述第二码率。
  4. 根据权利要求2或3所述的方法,其特征在于,所述视频数据流包括多个视频帧,所述多个视频帧中的每个视频帧包括I帧、B帧和P帧,
    其中,所述I帧采用的数据传输协议是传输控制协议,所述B帧和所述P帧采用的数据传输协议是用户数据包协议。
  5. 根据权利要求1所述的方法,其特征在于,所述请求消息用于请求 音频数据流,所述请求消息包括码率信息,所述根据所述请求消息和带宽检测结果,配置数据流传输的码率,包括:
    根据所述码率信息和所述带宽检测结果,配置所述音频流的传输码率为第三码率。
  6. 根据权利要求5所述的方法,其特征在于,所述音频数据流的数据传输协议是传输控制协议。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述音频数据流包括时间标识,所述视频数据流包括时间标识。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,还包括:
    检测所述第一服务器与所述终端设备的下行带宽,以获取所述带宽检测结果。
  9. 一种数据传输的方法,所述方法应用于第二服务器,其特征在于,包括:
    接收来自终端设备的用于请求数据流的第一请求消息,所述第一请求消息包括所述终端设备的运动数据;
    根据所述终端设备的运动数据,确定所述终端设备的视角信息;
    根据所述视角信息,确定所述数据流的码率信息;
    向第一服务器发送第二请求消息,所述第二请求消息包括所述码率信息、所述视角信息和所述终端设备的标识信息。
  10. 根据权利要求9所述的方法,其特征在于,所述第一请求消息还包括视频码率信息,
    所述根据所述视角信息,确定所述数据流的码率信息,包括:
    根据所述视角信息和所述视频码率信息,确定所述数据流的码率信息。
  11. 根据权利要求9所述的方法,其特征在于,所述第一请求消息还包括音频码率信息和视频码率信息,
    所述根据所述视角信息,确定所述终端设备的码率信息,包括:
    根据所述视角信息和所述视频码率信息,确定所述数据流的视频码率信息;
    所述方法还包括:
    根据所述音频码率信息,确定所述数据流的音频码率信息。
  12. 一种数据传输的装置,其特征在于,包括:
    收发单元,用于接收来自第二服务器的请求消息,所述请求消息用于请求音频数据流或视频数据流,所述请求消息包括终端设备的标识信息;
    处理单元,用于根据所述请求消息和带宽检测结果,配置数据流传输的码率;
    所述收发单元还用于根据所述码率和数据传输协议,向所述终端设备发送所述数据流。
  13. 一种数据传输的装置,其特征在于,包括:
    收发单元,用于接收来自终端设备的用于请求数据流的第一请求消息,所述第一请求消息包括所述终端设备的运动数据;
    处理单元,用于根据所述终端设备的运动数据,确定所述终端设备的视角信息;
    所述处理单元还用于根据所述视角信息,确定所述数据流的码率信息;
    所述收发单元还用于向第一服务器发送第二请求消息,所述第二请求消息包括所述码率信息、所述视角信息和所述终端设备的标识信息。
  14. 一种电子设备,其特征在于,包括处理器和存储器,所述存储器中存储有指令,所述处理器运行所述指令时,使得所述处理器执行权利要求1-8任一项所述的方法,或者执行权利要求9-11任一项所述的方法。
  15. 一种计算机存储介质,其特征在于,包括指令,当其在计算机上运行时,使得所述计算机执行权利要求1-8任一项所述的方法,或者执行权利要求9-11任一项所述的方法。
PCT/CN2023/100759 2022-09-19 2023-06-16 一种数据传输的方法、装置、电子设备及存储介质 WO2024060719A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211139880.1A CN117768669A (zh) 2022-09-19 2022-09-19 一种数据传输的方法、装置、电子设备及存储介质
CN202211139880.1 2022-09-19

Publications (1)

Publication Number Publication Date
WO2024060719A1 true WO2024060719A1 (zh) 2024-03-28

Family

ID=90318589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/100759 WO2024060719A1 (zh) 2022-09-19 2023-06-16 一种数据传输的方法、装置、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN117768669A (zh)
WO (1) WO2024060719A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107529064A (zh) * 2017-09-04 2017-12-29 北京理工大学 一种基于vr终端反馈的自适应编码方法
US20190246104A1 (en) * 2016-10-26 2019-08-08 Autel Robotics Co., Ltd. Panoramic video processing method, device and system
CN110519652A (zh) * 2018-05-22 2019-11-29 华为软件技术有限公司 Vr视频播放方法、终端及服务器
CN114651449A (zh) * 2020-04-26 2022-06-21 华为技术有限公司 一种流媒体参数动态自适应网络的调整方法及装置
CN115022546A (zh) * 2022-05-31 2022-09-06 咪咕视讯科技有限公司 全景视频传输方法、装置、终端设备以及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190246104A1 (en) * 2016-10-26 2019-08-08 Autel Robotics Co., Ltd. Panoramic video processing method, device and system
CN107529064A (zh) * 2017-09-04 2017-12-29 北京理工大学 一种基于vr终端反馈的自适应编码方法
CN110519652A (zh) * 2018-05-22 2019-11-29 华为软件技术有限公司 Vr视频播放方法、终端及服务器
CN114651449A (zh) * 2020-04-26 2022-06-21 华为技术有限公司 一种流媒体参数动态自适应网络的调整方法及装置
CN115022546A (zh) * 2022-05-31 2022-09-06 咪咕视讯科技有限公司 全景视频传输方法、装置、终端设备以及存储介质

Also Published As

Publication number Publication date
CN117768669A (zh) 2024-03-26

Similar Documents

Publication Publication Date Title
US11770594B2 (en) 360-degree video delivery over next generation network
JP5746392B2 (ja) モバイルデバイスからワイヤレスディスプレイにコンテンツを送信するシステムおよび方法
US11368731B2 (en) Method and apparatus for segmenting data
JP2020519094A (ja) ビデオ再生方法、デバイス、およびシステム
JP6663437B2 (ja) Mmtpストリームをmpeg−2 tsに変換する方法及び装置
JP2018509060A5 (zh)
US11956159B2 (en) Transmission device, transmission method, reception device, and reception method
JP7063985B2 (ja) 上りリンクストリーミング向けのネットワーク支援
CN108574816B (zh) 一种视联网终端以及基于视联网终端的通信方法、装置
JP2015138990A (ja) 受信装置、送信装置及び通信システム
US10165311B2 (en) Non-transitory computer-readable recording medium having program recorded therein for providing network-adaptive content and apparatus for providing network-adaptive content
JP2014513452A (ja) デジタルテレビ技術を実施するための方法およびWi−Fiホットスポット装置
WO2024060719A1 (zh) 一种数据传输的方法、装置、电子设备及存储介质
JP5376350B2 (ja) チャネル切替方法、デバイス、およびシステム
WO2022206016A1 (zh) 一种数据分层传输方法、装置及系统
WO2018171567A1 (zh) 播放媒体流的方法、服务器及终端
US11265357B2 (en) AV1 codec for real-time video communication
KR101883554B1 (ko) Mmt-기반 방송을 위한 시그널 메시지 송출 스케줄링 방법
US11558776B2 (en) Devices and system for transmitting and receiving compressed bitstream via wireless stream and handling transmission error
EP4319176A1 (en) Method for transmitting streaming media data and related device
KR101823377B1 (ko) 시점 예측에 따라 동영상을 제공하는 미디어 서버
CN117956170A (zh) 一种数据传输的方法、装置、电子设备及存储介质
WO2024088599A1 (en) Transporting multimedia immersion and interaction data in a wireless communication system
WO2024088600A1 (en) Split-rendering configuration for multimedia immersion and interaction data in a wireless communication system