CN115037958B - MEC collaborative transmission VR video method based on 5G network - Google Patents
MEC collaborative transmission VR video method based on 5G network Download PDFInfo
- Publication number
- CN115037958B CN115037958B CN202210281030.9A CN202210281030A CN115037958B CN 115037958 B CN115037958 B CN 115037958B CN 202210281030 A CN202210281030 A CN 202210281030A CN 115037958 B CN115037958 B CN 115037958B
- Authority
- CN
- China
- Prior art keywords
- mec
- video
- home
- view
- code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/231—Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
- H04N21/23106—Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion involving caching operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/239—Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
- H04N21/2393—Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
- H04N21/43637—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/647—Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
- H04N21/64746—Control signals issued by the network directed to the server or the client
- H04N21/64761—Control signals issued by the network directed to the server or the client directed to the server
- H04N21/64769—Control signals issued by the network directed to the server or the client directed to the server for rate control
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Abstract
The utility model discloses a method for cooperatively transmitting VR video based on MEC of a 5G network, and relates to the technical field of VR video transmission. The utility model provides an MEC cooperation domain, a plurality of base stations are divided into a cooperation area according to regional characteristics, and MECs in the cooperation domain cooperate with each other to jointly finish the processing of user request content in the cooperation domain. A full view low code rate version of VR video is stored in one collaborative domain, and each MEC independently operates a caching strategy for high code rate tile blocks. Specifically, whether to cache the high code rate version of the tile block is determined according to the request frequency of the FOV of the tile block by the user in the jurisdiction and the preference similarity of the FOV and the user in the jurisdiction. The utility model overcomes the limitation of the computing capability and the storage capability of a single MEC, processes the user request together with a plurality of MECs, fully utilizes the computing and storage resources of the MECs, improves the maximum number of users which can be served, reduces the average content request time delay of the users and reduces the flow of a return link.
Description
Technical Field
The utility model relates to the technical field of VR video transmission, in particular to a method for transmitting VR video cooperatively by MEC (media access control) based on a 5G network.
Background
VR video transmission mainly has two technical routes: transmission schemes based on full View and based on Field of View (FOV).
The full view transmission scheme is to transmit the whole 360-degree surrounding picture to the terminal, and when the user rotates the head to switch the picture, all the processing is completed locally at the terminal. VR video is at the same monocular visual resolution, the code rate is much larger than that of ordinary planar video due to frame rate, quantization level, 360 ° surround, etc., the former is typically 5-10 times that of the latter, which is a great challenge for transmission and greatly increases the cost.
However, when a user views a video, the user has a certain viewing angle limitation and cannot view the whole content, so that the full-view transmission scheme has a great waste of bandwidth.
The FOV transmission scheme mainly transmits a visible picture in the current viewing angle. Generally, a 360-degree panoramic view is divided into a plurality of views, each view generates a video file, the video file only contains high-resolution visual information in the view and low-resolution visual information in surrounding parts, and a terminal requests a corresponding view file from a server according to the current view gesture position of a user. Specifically, such as TWS (Tile Wise Streaming), block-based adaptive transmission) scheme.
The utility model discloses a Cloud VR video live broadcast system based on 5G and MEC, wherein the publication date is 2019, 9, 20, publication number is CN110266664A, and the name is a 'Cloud VR video live broadcast system based on 5G and MEC', and the content layer comprises a VR video shooting system for providing real-time VR video content to the platform layer; the platform layer comprises a VR live broadcast system, which is used for importing, transcoding, slicing and streaming VR videos in real time in the Cloud VR video live broadcast; the network layer is based on an operator 5G/4G wireless network MEC system and is used for providing stable transmission for the Cloud VR video service and realizing data interaction between the platform layer and the terminal layer and between the platform layer and the content layer; the terminal layer comprises a Cloud VR terminal for realizing VR video content presentation, network access and user authentication functions, accessing the MEC system through the network layer, and directly accessing VR video streams in the Cloud platform layer or the MEC system through the local shunting function.
In the prior art, the MEC server is deployed near the base station, and local support is provided for the user by providing computing and storage resources at the edge of the network, so as to improve the user experience quality. However, the computing power and storage power of a single MEC server are very limited compared to the cloud center, and in order to reduce the backhaul link traffic and better promote the QoE (quality of experience ) of the user, the present application provides a method for MEC cooperative transmission VR video based on a 5G network.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the utility model provides a method for transmitting VR video cooperatively by MEC based on a 5G network, which aims to reduce the flow of a backhaul link and better promote the QoE of a user. In the utility model, an MEC cooperation domain is provided, a plurality of base stations are divided into a cooperation area according to regional characteristics, and MECs in the cooperation domain cooperate with each other to jointly finish the processing of user request content in the cooperation domain. A full view low code rate version of VR video is stored in one collaborative domain, and each MEC independently operates a caching strategy for high code rate tile blocks. Specifically, whether to cache the high code rate version of the tile block is determined according to the request frequency of the FOV of the tile block by the user in the jurisdiction and the preference similarity of the FOV and the user in the jurisdiction. The utility model overcomes the limitation of the computing capability and the storage capability of a single MEC, processes the user request together with a plurality of MECs, fully utilizes the computing and storage resources of the MECs, improves the maximum number of users which can be served, reduces the average content request time delay of the users and reduces the flow of a return link.
In order to solve the problems in the prior art, the utility model is realized by the following technical scheme.
A method for MEC collaborative transmission VR video based on a 5G network, the method comprising the steps of:
s1, a central cloud distributes VR video content to MEC cooperation domains, wherein the MEC cooperation domains are formed by dividing wireless access point connection base stations of a plurality of 5G forwarding networks into a cooperation area according to regional characteristics, deploying an MEC server near the wireless access point connection base stations of each 5G forwarding network, and establishing communication connection among the MEC servers in the cooperation area to form the MEC cooperation domains;
s2, MEC servers in the MEC cooperation domain cooperate with each other to finish processing the user request field angle; the method specifically comprises the following steps:
s201, a VR terminal device tracks the motion trail of the head of a user, and when the view angle of the user changes, a high-code-rate tile block set corresponding to a new view angle formed by a new view angle and a low-code-rate full-view angle video file corresponding to the new view angle are required to be requested; if the video file required by the new view angle is missing in the local cache of the VR terminal device, entering S202;
s202, the VR terminal device sends a video request corresponding to the view angle to the Home MEC, and the video request is divided into two types according to whether the VR terminal device partially caches high-code-rate tile corresponding to the view angle: 1) There is no partial buffering: the content of the request contained in the video request is all high-code-rate tile blocks corresponding to the view angle and all low-code-rate full-view video corresponding to the view angle; 2) There is a partial cache: the video request comprises a high code rate tile block which is missing by the VR terminal device;
if the Home MEC stores the information corresponding to the video request, directly responding to the VR terminal device;
if the information corresponding to the video request is not stored in the Home MEC, the Home MEC sends the video request to other MECs in the MEC cooperation domain;
if the other MECs store the information corresponding to the video request, the information is responded to the Home MEC, and after the Home MEC receives the information, the information is responded to the VR terminal device;
if the MEC servers in the MEC collaboration domain do not store the information corresponding to the video request, the Home MEC initiates the video request to the central cloud, and the information corresponding to the video request is obtained and responded to the VR terminal device;
when the Home MEC acquires the video information corresponding to the request from other MEC servers or central clouds in the MEC collaborative domain, the Home MEC caches the acquired video information; when the Home MEC is insufficient in cache, the popularity of the currently requested video information and the popularity of the cached video information need to be calculated, and whether to delete the cached video information and cache the currently requested video information is determined according to the popularity;
s203, home MEC prepares a video requested by VR terminal equipment through the step S202, and then responds to the VR terminal equipment; meanwhile, the Home MEC pushes high-code-rate tile blocks corresponding to other view angles with more association requests of the video information of the currently requested view angle to the VR terminal device;
s204, after the VR terminal equipment prepares a high-code-rate tile block set and a low-code-rate full-view video file required by the current view angle, decoding the low-code-rate full-view video file, independently decoding each high-code-rate tile block, and then executing rendering and playing operations; for the high code rate tile block pushed by the Home MEC, the VR terminal device adopts a FIFO buffer strategy for storage.
Further, in the step S1, if the central cloud learns the preference of the user of the MEC collaboration domain for the VR video type, the high code rate tile set corresponding to the hotspot view angle in the relevant VR video and the low code rate full view video thereof are issued to the collaboration domain according to the preference of the user of the MEC collaboration domain.
Furthermore, the center cloud issues a high-code-rate tile block set corresponding to a hot spot view angle in the most popular VR video and a low-code-rate full-view video thereof to the collaboration domain according to the popularity of the full-network VR video under the condition that the user in the MEC collaboration domain does not have the preference.
In step S201, if a set of high-bitrate tile blocks corresponding to a new view angle is stored in a local buffer of the VR terminal device, the low-bitrate full-view video file is directly decoded, each high-bitrate tile block is independently decoded, and then rendering and playing operations are performed.
Further, in step S202, if it is determined to cache the currently requested video information after popularity comparison, popularity of the cached video information corresponding to each view angle is compared, and a high-bitrate tile set corresponding to a portion of the view angles with the lowest popularity and a low-bitrate full view video corresponding to the portion of the view angles are deleted from the cache of the Home MEC until the cache space is enough to store the currently requested video information.
Further, in step S202, if the Home MEC is the low-bitrate full view video corresponding to the requested view angle obtained from the central cloud, the low-bitrate full view video obtained by the Home MEC needs to be cached; if the Home MEC buffer space is sufficient, directly buffering the low-code-rate full view video on the current Home MEC; if the cache space is insufficient, deleting the cached video information corresponding to the view angle with lower popularity from the cache of the Home MEC according to the popularity of the video corresponding to the view angle, and then storing the acquired low-code-rate full-view video; and deleting the low-code-rate full-view video from the MEC collaborative domain when the corresponding view angle of the low-code-rate full-view video does not exist in the MEC collaborative domain.
When deleting the video corresponding to the view angle, if a high code rate tile block needed by the view angle which belongs to a certain cache table or is to be cached exists in the high code rate tile block set in the deleted view angle, the high code rate tile block is reserved.
Angle of viewPopularity of (2)By its similarity to user preferences in the Home MEC regionAnd the access frequency in its most recent time slot tAre obtained together; the popularity calculation formula is as follows:
in calculating similarityWhen the preference of the user in the Home MEC area is expressed as a keyword vector,Weight, angle of field representing keywordsExpressed as vectors,Representing keywordsAngle of viewIs of importance of (a); specifically, the calculation formula of the similarity is:
(3) The method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the utility model,representing the angle of view on Home MECThe number of times that it is requested,representing the total number of times the view angle is requested on the Home MEC.
Further, the access frequency in the last time slot t of the newly acquired view angle on Home MECIs set to 1.
Compared with the prior art, the beneficial technical effects brought by the utility model are as follows:
1. the utility model overcomes the limitation of the computing capability and the storage capability of a single MEC, processes the user request together with a plurality of MECs, fully utilizes the computing and storage resources of the MECs, improves the maximum number of users which can be served, reduces the average content request time delay of the users and reduces the flow of a return link.
2. According to the utility model, the tile block is used as a basic unit to store the VR video file, so that the storage space of the MEC is saved. Since the field angle FOVs may have overlapping regions, meaning that they have partially identical high code rate tile blocks, in the present utility model, only one copy of these high code rate tile blocks need to be stored on the Home MEC of the user, which saves more storage space than storing a single FOV independently.
3. The utility model further pushes the FOV of the next field angle which is possibly accessed by the user to the terminal equipment, thereby reducing the average content request time delay of the user.
4. The utility model provides the method for determining whether to cache the field angle FOV according to the popularity of the field angle FOV, and compared with the traditional caching algorithm, the method can more fully utilize MEC storage space and improve cache hit rate.
Drawings
FIG. 1 is an end-to-end system frame diagram of a 360 ° panoramic VR video of the present utility model;
fig. 2 is a view of a MEC cooperative transmission VR video architecture model based on a 5G network according to the present utility model.
Detailed Description
The technical scheme of the utility model is further elaborated below by combining the specification, the drawings and the specific embodiments. It will be apparent that the described embodiments are only some, but not all, embodiments of the utility model. All other embodiments, based on the examples herein, which are within the scope of the utility model, will be within the purview of one of ordinary skill in the art without the exercise of inventive faculty.
Referring to fig. 1 and 2 of the specification, the embodiment provides a method for transmitting VR video cooperatively by MEC based on a 5G network, which specifically includes the following steps:
s1, a central cloud distributes VR video content to MEC cooperation domains, wherein the MEC cooperation domains are formed by dividing wireless access point connection base stations of a plurality of 5G forwarding networks into a cooperation area according to regional characteristics, deploying an MEC server near the wireless access point connection base stations of each 5G forwarding network, and establishing communication connection among the MEC servers in the cooperation area to form the MEC cooperation domains;
s2, MEC servers in the MEC cooperation domain cooperate with each other to finish processing the user request field angle; the method specifically comprises the following steps:
s201, a VR terminal device tracks the motion trail of the head of a user, and when the view angle of the user changes, a high-code-rate tile block set corresponding to a new view angle formed by a new view angle and a low-code-rate full-view angle video file corresponding to the new view angle are required to be requested; if the video file required by the new view angle is missing in the local cache of the VR terminal device, entering S202;
s202, the VR terminal device sends a video request corresponding to the view angle to the Home MEC, and the video request is divided into two types according to whether the VR terminal device partially caches high-code-rate tile corresponding to the view angle: 1) There is no partial cache: the content of the request contained in the video request is all high-code-rate tile blocks corresponding to the view angle and all low-code-rate full-view video corresponding to the view angle; 2) There is a partial cache: the video request comprises a high code rate tile block which is missing by the VR terminal device;
if the Home MEC stores the information corresponding to the video request, directly responding to the VR terminal device;
if the information corresponding to the video request is not stored in the Home MEC, the Home MEC sends the video request to other MECs in the MEC cooperation domain;
if the other MECs store the information corresponding to the video request, the information is responded to the Home MEC, and after the Home MEC receives the information, the information is responded to the VR terminal device;
if the MEC servers in the MEC collaboration domain do not store the information corresponding to the video request, the Home MEC initiates the video request to the central cloud, and the information corresponding to the video request is obtained and responded to the VR terminal device;
when the Home MEC acquires the video information corresponding to the request from other MEC servers or central clouds in the MEC collaborative domain, the Home MEC caches the acquired video information; when the Home MEC cache space is insufficient, the popularity of the currently requested video information and the popularity of the cached video information need to be calculated, and whether to delete the cached video information and cache the currently requested video information is determined according to the popularity;
s203, home MEC prepares a video requested by VR terminal equipment through the step S202, and then responds to the VR terminal equipment; meanwhile, the Home MEC pushes high-code-rate tile blocks corresponding to other view angles with more association requests of the video information of the currently requested view angle to the VR terminal device;
s204, after the VR terminal device prepares a high-code-rate tile block set and a low-code-rate full view video file required by the current view angle, decoding the low-code-rate full view video file, independently decoding each high-code-rate tile block, and then executing rendering and playing operations; for the high code rate tile block pushed by the Home MEC, the VR terminal device adopts a FIFO buffer strategy for storage.
As an implementation manner of this embodiment, as shown in fig. 2, the entire transmission architecture is composed of four parts including a central cloud, a 5G network, an MEC collaboration domain, and a user VR terminal device, and functions of the four parts are described below.
Center cloud: after the VR video service part synthesis computing power sinks to the network edge, the central cloud is mainly responsible for the earlier production and distribution of VR videos, including video stitching, VR video mapping and encoding, and distribution of encoded content to MECs. And dividing the spliced and stitched 360-degree panoramic video into segment sets with the duration of 1 second. Each video segment is projected to a two-dimensional plane in an ERP mode, and then the two-dimensional plane video is divided into N x M independent coding and decoding blocks according to rows and columns, which are hereinafter referred to as tile blocks. The field angle FOV of the user, i.e., the region of the picture where the eyes are focused, is composed of a plurality of tile blocks. Since the user only pays attention to the picture in the current FOV at one time and does not pay special attention to the picture outside the FOV, we use high code rate transmission for the multiple tile blocks constituting the FOV and low code rate transmission for the multiple tile blocks constituting the picture outside the FOV, and for simplicity, use full view video transmitting the whole low code rate to serve as the peripheral background image of the current FOV of the user. In addition, the FOV of the user is not fixed, and the region of the screen where each tile is located may become the FOV of the user. Thus, a high code rate version is prepared in the utility model for each tile block. Because the delay of transmitting the whole low-rate full-view video is almost the same as the delay of transmitting the high-rate tile block of one FOV, even the former may be lower than the latter, so that the central cloud only distributes one low-rate full-view video to the MEC collaboration domain enough to support the request of the whole collaboration domain.
5G network: the 5G network can provide larger downlink bandwidth for the VR video system, and can package and send video contents of a plurality of peripheral viewpoints at the same time when the current VR video watching viewpoint is transmitted in a downlink mode. The VR video transmission network based on the 5G network may be structurally divided into three parts, a forwarding network, a backhaul network, and a core network. The forwarding network is composed of a plurality of wireless access node connection Base Stations (BS) for solving the last mile problem. Typically a base station is responsible for a small area (e.g. several residential and commercial areas) where users will establish a connection with the base station when using mobile surfing. This base station is also called Home BS for these users in our utility model. The backhaul network is formed by a number of base stations connected to a mobile switching node through which access to the 5G core network is possible and which is further connected to other networks, such as the internet. The forward and return networks together form a radio access network (Radio Access Network, RAN) in the 5G network. Large data centers (also called cloud nodes) are typically configured with massive amounts of storage space and computing resources and deployed near core switching nodes of backbone networks, such as the interface of 5G core networks with the internet, for better performance and to serve more users.
MEC collaboration domain: in a conventional RAN network, when a user requests a video, the data will span multiple networks including a forwarding network, a backhaul network, a core network, the internet, etc. The disadvantage of this mode is that: 1) The user will wait longer and be unfriendly to time-sensitive applications; 2) The radio access network, and in particular the backhaul network therein, will face a large traffic pressure. One solution is to deploy MEC servers in the vicinity of base stations, which by providing computing and storage resources at the edge of the network will provide local support for users, thereby improving user quality of experience (Quality of Experience, qoE). However, the computing power and storage power of a single MEC server is very limited compared to the cloud center, and we propose the concept of a MEC collaboration domain in order to reduce backhaul link traffic and better promote user QoE. Dividing a plurality of base stations into a cooperation area according to regional characteristics, and the cooperation area is called as a cooperation area for short. MECs in one collaborative domain cooperate with each other to jointly complete processing of user request content in the collaborative domain. One MEC has the computing power of the codec and the storage power of caching hot content. A full view low code rate version of VR video is stored in one collaborative domain, and each MEC independently operates a caching strategy for high code rate tile blocks. Specifically, whether to cache the high code rate version of the tile block is determined according to the request frequency of the FOV of the tile block by the user in the jurisdiction and the preference similarity of the FOV and the user in the jurisdiction.
VR video terminal: the VR video terminal comprises a Wi-Fi module, a video frame processing module, a playing operation control module, a display processing module and the like, and has the capabilities of decoding, gesture sensing, motion trail prediction, real-time model rendering, presentation and the like. At present, the main stream resolution of VR video is 4K, but the high-quality VR video resolution is at least more than 8K and can reach 30K higher. Therefore, the typical monocular resolution of the terminal needs to reach 2K and above, the hardware needs to support 8K decoding capability, the terminal head motion sensing delay should be less than 20ms, and meanwhile, the terminal head motion sensing delay also needs to have high-performance communication capability. In addition, the VR video terminal further has a certain storage capability, and is configured to receive the high-bitrate tile corresponding to the FOV and the low-bitrate full-view video corresponding to the FOV, which may be accessed immediately, from the home MEC.
As an implementation manner of this embodiment, as shown in fig. 2, the operation flow of the MEC cooperative transmission VR video architecture is mainly divided into two parts: the first part is that when the collaboration system is initially operated, the center cloud distributes the processed video content to the MEC collaboration domain in advance; the second part is that MEC cooperation fields cooperate to complete the processing of the user request FOV.
The first part, the flow of the central cloud distributing video content to the MEC collaboration domain is as follows:
if the center cloud learns the preferences of users of the MEC collaborative domain on the VR video types, a hot spot FOV and a low-code-rate full-view video in the related VR video are issued to the MEC collaborative domain according to the preferences of the users of the MEC collaborative domain;
if the center cloud does not have the favorite preferences of the users in the MEC collaborative domain, the hot spot FOV in the most popular VR video and the low-bitrate full-view video thereof are issued to the collaborative domain according to the popularity of the full-network VR video.
The second part, when the user requests a FOV, the transmission operation flow of the MEC collaboration domain is as follows:
the VR terminal device tracks the motion trajectory of the user's head, and when the user's viewing angle changes, a FOV corresponding to the new viewing angle is required. In this embodiment, if a certain high-bitrate tile exists in the buffer of the VR terminal device, the low-bitrate full-view video of the FOV to which the tile belongs is buffered at the same time. Therefore, if the high code rate tile block set T corresponding to the required FOV is already stored in the local buffer, the low code rate full view video file is directly decoded and each tile block is independently decoded, and then rendering and playing operations are performed.
If any tile block in the set T is not stored on the Home MEC, this also means that the low-rate full view video corresponding to the FOV is not stored on the Home MEC. At this time, the VR terminal device sends a video request corresponding to the view angle to the Home MEC, where the information corresponding to the video request includes a high code rate tile set corresponding to the view angle and a low code rate full view video corresponding to the FOV. Otherwise, the high code rate tile missing in the set T needs to be requested from the Home MEC. And the VR terminal device sends one of the two types of video requests corresponding to the field angle to the Home MEC.
If the Home MEC stores the information corresponding to the video request, directly responding to the VR terminal device;
if the information corresponding to the video request is not stored in the Home MEC, the Home MEC sends the video request to other MECs in the MEC cooperation domain; if the other MECs store the information corresponding to the video request, the information is responded to the Home MEC, and after the Home MEC receives the information, the information is responded to the VR terminal device; if the MEC servers in the MEC collaboration domain do not store the information corresponding to the video request, the Home MEC initiates the video request to the central cloud, obtains the information corresponding to the video request and responds to the VR terminal device.
The Home MEC prepares a video file requested by the VR terminal device through the process, and then responds to the VR terminal device; meanwhile, the Home MEC pushes high-code-rate tile blocks corresponding to other view angles with more association requests of the video information of the currently requested view angle to the VR terminal device; to reduce subsequent transmission bandwidth consumption.
After preparing a high-code-rate tile block set and a low-code-rate full view video file required by the current view angle, VR terminal equipment decodes the low-code-rate full view video file and independently decodes each high-code-rate tile block, and then performs rendering and playing operations; for the high code rate tile block pushed by the Home MEC, the VR terminal device adopts a FIFO buffer strategy for storage. If all the tile blocks with high code rate corresponding to the full view video with low code rate are removed from the buffer, the full view video with low code rate also needs to be removed.
As an implementation manner of this embodiment, when the Home MEC obtains the video information corresponding to the request from another MEC server or the central cloud in the MEC collaboration domain, if the cache space of the Home MEC can accommodate the high code rate tile, the Home MEC directly caches the video information obtained by the Home MEC.
When the Home MEC has insufficient buffering space, the popularity of the currently requested video information and the popularity of the buffered video information need to be calculated, and whether to delete the buffered video information and buffer the currently requested video information is determined according to the popularity.
And if the popularity comparison is carried out, determining to cache the currently requested video information, comparing the popularity of the cached video information corresponding to each view angle, deleting a part of high-code-rate tile block set corresponding to the view angle with the lowest popularity and the corresponding low-code-rate full-view video from the cache of the Home MEC until the cache space is enough to store the currently requested video information.
If the Home MEC is the low-code-rate full-view video corresponding to the requested view angle obtained from the central cloud, caching the obtained low-code-rate full-view video; if the Home MEC buffer space is sufficient, directly buffering the low-code-rate full view video on the current Home MEC; if the cache space is insufficient, deleting the cached video information corresponding to the view angle with lower popularity from the cache of the Home MEC according to the popularity of the video corresponding to the view angle, and then storing the acquired low-code-rate full-view video; and deleting the low-code-rate full-view video from the MEC collaborative domain when the corresponding view angle of the low-code-rate full-view video does not exist in the MEC collaborative domain.
When deleting the video corresponding to the view angle, if a high code rate tile block needed by the view angle which belongs to a certain cache table or is to be cached exists in the high code rate tile block set in the deleted view angle, the high code rate tile block is reserved.
As a further implementation of the present embodiment, the angle of viewPopularity of (2)By its similarity to user preferences in the Home MEC regionAnd the access frequency in its most recent time slot tAre obtained together; the popularity calculation formula is as follows:
in calculating similarityWhen the preference of the user in the Home MEC area is expressed as a keyword vector,Weight, angle of field representing keywordsExpressed as vectors,Representing keywordsAngle of viewIs of importance of (a); specifically, the calculation formula of the similarity is:
(3) The method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the utility model,representing the angle of view on Home MECThe number of times that it is requested,representing the total number of times the view angle is requested on the Home MEC.
To avoid the unfriendly access frequency to the newly arrived FOV on MEC, the access frequency of the last slot t of the newly arrived FOV is not calculated using equation 3, but is set to 1, that is, its access popularity depends only on the distance from the user's preference keyword vector.
Claims (8)
1. A method for MEC cooperative transmission of VR video based on a 5G network, comprising the steps of:
s1, a central cloud distributes VR video content to MEC cooperation domains, wherein the MEC cooperation domains are formed by dividing wireless access point connection base stations of a plurality of 5G forwarding networks into a cooperation area according to regional characteristics, deploying an MEC server near the wireless access point connection base stations of each 5G forwarding network, and establishing communication connection among the MEC servers in the cooperation area to form the MEC cooperation domains;
s2, MEC servers in the MEC cooperation domain cooperate with each other to finish processing the user request field angle; the method specifically comprises the following steps:
s201, a VR terminal device tracks the motion trail of the head of a user, and when the view angle of the user changes, a high-code-rate tile block set corresponding to a new view angle formed by a new view angle and a low-code-rate full-view angle video file corresponding to the new view angle are required to be requested; if the video file required by the new view angle is missing in the local cache of the VR terminal device, entering S202;
s202, the VR terminal device sends a video request corresponding to the view angle to the Home MEC, and the video request is divided into two types according to whether the VR terminal device partially caches high-code-rate tile corresponding to the view angle: 1) There is no partial cache: the content of the request contained in the video request is all high-code-rate tile blocks corresponding to the view angle and all low-code-rate full-view video corresponding to the view angle; 2) There is a partial cache: the video request comprises a high code rate tile block which is missing by the VR terminal device;
if the Home MEC stores the information corresponding to the video request, directly responding to the VR terminal device;
if the information corresponding to the video request is not stored in the Home MEC, the Home MEC sends the video request to other MECs in the MEC cooperation domain;
if the other MECs store the information corresponding to the video request, the information is responded to the Home MEC, and after the Home MEC receives the information, the information is responded to the VR terminal device;
if the MEC servers in the MEC collaboration domain do not store the information corresponding to the video request, the Home MEC initiates the video request to the central cloud, and the information corresponding to the video request is obtained and responded to the VR terminal device;
when the Home MEC acquires the video information corresponding to the request from other MEC servers or central clouds in the MEC collaborative domain, the Home MEC caches the acquired video information; when the Home MEC cache space is insufficient, the popularity of the currently requested video information and the popularity of the cached video information need to be calculated, and whether to delete the cached video information and cache the currently requested video information is determined according to the popularity;
s203, home MEC prepares a video requested by VR terminal equipment through the step S202, and then responds to the VR terminal equipment; meanwhile, the Home MEC pushes high-code-rate tile blocks corresponding to other view angles with more association requests of the video information of the currently requested view angle to the VR terminal device;
s204, after the VR terminal equipment prepares a high-code-rate tile block set and a low-code-rate full-view video file required by the current view angle, decoding the low-code-rate full-view video file, independently decoding each high-code-rate tile block, and then executing rendering and playing operations; for a high code rate tile block pushed by the Home MEC, the VR terminal device adopts a FIFO buffer strategy for storage;
in step S202, the angle of viewPopularity of->By its similarity to user preferences in the Home MEC regionAnd the access frequency in its last time slot t +.>Are obtained together; the popularity calculation formula is as follows:
in calculating similarityWhen the preference of the user in the Home MEC area is expressed as a keyword vector,/>Weight, angle of view of the representation keyword +.>Expressed as vectors,/>Representing keywords +.>Angle of view->Is of importance of (a); specifically, the calculation formula of the similarity is:
angle of viewThe access frequency in the last time slot t +.>The calculation formula of (2) is as follows:
2. The method for MEC cooperative transmission of VR video based on 5G network of claim 1, wherein: in the step S1, if the central cloud learns the preference of the user of the MEC collaboration domain for the VR video type, the high code rate tile set corresponding to the hotspot view angle in the relevant VR video and the low code rate full view video thereof are issued to the collaboration domain according to the preference of the user of the MEC collaboration domain.
3. The method for MEC cooperative transmission of VR video based on 5G network as set forth in claim 1 or 2, wherein: in the step S1, the center cloud issues a high-code-rate tile set corresponding to a hotspot view angle in the most popular VR video and a low-code-rate full-view video thereof to the collaboration domain according to the popularity of the full-network VR video under the condition that the user' S preference in the MEC collaboration domain is not available.
4. The method for MEC cooperative transmission of VR video based on 5G network of claim 1, wherein: in step S201, if the VR terminal device caches the high code rate tile block set corresponding to the new view angle, the VR terminal device directly decodes the low code rate full view video file and independently decodes each tile block, and then performs rendering and playing operations.
5. The method for MEC cooperative transmission of VR video based on 5G network of claim 1, wherein: in step S202, if the popularity comparison is performed, it is determined to cache the currently requested video information, then popularity of the cached video information corresponding to each view angle is compared, and a high-bitrate tile set corresponding to a portion of the view angles with the lowest popularity and a low-bitrate full view video corresponding to the portion of the view angles are deleted from the cache of the Home MEC until the cache space is enough to store the currently requested video information.
6. The method for the MEC collaborative transmission of VR video based on a 5G network of claims 1, 2, 4 or 5, wherein: in step S202, if the Home MEC is a low-bitrate full view video corresponding to the requested view angle obtained from the central cloud, the low-bitrate full view video obtained by the Home MEC needs to be cached; if the Home MEC buffer space is sufficient, directly buffering the low-code-rate full view video on the current Home MEC; if the cache space is insufficient, deleting the cached video information corresponding to the view angle with lower popularity from the cache of the Home MEC according to the popularity of the video corresponding to the view angle, and then storing the acquired low-code-rate full-view video; and deleting the low-code-rate full-view video from the MEC collaborative domain when the corresponding view angle of the low-code-rate full-view video does not exist in the MEC collaborative domain.
7. The method for MEC cooperative transmission of VR video based on 5G network of claim 6, wherein: when deleting the video corresponding to the view angle, if a high code rate tile block needed by the view angle which belongs to a certain cache table or is to be cached exists in the high code rate tile block set in the deleted view angle, the high code rate tile block is reserved.
8. The method for MEC cooperative transmission of VR video based on 5G network of claim 1, wherein: the access frequency in the last time slot of the newly acquired view angle on the Home MEC is set to 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210281030.9A CN115037958B (en) | 2022-03-22 | 2022-03-22 | MEC collaborative transmission VR video method based on 5G network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210281030.9A CN115037958B (en) | 2022-03-22 | 2022-03-22 | MEC collaborative transmission VR video method based on 5G network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115037958A CN115037958A (en) | 2022-09-09 |
CN115037958B true CN115037958B (en) | 2023-06-23 |
Family
ID=83119086
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210281030.9A Active CN115037958B (en) | 2022-03-22 | 2022-03-22 | MEC collaborative transmission VR video method based on 5G network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115037958B (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107909108B (en) * | 2017-11-15 | 2021-06-11 | 东南大学 | Edge cache system and method based on content popularity prediction |
CN110266664B (en) * | 2019-06-05 | 2021-07-09 | 中国联合网络通信有限公司广州市分公司 | Cloud VR video live broadcast system based on 5G and MEC |
CN110730471B (en) * | 2019-10-25 | 2022-04-01 | 重庆邮电大学 | Mobile edge caching method based on regional user interest matching |
CN113014961A (en) * | 2019-12-19 | 2021-06-22 | 中兴通讯股份有限公司 | Video pushing and transmitting method, visual angle synchronizing method and device and storage medium |
CN111586191B (en) * | 2020-05-25 | 2022-08-19 | 安徽大学 | Data cooperation caching method and system and electronic equipment |
-
2022
- 2022-03-22 CN CN202210281030.9A patent/CN115037958B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN115037958A (en) | 2022-09-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jedari et al. | Video caching, analytics, and delivery at the wireless edge: A survey and future directions | |
Mahzari et al. | Fov-aware edge caching for adaptive 360 video streaming | |
Bastug et al. | Toward interconnected virtual reality: Opportunities, challenges, and enablers | |
Sun et al. | Flocking-based live streaming of 360-degree video | |
Liu et al. | Rendering-aware VR video caching over multi-cell MEC networks | |
CN108521436B (en) | Mobile virtual reality transmission method and system based on terminal computing storage | |
CN103813213A (en) | Real-time video sharing platform and method based on mobile cloud computing | |
Khan et al. | A survey on mobile edge computing for video streaming: Opportunities and challenges | |
Garcia-Luna-Aceves et al. | Network support for ar/vr and immersive video application: A survey | |
CN112995636B (en) | 360-degree virtual reality video transmission system based on edge calculation and active cache and parameter optimization method | |
Song et al. | A fast FoV-switching DASH system based on tiling mechanism for practical omnidirectional video services | |
US20140189760A1 (en) | Method and system for allocating wireless resources | |
CN108769729B (en) | Cache arrangement system and cache method based on genetic algorithm | |
US20230188716A1 (en) | Viewport-based transcoding for immersive visual streams | |
CN101267541A (en) | A stream media distribution server applicable to online VoD or living broadcast | |
Yan et al. | Multipoint cooperative transmission for virtual reality in 5G new radio | |
CN114640870A (en) | QoE-driven wireless VR video self-adaptive transmission optimization method and system | |
CN110418194B (en) | Video distribution method and base station | |
Zhao et al. | A cloud-assisted DASH-based scalable interactive multiview video streaming framework | |
GB2568020A (en) | Transmission of video content based on feedback | |
US11310516B2 (en) | Adaptive bitrate algorithm with cross-user based viewport prediction for 360-degree video streaming | |
Yang et al. | Collaborative edge caching and transcoding for 360° video streaming based on deep reinforcement learning | |
Dai et al. | MAPCaching: A novel mobility aware proactive caching over C-RAN | |
Liu et al. | Joint EPC and RAN caching of tiled VR videos for mobile networks | |
CN115037958B (en) | MEC collaborative transmission VR video method based on 5G network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |