CN115314730A - Video streaming transmission method and device applied to virtual reality VR scene - Google Patents

Video streaming transmission method and device applied to virtual reality VR scene Download PDF

Info

Publication number
CN115314730A
CN115314730A CN202210958183.2A CN202210958183A CN115314730A CN 115314730 A CN115314730 A CN 115314730A CN 202210958183 A CN202210958183 A CN 202210958183A CN 115314730 A CN115314730 A CN 115314730A
Authority
CN
China
Prior art keywords
live broadcast
fov
client
unicast
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210958183.2A
Other languages
Chinese (zh)
Inventor
曾其妙
庄一嵘
梁洁
潘庆
陈麒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Telecom Corp Ltd
Original Assignee
China Telecom Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Telecom Corp Ltd filed Critical China Telecom Corp Ltd
Priority to CN202210958183.2A priority Critical patent/CN115314730A/en
Publication of CN115314730A publication Critical patent/CN115314730A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6408Unicasting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the application provides a video streaming transmission method and device applied to a Virtual Reality (VR) scene, and the method comprises the following steps: receiving a plurality of slice streams of a target live broadcast channel transmitted by a VR live broadcast server through a multicast network; a target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode, wherein the live broadcast request contains FOV visual angle information; selecting a target slice stream which accords with FOV visual angle information from a plurality of slice streams, and generating a unicast FOV video stream by combining the target slice streams; sending the unicast FOV video stream to the VR client. According to the technical scheme, the receiving amount of the VR client side to the slice streams can be reduced according to the FOV visual angle information of the VR client side, so that the time of the VR client side for receiving the VR scene videos is shortened, and the purpose of improving the user satisfaction degree is achieved.

Description

Video streaming transmission method and device applied to virtual reality VR scene
Technical Field
The application relates to the technical field of virtual reality, in particular to a video streaming transmission method and device applied to a Virtual Reality (VR) scene.
Background
With the continuous development of scientific technology, the virtual reality technology is developed rapidly, and at present, the virtual reality technology generally creates a full-field-angle VR scene video according to the acquired VR video materials so that a user can experience and watch the VR scene video through a VR client.
Because the range that people can see is limited, in order to promote the sense of immersion when the user watches the VR scene video, the VR client side often adopts to show the VR scene video with a full field angle in a certain field angle range. However, after receiving a VR scene video with a full view angle, a VR client only plays the VR scene video with a certain view angle, but does not play VR scene videos with other view angles, which causes a waste of network download resources and an increase of time delay for receiving the VR scene video.
Disclosure of Invention
To solve the technical problem, embodiments of the present application provide a video streaming method and apparatus applied to a virtual reality VR scene, a computer-readable storage medium, and an electronic device.
According to an aspect of an embodiment of the present application, there is provided a video streaming method applied to a virtual reality VR scene, where the method is applied to a video streaming gateway, and the method includes: receiving a plurality of slice streams of a target live broadcast channel transmitted by a VR live broadcast server through a multicast network; the target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode, and the live broadcast request contains FOV visual angle information; selecting a target slice stream from the plurality of slice streams that conforms to the FOV view information, the target slice stream combining to generate a unicast FOV video stream; sending the unicast FOV video stream to the VR client.
According to an aspect of an embodiment of the present application, a video streaming apparatus applied to a virtual reality VR scene is provided, including a receiving module configured to receive multiple slice streams of a target live broadcast channel delivered by a VR live broadcast server through a multicast network; the target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode, and the live broadcast request contains FOV visual angle information; a compositing module configured to select a target slice stream from the plurality of slice streams that conforms to the FOV view information, the target slice stream combining to generate a unicast FOV video stream; a sending module configured to send the unicast FOV video stream to the VR client.
In some embodiments of the present application, based on the foregoing solution, the receiving module is further configured to: receiving the live broadcast request sent by the VR client; the live broadcast request carries a URL address of the target live broadcast channel and the FOV view angle information; and accessing a multicast network corresponding to the VR live broadcast server according to the URL address.
In some embodiments of the present application, based on the foregoing solution, the receiving module is further configured to: receiving a unicast request sent by the VR client; forwarding the unicast request to a VR live broadcast server so that the VR live broadcast server inquires a trusted gateway list, and returning redirection service information to the video stream transmission gateway if the inquired trusted gateway list contains public network information of the video stream transmission gateway; the redirection service information comprises URL addresses corresponding to a plurality of live channels in the VR live server; and forwarding the redirection service information to the VR client so that the VR client initiates the live broadcast request according to the redirection service information.
In some embodiments of the present application, based on the foregoing solution, the video streaming apparatus applied to a virtual reality VR scene further includes: a registration module configured to send a registration request to the VR server; the registration request comprises public network information and a credible certificate corresponding to the video streaming gateway; and the VR server stores the public network information in a trusted gateway list corresponding to the VR server based on the trusted certificate.
In some embodiments of the present application, based on the foregoing scheme, the synthesis module is further configured to: judging whether the VR client accesses the multicast network for the first time; if so, initializing the received FOV visual angle information into default FOV visual angle information; selecting a target slice stream from the plurality of slice streams that conforms to the default FOV view information, the target slice stream being combined to generate a unicast FOV video stream.
In some embodiments of the present application, based on the foregoing scheme, the synthesis module is further configured to: removing an RTP packet header contained in the target slice stream and rewriting metadata information in the target slice stream to obtain a processed target slice stream; and combining the processed target slice streams to generate a video stream, and encapsulating the video stream by adopting a specified protocol to generate the unicast FOV video stream.
In some embodiments of the present application, based on the foregoing scheme, the synthesis module is further configured to: receiving visual angle change information sent by the VR client; reselecting a target slice stream from the plurality of slice streams according to the view change information to generate the unicast FOV video stream to be sent to the VR client.
According to an aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to execute a video streaming method applied to a virtual reality VR scene as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; a storage device to store one or more programs that, when executed by the one or more processors, cause the electronic device to implement the video streaming method applied to a virtual reality VR scene as described in the embodiments above.
In the technical scheme of the embodiment of the application, after the video stream transmission gateway receives a plurality of slice streams of a target live broadcast channel sent by a VR live broadcast server through a multicast network, because the target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode and contains FOV visual angle information in the live broadcast request, a target slice stream conforming to the FOV visual angle information is selected from the plurality of slice streams, a unicast FOV video stream is generated by combining the target slice streams, the unicast FOV video stream is sent to the VR client, so that the situation that part of the slice streams are selected to be combined into the unicast FOV video stream according to the requirement of the VR client is realized, the unicast FOV video stream is sent to the VR client in a unicast mode, the receiving amount of the slice streams by the VR client is reduced, and the time for the VR client to receive a VR scene video is shortened.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 is an architectural diagram of an implementation environment to which the present application relates.
Fig. 2 is a flowchart illustrating a video streaming method applied to a virtual reality VR scene according to an exemplary embodiment of the present application.
Fig. 3 is a flowchart illustrating a video streaming method applied to a virtual reality VR scene according to another exemplary embodiment of the present application.
Fig. 4 is a flowchart of another exemplary video streaming method applied to a virtual reality VR scene, which is proposed based on the embodiment shown in fig. 3.
Fig. 5 is a flow chart of step S220 in an example embodiment in the embodiment shown in fig. 2.
Fig. 6 is a flowchart illustrating a video streaming method applied to a virtual reality VR scene according to still another exemplary embodiment of the present application.
Fig. 7 is a flowchart illustrating a registration process in a video streaming method applied to a virtual reality VR scene according to an exemplary embodiment of the present application.
Fig. 8 is a flowchart illustrating a video streaming method applied to a virtual reality VR scene according to another exemplary embodiment of the present application.
Fig. 9 is a block diagram of a video streaming apparatus applied to a virtual reality VR scene according to an exemplary embodiment of the present application.
Fig. 10 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the embodiments of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It should be noted that: reference herein to "a plurality" means two or more. "and/or" describe the association relationship of the associated objects, meaning that there may be three relationships, e.g., A and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It should be noted that the present application relates to a VR (Virtual Reality) technology, which is a computer simulation system that can create and experience a Virtual Reality environment, and constructs a Virtual Reality environment from data for creating the Virtual Reality environment using a computer, so that a user can be immersed in the environment.
Fig. 1 is an architectural diagram of an exemplary virtual reality VR system. As shown in fig. 1, a virtual reality VR system 100 includes a VR live server 110, a video streaming gateway 120, and a VR client 130.
The VR live broadcast server 110 may create a full-field-angle VR scene video according to the acquired VR video material, and issue the full-field-angle VR scene video through a network; the video streaming gateway 120 is configured to establish a network connection between the VR live broadcast server and the VR client 130, so that the VR client 130 can obtain, through the network connection, the VR scene video delivered by the VR live broadcast server 110 for a user using the VR client 130 to experience and watch.
The VR live broadcast server 110 may be implemented as an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like.
The viewing angle corresponds to a viewing range, and the full viewing angle represents a surrounding viewing range of 360 °. The visual field range that human eyes can see is limited, and the VR client 130 usually takes a certain field angle range to show a part of the VR scene video with a full field angle, which is in line with the current field angle range, so as to improve the sense of immersion of the user when watching. And the showing portion of the VR scene video is changed accordingly when the user's current field angle range is changed. Therefore, when the VR client 130 receives a full-field-angle VR scene video and projects the VR scene video with respect to a certain field angle, not only is network resources wasted due to the reception of other unreprojected VR scene videos; and results in increased latency for receiving VR scene video, which in turn results in reduced user satisfaction.
It should be understood that fig. 1 above is only an architectural schematic diagram of an exemplary virtual reality VR system 100, and does not represent a limitation on the architecture of the virtual reality VR system 100. In a practical application scenario, the virtual reality VR system 100 may include different components from the architecture shown in fig. 1, such as more or less components than the architecture shown in fig. 1, which is not limited herein.
In order to solve the above technical problem, a video stream transmission method applied to a virtual reality VR scene is provided in the technical solution of the embodiment of the present application, and specifically refer to fig. 2. The method may be performed by the video streaming gateway 120 shown in fig. 1. The method at least comprises steps S210 to S230, and the following is detailed:
in step S210, a plurality of slice streams of a target live channel sent by the VR live server through the multicast network are received.
The VR live broadcast server is a server that provides live broadcast data services for VR clients, and may provide video stream data corresponding to a plurality of live broadcast channels to VR clients, for example. Each live channel can correspond to a plurality of slice streams respectively, so that different VR scene videos can be managed and issued by the VR live server conveniently, the selection range of a user for the VR scene videos can be expanded simultaneously, and the satisfaction degree of the user for a virtual reality VR system is improved. It should be noted that the slicing stream means that, in the process of constructing a VR scene video of a full field angle by a VR live broadcast server receiving a VR video material, the VR live broadcast server can divide the VR scene video into a plurality of slicing streams, so that VR scene videos at different field angles can be combined and constructed according to different slicing streams. The slice stream may also be referred to as a Tile stream. The number of slice streams and the range of the field angle may be specifically defined according to the service capability of the VR server, and is not limited herein.
In addition, the VR live broadcast server may issue the slice stream in a multicast manner in order to increase the rate at which the client receives the slice stream. Illustratively, the VR live broadcast server sets a corresponding multicast network for each live broadcast channel, and the VR live broadcast server issues a plurality of slice streams included in the corresponding live broadcast channel to the device connected to the multicast network through the multicast network, so as to prevent the device from receiving other content unrelated to the live broadcast channel.
In the embodiment of the application, after the video streaming gateway accesses the multicast network, a plurality of slice streams of a target live channel sent by the VR live broadcast server through the multicast network can be received.
The target live broadcast channel corresponds to a live broadcast request initiated by the VR client in a unicast mode, and the unicast mode comprises that the video streaming gateway returns a corresponding live broadcast video for the live broadcast request initiated by the VR client. Specifically, the VR client may carry information corresponding to a target live broadcast channel and FOV (Field of View) View information corresponding to the VR client in an initiated live broadcast request, so that after receiving the live broadcast request initiated by the VR client, the VR client accesses the corresponding target live broadcast channel for the live broadcast request, processes the acquired slice streams into video streams conforming to the FOV View information, and returns the video streams to the VR client for showing.
In step S220, a target slice stream that matches the FOV view information is selected from the plurality of slice streams, and a unicast FOV video stream is generated from the target slice streams in combination.
The FOV view angle information is a view angle range set by the VR client to improve the immersion of the user. And the unicast FOV video stream is a video stream returned by the video stream transmission gateway according to FOV view information carried by a live broadcast request initiated by the VR client in a unicast mode.
In an embodiment of the present application, the video streaming gateway may select a target slice stream that conforms to FOV view information from the plurality of slice streams, and combine the target slice streams to generate a unicast FOV video stream.
In an example, a unicast FOV video stream is generated by combining target slice streams, by first removing an RTP (Real-time Transport Protocol) header included in the target slice streams, and then rewriting metadata information in the target slice streams, a processed target Tile stream is obtained, and then a video stream is generated by combining the processed target slice streams, and a unicast FOV video stream is generated by encapsulating the video stream with a specific Protocol. The RTP header is used to indicate that the target slice stream is a transmission protocol corresponding to a multicast network, and the metadata information represents data information in a single slice stream, where the metadata information includes, but is not limited to, a video parameter set, a sequence parameter set, a picture parameter set, or an adaptive parameter set; a Protocol is specified for encapsulating the slice stream, for example, a RTSP (Real Time Streaming Protocol) Protocol is used to encapsulate the slice stream into a TS format.
In step S230, the unicast FOV video stream is sent to the VR client.
In an embodiment of the application, after the unicast FOV video stream is generated according to the FOV view information of the VR client, the unicast FOV video stream may be sent to the VR client, so that the VR client decodes the unicast FOV video stream after receiving the unicast FOV video stream, and projects a picture corresponding to the unicast FOV video stream according to the FOV view information.
Under the condition that a plurality of VR clients exist, the video streaming gateway can respectively generate corresponding unicast FOV videos according to FOV visual angle information carried in live broadcast requests initiated by the VR clients, and can respectively send different unicast FOV video streams to the corresponding VR clients.
In addition, the plurality of slice streams of the target live broadcast channel are introduced through the multicast network of the VR live broadcast server, so that the flow of a large network is saved, but the VR client in the WiFi network is easy to inhibit the rate of receiving the plurality of slice streams sent by the multicast network, so that the conversion from multicast to unicast is realized by introducing the video stream transmission gateway, and the flow is saved through the multicast network, and the use of the VR client in the WiFi scene is not influenced.
Through the embodiment, the video stream transmission gateway receives a plurality of slice streams of a target live channel issued by a VR live broadcast server through a multicast network according to a live broadcast request initiated by a VR client, processes the slice streams according to FOV visual angle information carried in the live broadcast request, and generates a unicast FOV video stream according to the combination of the processed slice streams to be sent to the VR client, so that the receiving number of the slice streams by the VR client is reduced on the premise of not influencing user experience, the time for the VR client to receive VR scene videos is further shortened, and the purpose of improving user satisfaction is achieved;
in addition, when the VR client selects different FOV visual angle information, the video streaming gateway side closest to the VR client can select slice streams conforming to the FOV visual angle information and combine the slice streams into a unicast FOV video stream, so that multicast is converted into unicast, the VR client does not need to interact with a VR live broadcast server any more, the slice streams conforming to new FOV visual angle information can be obtained, and the time for the VR client to receive VR scene videos is further shortened.
Referring to fig. 3, fig. 3 is a video streaming application to a virtual reality VR scene, according to another example embodiment. As shown in fig. 3, before step S210 in the embodiment shown in fig. 2, the method may further include steps S310 to S320, which are described in detail as follows:
in step S310, a live broadcast request sent by the VR client is received.
It should be noted that, in order to further improve the experience of the user on the virtual reality VR system, the VR live broadcast server may issue different VR scene videos on different live broadcast channels, so as to meet different requirements of the user. Therefore, in an embodiment of the present application, in order to determine a target live channel to be accessed by a VR client, a live request sent by the VR client may be received to determine a URL (Uniform Resource Locator) address of the target live channel. The URL address indicates an access address of the multicast network corresponding to the target live channel.
In an example, the VR client may directly select a URL address of the target live channel from a preset database, and add the URL address of the target live channel to the live request, so as to determine the URL address of the target live channel according to the live request, that is, the database may pre-store URL addresses corresponding to a plurality of live channels.
In another example, the VR client may establish a network connection with the VR live broadcast server through the video streaming gateway, and then obtain the URL address of the target live broadcast channel from the VR live broadcast server through the network connection, so as to add the URL address of the target live broadcast channel to the live broadcast request, thereby achieving the purpose of determining the URL address of the target live broadcast channel according to the live broadcast request.
In step S320, the multicast network corresponding to the VR live broadcast server is accessed according to the URL address.
In the embodiment of the application, after the URL address is determined, the multicast network corresponding to the VR live broadcast server can be accessed according to the URL address.
Through the embodiment, under the condition that the VR server comprises a plurality of live broadcast channels, the speed of the multicast network corresponding to the target live broadcast channel determined by the video streaming gateway is increased by carrying the URL address of the target live broadcast channel in the live broadcast request, and the accuracy of the target live broadcast channel determined by the video streaming gateway is increased.
Referring to fig. 4, fig. 4 is a flowchart illustrating a video streaming method applied to a virtual reality VR scene according to another exemplary embodiment. As shown in fig. 4, before step S310 in the embodiment shown in fig. 3, the method may further include steps S410 to S430, which are described in detail as follows:
in step S410, a unicast request sent by the VR client is received.
It should be noted that, if the local of the VR client does not store the URL address of each live channel of the VR live broadcast server, the VR client may send a unicast request to the video streaming gateway, so as to obtain the URL address of each live channel in the VR live broadcast server through the unicast request.
In step S420, the unicast request is forwarded to the VR live server.
In the embodiment of the application, after receiving a unicast request sent by a VR client, the unicast request is forwarded to a VR live broadcast server, so that the VR live broadcast server returns URL addresses of various live broadcast channels to the VR client.
The mode that the VR live broadcast server returns the URL addresses of all live broadcast channels to the VR client comprises the steps that after receiving a unicast request forwarded by a video stream transmission gateway, the VR live broadcast server inquires whether a trusted gateway list contains public network information of the video stream transmission gateway, and if the public network information of the video stream transmission gateway is inquired, URL addresses corresponding to a plurality of live broadcast channels in the VR live broadcast server are returned to the video stream transmission gateway. The trusted gateway list is used for representing a list of video streaming gateways which finish registration at the VR live broadcast server. In addition, the VR live broadcast server can also combine VR unicast service information recorded by the video stream transmission gateway during registration with URL addresses corresponding to a plurality of live broadcast channels in the VR live broadcast server to generate redirection service information, which is sent to the video stream transmission gateway, so that after the video stream transmission gateway returns the redirection service information to the VR client, the VR client initiates a live broadcast request in a unicast mode based on the VR unicast service information. The VR unicast service information includes, but is not limited to, a transport protocol, an intranet IP, a port, and the like corresponding to the video streaming gateway.
In an example, the video streaming gateway may send a registration request to the VR server, and the registration request carries public network information and VR unicast service information corresponding to the video streaming gateway, so that the VR server stores the public network information in a trusted gateway list and records the VR unicast service information, thereby completing a registration process. Meanwhile, in order to improve the security of the registration process, a corresponding trusted certificate can be carried in the registration request, so that the VR server can store the public network information in a trusted gateway list corresponding to the VR server based on the trusted certificate.
In addition, in an example, the video streaming gateway may forward the unicast request to the VR live broadcast server in an NAT (Network Address Translation) manner, that is, the video streaming gateway forwards the unicast request to the VR live broadcast server by using an IP Address of the video streaming gateway, so that the VR live broadcast server may query public Network information in the trusted gateway list through the IP Address corresponding to the unicast request, and further speed of querying the trusted gateway list by the VR live broadcast server is increased.
In step S430, the redirection service information is forwarded to the VR client.
In the embodiment of the application, after receiving redirection service information returned by the VR server, the redirection service information may be forwarded to the VR client, so that the VR client initiates a live broadcast request pointing to a target live broadcast channel in a unicast manner based on a URL address and VR unicast service information carried in the redirection service information.
Referring to fig. 5, fig. 5 is a flowchart of step S220 in an exemplary embodiment in the embodiment shown in fig. 2. As shown in fig. 5, the process of selecting a target slice stream that conforms to FOV view information from a plurality of slice streams and generating a unicast FOV video stream from the target slice stream combination may include steps S510 to S530, which are described in detail as follows:
in step S510, it is determined whether the VR client is accessing the multicast network for the first time.
In an embodiment of the present application, after receiving multiple slice streams of a target live channel, it may be determined whether a VR client accesses a multicast network for the first time.
In an example, the video streaming gateway may determine whether the preset operation log includes a connection record of the VR client accessing the multicast network, determine that the VR client accesses the multicast network for the first time if the connection record does not exist, and determine that the VR client does not access the multicast network for the first time if the connection record does exist.
If yes, in step S520, the received FOV view angle information is initialized to the default FOV view angle information.
In order to bring better experience to users in the process of showing the VR scene video with a full view, a view is usually selected as an initial view for showing so that the users can determine the current position of the VR scene video.
In the embodiment of the application, in the process of determining whether the VR client accesses the multicast network for the first time, if so, initializing the received FOV view information to default FOV view information.
For example, if FOV view information carried in the received live broadcast request is above a VR scene video picture of a full view, when it is determined that the VR client is first accessed to the multicast network, the received FOV view information is initialized to FOV view information corresponding to the middle of the VR scene video picture of the full view; that is, FOV view information corresponding to the middle of a VR scene video frame at a full view angle is used as default FOV view information, so that the viewing comfort of a user is improved.
In step S530, a target slice stream that matches the default FOV view information is selected from the plurality of slice streams, and a unicast FOV video stream is generated from the target slice streams in combination.
In an embodiment of the present application, after initializing FOV view information to default FOV view information, a target slice stream that conforms to the default FOV view information is selected from the plurality of slice streams, and a unicast FOV video stream is generated from the target slice stream combination.
Referring to fig. 6, fig. 6 is a video streaming application to a virtual reality VR scene, according to another example embodiment. As shown in fig. 6, after step S230 in the embodiment shown in fig. 2, the method may further include steps S610 to S620, which are described in detail as follows:
in step S610, perspective change information transmitted by the VR client is received.
It should be noted that, during the process of showing the VR scene video by the VR client, the view angle of showing the VR scene video by the VR client may be converted according to the rotation of the user. Therefore, in the embodiment of the present application, the view angle change information sent by the VR client may be received, so as to adjust the returned unicast FOV video stream according to the view angle change information.
In step S620, a target slice stream that conforms to the view change information is reselected from the plurality of slice streams according to the view change information to generate a unicast FOV video stream to be transmitted to the VR client.
In an embodiment of the application, after receiving the view angle change information sent by the VR client, the target slice stream that meets the view angle change information may be reselected from the multiple slice streams according to the view angle change information to generate a unicast FOV video stream, which is sent to the VR client, so that the picture of the VR scene video shown by the VR client is changed accordingly.
For example, a VR scene video of a full view is divided into 28 slice streams and arranged in a manner of 5 rows and 6 columns, and when the range corresponding to FOV view information is 6 slice streams, in the process of showing the VR scene video by the VR client for the first time, slice streams corresponding to 3 rd, 4 th and 5 th columns in the 2 nd row and the 3 rd row are initialized and selected. If the user controls the VR client to rotate upwards, visual angle change information is generated, the range corresponding to the visual angle change information is the slice streams corresponding to the 3 rd column, the 4 th column and the 5 th column in the 1 st row and the 2 nd row, then the slice streams corresponding to the 3 rd column, the 4 th column and the 5 th column in the 1 st row and the 2 nd row which are in accordance with the visual angle change information are selected from the multiple slice streams according to the visual angle change information, a unicast FOV video stream is generated and sent to the VR client, and therefore the immersion feeling of the user when watching the VR scene video is improved.
Referring to fig. 7, fig. 7 shows a registration process of a video streaming gateway in a VR live broadcast server in an embodiment of the present invention, where a specific implementation method may include the following steps:
and step S710, the VR live broadcast server sends service information to the VR live broadcast platform.
In order to improve the bearing capacity of the VR live broadcast server and the transmission efficiency of the VR live broadcast server, the VR live broadcast server can be arranged to provide services, or the VR live broadcast server is arranged in different areas to provide services for users in different areas, so that the purpose of improving the bearing capacity and the transmission efficiency is achieved. Consequently, set up under a plurality of circumstances at the live server of VR, in order to manage a plurality of live servers of VR for the convenience, just through setting up the live platform of VR in order to integrate the service information of a plurality of live servers of VR to be convenient for manage the live server of VR, wherein, service information includes the URL address of each live channel that the live server of VR contained, the quantity that each live channel corresponds a plurality of slices streams of VR scene video.
Step S720, the video stream transmission gateway sends a registration request to the VR live broadcast platform.
Because VR live platform carries out the integration management to a plurality of VR live server's service information, consequently in order to promote video streaming gateway's service ability, video streaming gateway can send the registration request to VR live platform to accomplish the registration on VR live platform, and send the VR unicast service information that this video streaming gateway corresponds to VR live platform, thereby be convenient for VR live platform to generate redirection service information.
A specific application scenario of the embodiment of the present application is described in detail below:
please refer to fig. 8. Fig. 8 is a flowchart of a method according to an embodiment of the present application, and as shown in fig. 8, the video streaming method applied to a virtual reality VR scene at least includes steps S810 to S870, which are described in detail as follows:
step S810, the VR client sends a unicast request to the video streaming gateway to obtain the URL address of the target live channel.
The video streaming gateway may receive and process request information sent by the VR client through a preset signaling module, where the request information includes, but is not limited to, a unicast request, a live broadcast request, view angle request information, encapsulation request information, network protocol request information, and VR live broadcast service request information.
Step S820, the video streaming gateway forwards the unicast request to the VR live broadcast platform in an NAT manner, so that the VR live broadcast platform determines whether the record of the video streaming gateway exists in the trusted gateway list according to the IP address carried in the unicast request.
Step S830, the VR live broadcast platform returns redirection service information to the VR client, wherein the redirection service information comprises URL addresses corresponding to all live broadcast channels and VR unicast service information.
And step 840, the VR client side initiates a live broadcast request to the video streaming gateway, wherein the live broadcast request is generated based on the redirection service information, and the live broadcast request carries a URL (uniform resource locator) address of the target live broadcast channel and FOV (field of view) view information corresponding to the VR client side.
Step S850, the video stream transmission gateway initiates a multicast network access request to the VR live broadcast server based on the URL address carried in the live broadcast request.
And step S860, the VR live broadcast server issues a plurality of slice streams corresponding to the target live broadcast channel to the video stream transmission gateway.
And the video streaming gateway can also simultaneously receive a plurality of slice streams corresponding to a plurality of live channels contained in the VR live broadcast server through a preset Tile stream receiver.
Step S870, the video streaming gateway processes the multiple slice streams based on the FOV view information carried in the live broadcast request to obtain a unicast FOV video stream.
For a specific process of processing the slice streams according to the FOV view information, please refer to the description in the embodiment of step S220, which is not described herein again.
In addition, the video stream transmission gateway can respectively generate corresponding unicast FOV video streams according to FOV visual angle information of different VR client sides through a preset Tile visual angle synthesis module, and then the preset FOV unicast distribution module sends the different unicast FOV video streams to the corresponding VR client sides according to the requirements of the VR client sides.
Step S880, the video stream transmission gateway sends the unicast FOV video stream to the VR client, so that after receiving the unicast FOV video stream, the VR client renders a picture of the unicast FOC video stream to a corresponding position according to the FOV view information.
Embodiments of the apparatus of the present application are described below, which may be used to perform the XX method in the above-described embodiments of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the XX method described above in the present application.
Fig. 9 shows a block diagram of a video streaming apparatus 900 applied to a virtual reality VR scene according to an embodiment of the application.
Referring to fig. 9, a video streaming apparatus 900 applied to a virtual reality VR scene according to an embodiment of the present application includes a receiving module 910 configured to receive multiple slice streams of a target live broadcast channel sent by a VR live broadcast server through a multicast network; a target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode, wherein the live broadcast request contains FOV visual angle information; a synthesizing module 920 configured to select a target slice stream that conforms to the FOV view information from the plurality of slice streams, and generate a unicast FOV video stream by target slice stream combination; a sending module 930 configured to send the unicast FOV video stream to the VR client.
In some embodiments of the present application, based on the foregoing solution, the receiving module 910 is further configured to: receiving a live broadcast request sent by a VR client; the live broadcast request carries a URL address and FOV visual angle information of a target live broadcast channel; and accessing a multicast network corresponding to the VR live broadcast server according to the URL address.
In some embodiments of the present application, based on the foregoing solution, the receiving module 910 is further configured to: receiving a unicast request sent by a VR client; forwarding the unicast request to a VR (virtual reality) live broadcast server so that the VR live broadcast server inquires a trusted gateway list, and returning redirection service information to the video stream transmission gateway if the inquired trusted gateway list contains public network information of the video stream transmission gateway; the redirection service information comprises URL addresses corresponding to a plurality of live channels in the VR live server; and forwarding the redirection service information to the VR client so that the VR client initiates a live broadcast request according to the redirection service information.
In some embodiments of the present application, based on the foregoing solution, the video streaming apparatus 900 applied to a virtual reality VR scene further includes: a registration module configured to send a registration request to a VR server; the registration request comprises public network information and a credible certificate corresponding to the video streaming gateway; and the VR server stores the public network information in a trusted gateway list corresponding to the VR server based on the trusted certificate.
In some embodiments of the present application, based on the foregoing scheme, the synthesis module 920 is further configured to: judging whether the VR client accesses a multicast network for the first time; if the FOV viewing angle information is judged to be the default FOV viewing angle information, initializing the received FOV viewing angle information; and selecting a target slice stream which accords with the default FOV view angle information from the plurality of slice streams, and combining the target slice streams to generate the unicast FOV video stream.
In some embodiments of the present application, based on the foregoing scheme, the synthesis module 920 is further configured to: removing an RTP packet header contained in the target slice stream and rewriting metadata information in the target slice stream to obtain a processed target slice stream; and combining the processed target slice streams to generate a video stream, and encapsulating the video stream by adopting a specified protocol to generate a unicast FOV video stream.
In some embodiments of the present application, based on the foregoing scheme, the synthesis module 920 is further configured to: receiving visual angle change information sent by a VR client; and reselecting a target slice stream which accords with the view change information from the plurality of slice streams according to the view change information to generate a unicast FOV video stream to be sent to the VR client.
It should be noted that the video streaming apparatus 900 applied to the virtual reality VR scenario provided in the foregoing embodiment and the video streaming method applied to the virtual reality VR scenario provided in the foregoing embodiment belong to the same concept, and specific manners in which the modules and units perform operations have been described in detail in the method embodiment, and are not repeated herein.
FIG. 10 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
It should be noted that the computer system 1000 of the electronic device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 10, the computer system 1000 includes a Central Processing Unit (CPU) 1001, which can perform various appropriate actions and processes, such as executing the method in the above-described embodiment, according to a program stored in a Read-Only Memory (ROM) 1002 or a program loaded from a storage portion 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for system operation are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other via a bus 1004. An Input/Output (I/O) interface 1005 is also connected to the bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output section 1007 including a Display panel such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 1008 including a hard disk and the like; and a communication section 1009 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The driver 1010 is also connected to the I/O interface 1005 as necessary. A removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1010 as necessary, so that a computer program read out therefrom is mounted into the storage section 1008 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication part 1009 and/or installed from the removable medium 1011. When the computer program is executed by a Central Processing Unit (CPU) 1001, various functions defined in the system of the present application are executed.
It should be noted that the computer readable media shown in the embodiments of the present application may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with a computer program embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may be separate and not incorporated into the electronic device. The computer readable medium carries one or more programs, which when executed by one of the electronic devices, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A video streaming method applied to a Virtual Reality (VR) scene, wherein the method is applied to a video streaming gateway, and the method comprises the following steps:
receiving a plurality of slice streams of a target live broadcast channel transmitted by a VR live broadcast server through a multicast network; the target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode, and the live broadcast request contains FOV visual angle information;
selecting a target slice stream from the plurality of slice streams that conforms to the FOV view information, the target slice stream combining to generate a unicast FOV video stream;
sending the unicast FOV video stream to the VR client.
2. The method of claim 1, wherein prior to receiving the plurality of slice streams of the target live channel sent by the VR live server over the multicast network, the method further comprises:
receiving the live broadcast request sent by the VR client; the live broadcast request carries a URL address of the target live broadcast channel and the FOV view angle information;
and accessing a multicast network corresponding to the VR live broadcast server according to the URL address.
3. The method of claim 2, wherein prior to the receiving the live request sent by the VR client, the method further comprises:
receiving a unicast request sent by the VR client;
forwarding the unicast request to a VR live broadcast server so that the VR live broadcast server inquires a trusted gateway list, and returning redirection service information to the video stream transmission gateway if the inquired trusted gateway list contains public network information of the video stream transmission gateway; the redirection service information comprises URL addresses corresponding to a plurality of live channels in the VR live server;
and forwarding the redirection service information to the VR client so that the VR client initiates the live broadcast request according to the redirection service information.
4. The method of claim 3, further comprising:
sending a registration request to the VR server; the registration request comprises public network information and a credible certificate corresponding to the video streaming gateway; and the VR server stores the public network information in a trusted gateway list corresponding to the VR server based on the trusted certificate.
5. The method of claim 1, wherein selecting a target slice stream from the plurality of slice streams that conforms to the FOV view information, wherein combining the target slice streams to generate a unicast FOV video stream comprises:
judging whether the VR client accesses the multicast network for the first time;
if the FOV viewing angle information is judged to be the default FOV viewing angle information, initializing the received FOV viewing angle information;
selecting a target slice stream from the plurality of slice streams that conforms to the default FOV view information, the target slice stream being combined to generate a unicast FOV video stream.
6. The method of claim 1, wherein the combining the target slice stream to generate a unicast FOV video stream comprises:
removing an RTP packet header contained in the target slice stream and rewriting metadata information in the target slice stream to obtain a processed target slice stream;
and combining the processed target slice streams to generate a video stream, and encapsulating the video stream by adopting a specified protocol to generate the unicast FOV video stream.
7. The method of claim 1, wherein after the sending the unicast FOV video stream to the VR client, the method further comprises:
receiving visual angle change information sent by the VR client;
reselecting a target slice stream from the plurality of slice streams according to the view change information to generate the unicast FOV video stream to be sent to the VR client.
8. A video streaming apparatus, comprising:
the receiving module is configured to receive a plurality of slice streams of a target live broadcast channel transmitted by the VR live broadcast server through a multicast network; the target live broadcast channel corresponds to a live broadcast request initiated by a VR client in a unicast mode, and the live broadcast request contains FOV visual angle information;
a compositing module configured to select a target slice stream from the plurality of slice streams that conforms to the FOV view information, the target slice stream combining to generate a unicast FOV video stream;
a sending module configured to send the unicast FOV video stream to the VR client.
9. A storage medium having stored thereon computer readable instructions which, when executed by a processor of a computer, cause the computer to perform the video streaming method applied to a virtual reality VR scene of any one of claims 1-7.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement the video streaming method as claimed in any of claims 1 to 7 for application to a virtual reality VR scene.
CN202210958183.2A 2022-08-10 2022-08-10 Video streaming transmission method and device applied to virtual reality VR scene Pending CN115314730A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210958183.2A CN115314730A (en) 2022-08-10 2022-08-10 Video streaming transmission method and device applied to virtual reality VR scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210958183.2A CN115314730A (en) 2022-08-10 2022-08-10 Video streaming transmission method and device applied to virtual reality VR scene

Publications (1)

Publication Number Publication Date
CN115314730A true CN115314730A (en) 2022-11-08

Family

ID=83861145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210958183.2A Pending CN115314730A (en) 2022-08-10 2022-08-10 Video streaming transmission method and device applied to virtual reality VR scene

Country Status (1)

Country Link
CN (1) CN115314730A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116668779A (en) * 2023-08-01 2023-08-29 中国电信股份有限公司 Virtual reality view field distribution method, system, device, equipment and medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1909509A (en) * 2006-07-19 2007-02-07 华为技术有限公司 System, method and user terminal for realizing video live broadcast in media distributing network
CN104641575A (en) * 2013-09-03 2015-05-20 华为技术有限公司 Method and device for transmitting media stream and user equipment
CN110149542A (en) * 2018-02-13 2019-08-20 华为技术有限公司 Transfer control method
WO2020034082A1 (en) * 2018-08-14 2020-02-20 海能达通信股份有限公司 Slicing-based rtp stream transmission method, device, terminal and server
CN113438495A (en) * 2021-06-23 2021-09-24 中国联合网络通信集团有限公司 VR live broadcast method, device, system, equipment and storage medium
WO2021190401A1 (en) * 2020-03-27 2021-09-30 华为技术有限公司 Program playing method and apparatus
WO2022002043A1 (en) * 2020-06-30 2022-01-06 中兴通讯股份有限公司 Data retransmission method, network device, and computer readable storage medium
CN114035672A (en) * 2020-07-20 2022-02-11 华为技术有限公司 Video processing method and related equipment for virtual reality VR scene

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1909509A (en) * 2006-07-19 2007-02-07 华为技术有限公司 System, method and user terminal for realizing video live broadcast in media distributing network
CN104641575A (en) * 2013-09-03 2015-05-20 华为技术有限公司 Method and device for transmitting media stream and user equipment
CN110149542A (en) * 2018-02-13 2019-08-20 华为技术有限公司 Transfer control method
WO2020034082A1 (en) * 2018-08-14 2020-02-20 海能达通信股份有限公司 Slicing-based rtp stream transmission method, device, terminal and server
WO2021190401A1 (en) * 2020-03-27 2021-09-30 华为技术有限公司 Program playing method and apparatus
WO2022002043A1 (en) * 2020-06-30 2022-01-06 中兴通讯股份有限公司 Data retransmission method, network device, and computer readable storage medium
CN114035672A (en) * 2020-07-20 2022-02-11 华为技术有限公司 Video processing method and related equipment for virtual reality VR scene
CN113438495A (en) * 2021-06-23 2021-09-24 中国联合网络通信集团有限公司 VR live broadcast method, device, system, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116668779A (en) * 2023-08-01 2023-08-29 中国电信股份有限公司 Virtual reality view field distribution method, system, device, equipment and medium
CN116668779B (en) * 2023-08-01 2023-10-10 中国电信股份有限公司 Virtual reality view field distribution method, system, device, equipment and medium

Similar Documents

Publication Publication Date Title
US20230246996A1 (en) Network address resolution
US10616301B2 (en) Request-based encoding for streaming content portions
US10171534B2 (en) Placeshifting of adaptive media streams
EP3000215B1 (en) Live media processing and streaming service
CN110677727B (en) Audio and video playing method and device, electronic equipment and storage medium
CA2702191C (en) Systems and methods for managing advertising content corresponding to streaming media content
CN110933517B (en) Code rate switching method, client and computer readable storage medium
CN108063769B (en) Method and device for realizing content service and content distribution network node
US20190289052A1 (en) Video streaming
CN106302362B (en) Multimedia content sending method, sharing method, receiving method and corresponding devices
US11374992B2 (en) Seamless social multimedia
CN112565802A (en) Live broadcast interaction method, system, server and storage medium
CN105610869B (en) Method and device for scheduling streaming media
CN115314730A (en) Video streaming transmission method and device applied to virtual reality VR scene
CN108777802B (en) Method and device for caching VR (virtual reality) video
CN110958279A (en) Data processing method and device
WO2016074149A1 (en) Expedited media content delivery
CN112243136A (en) Content playing method, video storage method and equipment
CN109948082A (en) Live information processing method and processing device, electronic equipment, storage medium
CN114071170B (en) Network live broadcast interaction method and device
van der Hooft et al. An HTTP/2 push-based framework for low-latency adaptive streaming through user profiling
CN116668779B (en) Virtual reality view field distribution method, system, device, equipment and medium
CN117041628B (en) Live picture rendering method, system, device, equipment and medium
US20240013461A1 (en) Interactive Animation Generation
Episkopos Peer-to-Peer video content delivery optimization service in a distributed network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination