CN111246237A - Panoramic video live broadcast method and device - Google Patents

Panoramic video live broadcast method and device Download PDF

Info

Publication number
CN111246237A
CN111246237A CN202010075800.5A CN202010075800A CN111246237A CN 111246237 A CN111246237 A CN 111246237A CN 202010075800 A CN202010075800 A CN 202010075800A CN 111246237 A CN111246237 A CN 111246237A
Authority
CN
China
Prior art keywords
panoramic
video
frame
panoramic image
frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010075800.5A
Other languages
Chinese (zh)
Inventor
靳伟明
覃才俊
韩杰
王艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN202010075800.5A priority Critical patent/CN111246237A/en
Publication of CN111246237A publication Critical patent/CN111246237A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream

Abstract

The embodiment of the invention provides a panoramic video live broadcast method and a panoramic video live broadcast device, which are applied to a video network and the video network, wherein the method comprises the following steps: after receiving a panoramic live video stream, acquiring at least one frame of panoramic image in the panoramic live video stream; decomposing each frame of the panoramic image into at least two frames of planar images; and respectively sending the two frames of plane images to at least two video network terminals in a live broadcast mode so as to enable the two video network terminals to respectively display the plane images. The method can realize that at least two video network terminals watch live broadcast at different angles, realize the mapping of the video network panorama to a plurality of surfaces, remove redundant data in the panoramic live broadcast video stream terminal, and facilitate the scheduling of each frame of planar image in any direction in the panoramic live broadcast video stream to form the live broadcast video stream in the direction.

Description

Panoramic video live broadcast method and device
Technical Field
The invention relates to the technical field of video networking, in particular to a panoramic video live broadcast method and device.
Background
Due to the fact that the video networking technology can achieve real-time transmission of full-network high-definition audios and videos, under the background, a plurality of creative applications are derived, and the mapping from the panoramic image to the cube is one of the creative applications derived under the video networking technology.
Currently, since a panoramic image includes image data of multiple directions, when image data of a certain direction is scheduled, the scheduling speed may be slow due to redundancy of the image data.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a panoramic video live broadcasting method and apparatus that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention provides a panoramic video live broadcast method, which is applied to a video network, and the method includes:
after receiving a panoramic live video stream, acquiring at least one frame of panoramic image in the panoramic live video stream;
decomposing each frame of the panoramic image into at least two frames of planar images;
and respectively sending the two frames of plane images to at least two video network terminals in a live broadcast mode so as to enable the two video network terminals to respectively display the plane images.
Optionally, the obtaining at least one frame of panoramic image in the panoramic live video stream includes:
extracting a panoramic image stream from the panoramic live video stream;
and carrying out demultiplexing processing on the panoramic image stream to obtain at least one frame of panoramic image.
Optionally, the decomposing the panoramic image of each frame into at least two planar images includes:
acquiring the number of rows and columns of the panoramic image of each frame;
combining the line number and the column number of each frame of panoramic image, and performing coordinate transformation processing on pixel points in each frame of panoramic image;
and responding to the coordinate transformation processing, and respectively obtaining at least two frames of plane images of each frame of panoramic image under at least two plane coordinate systems.
Optionally, the sending the two frames of plane images to at least two video network terminals in a live broadcast manner respectively includes:
encoding and packaging the at least two frames of plane images;
and sending the at least two frames of plane images subjected to the encoding and packaging processing to the at least two video network terminals in a live broadcast mode.
In order to solve the above problem, an embodiment of the present invention provides a panoramic video live broadcasting device, which is applied to a video network, and the device includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least one frame of panoramic image in a panoramic live video stream after receiving the panoramic live video stream;
the decomposition module is used for decomposing each frame of panoramic image into at least two frames of plane images;
and the sending module is used for respectively sending the two frames of plane images to at least two video network terminals in a live broadcast mode so as to enable the two video network terminals to respectively display the plane images.
Optionally, the obtaining module includes:
the extraction submodule is used for extracting a panoramic image stream from the panoramic live broadcast video stream;
and the first processing submodule is used for carrying out demultiplexing processing on the panoramic image stream to obtain at least one frame of panoramic image.
Optionally, the decomposition module comprises:
the acquisition submodule is used for acquiring the number of rows and the number of columns of the panoramic image of each frame;
the second processing submodule is used for carrying out coordinate transformation processing on pixel points in each frame of panoramic image by combining the line number and the column number of each frame of panoramic image;
and the obtaining submodule is used for responding to the coordinate transformation processing and respectively obtaining at least two frames of plane images of each frame of panoramic image under at least two plane coordinate systems.
Optionally, the sending module includes:
the encoding submodule is used for encoding and packaging the at least two frames of plane images;
and the sending submodule is used for sending the at least two frames of plane images subjected to the coding and packaging processing to the at least two video network terminals in a live broadcast mode.
In order to solve the above problem, an embodiment of the present invention provides an electronic device, including:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the panoramic video live method described above.
In order to solve the above problem, an embodiment of the present invention provides a computer-readable storage medium storing a computer program that causes a processor to execute the above panoramic video live broadcasting method.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, after receiving a panoramic live broadcast video stream, a virtual terminal acquires at least one frame of panoramic image in the panoramic live broadcast video stream, then decomposes each frame of panoramic image into at least two frames of plane images, and further respectively sends the two frames of plane images to at least two video network terminals in a live broadcast form so that the two video network terminals respectively display the plane images, so that the at least two video network terminals can watch live broadcasts at different angles, mapping of the panoramic video network to multiple planes can be realized, redundant data in the panoramic live broadcast video stream terminal can be removed, and each frame of plane image in any direction in the panoramic live broadcast video stream can be conveniently scheduled to form the live broadcast video stream in the direction.
Drawings
Fig. 1 is a flowchart illustrating steps of a panoramic video live broadcast method according to an embodiment of the present invention;
FIG. 2 is a schematic view illustrating a panoramic video live broadcast in a video networking scene according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating steps of a panoramic video live broadcast method according to a second embodiment of the present invention;
FIG. 4 is a schematic view illustrating a panoramic video live broadcast in another video networking scenario provided by an embodiment of the present invention;
fig. 5 shows a block diagram of a panoramic video live broadcast apparatus according to a third embodiment of the present invention;
FIG. 6 illustrates a networking diagram of a video network of the present invention;
FIG. 7 is a diagram illustrating a hardware architecture of a node server according to the present invention;
fig. 8 shows a hardware architecture diagram of an access switch of the present invention;
fig. 9 is a schematic diagram illustrating a hardware structure of an ethernet protocol conversion gateway according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video network adopts the most advanced real-time high-definition video exchange technology in the world, so that the real-time transmission of 4K images can be realized, and under the background, a plurality of creative applications are derived, and the mapping from the panoramic image to the cube is one of the creative applications.
Referring to fig. 1, a flowchart illustrating steps of a panoramic video live broadcast method provided in an embodiment of the present invention is shown, where the method may be applied to a video network, and specifically may include the following steps:
step 501, after receiving a panoramic live video stream, acquiring at least one frame of panoramic image in the panoramic live video stream.
In the present invention, the technical solution of the present embodiment can be described in detail with reference to fig. 2.
Referring to fig. 2, a schematic diagram of live panoramic video broadcast in a scene of a video network provided by an embodiment of the present invention is shown, as shown in fig. 2, in a process of live panoramic video broadcast, a panoramic video network terminal 10 issues live broadcast to a video network server 20, the video network server 20 may forward the live broadcast to a virtual terminal 30, the virtual terminal 30 may be configured to receive a live panoramic video stream, after receiving the live panoramic video stream, the virtual terminal 30 extracts a panoramic image stream from the live panoramic video stream, and performs demultiplexing on the panoramic image stream, so as to obtain at least one frame of panoramic image.
After acquiring at least one frame of panoramic image in the panoramic live video stream, step 502 is performed.
Step 502, decomposing each frame of panoramic image into at least two frames of planar images.
In the embodiment of the present invention, the virtual terminal 30 may first obtain the number of rows and columns of each frame of panoramic image, further perform coordinate transformation processing on the pixel points in each frame of panoramic image by combining the number of rows and columns of each frame of panoramic image, and finally obtain at least two frames of planar images of each frame of panoramic image in at least two planar coordinate systems in response to the coordinate transformation processing.
After the coordinate transformation processing is performed on the pixel points in each frame of panoramic image terminal, at least two frames of planar images in at least two planar coordinate systems can be output by using a cross-platform computer vision library (opencv) algorithm.
Optionally, each frame of panoramic image may be decomposed into two frames of planar images, each frame of panoramic image may also be decomposed into three frames of planar images, and each frame of panoramic image may also be decomposed into six frames of planar images.
After decomposing each frame of the panoramic image into at least two frames of planar images, step 503 is performed.
Step 503, respectively sending the two frames of plane images to at least two video network terminals in a live broadcast manner, so that the two video network terminals respectively display the plane images.
As shown in fig. 2, the virtual terminal 30 may respectively send two frames of plane images to the video network server 20 in a live broadcast manner, and then the video network server 20 issues two paths of live broadcasts corresponding to the two frames of plane images to at least two video network terminals 40, so that the two video network terminals 40 respectively display the corresponding plane images.
In the embodiment of the invention, after receiving a panoramic live broadcast video stream, a virtual terminal can acquire at least one frame of panoramic image in the panoramic live broadcast video stream, then each frame of panoramic image is decomposed into at least two frames of plane images, and further, the two frames of plane images are respectively sent to at least two video network terminals in a live broadcast mode, so that the two video network terminals respectively display the plane images, the at least two video network terminals can watch live broadcasts at different angles, the mapping of the panoramic live broadcast of the video network to multiple planes can be realized, redundant data in the panoramic live broadcast video stream terminal can be removed, and each frame of plane image in any direction in the panoramic live broadcast video stream can be conveniently scheduled to form the live broadcast video stream in the direction.
Referring to fig. 3, a flowchart illustrating steps of a panoramic video live broadcast method according to a second embodiment of the present invention is shown, where the method may be applied to a video network, where the video network includes a first management system and a second management system, the first management system includes a first autonomous server, and the second management system includes a second autonomous server, and specifically may include the following steps:
step 601, after receiving the panoramic live video stream, acquiring at least one frame of panoramic image in the panoramic live video stream.
In the embodiment of the present invention, a technical solution of the embodiment of the present invention may be described in detail with reference to fig. 4.
Referring to fig. 4, a schematic view of live panoramic video broadcast in another video networking scenario provided by an embodiment of the present invention is shown, as shown in fig. 4, in a process of live panoramic video broadcast, a panoramic video networking terminal issues live broadcast to a video networking server, the video networking server forwards the live broadcast to a virtual terminal, the virtual terminal receives a panoramic live broadcast video stream, and after receiving the panoramic live broadcast video stream, the virtual terminal extracts a panoramic image stream from the panoramic live broadcast video stream, and performs demultiplexing (demux) processing on the panoramic image stream to obtain at least one frame of panoramic image (frame).
After the signaling packet is generated, step 602 is performed.
Step 602, decomposing each frame of panoramic image into at least two frames of plane images.
As shown in fig. 4, after obtaining at least one frame of panoramic image (avframe), an image decomposition process (demix) may be performed on the at least one frame of panoramic image, and first, the number of rows and the number of columns of each frame of panoramic image are obtained. And then, combining the line number and the column number of each frame of panoramic image, then carrying out coordinate transformation processing on pixel points in each frame of panoramic image, calculating the source coordinate of each pixel point, finally mapping the coordinate onto a plane in response to the coordinate transformation processing, and respectively obtaining at least two frames of plane images of each frame of panoramic image under at least two plane coordinate systems.
After the coordinate transformation processing is performed on the pixel points in each frame of panoramic image terminal, at least two frames of planar images in at least two planar coordinate systems can be output by using a cross-platform computer vision library (opencv) algorithm.
Of course, in this embodiment, each frame of panoramic image may be decomposed into two frames of planar images, each frame of panoramic image may also be decomposed into three frames of planar images, and each frame of panoramic image may also be decomposed into six frames of planar images.
After decomposing each frame of the panoramic image into at least two frames of planar images, step 603 is performed.
Step 603, performing encoding and packaging processing on at least two frames of plane images.
As shown in fig. 4, at least two frames of planar images obtained by decomposition are sent to a multiplexing module (mux), and the at least two frames of planar images are encoded and encapsulated, so as to reduce the occupied bandwidth and ensure that the terminal of the video network can receive the corresponding planar images.
After the encoding and packing process is performed on at least two frames of planar images, step 604 is performed.
And step 604, sending the at least two frames of plane images subjected to the encoding and packaging processing to at least two video network terminals in a live broadcast mode.
As shown in fig. 4, the at least two frames of plane images subjected to the encoding and packaging process are generated into a video network live stream in a live form and sent to at least two video network terminals.
When each frame of panoramic image is decomposed into six frames of plane images, the mapping from the panoramic image to six surfaces of a cube can be completed, namely the panoramic image is live broadcast at six angles.
In the embodiment of the invention, after receiving a panoramic live broadcast video stream, a virtual terminal acquires at least one frame of panoramic image in the panoramic live broadcast video stream, decomposes each frame of panoramic image into at least two frames of plane images, performs coding and packaging processing on the at least two frames of plane images, and sends the at least two frames of plane images subjected to coding and packaging processing to at least two video network terminals in a live broadcast form, so that the at least two video network terminals can watch live broadcasts at different angles, mapping of the video network panoramas to multiple surfaces can be realized, redundant data in the panoramic live broadcast video stream terminal can be removed, and each frame of plane image in any direction in the panoramic live broadcast video stream can be conveniently scheduled to form the live broadcast video stream in the direction.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 5, a block diagram of a panoramic video live broadcast apparatus according to a third embodiment of the present invention is shown, where the panoramic video live broadcast apparatus 700 may be applied to a video network, and the apparatus specifically includes:
an obtaining module 701, configured to obtain at least one frame of panoramic image in a panoramic live video stream after the panoramic live video stream is received;
a decomposition module 702, configured to decompose each frame of panoramic image into at least two frames of planar images;
a sending module 703, configured to send the two frames of plane images to at least two video networking terminals in a live broadcast manner, so that the two video networking terminals display the plane images respectively.
Optionally, the obtaining module includes:
the extraction submodule is used for extracting a panoramic image stream from the panoramic live broadcast video stream;
and the first processing submodule is used for carrying out demultiplexing processing on the panoramic image stream to obtain at least one frame of panoramic image.
Optionally, the decomposition module comprises:
the acquisition submodule is used for acquiring the number of rows and the number of columns of each frame of panoramic image;
the second processing submodule is used for carrying out coordinate transformation processing on pixel points in each frame of panoramic image by combining the line number and the column number of each frame of panoramic image;
and the obtaining submodule is used for responding to the coordinate transformation processing and respectively obtaining at least two frames of plane images of each frame of panoramic image under at least two plane coordinate systems.
Optionally, the sending module includes:
the encoding submodule is used for encoding and packaging at least two frames of plane images;
and the sending submodule is used for sending the at least two frames of plane images subjected to the coding and packaging processing to at least two video network terminals in a live broadcast mode.
In the embodiment of the invention, the virtual terminal can acquire at least one frame of panoramic image in the panoramic live video stream after receiving the panoramic live video stream through the acquisition module, then decompose each frame of panoramic image into at least two frames of plane images through the decomposition module, further send the two frames of plane images to at least two video network terminals respectively in a live broadcast mode through the sending module, so that the two video network terminals respectively display the plane images, the at least two video network terminals can watch live broadcast at different angles, mapping of the panoramic video network to multiple planes can be realized, redundant data in the panoramic live video stream terminal can be removed, and each frame of plane images in any direction in the panoramic live video stream can be conveniently scheduled to form the live video stream in the direction.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, including:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform a panoramic video live method as described above.
An embodiment of the present invention further provides a computer-readable storage medium, which stores a computer program for enabling a processor to execute the panoramic video live broadcasting method.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network circuit Switching (circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 6, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 7, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 8, the network interface module (downlink network interface module 301, uplink network interface module 302), switching engine module 303 and CPU module 304 are mainly included;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the incoming data packet of the CPU module 304 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 308 is configured by the CPU module 304, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 9, the system mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MACSA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Based on the characteristics of the video network, one of the core concepts of the embodiments of the present invention is provided, after receiving a panoramic live video stream, a virtual terminal acquires at least one frame of panoramic image in the panoramic live video stream, then decomposes each frame of panoramic image into at least two frames of plane images, further, respectively sends the two frames of plane images to at least two video network terminals in a live broadcast manner, so that the two video network terminals respectively display the plane images, thereby enabling the at least two video network terminals to watch live broadcast at different angles, realizing the mapping of the panoramic live broadcast of the video network to multiple surfaces, removing redundant data in the panoramic live video stream terminal, and facilitating the scheduling of each frame of plane image in any direction in the panoramic live video stream to form the live video stream in the direction.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method and the device for live panoramic video of the camera provided by the invention are introduced in detail, and the principle and the implementation mode of the invention are explained by applying specific examples in the text, and the description of the examples is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A panoramic video live broadcast method is applied to a video network, and comprises the following steps:
after receiving a panoramic live video stream, acquiring at least one frame of panoramic image in the panoramic live video stream;
decomposing each frame of the panoramic image into at least two frames of planar images;
and respectively sending the two frames of plane images to at least two video network terminals in a live broadcast mode so as to enable the two video network terminals to respectively display the plane images.
2. The method of claim 1, wherein the obtaining at least one frame of panoramic image in the panoramic live video stream comprises:
extracting a panoramic image stream from the panoramic live video stream;
and carrying out demultiplexing processing on the panoramic image stream to obtain at least one frame of panoramic image.
3. The method of claim 1, wherein decomposing the panoramic image for each frame into at least two planar images comprises:
acquiring the number of rows and columns of the panoramic image of each frame;
combining the line number and the column number of each frame of panoramic image, and performing coordinate transformation processing on pixel points in each frame of panoramic image;
and responding to the coordinate transformation processing, and respectively obtaining at least two frames of plane images of each frame of panoramic image under at least two plane coordinate systems.
4. The method according to claim 1, wherein the sending the two frames of plane images to at least two video network terminals respectively in a live mode comprises:
encoding and packaging the at least two frames of plane images;
and sending the at least two frames of plane images subjected to the encoding and packaging processing to the at least two video network terminals in a live broadcast mode.
5. A panoramic video live broadcast device is characterized in that the panoramic video live broadcast device is applied to a video network, and the device comprises:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least one frame of panoramic image in a panoramic live video stream after receiving the panoramic live video stream;
the decomposition module is used for decomposing each frame of panoramic image into at least two frames of plane images;
and the sending module is used for respectively sending the two frames of plane images to at least two video network terminals in a live broadcast mode so as to enable the two video network terminals to respectively display the plane images.
6. The apparatus of claim 5, wherein the obtaining module comprises:
the extraction submodule is used for extracting a panoramic image stream from the panoramic live broadcast video stream;
and the first processing submodule is used for carrying out demultiplexing processing on the panoramic image stream to obtain at least one frame of panoramic image.
7. The apparatus of claim 5, wherein the decomposition module comprises:
the acquisition submodule is used for acquiring the number of rows and the number of columns of the panoramic image of each frame;
the second processing submodule is used for carrying out coordinate transformation processing on pixel points in each frame of panoramic image by combining the line number and the column number of each frame of panoramic image;
and the obtaining submodule is used for responding to the coordinate transformation processing and respectively obtaining at least two frames of plane images of each frame of panoramic image under at least two plane coordinate systems.
8. The apparatus of claim 5, wherein the sending module comprises:
the encoding submodule is used for encoding and packaging the at least two frames of plane images;
and the sending submodule is used for sending the at least two frames of plane images subjected to the coding and packaging processing to the at least two video network terminals in a live broadcast mode.
9. An electronic device, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the panoramic video live method of any of claims 1-4.
10. A computer-readable storage medium storing a computer program for causing a processor to execute the panoramic video live broadcasting method according to any one of claims 1 to 4.
CN202010075800.5A 2020-01-22 2020-01-22 Panoramic video live broadcast method and device Pending CN111246237A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010075800.5A CN111246237A (en) 2020-01-22 2020-01-22 Panoramic video live broadcast method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010075800.5A CN111246237A (en) 2020-01-22 2020-01-22 Panoramic video live broadcast method and device

Publications (1)

Publication Number Publication Date
CN111246237A true CN111246237A (en) 2020-06-05

Family

ID=70878116

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010075800.5A Pending CN111246237A (en) 2020-01-22 2020-01-22 Panoramic video live broadcast method and device

Country Status (1)

Country Link
CN (1) CN111246237A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112866786A (en) * 2021-01-14 2021-05-28 视联动力信息技术股份有限公司 Video data processing method and device, terminal equipment and storage medium
CN113038262A (en) * 2021-01-08 2021-06-25 深圳市智胜科技信息有限公司 Panoramic live broadcast method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106550239A (en) * 2015-09-22 2017-03-29 北京同步科技有限公司 360 degree of panoramic video live broadcast systems and its implementation
US20170200315A1 (en) * 2016-01-07 2017-07-13 Brendan Lockhart Live stereoscopic panoramic virtual reality streaming system
CN107529064A (en) * 2017-09-04 2017-12-29 北京理工大学 A kind of self-adaptive encoding method based on VR terminals feedback
CN108063946A (en) * 2017-11-16 2018-05-22 腾讯科技(成都)有限公司 Method for encoding images and device, storage medium and electronic device
CN108322727A (en) * 2018-02-28 2018-07-24 北京搜狐新媒体信息技术有限公司 A kind of panoramic video transmission method and device
CN108986117A (en) * 2018-07-18 2018-12-11 北京优酷科技有限公司 Video image segmentation method and device
CN109698952A (en) * 2017-10-23 2019-04-30 腾讯科技(深圳)有限公司 Playback method, device, storage medium and the electronic device of full-view video image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106550239A (en) * 2015-09-22 2017-03-29 北京同步科技有限公司 360 degree of panoramic video live broadcast systems and its implementation
US20170200315A1 (en) * 2016-01-07 2017-07-13 Brendan Lockhart Live stereoscopic panoramic virtual reality streaming system
CN107529064A (en) * 2017-09-04 2017-12-29 北京理工大学 A kind of self-adaptive encoding method based on VR terminals feedback
CN109698952A (en) * 2017-10-23 2019-04-30 腾讯科技(深圳)有限公司 Playback method, device, storage medium and the electronic device of full-view video image
CN108063946A (en) * 2017-11-16 2018-05-22 腾讯科技(成都)有限公司 Method for encoding images and device, storage medium and electronic device
CN108322727A (en) * 2018-02-28 2018-07-24 北京搜狐新媒体信息技术有限公司 A kind of panoramic video transmission method and device
CN108986117A (en) * 2018-07-18 2018-12-11 北京优酷科技有限公司 Video image segmentation method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113038262A (en) * 2021-01-08 2021-06-25 深圳市智胜科技信息有限公司 Panoramic live broadcast method and device
CN112866786A (en) * 2021-01-14 2021-05-28 视联动力信息技术股份有限公司 Video data processing method and device, terminal equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108737768B (en) Monitoring method and monitoring device based on monitoring system
CN110166433B (en) Method and system for acquiring video data
CN111124333A (en) Method, device, equipment and storage medium for synchronizing display contents of electronic whiteboard
CN108574816B (en) Video networking terminal and communication method and device based on video networking terminal
CN110113564B (en) Data acquisition method and video networking system
CN109743284B (en) Video processing method and system based on video network
CN109714568B (en) Video monitoring data synchronization method and device
CN110769179B (en) Audio and video data stream processing method and system
CN110650147A (en) Data acquisition method and system
CN110557319A (en) Message processing method and device based on video network
CN108965783B (en) Video data processing method and video network recording and playing terminal
CN111246237A (en) Panoramic video live broadcast method and device
CN110769297A (en) Audio and video data processing method and system
CN110830826A (en) Video transcoding equipment scheduling method and system
CN110086773B (en) Audio and video data processing method and system
CN110022500B (en) Packet loss processing method and device
CN110138729B (en) Data acquisition method and video networking system
CN109474661B (en) Method and system for processing network request event
CN109768964B (en) Audio and video display method and device
CN110677315A (en) Method and system for monitoring state
CN110830762A (en) Audio and video data processing method and system
CN110139060B (en) Video conference method and device
CN110536148B (en) Live broadcasting method and equipment based on video networking
CN109859824B (en) Pathological image remote display method and device
CN110113563B (en) Data processing method based on video network and video network server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200605