CN107835435B - Event wide-view live broadcasting equipment and associated live broadcasting system and method - Google Patents

Event wide-view live broadcasting equipment and associated live broadcasting system and method Download PDF

Info

Publication number
CN107835435B
CN107835435B CN201710432037.5A CN201710432037A CN107835435B CN 107835435 B CN107835435 B CN 107835435B CN 201710432037 A CN201710432037 A CN 201710432037A CN 107835435 B CN107835435 B CN 107835435B
Authority
CN
China
Prior art keywords
wide
view
live
video stream
media server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710432037.5A
Other languages
Chinese (zh)
Other versions
CN107835435A (en
Inventor
支小牧
刘善红
王旭
倪建聪
段硕
李宝玉
陈龙
陈光辉
Original Assignee
Fblife Beijing Media Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fblife Beijing Media Technology Co ltd filed Critical Fblife Beijing Media Technology Co ltd
Priority to CN201710432037.5A priority Critical patent/CN107835435B/en
Publication of CN107835435A publication Critical patent/CN107835435A/en
Application granted granted Critical
Publication of CN107835435B publication Critical patent/CN107835435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • H04N21/2335Processing of audio elementary streams involving reformatting operations of audio signals, e.g. by converting from one coding standard to another
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a live event broadcasting system, which comprises a wide-view-angle live broadcasting device, wherein the wide-view-angle live broadcasting device comprises a wide-view-angle image acquisition unit used for acquiring a wide-view-angle image; running a computer instruction sequence to obtain an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit; collecting a sample wide-view-angle image frame; deriving a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients; collecting a plurality of live broadcast wide-view image frames; creating a wide-view video stream with the captured plurality of live wide-view image frames; and transmitting the live broadcast wide-view video stream and the pixel coordinate mapping relation matrix to at least one streaming media server.

Description

Event wide-view live broadcasting equipment and associated live broadcasting system and method
Technical Field
The present application relates to live devices and methods, and more particularly, to live event devices, systems, and methods.
Background
Live events generally refer to the process of transmitting images and sounds of the event scene to, for example, a television, a radio, a computer, a mobile terminal, etc. through a cable television network, a broadcast network, or the internet, etc. in real time. With the development of internet technology, live broadcast technology based on internet has been developed, for example, streaming media technology appears, which can compress media data in sections and then send the data through a network, and transmit video and audio on the network in real time for a viewer to use specific playing software to play and view, and the viewer can use the playing software to enjoy live broadcast pictures of a certain shooting angle of a video capture device in real time.
When the sports racing car events such as the rally are rebroadcast, a relatively wide view angle is usually required to embody a large scene, and long-time playing of videos with wide view angles can lead viewers to tired emotion, thereby losing interest in live broadcasting. Conventional live broadcasting techniques usually employ time-sequential scene switching to realize live broadcasting from different angles, such as playing a first local close-up section at a first time, playing a panoramic section at a second time, and the like. Although pip technology is available, it occupies extra resources in the digital transmission mode, and thus has been rarely applied in digital tv systems.
Accordingly, there is a need for an apparatus, system, and associated method that enables wide-view live broadcasting of events.
Disclosure of Invention
Some embodiments of the present application provide a live event system, comprising a wide-view live broadcast device, the wide-view live broadcast device comprising a wide-view image acquisition unit configured to acquire a wide-view image; a memory for storing a sequence of computer instructions; a processor executing the sequence of computer instructions to: obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit; collecting a sample wide-view-angle image frame; deriving a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients; collecting a plurality of live broadcast wide-view image frames; creating a wide-view video stream with the captured plurality of live wide-view image frames; and transmitting the live broadcast wide-view video stream and the pixel coordinate mapping relation matrix to at least one streaming media server.
Some embodiments of the present application provide a live event system comprising a wide-view live broadcast device, the wide-view live broadcast device comprising a wide-view image acquisition unit configured to acquire a wide-view image frame; the system comprises at least one sound acquisition unit, a signal processing unit and a signal processing unit, wherein the sound acquisition unit is used for acquiring sound field signals of the live broadcast equipment of the sports racing car; a memory for storing a sequence of computer instructions; a processor executing the sequence of computer instructions to: obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit; collecting a sample wide-view-angle image frame; preprocessing the sample wide-view image frame; deriving a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients; acquiring a plurality of live broadcast wide-view image frames; digitizing the sound field signals into audio data; packaging the audio data and the plurality of wide-view image frames into wide-view audio and video streams; transmitting the pixel coordinate mapping relation matrix to the streaming media server; and transmitting the wide-view audio and video stream to the streaming media server.
Some embodiments of the present application provide a live event system comprising a playback device comprising a processor and a memory; the memory stores a sequence of computer instructions that, when executed, cause the playback device to: receiving a pixel coordinate mapping relation matrix aiming at least one wide-view image acquisition device from at least one streaming media server and acquiring a wide-view video stream or an audio/video stream by the wide-view image acquisition device; decoding the wide-view video stream or the audio/video stream to obtain a plurality of wide-view image frames; establishing a hemispherical model; transforming corresponding hemispherical texture coordinates by using the pixel coordinate mapping matrix; texture mapping each frame of the plurality of wide-perspective image frames on the hemispherical model to render a corrected image frame.
Some embodiments of the present application provide a live event system comprising at least one wide-view live broadcast device, each of the at least one wide-view live broadcast device comprising a wide-view image capture unit for acquiring wide-view image frames; a memory for storing a first sequence of computer instructions; a processor executing the first sequence of computer instructions to: obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit; collecting a sample wide-view-angle image frame; preprocessing the sample wide-view image frame; deriving a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients; acquiring a live broadcast wide-view video stream formed based on a plurality of live broadcast wide-view image frames; a streaming media server comprising a processor and a memory, the memory storing a second sequence of computer instructions which, when executed, cause the streaming media server to: receiving the live wide-view video stream and the pixel coordinate mapping relationship matrix from the at least one wide-view live device; and caching and scheduling the pixel coordinate mapping relation matrix and the wide-view video stream.
Some embodiments of the present application provide a live event system comprising at least one wide-view live broadcast device, each of the at least one wide-view live broadcast device comprising a wide-view image capture unit for acquiring wide-view image frames; a memory for storing a first sequence of computer instructions; a processor executing the first sequence of computer instructions to: obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit; acquiring a wide-view video stream formed based on a plurality of live wide-view image frames; transmitting the intrinsic parameter matrix and the distortion coefficient matrix to a streaming media server; transmitting the wide-view video stream to the streaming media server; a streaming media server in communication with at least one wide-view live device, comprising a processor and a memory, the memory storing a second sequence of computer instructions that, when executed, cause the streaming media server to: deducing a pixel coordinate mapping relation matrix of a wide-view image frame of the wide-view live broadcast equipment and a panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame received from the at least one wide-view live broadcast equipment and an internal parameter matrix and a distortion coefficient matrix of a wide-view image acquisition unit of the wide-view live broadcast equipment; locally storing the pixel coordinate mapping relation matrix of the wide-view-angle live broadcast equipment; receiving a wide-view video stream or an audio/video stream from the wide-view live broadcast equipment; caching the wide-view video stream; and scheduling the wide-view video stream and the pixel coordinate mapping relation matrix.
The live broadcast system and the live broadcast method with the wide viewing angles are particularly suitable for scenes such as live broadcast of events, suitable solutions are provided for live broadcast of off-road events and the like, and particularly, application of live broadcast equipment with the wide viewing angles of the events to a streaming media live broadcast platform is provided.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 is a schematic diagram of a scenario according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a wide-view live broadcasting device according to an embodiment of the present application;
fig. 3A is a flow chart of a process of a wide-view live device according to an embodiment of the present application;
fig. 3B is another process flow diagram of a wide-view live device in accordance with an embodiment of the present application;
fig. 4 is a schematic diagram of a wide view live system in accordance with an embodiment of the present application;
FIG. 5 is a schematic diagram of a playback interface of a user device according to an embodiment of the present application;
FIG. 6 is a process flow diagram of a user equipment in accordance with an embodiment of the present application;
FIG. 7 is a schematic diagram of the functionality of the generation of a pixel coordinate mapping relationship matrix; and
fig. 8 is a schematic diagram of a pixel coordinate mapping relationship matrix update function.
Detailed Description
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The term "server" in this context refers to an intelligent electronic device that can perform predetermined processes such as numerical calculation and/or logic calculation by executing predetermined programs or instructions, and may include a processor and a memory, wherein the predetermined processes are performed by the processor executing program instructions prestored in the memory, or the predetermined processes are performed by hardware such as ASIC, FPGA, DSP, or a combination thereof.
Wherein the user equipment includes but is not limited to smart phones, PDAs, PCs, notebook computers, etc.; the network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of computers or network servers, wherein Cloud Computing is one of distributed Computing, a super virtual computer consisting of a collection of loosely coupled computers. Wherein the computer device can be operated alone to implement the invention, or can be accessed to a network and implement the invention through interoperation with other computer devices in the network. The network in which the computer device is located includes, but is not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, and the like.
It should be noted that the user equipment, the network device, the network, etc. are only examples, and other existing or future computer devices or networks may also be included in the scope of the present invention, and are included by reference.
The methodologies discussed hereinafter, some of which are illustrated by flow diagrams, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. The processor(s) may perform the necessary tasks.
Specific structural and functional details disclosed herein are merely representative and are provided for purposes of describing example embodiments of the present invention. The present invention may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element may be termed a second element, and, similarly, a second element may be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The live event system of one embodiment of the present invention may be adapted for use in live events such as vehicle racing which may include, for example, field racing, such as slot and cross-country racing, such as rallies. FIG. 1 illustrates a schematic view of a live system scenario for a vehicle athletic event. As shown in FIG. 1, a typical vehicle athletic event includes at least two vehicles, at least one of which has a live device mounted thereon. The event venue may have live devices oriented at different viewing angles, the live devices 10 may generate images (pictures), video streams, audio streams, or audio/video streams, the video streams, audio streams, or audio/video streams may be transmitted to a streaming media server, and the streaming media server 20 may decode the received video streams, and/or audio/video streams, reduce the decoded video streams, audio/video streams, or audio/video streams into image frames, and process each image frame. And the processed image frames are coded, compressed and transmitted to user equipment at a client side through a live broadcast platform, such as a smart phone for playing. The live platform 30 may be a live server that may have a director system that controls the video, audio, or audio-video streams transmitted to the user devices and selects the video, audio, or audio-video streams received by the live platform 30. The live platform 30 may store the received video stream, audio stream, or audio/video stream in different storage locations, and the user device 40 (playing device) may view the video stream, audio stream, or audio/video stream by accessing an address specified by the live platform.
Among the live devices shown in fig. 1, some of the live devices may be wide view live devices 10 having wide view image capturing units. The wide-view image acquisition unit is, for example, an acquisition device with a view angle larger than 170 degrees. The wide view angle image capturing unit may be, for example, an image capturing module with a fisheye lens or a video capturing module, such as a fisheye camera, having a view angle of 220 degrees or more. These devices with wide-angle lenses may be deployed, for example, inside a vehicle, such as an instrument desk or a roof, or anywhere on a racing track.
In the illustration shown in fig. 1, a part of the vehicles is equipped with a wide-view live broadcasting device 10 having a fisheye camera, which may also be referred to as a fisheye live broadcasting device, and the fisheye live broadcasting devices 10A, 10B, 10C, and 10D may be mounted on the vehicles, for example, or may be mounted in a stadium, as shown by live broadcasting devices 10E and 10F.
Fig. 2 shows a schematic diagram of a wide-view live broadcast device 10 with a fisheye camera in one embodiment. Referring to fig. 2, the live broadcasting device includes a fisheye camera 11, a memory 12, a wired or wireless output unit 13, and a processor 14. The processor may communicate with the fisheye camera, memory, wired or wireless output unit via bus 16. The image collected by the fisheye camera 11 is basically a wide view angle image with a view angle greater than 180 degrees, which may also be referred to as a fisheye image, and a computer program, which may be an instruction sequence, may be stored in the memory 12. As shown in fig. 3A and 3B, when the processor runs the program, a sample wide view image S101 acquired by the fisheye camera and camera internal parameters and distortion parameters S102 are acquired or received first, and then the acquired or received sample wide view image is subjected to preprocessing S200. The internal parameter matrix and the distortion coefficient matrix of the fisheye camera, including radial distortion and tangential distortion, can be obtained by self-calibration methods such as an industrial camera self-calibration method based on active vision, an industrial camera self-calibration method for directly solving a Kruppa equation, a layered stepwise calibration method, a quadric surface-based self-calibration method and the like. May also be obtained using, for example, a chessboard-based calibration method.
The preprocessing may include one or more of vignetting processing and wide-view image position correction on the wide-view image. The vignetting process may include, for example, cropping of a wide view image. The majority position correction may include, for example, alignment processing, etc.
In the embodiment depicted in fig. 3A, the wide-view live device may calculate a pixel coordinate mapping relationship matrix for a sample wide-view image of a given size of the device, e.g., a fisheye image and an unwrapped image, based on the intrinsic parameters and distortion parameters, S300. The pixel coordinate mapping relation matrix can be obtained by obtaining one or more sample wide-view-angle images and calculating based on the internal parameter matrix and the distortion coefficient matrix. The wide view image or wide view image frame is then acquired and a wide view video stream is generated, which may then be transmitted to a streaming server along with the pixel coordinate mapping relationship matrix. The wide view images may be referred to as live wide view images and the video stream may be referred to as a wide view video stream.
For example, the pixel coordinate mapping relationship matrix and the wide view image with or without pre-processing may be encoded and compressed into a video stream based on the RTMP protocol and sent to the streaming server, for example, the pixel coordinate mapping relationship matrix may be sent to the streaming server through a metadata packet in the RTMP protocol. For example, a wide view video stream may be encoded and transmitted with several frames of live wide view images, or generated based on a group of pictures (GOP) format.
The live wide view video stream cached by the streaming media server may be transmitted to the user equipment, or a playing function of the playing device, such as a panoramic player, through the live platform, and is presented by the panoramic player.
For example, as depicted in fig. 6, after the user device 40 receives the wide-view video stream and/or the audio/video stream S601, it may decode it to restore a wide-view image frame, for example, to restore an intra-coded frame I in a group of images, and the user device may also establish a panoramic player by, for example, applying the panoramic player, and during playing, may establish a stereoscopic model S602, for example, for a fish-eye image frame, a hemispherical model may be established, a pixel coordinate mapping matrix is used to transform corresponding spherical texture coordinates S603, and then a texture mapping technique is applied to map the image frame onto the established hemispherical model S604, i.e., a frame of corrected image may be displayed.
In the embodiment depicted in fig. 3B, the image (or called picture) acquired by the live device may also be directly compressed and transmitted to the rich media server to be stored in a file form, or transmitted to the streaming media server in a form of encoded and compressed video stream, and meanwhile, the measured intrinsic parameter and distortion coefficient are transmitted to the streaming server through the metadata packet of rtmp, and then the pixel coordinate mapping relationship matrix calculation function in the streaming media server calculates the wide view angle picture of a given size, such as a fisheye picture, and the pixel coordinate mapping relationship matrix of the expanded picture, and stores them locally in the streaming media server. In the system, there may be a plurality of streaming media servers, and each streaming media server may derive a pixel coordinate mapping relationship matrix of a specific live device according to an internal parameter and a distortion coefficient matrix obtained from the specific live device.
One, two or more streaming media servers can be arranged according to needs, and can cache and schedule video streams or audio and video streams from part or all of wide-view live broadcast equipment.
In another embodiment, as shown in fig. 7, the streaming media server 20 may have a pixel coordinate mapping relationship matrix derivation function 21, which may be implemented by programming, for example, a sequence of instructions, and the program, when executed by a processor of the streaming media server, calculates a pixel coordinate mapping relationship matrix for a given size of a wide viewing angle, for example, a fisheye, an image frame and a panoramic expansion image thereof, of the wide viewing angle live broadcasting device based on the internal parameters and distortion coefficients received from the wide viewing angle live broadcasting device. The streaming media server may also have a pixel coordinate mapping relation matrix storage function 22, and if there are a plurality of the wide-view live devices, the streaming media server locally stores a plurality of such pixel coordinate mapping relation matrices.
Generally, once the pixel coordinate mapping relationship matrix is calculated, as long as the specific wide-view live broadcast device is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relationship matrix to the user equipment. The streaming media server may include a pixel coordinate mapping relationship matrix updating function 23, and when the streaming media server is disconnected from the wide-view live broadcast device S231, the streaming media server deletes the pixel coordinate mapping relationship matrix. When the connection is reestablished, the streaming media server obtains the internal parameter and distortion coefficient matrix from the specific live broadcast device again S232, calculates the pixel coordinate mapping relationship matrix of the wide-view live broadcast device S233, and transmits the pixel coordinate mapping relationship matrix to the user device S234, as shown in fig. 8.
Similarly, the rich media server may transmit the pre-processed wide view image to the user device for display based on a request from the user device 40. The wide-view live broadcast equipment can expand the wide-view image based on the internal parameter matrix and the distortion coefficient matrix to form an expanded wide-view image, and transmit the expanded image to the rich media server. Or the wide-view live broadcast equipment can carry out preprocessing and expansion processing on the acquired wide-view images, send the expanded images to the rich media server for storage, and can encode, compress and transmit a single expanded image after receiving a request of transmitting the single expanded image from the user equipment.
The user device 40 receives the pre-processed wide view image, such as a panned out picture of a fisheye image, from the rich media server for presentation.
The user equipment 40 also receives the wide view video stream, e.g., a fisheye image-based wide view video stream, and the pixel coordinate mapping relationship matrix from the streaming server. The user equipment may have a restore function to restore the received wide view video stream, e.g. to restore its intra-coded frames I for a video stream transmitted in a group of pictures format. The user device may include a panoramic display function that may then create a hemispherical model, transform the corresponding spherical texture coordinates using a pixel coordinate mapping matrix, and apply texture mapping techniques to map each image frame onto the created hemispherical model, i.e., may display the corrected image frame, as depicted in fig. 6.
A panoramic video player may be provided at the user device to implement the aforementioned restore function and to align the initial focus point of the wide-view video stream with the geometric center of the panoramic image.
The streaming media server 20 may be geographically located remotely from the live platform 30. For example, the streaming server 20 may be located in the center of the track, or on any vehicle.
The streaming media server 20 may store the received video stream segment in the storage location 31 of the live platform 30 for the client to retrieve, or may also directly transmit the video stream segment to the client through the live platform based on the RTMP, HLS protocol or HTTP FLV protocol.
The live device 10 may be provided with a sound collection unit 15, the sound collection unit 15 collecting sound field signals, and the sound field signals may be digitized into audio data by a program run by a processor, and the audio data may be pre-processed, e.g., de-noised, at the live device. The audio data after the pre-processing and the wide view image frame are synchronized, encoded, compressed and encapsulated into audio and video stream, and the audio and video stream is pushed to the streaming media server 20 based on the RTMP protocol.
Or, a request for transmitting the video stream or the audio/video stream may be sent to the live device 10 by the streaming media server 20 at regular time, and the live device 10 may transmit the video stream or the audio/video stream generated by encoding and compressing to the streaming media server 20 in response to the request.
The streaming server 20 buffers the obtained stream through the live platform 30 and simultaneously broadcasts it live to the viewer's user device 40.
The live system 30 may also obtain a video stream from a common view live device 50. One or more of the normal-view live devices 50 may be arranged. May be arranged on the vehicle or at any position of the playing field. The normal view live broadcast device may have a similar structure to the wide view live broadcast device. For example, an image acquisition unit, a processor, a memory, a bus, a wired or wireless output unit, etc. may be included as well. The normal view live broadcasting device 50 can generate a video stream only by correcting the acquired image, and push the video stream to the streaming media server.
Similarly, the normal-view live broadcasting device 50 may also include a sound collection unit for acquiring sound field signals and digitally generating audio data. The audio data can be encoded to form an audio stream and pushed to the streaming media server or encapsulated into an audio/video stream together with the video stream and pushed to the streaming media server.
The delivery of the expanded wide view video stream or audio-visual stream and the normal view expanded video stream or audio-visual stream may be controlled via the director system 32 of the live platform. The live platform can be an existing cloud live platform.
The video stream, the audio stream, or the audio/video stream generated by the normal viewing angle live broadcasting device 50 may be pushed to the streaming media server 20 in parallel with the expanded wide viewing angle video stream or the expanded audio/video stream generated by the wide viewing angle live broadcasting device. The streaming media server 20 may synchronize the wide view video stream or the audio/video stream with the general view video stream or the audio/video stream, and then transmit the synchronized stream to the live broadcast platform for storage, or transmit the synchronized stream to a user device such as a smart phone through the live broadcast platform.
The wide-view video stream or the audio/video stream and the normal-view video stream or the audio/video stream can also be asynchronously transmitted to the playing device.
A play interface of multiple angles may be provided at the playing device, for example, a play interface as shown in fig. 5 may be provided, which includes a panorama player screen located in the middle, for example, panorama play screen 1, panorama play screen 2, and panorama play screen 3. And normal-viewing-angle screens 1 to 4 having a normal viewing angle aspect ratio on both sides of the unfolded wide-viewing-angle screen. The video source played by the common visual angle screen can be a video stream pushed by the common visual angle live broadcasting equipment arranged on the competition field or the common visual angle live broadcasting equipment arranged on the racing car.
For example, it is also possible to play a video stream or an audio/video stream on one of the panoramic play screen and the normal view screen, and repeatedly play a segment of the video stream in a time cycle on the other screen, and when the user selects the segment played in the cycle through, for example, a click operation or a touch operation, pull the video stream or the audio/video stream that the corresponding live broadcast equipment has changed from the memory of the live broadcast platform or pull the video stream or the audio/video stream that the streaming media server has just obtained from the live broadcast equipment in real time from the streaming media server.
The playing interface can also judge whether the screen needs to be switched from the expanded wide view angle screen to the common view angle screen or vice versa when receiving the operation of a user on the screen corresponding to a certain direct playing device, and if the switching is judged to be needed, the playing interface is replaced.
Upon switching the playback screen, the streaming media server may be configured to recalculate the pixel coordinate mapping relationship matrix. For example, if the panorama playback screen 2 is selected and the panorama playback screen 1 is deselected, and then the panorama playback screen 1 is reselected, the streaming media server may be configured to recalculate the pixel coordinate mapping relationship matrix of the wide-view-angle live broadcast device corresponding to the panorama playback screen 1, and transmit the pixel coordinate mapping relationship matrix to the user device for presenting the panoramic video.
The playback device 40 may include a tactile sensation control unit configured to receive a tactile sensation input to change a viewing direction of the virtual camera unit.
It should be understood by those skilled in the art that the above-mentioned function options and their corresponding live event applications are only examples for illustrating the present invention, and should not be construed as any limitation to the present invention, and other existing or future function options and their corresponding live event applications, such as may be applied to the present invention, should be included in the scope of the present invention.
It should be noted that the processing portions of the present invention may be implemented in software and/or a combination of software and hardware, for example, as an Application Specific Integrated Circuit (ASIC), a general purpose computer or any other similar hardware device. In one embodiment, the software program of the present invention may be executed by a processor to implement the steps or functions described above. Also, the software programs (including associated data structures) of the present invention can be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Further, some of the steps or functions of the present invention may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present invention can be applied as a computer program product, such as computer program instructions, which when executed by a computer, can invoke or provide the method and/or technical solution according to the present invention through the operation of the computer. Program instructions which invoke the methods of the present invention may be stored on a fixed or removable recording medium and/or transmitted via a data stream on a broadcast or other signal-bearing medium and/or stored within a working memory of a computer device operating in accordance with the program instructions. An embodiment according to the invention herein comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or solution according to embodiments of the invention as described above.
Examples of the present application include:
1. a live event broadcasting system comprises a live broadcasting device with a wide view angle, wherein the live broadcasting device with the wide view angle comprises
The wide-view image acquisition unit is used for acquiring a wide-view image;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
collecting a plurality of live broadcast wide-view image frames;
creating a wide-view video stream with the captured plurality of live wide-view image frames;
and transmitting the live broadcast wide-view video stream and the pixel coordinate mapping relation matrix to at least one streaming media server.
2. According to the event live broadcasting system described in example 1, each frame of the live broadcast wide-view video stream is pre-processed and then transmitted to the streaming media server.
3. The live event system of example 1, wherein the live wide-view video stream is presented in a group of pictures.
4. The live event broadcasting system of example 1, comprising: the live broadcast wide-view video stream is encoded and compressed into a video stream, and the video stream is pushed to a streaming media server based on an RTMP protocol.
5. The live event broadcasting system of example 4, wherein: the sample wide-view image frames and/or the live broadcast wide-view image frames are sample fisheye image frames and/or live broadcast fisheye image frames, and the wide-view live broadcast equipment is fisheye live broadcast equipment.
6. The live event broadcasting system of example 1, the wide-view live broadcasting device comprising
The wide-view image acquisition unit is used for acquiring a wide-view image;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit, and deducing a pixel coordinate mapping relation matrix of the wide-view image and the expanded image;
preprocessing the wide-view image to remove invalid data image parts;
expanding the wide visual angle image without the invalid data part based on the internal parameter matrix and the distortion coefficient matrix to form a wide visual angle expanded image;
and transmitting the wide view expanded image to a rich media server.
7. The live event broadcasting system of example 1, further comprising a sound collection unit configured to obtain a sound field signal of the live event broadcasting device of the racing car.
8. The live event system of example 7, the memory further storing computer instructions, the processor executing the computer instructions to implement
Digitizing the sound field signals into audio data;
the audio data and the plurality of wide-view image frames are packaged into wide-view audio and video streams; and
and transmitting the wide-view audio and video stream to the streaming media server.
9. A live event broadcasting system comprises a live broadcasting device with a wide view angle, wherein the live broadcasting device with the wide view angle comprises
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
the system comprises at least one sound acquisition unit, a signal processing unit and a signal processing unit, wherein the sound acquisition unit is used for acquiring sound field signals of the live broadcast equipment of the sports racing car;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame;
preprocessing the sample wide-view image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
acquiring a plurality of live broadcast wide-view image frames;
digitizing the sound field signals into audio data;
packaging the audio data and the plurality of wide-view image frames into wide-view audio and video streams; and
transmitting the pixel coordinate mapping relation matrix to the streaming media server; and transmitting the wide-view audio and video stream to the streaming media server.
10. The event live broadcasting system according to example 9, wherein each frame of the live wide view video stream is pre-processed and then transmitted to the streaming media server.
11. The live event system of example 9, wherein the live wide-view video stream is presented in a group of pictures.
12. The event live broadcasting system of example 9, wherein the live wide view video stream is encoded and compressed into a video stream to be pushed to a streaming media server based on an RTMP protocol.
13. The live event broadcasting system of example 12, wherein the sample wide view image frames and/or the live broadcast wide view image frames are sample fisheye image frames and/or live broadcast fisheye image frames, and the wide view live broadcast device is a fisheye live broadcast device.
14. The live event broadcasting system of example 9, the wide-view live broadcasting device comprising
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
preprocessing the wide view image to remove invalid data image portions;
expanding the wide visual angle image without the invalid data part based on the internal parameter matrix and the distortion coefficient matrix to form a wide visual angle expanded image;
and transmitting the wide view expanded image to a rich media server.
15. A live event system comprises
A playback device comprising a processor and a memory; the memory stores a sequence of computer instructions that, when executed, cause the playback device to:
receiving a pixel coordinate mapping relation matrix aiming at least one wide-view image acquisition device from at least one streaming media server and acquiring a wide-view video stream or an audio/video stream by the wide-view image acquisition device;
decoding the wide-view video stream or the audio/video stream to obtain a plurality of wide-view image frames;
establishing a hemispherical model;
transforming corresponding hemispherical texture coordinates by using the pixel coordinate mapping matrix;
texture mapping each frame of the plurality of wide-perspective image frames on the hemispherical model to render a corrected image frame.
16. The live event broadcasting system of claim 15, further comprising a virtual camera unit located at a center of a sphere where the hemisphere of the hemispherical model is located; the visual angle direction of the virtual camera shooting unit is adjustable.
17. The live event broadcasting system of example 16, the playback device comprising a tactile control unit configured to receive a tactile input to change a viewing direction of the virtual camera unit.
18. The event live broadcasting system according to example 15, wherein the wide view live broadcasting device pushes the wide view video stream to the streaming media server based on an RTMP protocol, or the wide view live broadcasting device transmits the expanded wide view video stream to the streaming media server based on a request of the streaming media server.
19. The live event broadcasting system of example 18, further comprising at least one normal view live device that generates a normal view video stream and transmits the normal view video stream to the streaming media server.
20. The live event system of example 16, the streaming media system to concurrently transmit the common instructional video stream received from the common view live device and the expanded wide view video stream received from the wide view live device to a live server.
21. A live event system comprises
At least one wide-view live device, each of the at least one wide-view live device comprising
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
a memory for storing a first sequence of computer instructions;
a processor executing the first sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame;
preprocessing the sample wide-view image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
acquiring a live broadcast wide-view video stream formed based on a plurality of live broadcast wide-view image frames;
a streaming media server comprising a processor and a memory, the memory storing a second sequence of computer instructions which, when executed, cause the streaming media server to:
receiving the live wide-view video stream and the pixel coordinate mapping relationship matrix from the at least one wide-view live device;
and caching and scheduling the pixel coordinate mapping relation matrix and the wide-view video stream.
22. A live event system comprises
At least one wide-view live device, each of the at least one wide-view live device comprising
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
a memory for storing a first sequence of computer instructions;
a processor executing the first sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
acquiring a wide-view video stream formed based on a plurality of live wide-view image frames;
transmitting the intrinsic parameter matrix and the distortion coefficient matrix to a streaming media server; and
transmitting the wide-view video stream to the streaming server;
a streaming media server in communication with at least one wide-view live device, comprising a processor and a memory, the memory storing a second sequence of computer instructions that, when executed, cause the streaming media server to:
deducing a pixel coordinate mapping relation matrix of a wide-view image frame of the wide-view live broadcast equipment and a panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame received from the at least one wide-view live broadcast equipment and an internal parameter matrix and a distortion coefficient matrix of a wide-view image acquisition unit of the wide-view live broadcast equipment;
locally storing the pixel coordinate mapping relation matrix of the wide-view-angle live broadcast equipment;
receiving a wide-view video stream or an audio/video stream from the wide-view live broadcast equipment;
caching the wide-view video stream; and
and scheduling the wide-view video stream and the pixel coordinate mapping relation matrix.
23. According to the live event broadcasting system of example 22, the scheduling the wide view video stream and the pixel coordinate mapping relationship matrix includes sending the wide view video stream or the audio/video stream together with the pixel coordinate mapping relationship matrix to a target playback device.
24. The event live system of example 22, the streaming media server to retain the pixel coordinate mapping relationship matrix until a connection of the streaming media server to the wide-view live device is broken.
25. According to the event live broadcasting system of example 23, when the streaming media server switches the request of the user equipment for the video stream or the audio/video stream from the different live broadcasting devices with the wide view angle, the streaming media server derives a new pixel coordinate mapping relationship matrix according to the recently received sample image with the wide view angle and replaces the stored pixel coordinate mapping relationship matrix.
26. A live broadcast method for events is used for controlling the live broadcast equipment with wide visual angle and comprising a wide visual angle image acquisition unit, and comprises
Obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
collecting a plurality of live broadcast wide-view image frames;
creating a wide-view video stream with the captured plurality of live wide-view image frames;
and transmitting the live broadcast wide-view video stream and the pixel coordinate mapping relation matrix to at least one streaming media server.
A race live broadcast method is used for controlling wide-view live broadcast equipment comprising a wide-view image acquisition unit, and comprises the steps of obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame;
preprocessing the sample wide-view image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
acquiring a plurality of live broadcast wide-view image frames;
digitizing the sound field signals into audio data;
packaging the audio data and the plurality of wide-view image frames into wide-view audio and video streams; and
transmitting the pixel coordinate mapping relation matrix to the streaming media server; and transmitting the wide-view audio and video stream to the streaming media server.
28. A computer readable storage medium having stored thereon a sequence of computer instructions for carrying out the method of claim 26 or 27.

Claims (28)

1. A live event broadcasting system comprises a live broadcasting device with a wide view angle, wherein the live broadcasting device with the wide view angle comprises
The wide-view image acquisition unit is used for acquiring a wide-view image;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
collecting a plurality of live broadcast wide-view image frames;
creating a wide-view video stream with the captured plurality of live wide-view image frames;
transmitting the live wide view video stream and the pixel coordinate mapping relationship matrix to at least one streaming media server, the streaming media server configured to: transmitting the wide-view video stream to a user device via a live platform; as long as the wide-view-angle live broadcast equipment is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix;
the user equipment is configured to receive and restore the wide-view video stream, wherein restoring the wide-view video stream comprises establishing a stereoscopic model, transforming stereoscopic texture coordinates corresponding to the stereoscopic model using the pixel coordinate mapping relation matrix, and then applying a texture mapping technology to map the live wide-view image frame onto the stereoscopic model to display a corrected image.
2. The live event broadcasting system of claim 1, wherein each frame of the live wide view video stream is pre-processed and then transmitted to the streaming media server.
3. A live event system as claimed in claim 1 wherein the live wide view video stream is presented in the form of groups of pictures.
4. A live event broadcasting system according to claim 1, wherein: the live broadcast wide-view video stream is encoded and compressed into a video stream, and the video stream is pushed to a streaming media server based on an RTMP protocol.
5. The live event broadcasting system of claim 4, wherein: the sample wide-view image frames and/or the live broadcast wide-view image frames are sample fisheye image frames and/or live broadcast fisheye image frames, and the wide-view live broadcast equipment is fisheye live broadcast equipment.
6. A live event broadcast system as claimed in claim 1, said wide view live broadcast device comprising
The wide-view image acquisition unit is used for acquiring a wide-view image;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit, and deducing a pixel coordinate mapping relation matrix of the wide-view image and the expanded image;
preprocessing the wide-view image to remove invalid data image parts;
expanding the wide visual angle image without the invalid data part based on the internal parameter matrix and the distortion coefficient matrix to form a wide visual angle expanded image;
and transmitting the wide view expanded image to a rich media server.
7. A live event broadcast system as claimed in claim 1 further comprising a sound capture unit for capturing sound field signals of the live broadcast device.
8. A live event system as in claim 7 wherein said memory further stores computer instructions, said processor executing said computer instructions to implement
Digitizing the sound field signals into audio data;
the audio data and the plurality of wide-view image frames are packaged into wide-view audio and video streams; and
and transmitting the wide-view audio and video stream to the streaming media server.
9. A live event broadcasting system comprises a live broadcasting device with a wide view angle, wherein the live broadcasting device with the wide view angle comprises
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
the system comprises at least one sound acquisition unit, a sound processing unit and a control unit, wherein the sound acquisition unit is used for acquiring sound field signals of live broadcast equipment of racing cars;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame;
preprocessing the sample wide-view image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
acquiring a plurality of live broadcast wide-view image frames;
digitizing the sound field signals into audio data;
packaging the audio data and the plurality of wide-view image frames into wide-view audio and video streams; and
transmitting the pixel coordinate mapping relation matrix to a streaming media server; transmitting the wide-view audio and video stream and the live wide-view image frame to the streaming media server;
the streaming media server is configured to: transmitting the wide-view video stream to a user device via a live platform; as long as the wide-view-angle live broadcast equipment is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix;
the user equipment is configured to receive and restore the wide-view video stream, wherein restoring the wide-view video stream comprises establishing a stereoscopic model, transforming stereoscopic texture coordinates corresponding to the stereoscopic model using the pixel coordinate mapping relation matrix, and then applying a texture mapping technology to map the live wide-view image frame onto the stereoscopic model to display a corrected image.
10. The live event broadcasting system of claim 9, wherein each frame of the live wide view video stream is pre-processed and then transmitted to the streaming media server.
11. A live event system as claimed in claim 9 wherein the live wide view video stream is presented in the form of groups of pictures.
12. The live event broadcasting system of claim 9, wherein the live wide view video stream is encoded and compressed into a video stream to be pushed to a streaming media server based on an RTMP protocol.
13. The live event broadcasting system of claim 12, wherein the sample wide view image frames and/or the live broadcast wide view image frames are sample fisheye image frames and/or live broadcast fisheye image frames, and the wide view live broadcast device is a fisheye live broadcast device.
14. A live event system as recited in claim 9 wherein the wide view live device comprises
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
a memory for storing a sequence of computer instructions;
a processor executing the sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
preprocessing the wide view image to remove invalid data image portions;
expanding the wide visual angle image without the invalid data part based on the internal parameter matrix and the distortion coefficient matrix to form a wide visual angle expanded image;
and transmitting the wide view expanded image to a rich media server.
15. A live event system comprises
A playback device comprising a processor and a memory; the memory stores a sequence of computer instructions that, when executed, cause the playback device to:
receiving a wide-view image frame of the wide-view live broadcast equipment and a pixel coordinate mapping relation matrix of a panoramic expansion image frame corresponding to the wide-view image frame aiming at least one wide-view image acquisition equipment from at least one streaming media server, and acquiring a wide-view video stream or an audio/video stream from the wide-view image acquisition equipment;
decoding the wide-view video stream or the audio/video stream to obtain a plurality of wide-view image frames;
establishing a hemispherical model;
transforming corresponding hemispherical texture coordinates by using the pixel coordinate mapping matrix;
texture mapping each frame of the plurality of wide-view image frames on the hemispherical surface model to render a corrected image frame;
the streaming media server is configured to: as long as the wide-view acquisition device is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; and when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix.
16. The live event broadcasting system of claim 15, further comprising a virtual camera unit located at a center of a sphere where the hemisphere of the hemispherical model is located; the visual angle direction of the virtual camera shooting unit is adjustable.
17. A live event broadcast system according to claim 16, the playback device comprising a tactile control unit configured to receive tactile input to change the viewing direction of the virtual camera unit.
18. The live event broadcasting system of claim 15, wherein the wide view live device pushes the wide view video stream to the streaming media server based on an RTMP protocol, or the wide view live device transmits the wide view video stream to the streaming media server based on a request of the streaming media server.
19. A live event system according to claim 18 further comprising at least one normal view live device, the normal view live device generating a normal view video stream and transmitting the normal view video stream to the streaming media server.
20. The live event system of claim 19 wherein the streaming server concurrently transmits the normal view video stream received from the normal view live device and the wide view video stream received from the wide view live device to a live server.
21. A live event system comprises
At least one wide-view live device, each of the at least one wide-view live device comprising
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
a memory for storing a first sequence of computer instructions;
a processor executing the first sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame;
preprocessing the sample wide-view image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
acquiring a live broadcast wide-view video stream formed based on a plurality of live broadcast wide-view image frames;
a streaming media server comprising a processor and a memory, the memory storing a second sequence of computer instructions which, when executed, cause the streaming media server to:
receiving the live wide-view video stream and the pixel coordinate mapping relationship matrix from the at least one wide-view live device;
caching and scheduling the pixel coordinate mapping relation matrix and the wide-view video stream;
the streaming media server is configured to: transmitting the wide-view video stream to a user device via a live platform; as long as the wide-view-angle live broadcast equipment is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix;
the user equipment is configured to receive and restore the wide-view video stream, wherein restoring the wide-view video stream comprises establishing a stereoscopic model, transforming stereoscopic texture coordinates corresponding to the stereoscopic model using the pixel coordinate mapping relation matrix, and then applying a texture mapping technology to map the live wide-view image frame onto the stereoscopic model to display a corrected image.
22. A live event system comprises
At least one wide-view live device, each of the at least one wide-view live device comprising
The wide visual angle image acquisition unit is used for acquiring a wide visual angle image frame;
a memory for storing a first sequence of computer instructions;
a processor executing the first sequence of computer instructions to:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
acquiring a wide-view video stream formed based on a plurality of live wide-view image frames;
transmitting the internal parameter matrix and the distortion coefficient matrix to a streaming media server; and
transmitting the wide-view video stream to the streaming server;
a streaming media server in communication with at least one wide-view live device, comprising a processor and a memory, the memory storing a second sequence of computer instructions that, when executed, cause the streaming media server to:
deducing a pixel coordinate mapping relation matrix of a wide-view image frame of the wide-view live broadcast equipment and a panoramic expansion image frame of the wide-view live broadcast equipment according to the wide-view image frame received from the at least one wide-view live broadcast equipment and an internal parameter matrix and a distortion coefficient matrix of a wide-view image acquisition unit of the wide-view live broadcast equipment;
locally storing the pixel coordinate mapping relation matrix of the wide-view-angle live broadcast equipment;
receiving a wide-view video stream or an audio/video stream from the wide-view live broadcast equipment;
caching the wide-view video stream; and
and scheduling the wide-view video stream and the pixel coordinate mapping relation matrix.
The streaming media server is configured to: transmitting the wide-view video stream to a user device via a live platform; as long as the wide-view-angle live broadcast equipment is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix;
the user equipment is configured to receive and restore the wide-view video stream, wherein restoring the wide-view video stream comprises establishing a stereoscopic model, transforming stereoscopic texture coordinates corresponding to the stereoscopic model using the pixel coordinate mapping relation matrix, and then applying a texture mapping technology to map the live wide-view image frame onto the stereoscopic model to display a corrected image.
23. The live event broadcasting system of claim 22, wherein scheduling the wide-view video stream and the pixel coordinate mapping relationship matrix comprises sending the wide-view video stream or the audio/video stream together with the pixel coordinate mapping relationship matrix to a target playback device.
24. A live event system according to claim 22 and wherein said streaming media server maintains said pixel coordinate mapping relationship matrix until said streaming media server is disconnected from said wide-view live broadcast device.
25. The live event broadcasting system of claim 23, wherein the streaming media server derives a new pixel coordinate mapping relationship matrix and replaces the stored pixel coordinate mapping relationship matrix according to the recently received sample wide view image when switching the request of the user equipment for the video stream or the audio/video stream from the different wide view live broadcasting devices.
26. A live broadcast method for events is used for controlling the live broadcast equipment with wide visual angle and comprising a wide visual angle image acquisition unit, and comprises
Obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
collecting a plurality of live broadcast wide-view image frames;
creating a wide-view video stream with the captured plurality of live wide-view image frames;
transmitting the live broadcast wide-view video stream and the pixel coordinate mapping relation matrix to at least one streaming media server;
the streaming media server is configured to: transmitting the wide-view video stream to a user device via a live platform; as long as the wide-view-angle live broadcast equipment is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix;
the user equipment is configured to receive and restore the wide-view video stream, wherein restoring the wide-view video stream comprises establishing a stereoscopic model, transforming stereoscopic texture coordinates corresponding to the stereoscopic model using the pixel coordinate mapping relation matrix, and then applying a texture mapping technology to map the live wide-view image frame onto the stereoscopic model to display a corrected image.
27. A live event broadcasting method is used for controlling a wide-view live broadcasting device comprising a wide-view image acquisition unit, and comprises the following steps:
obtaining an internal parameter matrix and a distortion coefficient matrix of the wide-view image acquisition unit;
collecting a sample wide-view-angle image frame;
preprocessing the sample wide-view image frame; and
deducing a pixel coordinate mapping relation matrix of the wide-view image frame of the wide-view live broadcast equipment and the panoramic expansion image frame of the wide-view live broadcast equipment according to the sample wide-view image frame, the internal parameters and the distortion coefficients;
acquiring a plurality of live broadcast wide-view image frames;
acquiring a sound field signal of the wide-view-angle live broadcast equipment;
digitizing the sound field signals into audio data;
packaging the audio data and the plurality of wide-view image frames into wide-view audio and video streams; and
transmitting the pixel coordinate mapping relation matrix to a streaming media server; transmitting the wide-view audio and video stream and the live wide-view image frame to the streaming media server;
the streaming media server is configured to: transmitting the wide-view video stream to a user device via a live platform; as long as the wide-view-angle live broadcast equipment is connected with the streaming media server, the streaming media server always transmits the pixel coordinate mapping relation matrix to the user equipment; when the connection between the streaming media server and the wide-view-angle live broadcast equipment is disconnected, the streaming media server deletes the pixel coordinate mapping relation matrix;
the user equipment is configured to receive and restore the wide-view video stream, wherein restoring the wide-view video stream comprises establishing a stereoscopic model, transforming stereoscopic texture coordinates corresponding to the stereoscopic model using the pixel coordinate mapping relation matrix, and then applying a texture mapping technology to map the live wide-view image frame onto the stereoscopic model to display a corrected image.
28. A computer readable storage medium having stored thereon a sequence of computer instructions for carrying out the method of claim 26 or 27.
CN201710432037.5A 2017-06-09 2017-06-09 Event wide-view live broadcasting equipment and associated live broadcasting system and method Active CN107835435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710432037.5A CN107835435B (en) 2017-06-09 2017-06-09 Event wide-view live broadcasting equipment and associated live broadcasting system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710432037.5A CN107835435B (en) 2017-06-09 2017-06-09 Event wide-view live broadcasting equipment and associated live broadcasting system and method

Publications (2)

Publication Number Publication Date
CN107835435A CN107835435A (en) 2018-03-23
CN107835435B true CN107835435B (en) 2021-08-20

Family

ID=61643037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710432037.5A Active CN107835435B (en) 2017-06-09 2017-06-09 Event wide-view live broadcasting equipment and associated live broadcasting system and method

Country Status (1)

Country Link
CN (1) CN107835435B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020258047A1 (en) * 2019-06-25 2020-12-30 Beijing Xiaomi Mobile Software Co., Ltd. Omnidirectional media playback method and device and computer readable storage medium thereof
CN112689175A (en) * 2020-12-21 2021-04-20 四川一电航空技术有限公司 Scooter capable of being directly seeded
CN114866789A (en) * 2021-02-04 2022-08-05 华为技术有限公司 Vehicle-mounted live broadcast method and device
CN114630142B (en) * 2022-05-12 2022-07-29 北京汇智云科技有限公司 Large-scale sports meeting rebroadcast signal scheduling method and broadcasting production system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726855A (en) * 2009-11-13 2010-06-09 河北工业大学 Correction method of fisheye image distortion on basis of cubic projection
CN102291527A (en) * 2011-08-11 2011-12-21 杭州海康威视软件有限公司 Panoramic video roaming method and device based on single fisheye lens
CN106060571A (en) * 2016-05-30 2016-10-26 湖南纽思曼导航定位科技有限公司 Automobile travelling data recorder and live video streaming method
CN106570938A (en) * 2016-10-21 2017-04-19 哈尔滨工业大学深圳研究生院 OPENGL based panoramic monitoring method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009139701A (en) * 2007-12-07 2009-06-25 Olympus Imaging Corp Zoom lens and imaging device using the same
CN103033913A (en) * 2011-10-06 2013-04-10 鸿富锦精密工业(深圳)有限公司 Zoom lens and imaging device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726855A (en) * 2009-11-13 2010-06-09 河北工业大学 Correction method of fisheye image distortion on basis of cubic projection
CN102291527A (en) * 2011-08-11 2011-12-21 杭州海康威视软件有限公司 Panoramic video roaming method and device based on single fisheye lens
CN106060571A (en) * 2016-05-30 2016-10-26 湖南纽思曼导航定位科技有限公司 Automobile travelling data recorder and live video streaming method
CN106570938A (en) * 2016-10-21 2017-04-19 哈尔滨工业大学深圳研究生院 OPENGL based panoramic monitoring method and system

Also Published As

Publication number Publication date
CN107835435A (en) 2018-03-23

Similar Documents

Publication Publication Date Title
US11381801B2 (en) Methods and apparatus for receiving and/or using reduced resolution images
JP7045856B2 (en) Video transmission based on independent coded background update
CN107835435B (en) Event wide-view live broadcasting equipment and associated live broadcasting system and method
CN107211081B (en) Video transmission based on independently coded background updates
KR101528863B1 (en) Method of synchronizing tiled image in a streaming service providing system of panoramic image
WO2007061068A1 (en) Receiver and line video distributing device
JP7177034B2 (en) Method, apparatus and stream for formatting immersive video for legacy and immersive rendering devices
CN111542862A (en) Method and apparatus for processing and distributing live virtual reality content
CN114007059A (en) Video compression method, decompression method, device, electronic equipment and storage medium
JP5520146B2 (en) Video receiving apparatus and control method thereof
CN107835433B (en) Event wide-view live broadcasting system, associated equipment and live broadcasting method
CN107835434B (en) Event wide-view live broadcasting equipment and associated live broadcasting system and method
CN109479147B (en) Method and technical device for inter-temporal view prediction
CN112203101B (en) Remote video live broadcast method and device and electronic equipment
US20230222754A1 (en) Interactive video playback techniques to enable high fidelity magnification
CN113038262A (en) Panoramic live broadcast method and device
JP2022034941A (en) Moving image communication system
WO2019072861A1 (en) Selection of animated viewing angle in an immersive virtual environment
JP2003235058A (en) View point free type image display apparatus and method, storage medium, computer program, broadcast system, and attached information management apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211126

Address after: Room 501, gate 9, 9 / F, zhixueyuan, No. 29, Xierqi Road West, Haidian District, Beijing 100085

Patentee after: Zhi Xiaomu

Address before: Room 1513, building 3, No. 3, Xijing Road, Badachu high tech park, Shijingshan District, Beijing 100041

Patentee before: FBLIFE (BEIJING) MEDIA TECHNOLOGY CO.,LTD.