CN112533002A - Dynamic image fusion method and system for VR panoramic live broadcast - Google Patents

Dynamic image fusion method and system for VR panoramic live broadcast Download PDF

Info

Publication number
CN112533002A
CN112533002A CN202011284179.XA CN202011284179A CN112533002A CN 112533002 A CN112533002 A CN 112533002A CN 202011284179 A CN202011284179 A CN 202011284179A CN 112533002 A CN112533002 A CN 112533002A
Authority
CN
China
Prior art keywords
live broadcast
live
dimensional
panoramic
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011284179.XA
Other languages
Chinese (zh)
Inventor
张晖
李吉媛
赵海涛
孙雁飞
朱洪波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202011284179.XA priority Critical patent/CN112533002A/en
Publication of CN112533002A publication Critical patent/CN112533002A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a dynamic image fusion method and a system for VR panoramic live broadcast, which comprises the following steps: collecting multi-view live broadcast streams, compressing and coding the live broadcast streams, and pushing the live broadcast streams to a streaming media server in real time; seamlessly splicing the live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream; creating a virtual scene and a 3D sphere object, and establishing a grid model in the virtual scene by taking a VR panorama as a material; creating a virtual perspective camera, and specifying an initial position, a direction and a visual field range of the camera; creating a 3D renderer, rendering all two-dimensional live broadcast pictures in the visual field range of the virtual camera in a scene into three-dimensional images and displaying the three-dimensional images; and according to the change of the view angle of the user, changing the position and the direction of the virtual camera to generate and display live pictures with different view angles. The invention supports the change of the visual angle of the user, not only can expand the visual range of the user in live broadcast and eliminate the blind zone of live broadcast, but also brings immersive visual experience to the user and improves the live broadcast effect.

Description

Dynamic image fusion method and system for VR panoramic live broadcast
Technical Field
The invention belongs to the field of 5G online live broadcast, and particularly relates to a dynamic image fusion method and system for VR panoramic live broadcast.
Background
The arrival of the 5G era brings vigorous vitality to the network live broadcast industry and enables the application range of the network live broadcast industry to be wider. With the development of science and technology, people are continuously pursuing higher comfort level and experience degree, so the requirement on the experience degree of watching live broadcast is higher and higher, the live broadcast itself is bound to develop a new existing form, and the functions are stronger and stronger. In the application of the 5G network, the immersive experience becomes an important form of learning, life, entertainment and aesthetic, and the combination of the immersive experience and the live broadcast of the network becomes the key point of the current research.
Traditional immersive experience need purchase and wear professional VR glasses, and not only the price is expensive, and the threshold is high, also is unfavorable for the popularization and the propagation of live moreover very much, and this runs counter to the flexibility of seeing live anytime and anywhere. The live broadcast picture seen by the method does not support the change of the observation visual angle of the user, and has small visual field range, large blind area and poor interactivity. How to enable users to really obtain perfect visual experience and be beneficial to the popularization and the propagation of live broadcast, which becomes a problem to be solved urgently by technical personnel in the field.
Disclosure of Invention
The purpose of the invention is as follows: the invention provides a dynamic image fusion method and a dynamic image fusion system for VR panoramic live broadcast, which make full use of pixel information of images with different visual angles and front and back images, so that splicing is faster and smoother, and live broadcast delay is reduced.
The invention content is as follows: the invention provides a dynamic image fusion method for VR panoramic live broadcast, which specifically comprises the following steps:
(1) the method comprises the steps that cameras in different directions are erected to collect multi-view live broadcast streams, and the multi-view live broadcast streams are pushed to a streaming media server in real time after being compressed and coded;
(2) seamlessly splicing the live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream;
(3) creating a virtual scene and a 3D sphere object, and then taking the decoded VR panoramic live broadcast stream as a grid material of the 3D sphere object to be pasted on the inner surface of the sphere;
(4) creating a virtual perspective camera in a virtual scene, and specifying an initial position, a direction and a visual field range of the camera;
(5) creating a 3D renderer, rendering all two-dimensional live broadcast pictures in the visual field range of a virtual camera in a virtual scene into three-dimensional images and displaying the three-dimensional images;
(6) and according to the change of the view angle of the user, changing the position and the direction of the virtual camera to generate and display live pictures with different view angles.
Further, the step (2) is realized as follows:
(21) when images are spliced, regarding the overlapped area of the images, considering the uneven brightness factor of the live images at different viewing angles, and using weighted average to smooth the overlapped area; the image weighted average expression is:
Figure BDA0002781790000000021
in the formula IiIs represented by1And I2Stitched image, α1And alpha2Is represented by1And I2The weight value is determined by the distance between the pixel point of the overlapped area and the source image and is between 0 and 1;
(22) in the process of generating the VR panoramic live stream, not only the pixel information of the current image is considered, but also the pixel information of the next frame of image is extracted, then the pixel information of the relative change of the previous and next images is extracted, the extracted pixel information is updated in time, and the current panoramic image of each frame is generated based on the panoramic image of the previous frame.
Further, the step (3) includes the steps of:
(31) creating a virtual Scene in a development environment;
(32) establishing a 3D spherical object with the radius of 500 and the horizontal and vertical division surfaces of 60 and 40 respectively;
(33) in a virtual scene, establishing a Mesh model object Mesh by using a 3D sphere object and video texture materials;
(34) and obtaining a VR live broadcast stream from the SRS streaming media server by adopting an HLS transmission protocol, decoding by using H264, and attaching the VR live broadcast picture on the inner surface of the ball as a grid model material.
Further, the initial position of the virtual perspective camera in step (4) is specified as (radius,0,0), the direction is the target object direction, and the field of view is 75 °.
The invention also provides a dynamic image fusion system for VR panoramic live broadcast, which comprises:
the acquisition module is used for acquiring the multi-view live broadcast stream and performing compression coding;
the dynamic image fusion module seamlessly splices live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream and pushes the live broadcast stream for the VR player module;
the VR player module is used for three-dimensional construction and playing of VR panoramic live broadcast, and comprises a VR module and a rendering module, wherein the VR module is used for three-dimensional reconstruction of a two-dimensional panoramic live broadcast view and restoring a live broadcast scene in a truest three-dimensional space; the rendering module is used for VR live 3D rendering and displaying a three-dimensional panoramic live frame;
and the control module is used for controlling the operation of the virtual camera in the three-dimensional scene so as to generate and display the current three-dimensional live broadcast picture.
Further, the control module comprises a gravity sensing module and a gesture sliding module; the gravity sensing module is used for sensing the direction of the screen held by the mobile terminal user and the change of the direction of the screen, and changing the rotation angle of the virtual camera according to the sensing information so as to display the current three-dimensional live broadcast picture; the gesture sliding module is used for acquiring the sliding direction and distance of a user on the screen, and changing the position of the virtual camera according to the direction and distance information so as to display the current three-dimensional live broadcast picture.
Has the advantages that: compared with the prior art, the invention has the beneficial effects that: by using the dynamic image fusion algorithm provided by the method, the pixel information of the images with different visual angles is fully utilized, so that the image splicing is more natural and smooth, the panoramic image splicing time is reduced, and the live broadcast delay is reduced; the method adopts a 3D drawing technology, carries out three-dimensional reconstruction on the panoramic live view based on OpenGL, restores the most real live scene of a three-dimensional space, not only supports the user to change the observation visual angle, enlarges the visual field range and brings immersive experience to the user, but also does not need to wear any equipment, can experience the naked eye 3D effect by opening the live link at any time and any place, and is favorable for the popularization and propagation of live broadcast.
Drawings
FIG. 1 is a flow chart of a dynamic image fusion method for VR panorama live broadcast;
FIG. 2 is a flow chart of three-dimensional picture reconstruction for VR panoramic live view;
fig. 3 is a schematic structural diagram of a dynamic image fusion system for VR panorama live broadcasting.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
As shown in fig. 1, the dynamic image fusion method for VR panorama live broadcast provided by the present invention specifically includes the following steps:
step 1: the multi-view live broadcast stream is collected by erecting cameras in different directions, and is pushed to a streaming media server in real time after being compressed and coded.
The compression coding used is H264 compression coding. The streaming media server is an SRS open source streaming media server, an RTMP transmission protocol is adopted in uplink, and live streaming is pushed to the SRS streaming media server in real time.
Step 2: and seamlessly splicing the live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream.
The method comprises the following steps of (1) using a dynamic image fusion algorithm, wherein when images are spliced, factors of uneven brightness of different image pixels are fully considered for overlapped areas of the images, and a weighted average is used for smoothing the overlapped areas; in the process of generating the VR live stream, not only the pixel information of the current image is considered, but also the pixel information of the next frame of image which is changed relative to the previous frame of image is extracted, the extracted pixel information is updated in time, and the current panorama of each frame is generated based on the panorama of the previous frame.
And step 3: and creating a virtual scene and a 3D sphere object, and then pasting the decoded VR panoramic live stream on the inner surface of the sphere as a grid material of the 3D sphere object.
As shown in fig. 2, the three-dimensional picture reconstruction of the VR panorama live view of the present invention mainly includes the following steps:
(1) a virtual Scene is created in the development environment.
The OpenGl-based 3D drawing engine Threejs is used for creating a virtual Scene.
(2) Establishing a 3D spherical object with the radius of 500 and the horizontal and vertical division surfaces of 60 and 40 respectively; when a 3D sphere object is created using the engine, the radius of the sphere defaults to 500, and the default specified horizontal and vertical segmentation planes are 60 and 40, respectively.
(3) In a virtual scene, establishing a Mesh model object Mesh by using a 3D sphere object and video texture materials; the video texture is video texture created by a Threejs engine by using a panoramic live stream, the video texture is used as a material, and each frame of panoramic picture of the video stream is pasted on the inner surface of the 3D sphere as a map.
(4) Creating a virtual perspective camera in a scene, and specifying an initial position, a direction and a visual field range of the camera; a virtual perspective camera PerspecectraCamera is created in a scene by using a Threejs engine, the initial position of the camera is designated as (radius,0,0), the direction is the direction of a target object, and the visual field range is 75 degrees.
The position and the direction of the virtual perspective camera can be changed by dragging a mouse at the PC end, and sliding gestures or gravity sensing at the mobile end. The mouse dragging of the PC end and the gesture sliding of the moving end comprise four different directions, namely an upper direction, a lower direction, a left direction and a right direction; the coordinate position of the virtual camera in the three-dimensional world is changed by acquiring the sliding distance of the PC end mouse or the gesture of the moving end on the X axis and the Y axis, and the camera can acquire live broadcast pictures with different visual angles at different coordinate positions. The gravity sensing of the mobile terminal can sense the rotation angles alpha, beta and gamma of three different coordinate axes of the user equipment XYZ, and the three angle information is applied to change the coordinate position of the virtual camera, so that a user can see live broadcast pictures with different visual angles according to own preference.
The invention also provides a dynamic image fusion system for VR panoramic live broadcasting, as shown in fig. 3, which is a structural block diagram of a VR live broadcasting mobile terminal model of the present embodiment, and according to the structural block diagram, the mobile terminal model includes an acquisition module, a dynamic image fusion module, a VR player module, and a control module, wherein:
the acquisition module is mainly used for acquiring the multi-view live broadcast stream and performing compression coding;
the dynamic image fusion module is mainly used for seamlessly splicing the live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream and pushing the live broadcast stream for the VR module;
and the VR player module is used for three-dimensional construction and playing of VR panoramic live broadcast and comprises a VR module and a rendering module. The VR module is used for three-dimensional reconstruction of a two-dimensional panoramic live view and restoring a live view scene of a truest three-dimensional space; the rendering module is used for VR live 3D rendering and displaying a three-dimensional panoramic live frame;
and the control module is mainly used for controlling the operation of the virtual camera in the three-dimensional scene so as to generate and display the current three-dimensional live broadcast picture. The control module comprises a gravity sensing module and a gesture sliding module, wherein the gravity sensing module is used for sensing the direction of a screen held by a mobile terminal user and the change of the direction of the screen, and changing the rotation angle of the virtual camera according to the sensing information so as to display the current three-dimensional live broadcast picture; the gesture sliding module is used for acquiring the sliding direction and distance of a user on the screen, changing the position of the virtual camera according to the direction and distance information and further displaying the current three-dimensional live broadcast picture
The following detailed description is provided in conjunction with the embodiments and with reference to the accompanying drawings in order to provide a further explanation of the technical features and advantages of the present invention.
In a live broadcast site, multi-view live broadcast streams are collected by erecting cameras in four different directions, namely east, west, south and north, and the live broadcast streams are transmitted to an SRS streaming media server by using an RTMP transmission protocol after H264 compression coding is adopted. In order to reduce the delay of the transmission network and better reflect the real-time performance of the live broadcast, the transmission network adopts a 5G network.
The method comprises the steps of intercepting images at different angles at the same moment from a live stream transmitted by an acquisition end, seamlessly splicing the live images at different visual angles by using a dynamic image fusion algorithm, smoothing an overlapped area by using weighted average, simultaneously extracting pixel information of a next frame of image relative change to a previous frame of image, and timely updating the extracted pixel information to a current frame to generate a VR (virtual reality) panoramic live stream. The image weighted average expression is:
Figure BDA0002781790000000051
in the formula IiIs represented by1And I2Stitched image, α1And alpha2Is represented by1And I2The weight value of (2) is determined by the distance between the pixel point of the overlapped area and the source image, and the value is between 0 and 1, wherein the value is 0.5.
And (3) slicing the live streaming by adopting an HLS transmission protocol according to the live streaming sequence of the VR panoramic live streaming stored on the streaming media, wherein 5 seconds are one TS slice, and pushing the live streaming to a preset VR player module.
And a VR module in the VR player receives the live stream TS slice, and the TS slice is used for reconstructing a three-dimensional picture of a VR panoramic live view. And taking the small segment of live broadcast stream as a video texture, constructing a Mesh model Mesh, and adding the Mesh model Mesh into a virtual scene created in a development environment.
In a development environment, a renderer WebGLRenderer is created by using a Threejs engine, a Canvas container is created for the renderer, a rendering range is set, the constructed three-dimensional VR panorama is displayed in the container in a live mode, the three-dimensional VR panorama is continuously refreshed, and live pictures are updated in real time.
And the controller module is mainly used for controlling the operation of the virtual camera in the three-dimensional scene. And when the gravity sensing is started, monitoring the motion state of the equipment in real time by using the monitoring event orientation change. Any rotation angle of the device can be described by three angles of alpha, beta and gamma, the position and the direction of the camera are changed by applying the change of the three angles, and then the live broadcast picture is updated and rendered in real time, so that the observation visual angle of the user is changed.
When the gesture sliding function of the mobile terminal is used, the sliding behavior of a user on a screen of the mobile terminal is monitored in real time, the sliding distances on an X axis and a Y axis can be described by longitude and latitude, namely lon and lat, and the new position of the camera after movement is calculated through the two sliding angles. And the live broadcast picture of a new visual angle can be seen at the new position.

Claims (6)

1. A dynamic image fusion method for VR panoramic live broadcast is characterized by comprising the following steps:
(1) the method comprises the steps that cameras in different directions are erected to collect multi-view live broadcast streams, and the multi-view live broadcast streams are pushed to a streaming media server in real time after being compressed and coded;
(2) seamlessly splicing the live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream;
(3) creating a virtual scene and a 3D sphere object, and then taking the decoded VR panoramic live broadcast stream as a grid material of the 3D sphere object to be pasted on the inner surface of the sphere;
(4) creating a virtual perspective camera in a virtual scene, and specifying an initial position, a direction and a visual field range of the camera;
(5) creating a 3D renderer, rendering all two-dimensional live broadcast pictures in the visual field range of a virtual camera in a virtual scene into three-dimensional images and displaying the three-dimensional images;
(6) and according to the change of the view angle of the user, changing the position and the direction of the virtual camera to generate and display live pictures with different view angles.
2. The dynamic image fusion method for VR live panorama according to claim 1, wherein the step (2) is implemented as follows:
(21) when images are spliced, regarding the overlapped area of the images, considering the uneven brightness factor of the live images at different viewing angles, and using weighted average to smooth the overlapped area; the image weighted average expression is:
Figure FDA0002781789990000011
in the formula IiIs represented by1And I2Stitched image, α1And alpha2Is represented by1And I2The weight value is determined by the distance between the pixel point of the overlapped area and the source image and is between 0 and 1;
(22) in the process of generating the VR panoramic live stream, not only the pixel information of the current image is considered, but also the pixel information of the next frame of image is extracted, then the pixel information of the relative change of the previous and next images is extracted, the extracted pixel information is updated in time, and the current panoramic image of each frame is generated based on the panoramic image of the previous frame.
3. The dynamic image fusion method for VR live panorama according to claim 1, wherein the step (3) comprises the steps of:
(31) creating a virtual Scene in a development environment;
(32) establishing a 3D spherical object with the radius of 500 and the horizontal and vertical division surfaces of 60 and 40 respectively;
(33) in a virtual scene, establishing a Mesh model object Mesh by using a 3D sphere object and video texture materials;
(34) and obtaining a VR live broadcast stream from the SRS streaming media server by adopting an HLS transmission protocol, decoding by using H264, and attaching the VR live broadcast picture on the inner surface of the ball as a grid model material.
4. The dynamic image fusion method for VR panorama live broadcasting of claim 1, wherein the initial position of the virtual perspective camera of step (4) is specified in (radius,0,0), the direction is a target object direction, and the field of view is 75 °.
5. A dynamic image fusion system for VR live panorama using the method of any of claims 1-4, the system comprising:
the acquisition module is used for acquiring the multi-view live broadcast stream and performing compression coding;
the dynamic image fusion module seamlessly splices live broadcast pictures at different visual angles by using a dynamic image fusion algorithm to generate a VR panoramic live broadcast stream and pushes the live broadcast stream for the VR player module;
the VR player module is used for three-dimensional construction and playing of VR panoramic live broadcast, and comprises a VR module and a rendering module, wherein the VR module is used for three-dimensional reconstruction of a two-dimensional panoramic live broadcast view and restoring a live broadcast scene in a truest three-dimensional space; the rendering module is used for VR live 3D rendering and displaying a three-dimensional panoramic live frame;
and the control module is used for controlling the operation of the virtual camera in the three-dimensional scene so as to generate and display the current three-dimensional live broadcast picture.
6. The dynamic image fusion system for VR live panorama of claim 5, wherein the control module includes a gravity sensing module and a gesture sliding module; the gravity sensing module is used for sensing the direction of the screen held by the mobile terminal user and the change of the direction of the screen, and changing the rotation angle of the virtual camera according to the sensing information so as to display the current three-dimensional live broadcast picture; the gesture sliding module is used for acquiring the sliding direction and distance of a user on the screen, and changing the position of the virtual camera according to the direction and distance information so as to display the current three-dimensional live broadcast picture.
CN202011284179.XA 2020-11-17 2020-11-17 Dynamic image fusion method and system for VR panoramic live broadcast Pending CN112533002A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011284179.XA CN112533002A (en) 2020-11-17 2020-11-17 Dynamic image fusion method and system for VR panoramic live broadcast

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011284179.XA CN112533002A (en) 2020-11-17 2020-11-17 Dynamic image fusion method and system for VR panoramic live broadcast

Publications (1)

Publication Number Publication Date
CN112533002A true CN112533002A (en) 2021-03-19

Family

ID=74981045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011284179.XA Pending CN112533002A (en) 2020-11-17 2020-11-17 Dynamic image fusion method and system for VR panoramic live broadcast

Country Status (1)

Country Link
CN (1) CN112533002A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113194326A (en) * 2021-04-28 2021-07-30 平安国际智慧城市科技股份有限公司 Panoramic live broadcast method and device, computer equipment and computer readable storage medium
CN113873285A (en) * 2021-10-14 2021-12-31 中国科学院软件研究所 Naked eye 3D live broadcast method, device and system based on Hongmon distributed capability
CN113992679A (en) * 2021-10-26 2022-01-28 广域铭岛数字科技有限公司 Automobile image display method, system and equipment
CN114007017A (en) * 2021-11-18 2022-02-01 浙江博采传媒有限公司 Video generation method and device and storage medium
CN114286121A (en) * 2021-12-22 2022-04-05 天翼视讯传媒有限公司 Method and system for realizing picture guide live broadcast based on panoramic camera
CN114401414A (en) * 2021-12-27 2022-04-26 北京达佳互联信息技术有限公司 Immersive live broadcast information display method and system and information push method
CN114449169A (en) * 2022-01-27 2022-05-06 中影电影数字制作基地有限公司 Cutting method and system for displaying panoramic video in CAVE space
CN114518825A (en) * 2022-02-14 2022-05-20 广州塔普鱼网络科技有限公司 XR (X-ray diffraction) technology-based man-machine interaction method and system
CN115190321A (en) * 2022-05-13 2022-10-14 广州博冠信息科技有限公司 Switching method and device of live broadcast room and electronic equipment
CN115225923A (en) * 2022-06-09 2022-10-21 广州博冠信息科技有限公司 Gift special effect rendering method and device, electronic equipment and live broadcast server
CN115442658A (en) * 2022-08-04 2022-12-06 珠海普罗米修斯视觉技术有限公司 Live broadcast method and device, storage medium, electronic equipment and product
CN115665461A (en) * 2022-10-13 2023-01-31 聚好看科技股份有限公司 Video recording method and virtual reality equipment
CN116320366A (en) * 2023-05-18 2023-06-23 中数元宇数字科技(上海)有限公司 Video stream data pushing method, device, equipment and storage medium
CN117111873A (en) * 2023-10-23 2023-11-24 南昌市一境信息技术有限公司 Immersion interaction system based on cave environment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898337A (en) * 2015-11-18 2016-08-24 乐视网信息技术(北京)股份有限公司 Panoramic video display method and device
CN106604042A (en) * 2016-12-22 2017-04-26 Tcl集团股份有限公司 Panorama webcasting system and panorama webcasting method based on cloud server
CN106658212A (en) * 2017-01-20 2017-05-10 北京红马传媒文化发展有限公司 VR online playing method, system and player based on HTML5
WO2018014495A1 (en) * 2016-07-18 2018-01-25 范治江 Real-time panoramic live broadcast network camera and system and method
CN111383204A (en) * 2019-12-19 2020-07-07 北京航天长征飞行器研究所 Video image fusion method, fusion device, panoramic monitoring system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898337A (en) * 2015-11-18 2016-08-24 乐视网信息技术(北京)股份有限公司 Panoramic video display method and device
WO2018014495A1 (en) * 2016-07-18 2018-01-25 范治江 Real-time panoramic live broadcast network camera and system and method
CN106604042A (en) * 2016-12-22 2017-04-26 Tcl集团股份有限公司 Panorama webcasting system and panorama webcasting method based on cloud server
CN106658212A (en) * 2017-01-20 2017-05-10 北京红马传媒文化发展有限公司 VR online playing method, system and player based on HTML5
CN111383204A (en) * 2019-12-19 2020-07-07 北京航天长征飞行器研究所 Video image fusion method, fusion device, panoramic monitoring system and storage medium

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113194326A (en) * 2021-04-28 2021-07-30 平安国际智慧城市科技股份有限公司 Panoramic live broadcast method and device, computer equipment and computer readable storage medium
CN113873285A (en) * 2021-10-14 2021-12-31 中国科学院软件研究所 Naked eye 3D live broadcast method, device and system based on Hongmon distributed capability
CN113992679A (en) * 2021-10-26 2022-01-28 广域铭岛数字科技有限公司 Automobile image display method, system and equipment
CN113992679B (en) * 2021-10-26 2023-10-31 广域铭岛数字科技有限公司 Automobile image display method, system and equipment
CN114007017A (en) * 2021-11-18 2022-02-01 浙江博采传媒有限公司 Video generation method and device and storage medium
CN114286121A (en) * 2021-12-22 2022-04-05 天翼视讯传媒有限公司 Method and system for realizing picture guide live broadcast based on panoramic camera
CN114401414A (en) * 2021-12-27 2022-04-26 北京达佳互联信息技术有限公司 Immersive live broadcast information display method and system and information push method
CN114401414B (en) * 2021-12-27 2024-01-23 北京达佳互联信息技术有限公司 Information display method and system for immersive live broadcast and information pushing method
CN114449169A (en) * 2022-01-27 2022-05-06 中影电影数字制作基地有限公司 Cutting method and system for displaying panoramic video in CAVE space
CN114449169B (en) * 2022-01-27 2023-11-17 中影电影数字制作基地有限公司 Clipping method and system for showing panoramic video in CAVE space
CN114518825A (en) * 2022-02-14 2022-05-20 广州塔普鱼网络科技有限公司 XR (X-ray diffraction) technology-based man-machine interaction method and system
CN115190321A (en) * 2022-05-13 2022-10-14 广州博冠信息科技有限公司 Switching method and device of live broadcast room and electronic equipment
CN115190321B (en) * 2022-05-13 2024-06-04 广州博冠信息科技有限公司 Live broadcast room switching method and device and electronic equipment
CN115225923A (en) * 2022-06-09 2022-10-21 广州博冠信息科技有限公司 Gift special effect rendering method and device, electronic equipment and live broadcast server
CN115225923B (en) * 2022-06-09 2024-03-22 广州博冠信息科技有限公司 Method and device for rendering gift special effects, electronic equipment and live broadcast server
CN115442658A (en) * 2022-08-04 2022-12-06 珠海普罗米修斯视觉技术有限公司 Live broadcast method and device, storage medium, electronic equipment and product
CN115442658B (en) * 2022-08-04 2024-02-09 珠海普罗米修斯视觉技术有限公司 Live broadcast method, live broadcast device, storage medium, electronic equipment and product
CN115665461B (en) * 2022-10-13 2024-03-22 聚好看科技股份有限公司 Video recording method and virtual reality device
CN115665461A (en) * 2022-10-13 2023-01-31 聚好看科技股份有限公司 Video recording method and virtual reality equipment
CN116320366A (en) * 2023-05-18 2023-06-23 中数元宇数字科技(上海)有限公司 Video stream data pushing method, device, equipment and storage medium
CN117111873B (en) * 2023-10-23 2024-01-09 南昌市一境信息技术有限公司 Immersion interaction system based on cave environment
CN117111873A (en) * 2023-10-23 2023-11-24 南昌市一境信息技术有限公司 Immersion interaction system based on cave environment

Similar Documents

Publication Publication Date Title
CN112533002A (en) Dynamic image fusion method and system for VR panoramic live broadcast
US11528468B2 (en) System and method for creating a navigable, three-dimensional virtual reality environment having ultra-wide field of view
CN106101741B (en) Method and system for watching panoramic video on network video live broadcast platform
CN108648257B (en) Panoramic picture acquisition method and device, storage medium and electronic device
CN113099204B (en) Remote live-action augmented reality method based on VR head-mounted display equipment
US20180146193A1 (en) Analytic Reprocessing for Data Stream System and Method
CN112702522B (en) Self-adaptive control playing method based on VR live broadcast system
US20080246759A1 (en) Automatic Scene Modeling for the 3D Camera and 3D Video
CN104219584A (en) Reality augmenting based panoramic video interaction method and system
CN106210856B (en) The method and system of 3D panoramic video are watched on internet video live broadcasting platform
US20190335166A1 (en) Deriving 3d volumetric level of interest data for 3d scenes from viewer consumption data
WO2012166593A2 (en) System and method for creating a navigable, panoramic three-dimensional virtual reality environment having ultra-wide field of view
WO2017128887A1 (en) Method and system for corrected 3d display of panoramic image and device
CN106453913A (en) Method and apparatus for previewing panoramic contents
KR101340598B1 (en) Method for generating a movie-based, multi-viewpoint virtual reality and panoramic viewer using 3d surface tile array texture mapping
EP3057316B1 (en) Generation of three-dimensional imagery to supplement existing content
CN111083368A (en) Simulation physics cloud platform panoramic video display system based on high in clouds
CN116325769A (en) Panoramic video streaming scenes from multiple viewpoints
JP7447266B2 (en) View encoding and decoding for volumetric image data
CN111031327A (en) Panoramic playing method
WO2022022548A1 (en) Free viewpoint video reconstruction and playing processing method, device, and storage medium
Zhou et al. Streaming Location-Based Panorama Videos into Augmented Virtual Environment
Li et al. Implementation and Application of Video Distribution Technology Based on OpenGL
CN114866760A (en) Virtual reality display method, equipment, system and readable storage medium
CN113676731A (en) Method for compressing VR video data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210319

RJ01 Rejection of invention patent application after publication