CN114079803A - Music live broadcast method and system based on virtual reality - Google Patents

Music live broadcast method and system based on virtual reality Download PDF

Info

Publication number
CN114079803A
CN114079803A CN202010851026.2A CN202010851026A CN114079803A CN 114079803 A CN114079803 A CN 114079803A CN 202010851026 A CN202010851026 A CN 202010851026A CN 114079803 A CN114079803 A CN 114079803A
Authority
CN
China
Prior art keywords
data
virtual
user
stream data
virtual space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010851026.2A
Other languages
Chinese (zh)
Inventor
赵一阳
杨国栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Haohai Information Technology Co ltd
Original Assignee
Shanghai Haohai Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Haohai Information Technology Co ltd filed Critical Shanghai Haohai Information Technology Co ltd
Priority to CN202010851026.2A priority Critical patent/CN114079803A/en
Publication of CN114079803A publication Critical patent/CN114079803A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention provides a music sharing system and method based on virtual reality, after a main broadcasting end and a user end log in a virtual scene server, music content information input by the main broadcasting end can be analyzed and processed into digital unit information, different resources and synchronization between the resources are carried out in a virtual scene, and then instruction information after the synchronization of the synthesis is fed back to a front end from the virtual scene server to present the effect of virtual live broadcasting.

Description

Music live broadcast method and system based on virtual reality
Technical Field
The invention relates to application of a virtual reality technology in the field of digital music, in particular to a music sharing system and method based on virtual reality.
Background
Live broadcasting of online concerts is exactly for solving the problem that many users can't experience concerts off-line, so through the online mode of watching, mainly through two-dimensional plane computer terminals such as TV, computer. But this solution by itself does not deliver a good sense of presence to the audience. In addition, the production of the visual effect part of a concert scene is itself limited by the field environment, the size of the space, the hardware & software technology and the cost, the development of which is limited, and does not necessarily bring a good audio-visual immersion to the audience.
Disclosure of Invention
The invention aims to provide a music sharing system and method based on virtual reality, provides a virtual reality multi-person online space and solves the problem of video synchronous playing of the existing live concert platform.
In order to achieve the above object, an aspect of the present invention provides a music sharing method based on virtual reality, including the following steps:
step S1, creating a virtual space, synchronizing information stream data in the virtual space to the first processing module, and synchronizing video stream data in the virtual space to the second processing module;
step S2, the first processing module carries out serialization processing on the information stream data and sends the serialized data to VR head display equipment of a user end;
in step S3, the second processing module performs virtualization on the video stream data, and sends the virtualized video stream data to the two-dimensional display device at the user end in a point-to-point manner.
Further, in step S1, the method further includes the following steps:
s101, selecting a virtual scene to be created and a special effect thereof, selecting a matched digital music file, and storing selected data;
s102, loading the selected file into a virtual space according to the stored data, performing informatization processing on the digital music file, enabling the virtual scene and the special effect to generate a linkage relation with the wave rate and the beat of the digital music file, and displaying the linkage relation in the virtual space in a visual mode;
and S103, updating the virtual space in real time and synchronously generating information stream data and video stream data.
Furthermore, the virtual scene, the special effect and the digital music file all have unique codes, and the codes are used for calling corresponding files to load a virtual space.
Further, in step S2, the method further includes the following steps:
s201, traversing network delay of each node user side and a first processing module, and setting a node with the lowest network delay as a relay side;
s202, traversing network delay of each node user side and a relay side, wherein the user side receives information flow data forwarded by the relay side with the lowest network delay;
and S203, repeating the steps until all the user sides receive the information stream data.
Further, in step S3, the method further includes the following steps:
s301, dividing video stream data into N data blocks with the same size in a virtualization mode, wherein the data blocks all contain index information of data;
s302, when the user side of each node receives the data block, the data block is distributed to other nodes for downloading;
s303, when the user side of each node receives the data blocks distributed by other nodes according to the index information of the data blocks;
and S304, repeating the steps until all the user terminals receive the complete video stream data.
On the other hand, the invention also provides a music sharing system based on virtual reality, which comprises:
the virtual scene server is used for creating a virtual space, synchronizing information stream data in the virtual space to the first processing module and synchronizing video stream data in the virtual space to the second processing module;
the first processing module is used for carrying out serialization processing on information stream data and sending the serialized data to a VR head display terminal of a user;
and the second processing module is used for performing virtualization processing on the video stream data and sending the virtualized video stream data to a two-dimensional display terminal of a user in a point-to-point mode.
Further, the system further comprises:
the anchor terminal is used for selecting a virtual scene to be created and a special effect thereof, selecting a matched digital music file at the same time, and storing the selected data;
the virtual scene server receives the data stored by the anchor terminal, loads the selected file into a virtual space, and carries out informationization processing on the digital music file, so that the virtual scene and the special effect are in linkage relation with the wave rate and the beat of the digital music file and are visually presented in the virtual space.
Furthermore, the virtual scene, the special effect and the digital music file all have unique codes, the anchor terminal uploads the codes to the virtual scene server, and the corresponding files are called to load the virtual space.
Further, the system further comprises a first network distribution module, wherein:
the first network distribution module traverses the network delay of each node user side and the first processing module, and sets the node with the lowest network delay as a relay side;
the first network distribution module traverses network delay of each node user terminal and the relay terminal, and the user terminal receives information flow data forwarded by the relay terminal with the lowest network delay.
Further, the system further comprises a second network distribution module, wherein:
the second network distribution module is used for virtually dividing video stream data into N data blocks with the same size, wherein the data blocks all contain index information of the data;
when a user side of each node receives the data block, the data block is distributed to other nodes for downloading;
and when the user side of each node receives the data blocks distributed by other nodes according to the index information of the data blocks.
The invention provides a music sharing system and method based on virtual reality, after a main broadcasting end and a user end log in a virtual scene server, music content information input by the main broadcasting end can be analyzed and processed into digital unit information, different resources and synchronization between the resources are carried out in a virtual scene, and then instruction information after the synchronization of the synthesis is fed back to a front end from the virtual scene server to present the effect of virtual live broadcasting.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a virtual reality-based music sharing method according to an embodiment of the present invention.
Fig. 2 is an architecture diagram of a virtual reality based music sharing system according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of data transmission of information stream data according to an embodiment of the present invention.
Fig. 4 is a data structure diagram of video stream data according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
The invention aims to solve the problem of immersion in the aspect of audio and visual of the existing live concert platform. The virtual reality multi-person online space is provided, when a user enters the platform after wearing a virtual reality helmet (the fact that a performer and the user are in the same virtual space platform is supported), immersive music and visual experience can be provided through the system, the experience is not only immersive audio-visual experience, but also sensory enjoyment combining two artistic types. And simultaneously supports the push streaming of virtual concert content in a live form. The user can watch the display terminal such as VR head display, mobile phone, PAD, computer. Impact which cannot be provided by a real stage effect can be achieved.
Fig. 1 is a flowchart of a virtual reality-based music sharing method according to an embodiment of the present invention. As shown in fig. 1, a music sharing method based on virtual reality of the present invention includes the following steps:
step S1, selecting the virtual scenes and elements to be created in the system, where each virtual scene and element has a unique code, and the codes are in one-to-one correspondence with the background stored scene data.
Specifically, the anchor terminal logs in a virtual scene server and then selects virtual scenes and elements, and performs art digital editing in an editable mode, including particle special effect editing, spontaneous dynamic light editing, RGB editing, saturation editing and the like. The virtual scenes and elements are combined with the digital effect and are synchronized with the music part through the instruction of the computer program to the user side.
And step S2, storing the selected data at the front end, wherein the stored data is the only code of the scene.
In step S3, the desired digital music files are selected in the system, each having a unique code.
Step S4: and storing the selected data in the front end, wherein the stored data is the only code of the scene.
For example, if the scene datA is selected as "Cyberpunk land of infection", the unique code of the scene is "lyoiwtnew 98MjvabE e3EpIu4 odhGfyO-A", the code is recorded, the digital music file datA is selected as "PRINSI", and the unique code of the digital music file is "sU 0fenhh 6UDu1b3n7bY2WuzpJ8f 5".
Step S5, after the selection is completed and determined in step S4, a new virtual space is created.
And step S6, sending the data saved in S2 and S4 to the server, and requesting the server to save the scene files, special effect files and digital music files corresponding to the codes in the data.
Step S7, load the data requested in S6 into the virtual space created in S5.
Step S8, analyzing the digital music file loaded in the virtual scene to obtain the wave frequency, sampling rate, sound channel and BPM information of the digital music file
For example: "PRISNI" -sampling rate: 44100 Hz; sound channel: stereo sound; BPM: 128,
and step S9, binding the digital information analyzed in the step S8 with elements in the scene.
Specifically, after the information of music digitization is processed, preset scene elements in a virtual space scene are called, linkage relation is generated between each frame and each wave rate and beat through information unitization processing, and the information is expressed in a visual presentation mode. Meanwhile, the performer is also supported to change and synchronize the on-site special effects through the scene special effect digital editing module.
And step S11, sending the video stream information in the scene to a second processing module of the server, wherein the second processing module of the server is mainly responsible for pushing stream, pulling stream and forwarding requests of the video stream data.
Step S12, data changes in the virtual reality scene are synchronized to the first processing module of the server in real time, and the first processing module of the server mainly processes serialization, deserialization, acquisition and forwarding of data in the scene.
And step S13, the server responds to the pull request of the user terminal and pushes the video stream data to the user terminal.
Step S14: and the server carries out serialization processing on the scene data information and sends the serialized data to a user terminal using a VR head display.
In which video stream information is shared in S13 using a software distribution system that is efficient in virtual reality and a point-to-point technique.
Fig. 3 is a schematic diagram of data transmission of information stream data according to an embodiment of the present invention. As shown in fig. 3, the video stream data is virtually divided into N blocks with equal size, and each block has index information of data, when a user a acquires a data block 1, the data block a is distributed out for downloading by other people, and meanwhile, other data blocks are downloaded according to the index information contained in the file. Because the data is divided into smaller data blocks and the data blocks are uploaded while the data blocks are downloaded, the uploading bandwidth of the user is fully utilized, the requirement on the bandwidth when the user uses the data blocks is reduced, and the loading speed is accelerated.
Wherein the delay problem for the user between different zones is solved in S14 using a form of distributed distribution in virtual reality.
Fig. 4 is a data structure diagram of video stream data according to an embodiment of the present invention. As shown in fig. 4, assuming that network delays of user a and user B in area01 and the anchor are both 300ms, user a and user B both acquire serialized scene data information from data uploaded by the anchor and upload the data as a relay while acquiring the data. At this point, user a becomes relay a and user B becomes relay B. After joining, user C at area02 traverses the network delays of the anchor and all the relays, assuming that the network delay of user C at area02 with the anchor is 1000ms, the network delay with user a is 300ms, and the network delay with user B is 200 ms. The total delay for user C to obtain data from user B located in area01 is 500ms, which is the lowest delay way for user C to obtain data, so user C will obtain data from user B and forward the data to become relay point C.
Fig. 2 is an architecture diagram of a virtual reality based music sharing system according to an embodiment of the present invention. As shown in fig. 2, a music sharing system based on virtual reality of the present invention includes:
the virtual scene server 1 is used for creating a virtual space, synchronizing information stream data in the virtual space to a first processing module, and synchronizing video stream data in the virtual space to a second processing module;
the first processing module 1a is used for serializing the information stream data and sending the serialized data to a VR head display terminal of the user end 3;
and the second processing module 1b is used for virtualizing the video stream data, and sending the virtualized video stream data to the two-dimensional display terminal of the user end 3 in a point-to-point mode.
And the anchor terminal 4 is used for selecting the virtual scene to be created and the special effect thereof, selecting the matched digital music file at the same time, and storing the selected data.
Specifically, the virtual scene server receives data stored by the anchor terminal, loads the selected file into a virtual space, and performs informatization processing on the digital music file, so that the virtual scene and the special effect are in linkage relation with the wave rate and the beat of the digital music file, and are visually presented in the virtual space.
The virtual scene, the special effect and the digital music file are all provided with unique codes, the anchor terminal uploads the codes to the virtual scene server, and the corresponding files are called to load the virtual space.
The system also comprises a first network distribution module 1c and a second network publishing module 1 d.
The first network distribution module 1c traverses the network delay of each node user end 3 and the first processing module 1a, and sets the node with the lowest network delay as a relay end; the first network distribution module 1d traverses the network delay of each node user end 3 and the relay end, and the user end 3 receives the information stream data forwarded by the relay end with the lowest network delay.
The system further comprises a second network distribution module 1d, wherein:
the second network distribution module 1d virtually divides the video stream data into N data blocks with equal size, and the data blocks all contain index information of the data; when the user end 3 of each node receives the data block, the data block is distributed to other nodes for downloading; when the user end 3 of each node receives the data blocks distributed by other nodes according to the index information of the data blocks.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A music sharing method based on virtual reality is characterized by comprising the following steps:
step S1, creating a virtual space, synchronizing information stream data in the virtual space to the first processing module, and synchronizing video stream data in the virtual space to the second processing module;
step S2, the first processing module carries out serialization processing on the information stream data and sends the serialized data to VR head display equipment of a user end;
in step S3, the second processing module performs virtualization on the video stream data, and sends the virtualized video stream data to the two-dimensional display device at the user end in a point-to-point manner.
2. The virtual reality-based music sharing method according to claim 1, wherein in step S1, the method further comprises the following steps:
s101, selecting a virtual scene to be created and a special effect thereof, selecting a matched digital music file, and storing selected data;
s102, loading the selected file into a virtual space according to the stored data, performing informatization processing on the digital music file, enabling the virtual scene and the special effect to generate a linkage relation with the wave rate and the beat of the digital music file, and displaying the linkage relation in the virtual space in a visual mode;
and S103, updating the virtual space in real time and synchronously generating information stream data and video stream data.
3. A virtual reality based music sharing method as claimed in claim 2, wherein the virtual scene, special effects and digital music files all have unique codes, and the codes are used to call corresponding files to load the virtual space.
4. The virtual reality-based music sharing method according to any one of claims 1 or 2, further comprising, in step S2, the steps of:
s201, traversing network delay of each node user side and a first processing module, and setting a node with the lowest network delay as a relay side;
s202, traversing network delay of each node user side and a relay side, wherein the user side receives information flow data forwarded by the relay side with the lowest network delay;
and S203, repeating the steps until all the user sides receive the information stream data.
5. The virtual reality-based music sharing method according to any one of claims 1 or 2, further comprising, in step S3, the steps of:
s301, dividing video stream data into N data blocks with the same size in a virtualization mode, wherein the data blocks all contain index information of data;
s302, when the user side of each node receives the data block, the data block is distributed to other nodes for downloading;
s303, when the user side of each node receives the data blocks distributed by other nodes according to the index information of the data blocks;
and S304, repeating the steps until all the user terminals receive the complete video stream data.
6. A virtual reality-based music sharing system, comprising:
the virtual scene server is used for creating a virtual space, synchronizing information stream data in the virtual space to the first processing module and synchronizing video stream data in the virtual space to the second processing module;
the first processing module is used for carrying out serialization processing on information stream data and sending the serialized data to a VR head display terminal of a user;
and the second processing module is used for performing virtualization processing on the video stream data and sending the virtualized video stream data to a two-dimensional display terminal of a user in a point-to-point mode.
7. The virtual reality-based music sharing system of claim 6, further comprising:
the anchor terminal is used for selecting a virtual scene to be created and a special effect thereof, selecting a matched digital music file at the same time, and storing the selected data;
the virtual scene server receives the data stored by the anchor terminal, loads the selected file into a virtual space, and carries out informationization processing on the digital music file, so that the virtual scene and the special effect are in linkage relation with the wave rate and the beat of the digital music file and are visually presented in the virtual space.
8. The virtual reality-based music sharing system according to claim 7, wherein the virtual scene, the special effect and the digital music file all have unique codes, and the host uploads the codes to the virtual scene server and calls the corresponding file to load the virtual space.
9. The virtual reality-based music sharing system of any one of claims 6 or 7, further comprising a first network distribution module, wherein:
the first network distribution module traverses the network delay of each node user side and the first processing module, and sets the node with the lowest network delay as a relay side;
the first network distribution module traverses network delay of each node user terminal and the relay terminal, and the user terminal receives information flow data forwarded by the relay terminal with the lowest network delay.
10. The virtual reality-based music sharing system of any one of claims 6 or 7, further comprising a second network distribution module, wherein:
the second network distribution module is used for virtually dividing video stream data into N data blocks with the same size, wherein the data blocks all contain index information of the data;
when a user side of each node receives the data block, the data block is distributed to other nodes for downloading;
and when the user side of each node receives the data blocks distributed by other nodes according to the index information of the data blocks.
CN202010851026.2A 2020-08-21 2020-08-21 Music live broadcast method and system based on virtual reality Pending CN114079803A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010851026.2A CN114079803A (en) 2020-08-21 2020-08-21 Music live broadcast method and system based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010851026.2A CN114079803A (en) 2020-08-21 2020-08-21 Music live broadcast method and system based on virtual reality

Publications (1)

Publication Number Publication Date
CN114079803A true CN114079803A (en) 2022-02-22

Family

ID=80282599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010851026.2A Pending CN114079803A (en) 2020-08-21 2020-08-21 Music live broadcast method and system based on virtual reality

Country Status (1)

Country Link
CN (1) CN114079803A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117880551A (en) * 2024-03-12 2024-04-12 深圳市云实互联网信息科技有限公司 Virtual live broadcast processing method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010077433A (en) * 2000-02-02 2001-08-20 김경미 Virtual Sound Responsive Landscape System And Visual Display Method In That System
CN101478564A (en) * 2008-12-31 2009-07-08 西安交通大学 Adaptive hierarchical transmission structure design method for P2P stream media network
CN105939481A (en) * 2016-05-12 2016-09-14 深圳市望尘科技有限公司 Interactive three-dimensional virtual reality video program recorded broadcast and live broadcast method
CN107801083A (en) * 2016-09-06 2018-03-13 星播网(深圳)信息有限公司 A kind of network real-time interactive live broadcasting method and device based on three dimensional virtual technique
CN108513088A (en) * 2017-02-24 2018-09-07 腾讯科技(深圳)有限公司 The method and device of group's video session
US20190206129A1 (en) * 2018-01-03 2019-07-04 Verizon Patent And Licensing Inc. Methods and Systems for Presenting a Video Stream Within a Persistent Virtual Reality World
CN111541932A (en) * 2020-04-30 2020-08-14 广州华多网络科技有限公司 User image display method, device, equipment and storage medium for live broadcast room

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010077433A (en) * 2000-02-02 2001-08-20 김경미 Virtual Sound Responsive Landscape System And Visual Display Method In That System
CN101478564A (en) * 2008-12-31 2009-07-08 西安交通大学 Adaptive hierarchical transmission structure design method for P2P stream media network
CN105939481A (en) * 2016-05-12 2016-09-14 深圳市望尘科技有限公司 Interactive three-dimensional virtual reality video program recorded broadcast and live broadcast method
CN107801083A (en) * 2016-09-06 2018-03-13 星播网(深圳)信息有限公司 A kind of network real-time interactive live broadcasting method and device based on three dimensional virtual technique
CN108513088A (en) * 2017-02-24 2018-09-07 腾讯科技(深圳)有限公司 The method and device of group's video session
US20190206129A1 (en) * 2018-01-03 2019-07-04 Verizon Patent And Licensing Inc. Methods and Systems for Presenting a Video Stream Within a Persistent Virtual Reality World
CN111541932A (en) * 2020-04-30 2020-08-14 广州华多网络科技有限公司 User image display method, device, equipment and storage medium for live broadcast room

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117880551A (en) * 2024-03-12 2024-04-12 深圳市云实互联网信息科技有限公司 Virtual live broadcast processing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US10930318B2 (en) Gapless video looping
US11317173B2 (en) Remote cloud-based video production system in an environment where there is network delay
CN107360440B (en) Deep interaction system and interaction method based on game process introduced into live stream
KR101470904B1 (en) Method and system for providing video
US8797357B2 (en) Terminal, system and method for providing augmented broadcasting service using augmented scene description data
WO2012121158A1 (en) Synchronized content broadcast distribution system
CN103947221A (en) User interface display method and device using same
CN113225577B (en) Live stream processing method, device and system, electronic equipment and storage medium
CN114554277B (en) Multimedia processing method, device, server and computer readable storage medium
CN106792244A (en) Net cast method and device
CN114125480B (en) Live chorus interaction method, system, device and computer equipment
CN115065829A (en) Multi-person wheat connecting method and related equipment
CN114461423A (en) Multimedia stream processing method, device, storage medium and program product
US20210227005A1 (en) Multi-user instant messaging method, system, apparatus, and electronic device
CN112383794A (en) Live broadcast method, live broadcast system, server and computer storage medium
CN115243063B (en) Video stream processing method, processing device and processing system
CN108696762A (en) A kind of synchronous broadcast method, device and system
CN114079803A (en) Music live broadcast method and system based on virtual reality
CN108124188B (en) Audio-video system operation method
CN114079799A (en) Music live broadcast system and method based on virtual reality
KR102051985B1 (en) Synchronization of Media Rendering in Heterogeneous Networking Environments
CN113411636A (en) Live wheat-connecting method and device, electronic equipment and computer-readable storage medium
CN113727177A (en) Screen-projecting resource playing method and device, equipment and medium thereof
KR102459197B1 (en) Method and apparatus for presentation customization and interactivity
KR20140088052A (en) Contents complex providing server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination