CN106878764A - A kind of live broadcasting method of virtual reality, system and application thereof - Google Patents
A kind of live broadcasting method of virtual reality, system and application thereof Download PDFInfo
- Publication number
- CN106878764A CN106878764A CN201510867854.4A CN201510867854A CN106878764A CN 106878764 A CN106878764 A CN 106878764A CN 201510867854 A CN201510867854 A CN 201510867854A CN 106878764 A CN106878764 A CN 106878764A
- Authority
- CN
- China
- Prior art keywords
- video
- live
- playback terminal
- video data
- angles
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 241001269238 Data Species 0.000 claims abstract description 136
- 238000003860 storage Methods 0.000 claims description 39
- 230000033001 locomotion Effects 0.000 claims description 13
- 230000008859 change Effects 0.000 claims description 11
- 239000011521 glass Substances 0.000 claims description 8
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 230000004913 activation Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 7
- 230000003993 interaction Effects 0.000 abstract description 6
- 238000010899 nucleation Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000014759 maintenance of location Effects 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000000386 athletic effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The invention discloses a kind of live broadcasting method of virtual reality, system and application thereof.Wherein method comprises the steps:Live Virtual Reality video data is sent to playback terminal, multiple video datas of the VR video datas comprising difference angle of same moment, so as to according to control instruction, show the video data of one or more of angles.In the present invention, playback terminal is after VR video datas are received, can be according to control instruction, show the video of wherein one or more angles to user, the good interaction between live video and user is realized, user is obtained the impression on the spot in person to live scene, for existing direct-seeding, sense of participation of the user to live scene can be effectively enhanced, live effect is greatly improved.
Description
Technical field
The present invention relates to direct seeding technique field, more particularly to a kind of live broadcasting method of virtual reality, system and its
Purposes.
Background technology
In the prior art, live is, with the process of the occurrence and development of live event, synchronously to make and pass
Send the process of associated media information.Its form can be divided into on-the-spot broadcasting, studio talk-show live, word graph
Piece is live, video and audio live broadcast or by third party provide information source it is live.It is live to allow spectators with ginseng in real time
With sense, it is possible to effectively accelerate the propagation of information.
It is of the prior art live including traditional live telecast, network direct broadcasting etc., but existing any one
In direct-seeding, spectators are merely able to passively be watched by playback terminal such as television set, computer or mobile phone etc.
Programme televised live.
The content of the invention
In view of the above problems, it is proposed that the present invention overcomes above mentioned problem or at least in part to provide one kind
Live broadcasting method, system of a kind of virtual reality for solving the above problems and application thereof.
In a first aspect, the invention provides a kind of live broadcasting method of virtual reality, including:
Live Virtual Reality video data is sent to playback terminal, the VR video datas are included
Multiple video datas of same moment difference angle, so as to according to control instruction, displaying therein
The video data of individual or multiple angles.
Further, the live VR video datas are sent to playback terminal includes:
The video data of whole angles in the VR video datas is sent to playback terminal, or will be described
The video data of Partial angle is sent to playback terminal in VR video datas.
Further, it is described according to instruction, show the video data of one or more of angles, including:
According to control instruction, show the video data of one or more angles at current time.
Further, it is described live VR video datas are sent to playback terminal before, methods described is also
Including:The live VR video datas of collection.
Further, live VR video datas are gathered, including:
Carry out the shooting of multiple angles to target scene simultaneously using the camera of multiple difference angles, obtain same
One multiple video data of the different angles of moment shooting.
Further, after the live VR video datas of the collection, also include:
The VR video datas to gathering are encoded and stored, and the VR video datas of storage are included together
One multiple video data of the different angles of moment shooting.
Further, gather after live VR video datas, also include:
The video data of multiple angles that the same moment is shot is spliced;
The VR video datas of described pair of collection are encoded, including:
The video data of multiple angles that the same moment is shot carries out spliced video data and enters
Row coding.
Further, the VR video datas of described pair of collection are encoded, including:
The video data of the multiple angles shot to the same moment is encoded respectively;
Information after the video data encoding of the multiple angle is spliced.
Further, by multiple video datas of same moment difference angle in live VR video datas
Playback terminal is sent to, including:
Multiple video datas of same moment difference angle in live VR video datas are passed through into radio
The mode of broadcast is sent to playback terminal by way of data flow.
Further, the live VR video datas are stored in Cloud Server.
Further, it is described to be sent to playback terminal, including:
According to the IP address of playback terminal, according to the distance with playback terminal, server can be used bandwidth and
One or more in server load, it is defined as the playback terminal provides VR video datas at least one
Individual cloud storage position;
The VR video datas are read from the cloud storage position and the playback terminal is sent to.
Further, the control instruction is obtained by following one or more mode:
By perceive screen to playback terminal, the touch-control of keyboard or manipulation button and obtain;
The instruction sent by the remote control equipment for perceiving playback terminal is obtained;
The instruction sent by the joystick for perceiving playback terminal is obtained;
Obtained by the angle change perceived produced by the motion of VR wearable devices.
Further, the VR wearable devices include:The VR helmets or VR glasses.
Further, the form of the coding is for H.261, H.263, H.264 or H.265.
Second aspect, the embodiment of the present invention additionally provides a kind of live broadcast system of virtual reality, including:
Virtual Reality video capture device, for gathering live VR video datas;
Cloud server system, for storing VR video datas, the VR video datas of the storage are included together
One video data of the multiple difference angles of moment shooting, and regarding multiple angles at same moment
Frequency evidence is sent to playback terminal.
Further, the cloud server system, specifically for by whole angle in the VR video datas
Video data be sent to playback terminal, or by the video data of Partial angle in the VR video datas
It is sent to playback terminal.
Further, the live broadcast system also includes:At least one playback terminal, for according to control instruction,
Show the video data of one or more angles in the VR video datas.
Further, the VR video capture devices include the camera of multiple difference angles, for using
The camera of the multiple different angles carries out the shooting of multiple angles to target scene simultaneously, obtains same
Multiple video datas of the different angles that the moment shoots.
Further, the VR video capture devices, be additionally operable to gather live VR video datas it
Afterwards, the video data of the multiple angles for synchronization being shot is spliced, and to spliced video data
Encoded, the VR video data transmittings that will be obtained after coding deliver to cloud server system storage.
Further, the cloud server system is additionally operable to the synchronization for gathering VR video capture devices
The video data of multiple angles of shooting is spliced, and spliced video data is encoded, storage
The VR video datas obtained after coding.
Further, the cloud server system is additionally operable to the synchronization for gathering VR video capture devices
The video data of multiple angles of shooting is encoded respectively, to regarding for the multiple angle that is obtained after coding
Frequency stores the VR video datas obtained after splicing according to being spliced.
Further, encoding device is included in the VR video capture devices or cloud server system, is used
Encoded in VR video datas.
Further, cloud server system, specifically for by way of radio broadcasting or by data
Multiple video datas of the different angles at same moment are sent to playback terminal by the mode of stream.
Further, the cloud server system, including multiple cloud storage service devices, broadcast for working as to receive
When putting the live playing requests of VR of terminal initiation, according to the IP address of playback terminal, according to play eventually
The distance at end, server can be used one or more in bandwidth and server load, be defined as the broadcasting
Terminal provides at least one cloud storage service device of VR video datas;From the cloud storage service determined
The VR video datas are read in device and the playback terminal is sent to.
Further, control instruction is obtained by following one or more mode:
By perceive screen to playback terminal, the touch-control of keyboard or manipulation button and obtain;
The instruction sent by the remote control equipment for perceiving playback terminal is obtained;
The instruction sent by the joystick for perceiving playback terminal is obtained;
Obtained by the angle change perceived produced by the motion of VR wearable devices.
Further, the VR wearable devices include:The VR helmets or VR glasses.
Further, the form of the coding is for H.261, H.263, H.264 or H.265.
The third aspect, the embodiment of the present invention additionally provide the live broadcasting method of above-mentioned virtual reality at following one or
Purposes among multinomial:
Sports event live broadcast, live news and variety are live.
Above-mentioned technical proposal provided in an embodiment of the present invention at least includes following beneficial effects:
In live broadcasting method, system of above-mentioned virtual reality provided in an embodiment of the present invention and application thereof, will be live
VR video datas be sent to playback terminal, due in VR video datas comprising the same moment difference angle
Multiple video datas of degree, so, playback terminal, can be according to control after VR video datas are received
Instruction, the video of wherein one or more angles is shown to user, is realized good between live video and user
Good interaction, makes user obtain the impression on the spot in person to live scene, compared to existing direct-seeding for, can
To effectively enhance sense of participation of the user to live scene, live effect is greatly improved.
Other features and advantages of the present invention will illustrate in the following description, also, partly from explanation
Become apparent in book, or understood by implementing the present invention.The purpose of the present invention and other advantages can
Realize and obtain by specifically noted structure in the specification, claims and accompanying drawing write
.
Below by drawings and Examples, technical scheme is described in further detail.
Brief description of the drawings
Accompanying drawing is used for providing a further understanding of the present invention, and constitutes a part for specification, with this hair
Bright embodiment is used to explain the present invention together, is not construed as limiting the invention.In the accompanying drawings:
Fig. 1 is a kind of structural representation of VR video cameras provided in an embodiment of the present invention;
Fig. 2 is the structural representation of another VR video cameras provided in an embodiment of the present invention;
Fig. 3 is the flow chart of the live broadcasting method of virtual reality provided in an embodiment of the present invention;
Fig. 4 is the flow chart for sending playback terminal to provided in an embodiment of the present invention;
Fig. 5 is a kind of configuration diagram of the live broadcast system of virtual reality provided in an embodiment of the present invention;
Fig. 6 is another configuration diagram of the live broadcast system of virtual reality provided in an embodiment of the present invention.
Specific embodiment
The exemplary embodiment of the disclosure is more fully described below with reference to accompanying drawings.Although being shown in accompanying drawing
The exemplary embodiment of the disclosure, it being understood, however, that may be realized in various forms the disclosure without should be by
Embodiments set forth here is limited.Conversely, there is provided these embodiments are able to be best understood from this
It is open, and can by the scope of the present disclosure it is complete convey to those skilled in the art.
Live broadcasting method first to virtual reality provided in an embodiment of the present invention is described in detail.
The live broadcasting method of virtual reality (Virtual Reality, VR) provided in an embodiment of the present invention, relatively
In existing live broadcasting method, difference is, in the live broadcasting method of the VR, live VR is regarded
According to playback terminal is sent to, the multiple that the video data of VR has included same moment difference angle is regarded frequency
Frequency evidence, so as to according to control instruction, show the video data of one or more of angles.
In existing live broadcasting method, in the live video data of the formation after gathering and editing, even if there is multiple to regard
Frequency source, but in live video data same moment only one of which video source vision signal, in playback terminal,
Also it is merely able to passively play the vision signal gathered and edited, the scene for playing out is also " plane ", and is adopted
It is same due to being included in VR video datas with the live broadcasting method of above-mentioned VR provided in an embodiment of the present invention
Moment difference angle multiple video datas, so, playback terminal after VR video datas are received, energy
It is enough that the video of wherein one or more angles is dynamically shown to user according to control instruction, it is capable of achieving live
Good interaction between video and user, makes user as on the spot in person, obtains vertical in live scene
Body-sensing is received, and further enhancing sense of participation of the user to live scene, improves live effect.
Further, in the live broadcasting method of above-mentioned VR provided in an embodiment of the present invention, live VR is regarded
Frequency is according to the step of being sent to playback terminal, may be implemented as:
The video data of whole angle in VR video datas is sent to playback terminal, or by VR videos
The video data of Partial angle is sent to playback terminal in data.
Live VR video datas are sent to playback terminal, one of above two mode can be used, on
State two ways, the first is that VR video datas are all sent into playback terminal, by playback terminal according to
Control instruction, the video data of one or more of angles is shown to user.For example, if
VR video datas integrally contain 360 ° of image informations of angular field of view, wherein specifically containing 10 not
With the video data (every 36 ° be an angular range) of angular range, first way be by this 10 not
Playback terminal is sent to the video data of angle simultaneously.
Second is to send VR video datas several times, is every time only sent to part VR video datas
Playback terminal, such as when initial, according to default or default angle, by default or default angle and neighbour
The part of near angle issues playback terminal, the angle indicated with the control instruction that playback terminal is received
Change, then the part of the related angle after change is then forwarded to user terminal again, or with above-mentioned bag
As a example by having contained 10 video datas of different angular ranges, when initial, can be by 0 °~36 ° and adjacent
The video data of angular range is sent to playback terminal, when control instruction indicates to need to show other angular ranges
Video when, retransmit the video data of corresponding angle to playback terminal.The advantage of this mode is certain
The occupancy of bandwidth can be reduced in degree, however it is necessary that playback terminal is multiple with VR video data transmitting terminals
Interaction.
Correspondingly, in the live broadcasting method of above-mentioned VR provided in an embodiment of the present invention, in playback terminal, exhibition
Show the video data of wherein one or more angles, can be embodied as:According to control instruction, show current time
One or more angles video data.
By taking a specific scene as an example, it is assumed that current live scene is the scene of flower floats parade, then, can
With simultaneously to user displaying flower floats parade in traveling vehicle the more forward scene in front and both sides, with
The control instruction at family, then progressively to user displaying traveling vehicle both sides more rearward and rear portion scene, together
At one moment, may simultaneously show there is the video data of multiple angles, so as to produce the sense of viewing stereo-picture
Official's effect.
In one embodiment, in a kind of live broadcasting method of virtual reality provided in an embodiment of the present invention, inciting somebody to action
Before live VR video datas are sent to playback terminal, following step is can also carry out:Collection is live
VR video datas.
Specifically, the live VR video datas of above-mentioned collection, may be implemented as:Use multiple difference angles
The camera of degree carries out the shooting of multiple angles to target scene simultaneously, obtains the multiple of same moment shooting
The video data of angle.
The live VR video datas of collection, can be by least 2 cameras of (more than 2) simultaneously
Shot, be as shown in Figure 1 possess two VR video cameras of camera, each camera can be gathered
The video of corresponding orientation in three dimensions, as shown in Fig. 2 possesses 10 cameras while being shot
VR video cameras, each camera correspondence shoot angular range it is different, each camera shoot angle
Degree scope can partly overlap or not overlap.
So, in the VR video acquisition stages, while using the camera of multiple difference angles to target scene
The shooting of multiple angles is carried out, the multichannel that can so obtain the different angular ranges that the same moment shoots is regarded
Frequency evidence.
After completing the shooting of all scenes in the manner described above, it becomes possible to which the VR for gathering multi-faceted full angle is regarded
Frequency evidence.
Further, after the live VR video datas of above-mentioned collection, in addition it is also necessary to which the VR to gathering is regarded
According to being encoded and being stored, the VR video datas of storage include that the multiple of shooting of same moment is different to frequency
The video data of angle.
By taking the flow chart shown in Fig. 3 as an example, the live broadcasting method of the virtual reality that the flow chart is illustrated may include down
State step:
The live VR video datas of step S31, collection;
Step S32, VR video datas are encoded and stored in real time;The VR video datas of storage
Multiple video datas of the different angles shot including the same moment, so as to according to control instruction, exhibition
Show the video data of one or more of angles;
Step S33, send the VR video datas of storage to playback terminal in real time.
Further, for multiple video datas of the different angles that synchronization shoots, for convenience
Transmission and broadcasting, the multiple of synchronization difference angle that can also be to being included in above-mentioned VR video datas are regarded
Frequency evidence carries out splicing.
The step of multiple video datas of synchronization difference angle carry out splicing, can spirit as needed
It is living to implement, such as following several situations:
The first situation is the multiple angles for shooting synchronization after live VR video datas are gathered
The video data of degree is spliced.After splicing is completed, then regarding multiple angles of shooting of same moment
Frequency evidence carries out spliced video data and is encoded.
The video data of camera collection is often the form of analog signal, is entered in the video data that will be collected
After row splicing, transmission to the video data after merging, it is necessary to further be encoded and deposited for convenience
Storage, is then transmitted, by different media so that playback terminal obtains live VR video counts in real time again
According to.
The process of second situation multiple video data splicing is carried out after coding is completed to multiple videos.Change speech
It, encodes to VR video datas in real time in above-mentioned steps S32, in the specific implementation, can wrap
Include following step:
The video data of the multiple angles shot to the same moment is encoded respectively;
Information after the video data encoding of multiple angles is spliced.
Multiple video datas of the different angles shot to the same moment are encoded respectively, and it is right to frequently include
The video data that each camera shoots, is simulated signal form to the conversion of digital signal form first,
Then it is compressed again, encodes, makes a direct-broadcasting code stream.
After experienced cataloged procedure, the video data of each analog signal form has been converted into specific regarding
The direct-broadcasting code stream of frequency form, then the code stream after the video data encoding of multiple difference angles is spliced, obtain
VR video datas after to coding.
The form of coding can have various, for example Motion Picture Experts Group (Moving Picture Experts
Group, MPEG) series standard form, H.261, H.263, the H.264 and H.265 lattice of International Telecommunication Association
Formula etc..
It is preferred that in embodiments of the present invention, can be encoded to VR videos using H.265 form.
H.265 standard to existing video encoding standard, H.264 improved by some related technologies.H.265
Standard is used to improve the relation between code stream, coding quality, time delay and algorithm complex using advanced technology,
Being optimal is set.Improved aspect includes:Improve compression efficiency, improve robustness and Fault recovery energy
Power, the real-time time delay of reduction, reduction channel acquisition time and Stochastic accessing time delay, reduction complexity etc..H264
Due to algorithm optimization, the velocity interpolation SD digital picture that can be less than 1Mbps is transmitted;H265 then can be with
Realize transmitting the common high resolution audio and videos of 720P (resolution ratio 1280*720) using the transmission speed of 1~2Mbps.
Encoded using H.265 form and be advantageous in that, in the case of using same band, can be transmitted higher
The video of quality, or in the case that transmission video quality is constant, occupied bandwidth, Ke Yiyou as few as possible
Effect reduces the time delay of live VR transmission of video, it is ensured that the effect of VR live real-time.
In spliced VR video datas, due to containing multiple video datas of different angles, for side
Just decode, can be in splicing, the video counts of the different angles shot to synchronization difference camera
Default separator or identifier is added between effectively to distinguish.
Further, in above-mentioned steps S33, can be by variety carrier form, by live VR videos
Multiple video datas of same moment difference angle send playback terminal in real time in data, for example can be with
Send playback terminal to by way of radio broadcasting (Public Air) or by way of data flow.
It is preferred that live VR video datas are stored in Cloud Server.
It is above-mentioned the step of send VR video datas to playback terminal based on this, in the specific implementation, such as
Shown in Fig. 4, can comprise the steps:
Step S41, when the live playing requests of VR of playback terminal initiation are received, according to playing eventually
The IP address at end, according to the distance with playback terminal, in server usable bandwidth and server load
One or more, it is defined as at least one cloud storage position that playback terminal provides VR video datas;
Step S42, read from cloud storage position and VR video datas and send playback terminal to.
Above-mentioned steps S41 in the specific implementation, can synthetically refer to factor, the clothes of the distance of playback terminal
The effective bandwidth of business device and server load etc. factor, selection are best suitable for providing server for the playback terminal
Cloud storage position, then by the storage location preserve VR video datas export to playback terminal.
By the way of cloud storage, exactly the live VR Video Redundancies of identical are backed up in multiple cloud storages
In position, for example, it is stored among different cloud storage service devices, when user's request accesses live VR videos
During data, the propagation efficiency of live VR video datas can be effectively improved, it is especially larger in user's visit capacity
And it is more dispersed in the case of, the mode of cloud storage can effectively disperse the access pressure of individual server,
Live VR video datas are enable rapidly to transmit to the playback terminal of request access, it is ensured that live
Real-time.
Wherein, VR video datas are transmitted by the way of data flow, can be passed by including traditional fixed network
Defeated mode, it is also possible to by way of being transmitted the mobile Internet that various mobile communications networks are realized.
In embodiments of the present invention, playback terminal is included but is not limited to:Traditional tv (can be wide by radio
The mode for broadcasting signal receives live VR video datas), intelligent television (or Web TV, can be by mutual
Networking mode obtains live VR video datas), various mobile terminals such as smart mobile phone, panel computer,
VR wearable devices (the VR helmets and VR glasses) etc..
Playback terminal possesses the function of playing VR videos, and user can directly be sent in itself by playback terminal
The control instruction in specific orientation, or send control indirectly to playback terminal by other control devices and refer to
Order, control instruction is used for the video for controlling playback terminal to play which or which angle.
For example, above-mentioned control instruction is obtained by following one or more mode:
By perceive screen to playback terminal, the touch-control of keyboard or manipulation button and obtain;(such as user
The screen of touch-control smart mobile phone to left and right, the situation of upper and lower movement)
The instruction sent by the remote control equipment for perceiving playback terminal is obtained;(for example user clicks on remote control
The situation of directionkeys up and down)
The instruction sent by the joystick for perceiving playback terminal is obtained;(for example user manipulates as needed
The situation of handle)
Obtained by the angle change perceived produced by the motion of VR wearable devices.(for example user wears
VR wearable devices, with the motion of user's head, perceive the change of its angle)
In the above example, user can directly on the screen of smart mobile phone, by it is upward, downward, to
Left or slip instruction to the right, sends the instruction of upward, downward, motion direction to the left and to the right, intelligence
Mobile phone can be switched to after the video of corresponding angle scope is decoded according to the control instruction and play.
In the above example, it is generally used for television set as broadcasting by way of remote control sends control instruction
The situation of terminal, user can for example press the directionkeys for moving up by pressing the directionkeys of remote control
When, according to the visual custom of human eye, it is assumed that the currently displaying video data being horizontally oriented, then with
The control instruction for moving up that user sends, television set enters the video data of certain angle of being faced upward to level
Row decoding simultaneously shows that for example television set is initially shown the scenery near sea horizon to user, with user
The upwardly direction key of remote control is pressed, the content of display is gradually faced upward by near sea horizon in television set
It is transformed to the seabird circled in the air between Hai Tian occur, until all showing whole sky.
In the above example, user directly can also send motion direction to any by VR wearable devices
The control instruction of angle, such as the VR helmets and VR glasses, user are wearing the VR helmets and VR
After mirror, it is fitted with that the sensing of user's head athletic posture and angle can be perceived on these hardware devices
Device, these sensors can accurately perceive the movement angle of user's head change (for example face upward, nutation,
Turn left, turn right etc.), such as when user's head turns left, sensor sense on the VR helmets or VR glasses
The motion of the head of user should be arrived, relatively current angles of display one or more video counts to the left are read automatically
According to and decoded and played.
The mode of joystick can be used cooperatively with such as television set, computer, smart mobile phone etc., Yong Huyao
Fixed handle, sends the control instruction of moving direction, due to joystick and television set, computer or smart mobile phone
It is connected Deng playback terminal, after joystick sends control instruction to television set, computer or smart mobile phone, electricity
Depending on mechanical, electrical brain or smart mobile phone according to the control instruction, the video of reading corresponding angle is played after being decoded.
The embodiment of the present invention additionally provides a kind of live broadcasting method of above-mentioned virtual reality at following one or more
Among purposes:Sports event live broadcast, live news and variety are live.
Based on same inventive concept, the embodiment of the present invention additionally provides a kind of live broadcast system of virtual reality, by
It is similar to the live broadcasting method of aforementioned virtual reality in the principle of the live broadcast system institute solve problem of the virtual reality,
Therefore the implementation of the system may refer to the implementation of preceding method, repeats part and repeats no more.
As shown in figure 5, the live broadcast system of virtual reality provided in an embodiment of the present invention, including:
VR video capture devices 51, for gathering live VR video datas;
Cloud server system 52, for storing VR video datas, the VR video datas of storage are including same
The video data of the multiple difference angles that individual moment shoots, and by the video of multiple angles at same moment
Data are sent to playback terminal.
Further, above-mentioned cloud server system 52, specifically for by whole angle in VR video datas
Video data is sent to playback terminal, or the video data of Partial angle in VR video datas is sent to
Playback terminal.
Further, as shown in figure 5, the live broadcast system of virtual reality provided in an embodiment of the present invention, may be used also
To include:
At least one playback terminal 53, for according to control instruction, showing one in the VR video datas
Or the video data of multiple angles.
Further, as aforementioned virtual reality live broadcasting method described in, VR video capture devices 51 can be with
It is the picture pick-up device of the camera for possessing multiple (at least two) difference angle, for using multiple difference angles
The camera of degree carries out the shooting of multiple angles to target scene simultaneously, obtains the difference of same moment shooting
Multiple video datas of angle.
By shown in Fig. 2 possess 10 picture pick-up devices of camera as a example by, 10 of picture pick-up device shootings
Head is located in same level, is annularly evenly arranged, and it is equal to be spaced angle, the equal court of each camera
Outward, it is responsible for shooting the video data of corresponding angle, dotted line represents each camera and clapped in Fig. 2
The angular range taken the photograph.
The step of multiple video datas of the different angles shot to each camera same moment splice,
Can be carried out in VR video capture devices 51, cloud server system 52 or playback terminal 53.
In the case of performing splicing by VR video capture devices 51, VR video capture devices 51 are in collection
After live VR video datas, the video data of multiple angles that synchronization is shot is spliced,
And spliced video data is encoded, the VR video data transmittings that will be obtained after coding deliver to cloud service
Device system 52 is stored.
In these cases, the step of video-splicing and coding is by the responsible video for gathering VR video datas
Collecting device 51 itself is completed, and this requires that VR video capture devices 51 possess video data higher and adopt
The ability of collection, caching, treatment and output, due to live requirement of real-time characteristic high, during live
The ageing requirement of each link it is higher therefore higher to the requirement of VR collecting devices software and hardware.
In the case of performing splicing by cloud server system 52, cloud server system 52 is by VR video acquisitions
The video data of multiple angles that the synchronization of the collection of equipment 51 shoots is spliced, and to spliced
Video data is encoded, the VR video datas obtained after storage coding.
Or take following manner:Cloud server system 52 by VR video capture devices 51 gather it is same
The video data of multiple angles that moment shoots is encoded respectively, to regarding for multiple angles for being obtained after coding
Frequency stores the VR video datas obtained after splicing according to being spliced.
In the case of above-mentioned, video-splicing is completed with the step of coding by cloud server system 52, this service
Device often itself has the ability that stronger video data is processed, compared with foregoing the first situation, splicing,
The process of coding can be completed relatively quickly, not restricted by equipment soft hardware performance.
First splice the mode of re-encoding, and first encode the mode spliced again, it is each advantageous, due to having encoded
Cheng Hou, the data volume obtained after coding greatly reduces relative to the original video data amount before coding, first splices
The mode of re-encoding, the data volume of splicing is relatively larger, and the data volume of coded treatment is relatively smaller,
And the mode spliced again is first encoded, the data volume of coded treatment is relatively larger, the data volume phase of splicing
To smaller.Can specifically be determined according to the disposal ability of equipment.
In the case of performing splicing by playback terminal 53, by VR video capture devices 51 or Cloud Server system
System 52 completes the coding of multiple video datas of the different angles at same moment in VR video datas, so
Afterwards, after VR video datas after playback terminal 53 receives coding, after the coding that will wherein include
Multiple video datas spliced again, finally according to the control instruction of user, play wherein corresponding angle
One or more videos.
In the specific implementation, volume is included in above-mentioned VR video capture devices 51 or cloud server system 52
Decoding apparatus, the equipment is used to encode VR video datas.
In another alternative embodiment, encoding device can also be independently of VR video capture devices and cloud
Outside server, in this case, as shown in fig. 6, the live VR network architectures can include:VR is regarded
Frequency collecting device 51, encoding device 54 and cloud server system 52 and playback terminal 53;Wherein, encode
Equipment 54 is connected with VR video capture devices 51 and cloud server system 52 simultaneously.
Further, cloud server system 52, specifically for by way of radio broadcasting or by number
Multiple video datas of the different angles at same moment are sent to playback terminal according to the mode of stream.
Further, above-mentioned cloud server system 52, including multiple cloud storage service devices, receive for working as
During the live playing requests of VR that playback terminal is initiated, according to the IP address of playback terminal 53, according to
The distance of playback terminal 53, server can be used one or more in bandwidth and server load, it is determined that
At least one cloud storage service device of VR video datas is provided for playback terminal 53;Deposited from the cloud determined
The VR video datas are read in storage server and send the playback terminal 53 to.
Cloud server system 52 can use the framework of various cloud storages, such as distributed server cluster
Mode etc..
In the live broadcast system of above-mentioned virtual reality, playback terminal 53 can include:Television set, various shiftings
Dynamic terminal and VR wearable devices etc..
Further, above-mentioned VR wearable devices include:The VR helmets and VR glasses.
By the live broadcast system of above-mentioned virtual reality provided in an embodiment of the present invention, user can be by playing
Terminal directly manipulated or manipulated indirectly by other control devices, realizes the control according to user
Instruction, is switched between the video data of different angles, and user is wanted the relative orientations arc of viewing
The continuous real-time exhibition of live video by the interaction with user, can give user with strong live to user
Scene substitutes into sense, enhances live result of broadcast.
Further, above-mentioned control instruction is obtained by following one or more mode:
By perceive screen to playback terminal, the touch-control of keyboard or manipulation button and obtain;
The instruction sent by the remote control equipment for perceiving playback terminal is obtained;
The instruction sent by the joystick for perceiving playback terminal is obtained;
Obtained by the angle change perceived produced by the motion of VR wearable devices.
In the live broadcast system of above-mentioned virtual reality, the form of coding can use MPEG series standard lattice
Formula, or International Telecommunication Association H.261, H.263, H.264 and H.265 form etc., it is preferred that can be with
Using H.265 form.
In live broadcasting method, system of above-mentioned virtual reality provided in an embodiment of the present invention and application thereof, will be live
VR video datas be sent to playback terminal, due in VR video datas comprising the same moment difference angle
Multiple video datas of degree, so, playback terminal, can be according to control after VR video datas are received
Instruction, the video of wherein one or more angles is shown to user, is realized good between live video and user
Good interaction, makes user obtain the impression on the spot in person to live scene, compared to existing direct-seeding for, can
To effectively enhance sense of participation of the user to live scene, live effect is greatly improved.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or meter
Calculation machine program product.Therefore, the present invention can be using complete hardware embodiment, complete software embodiment or knot
Close the form of the embodiment in terms of software and hardware.And, the present invention can be used and wherein wrapped at one or more
Containing computer usable program code computer-usable storage medium (including but not limited to magnetic disk storage and
Optical memory etc.) on implement computer program product form.
The present invention is produced with reference to method according to embodiments of the present invention, equipment (system) and computer program
The flow chart and/or block diagram of product is described.It should be understood that can by computer program instructions realize flow chart and
/ or block diagram in each flow and/or the flow in square frame and flow chart and/or block diagram and/
Or the combination of square frame.These computer program instructions to all-purpose computer, special-purpose computer, insertion can be provided
The processor of formula processor or other programmable data processing devices is producing a machine so that by calculating
The instruction of the computing device of machine or other programmable data processing devices is produced for realizing in flow chart one
The device of the function of being specified in individual flow or multiple one square frame of flow and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in can guide computer or the treatment of other programmable datas to set
In the standby computer-readable memory for working in a specific way so that storage is in the computer-readable memory
Instruction produce include the manufacture of command device, the command device realization in one flow of flow chart or multiple
The function of being specified in one square frame of flow and/or block diagram or multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices, made
Obtain and series of operation steps is performed on computer or other programmable devices to produce computer implemented place
Reason, so as to the instruction performed on computer or other programmable devices is provided for realizing in flow chart one
The step of function of being specified in flow or multiple one square frame of flow and/or block diagram or multiple square frames.
Obviously, those skilled in the art can carry out various changes and modification without deviating from this hair to the present invention
Bright spirit and scope.So, if it is of the invention these modification and modification belong to the claims in the present invention and
Within the scope of its equivalent technologies, then the present invention is also intended to comprising these changes and modification.
Claims (28)
1. a kind of live broadcasting method of virtual reality, it is characterised in that by live Virtual Reality video
Data are sent to playback terminal, multiple videos of the VR video datas comprising difference angle of same moment
Data, so as to according to control instruction, show the video data of one or more of angles.
2. the live broadcasting method described in claim 1, it is characterised in that the live VR video datas
Being sent to playback terminal includes:
The video data of whole angles in the VR video datas is sent to playback terminal, or will be described
The video data of Partial angle is sent to playback terminal in VR video datas.
3. the live broadcasting method described in claim 1, it is characterised in that described according to instruction, displaying is wherein
One or more angles video data, including:
According to control instruction, show the video data of one or more angles at current time.
4. the live broadcasting method described in claim 1, it is characterised in that described by live VR video counts
According to before being sent to playback terminal, methods described also includes:The live VR video datas of collection.
5. the live broadcasting method described in claim 4, it is characterised in that the live VR video datas of collection,
Including:
Carry out the shooting of multiple angles to target scene simultaneously using the camera of multiple difference angles, obtain same
One multiple video data of the different angles of moment shooting.
6. the live broadcasting method described in claim 4, it is characterised in that the live VR videos of the collection
After data, also include:
The VR video datas to gathering are encoded and stored, and the VR video datas of storage are included together
One multiple video data of the different angles of moment shooting.
7. live broadcasting method as claimed in claim 6, it is characterised in that the live VR video counts of collection
After, also include:
The video data of multiple angles that the same moment is shot is spliced;
The VR video datas of described pair of collection are encoded, including:
The video data of multiple angles that the same moment is shot carries out spliced video data and enters
Row coding.
8. live broadcasting method as claimed in claim 6, it is characterised in that the described pair of VR video of collection
Data are encoded, including:
The video data of the multiple angles shot to the same moment is encoded respectively;
Information after the video data encoding of the multiple angle is spliced.
9. live broadcasting method as claimed in claim 1, it is characterised in that by live VR video datas
In multiple video datas of same moment difference angle be sent to playback terminal, including:
Multiple video datas of same moment difference angle in live VR video datas are passed through into radio
The mode of broadcast is sent to playback terminal by way of data flow.
10. live broadcasting method as claimed in claim 9, it is characterised in that the live VR video counts
According to being stored in Cloud Server.
11. live broadcasting methods as claimed in claim 10, it is characterised in that described to be sent to playback terminal,
Including:
According to the IP address of playback terminal, according to the distance with playback terminal, server can be used bandwidth and
One or more in server load, it is defined as the playback terminal provides VR video datas at least one
Individual cloud storage position;
The VR video datas are read from the cloud storage position and the playback terminal is sent to.
12. live broadcasting method as described in claim any one of 1-11, it is characterised in that the control instruction
Obtained by following one or more mode:
By perceive screen to playback terminal, the touch-control of keyboard or manipulation button and obtain;
The instruction sent by the remote control equipment for perceiving playback terminal is obtained;
The instruction sent by the joystick for perceiving playback terminal is obtained;
Obtained by the angle change perceived produced by the motion of VR wearable devices.
13. live broadcasting methods as claimed in claim 12, it is characterised in that the VR wearable devices
Including:The VR helmets or VR glasses.
14. live broadcasting method as described in claim any one of 6-8, it is characterised in that the lattice of the coding
Formula is for H.261, H.263, H.264 or H.265.
A kind of 15. live broadcast systems of virtual reality, it is characterised in that including:
Virtual Reality video capture device, for gathering live VR video datas;
Cloud server system, for storing VR video datas, the VR video datas of the storage are included together
One video data of the multiple difference angles of moment shooting, and regarding multiple angles at same moment
Frequency evidence is sent to playback terminal.
16. live broadcast systems as claimed in claim 15, it is characterised in that the cloud server system,
Specifically for the video data of whole angles in the VR video datas is sent into playback terminal, or will
The video data of Partial angle is sent to playback terminal in the VR video datas.
Live broadcast system described in 17. claims 15, it is characterised in that the live broadcast system also includes:
At least one playback terminal, for according to control instruction, showing one or more in the VR video datas
The video data of angle.
18. live broadcast systems as claimed in claim 15, it is characterised in that the VR video acquisitions set
The standby camera for including multiple difference angles, for the camera using the multiple different angles simultaneously to mesh
Mark scene carries out the shooting of multiple angles, obtains multiple video counts of the different angles that the same moment shoots
According to.
19. live broadcast systems as claimed in claim 18, it is characterised in that the VR video acquisitions set
It is standby, it is additionally operable to after live VR video datas are gathered, by regarding for multiple angles of synchronization shooting
Frequency evidence is spliced, and spliced video data is encoded, the VR videos that will be obtained after coding
Data is activation to cloud server system is stored.
20. live broadcast systems as claimed in claim 18, it is characterised in that the cloud server system is also
The video data of the multiple angles shot for the synchronization for gathering VR video capture devices is spelled
Connect, and spliced video data is encoded, the VR video datas obtained after storage coding.
Live broadcast system described in 21. claims 18, it is characterised in that the cloud server system is also used
The video data of the multiple angles shot in the synchronization for gathering VR video capture devices is compiled respectively
Code, the video data of the multiple angle to being obtained after coding splices, and is obtained after storage splicing
VR video datas.
22. live broadcast system as described in claim any one of 18-20, it is characterised in that the VR is regarded
Encoding device is included in frequency collecting device or cloud server system, for being encoded to VR video datas.
23. live broadcast systems as claimed in claim 15, it is characterised in that cloud server system, specifically
For by way of radio broadcasting or by way of data flow by the different angles at same moment
Multiple video datas be sent to playback terminal.
24. live broadcast systems as claimed in claim 15, it is characterised in that the cloud server system,
Including multiple cloud storage service devices, for when the live playing requests of the VR for receiving playback terminal initiation,
According to the IP address of playback terminal, bandwidth and service can be used according to the distance with playback terminal, server
One or more in device load, it is defined as at least one cloud that the playback terminal provides VR video datas
Storage server;The VR video datas are read from the cloud storage service device determined and is sent to
The playback terminal.
25. live broadcast system as described in claim any one of 15-24, it is characterised in that control instruction is led to
Following one or more mode are crossed to obtain:
By perceive screen to playback terminal, the touch-control of keyboard or manipulation button and obtain;
The instruction sent by the remote control equipment for perceiving playback terminal is obtained;
The instruction sent by the joystick for perceiving playback terminal is obtained;
Obtained by the angle change perceived produced by the motion of VR wearable devices.
26. live broadcast systems as claimed in claim 25, it is characterised in that the VR wearable devices
Including:The VR helmets or VR glasses.
27. live broadcast system as described in claim any one of 19-21, it is characterised in that the coding
Form is for H.261, H.263, H.264 or H.265.
The purposes of method described in 28. claim any one of 1-14 among following one or more:
Sports event live broadcast, live news and variety are live.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510867854.4A CN106878764A (en) | 2015-12-01 | 2015-12-01 | A kind of live broadcasting method of virtual reality, system and application thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510867854.4A CN106878764A (en) | 2015-12-01 | 2015-12-01 | A kind of live broadcasting method of virtual reality, system and application thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106878764A true CN106878764A (en) | 2017-06-20 |
Family
ID=59236339
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510867854.4A Pending CN106878764A (en) | 2015-12-01 | 2015-12-01 | A kind of live broadcasting method of virtual reality, system and application thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106878764A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107454434A (en) * | 2017-08-14 | 2017-12-08 | 姜汉龙 | Virtual reality net cast method and video playing terminal |
CN107481324A (en) * | 2017-07-05 | 2017-12-15 | 微幻科技(北京)有限公司 | A kind of method and device of virtual roaming |
CN107896333A (en) * | 2017-11-29 | 2018-04-10 | 北京未来媒体科技股份有限公司 | The method and device that a kind of remote control panoramic video based on intelligent terminal plays |
CN108156467A (en) * | 2017-11-16 | 2018-06-12 | 腾讯科技(成都)有限公司 | Data transmission method and device, storage medium and electronic device |
CN108418832A (en) * | 2018-03-26 | 2018-08-17 | 深圳市酷开网络科技有限公司 | A kind of virtual reality shopping guide method, system and storage medium |
CN108833892A (en) * | 2018-05-28 | 2018-11-16 | 徐州昇科源信息技术有限公司 | A kind of VR live broadcast system |
CN109104613A (en) * | 2017-06-21 | 2018-12-28 | 苏宁云商集团股份有限公司 | A kind of VR live broadcasting method and system for realizing the switching of multimachine position |
CN111447462A (en) * | 2020-05-20 | 2020-07-24 | 上海科技大学 | Video live broadcast method, system, storage medium and terminal based on visual angle switching |
CN111614968A (en) * | 2020-05-11 | 2020-09-01 | 厦门潭宏信息科技有限公司 | Live broadcast method, equipment and storage medium |
CN112203128A (en) * | 2020-10-12 | 2021-01-08 | 广州欢网科技有限责任公司 | Method, device and system for playing VR panoramic video of smart television terminal |
CN112383784A (en) * | 2020-11-16 | 2021-02-19 | 浙江传媒学院 | Video playing method, video transmission method and VR cluster playing system |
CN114268840A (en) * | 2021-12-20 | 2022-04-01 | 中国电信股份有限公司 | Video pushing method and device, storage medium and electronic equipment |
CN114615528A (en) * | 2020-12-03 | 2022-06-10 | 中移(成都)信息通信科技有限公司 | VR video playing method, system, device and medium |
CN116266868A (en) * | 2021-12-17 | 2023-06-20 | 聚好看科技股份有限公司 | Display equipment and viewing angle switching method |
CN116320366A (en) * | 2023-05-18 | 2023-06-23 | 中数元宇数字科技(上海)有限公司 | Video stream data pushing method, device, equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102523420A (en) * | 2011-11-30 | 2012-06-27 | 江苏奇异点网络有限公司 | Online remote teaching method capable of virtualizing real environment |
CN104144335A (en) * | 2014-07-09 | 2014-11-12 | 青岛歌尔声学科技有限公司 | Head-wearing type visual device and video system |
CN104506826A (en) * | 2015-01-13 | 2015-04-08 | 中南大学 | Fixed-point directional video real-time mosaic method without valid overlapping variable structure |
CN104580986A (en) * | 2015-02-15 | 2015-04-29 | 王生安 | Video communication system combining virtual reality glasses |
CN104660995A (en) * | 2015-02-11 | 2015-05-27 | 尼森科技(湖北)有限公司 | Disaster relief visual system |
-
2015
- 2015-12-01 CN CN201510867854.4A patent/CN106878764A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102523420A (en) * | 2011-11-30 | 2012-06-27 | 江苏奇异点网络有限公司 | Online remote teaching method capable of virtualizing real environment |
CN104144335A (en) * | 2014-07-09 | 2014-11-12 | 青岛歌尔声学科技有限公司 | Head-wearing type visual device and video system |
CN104506826A (en) * | 2015-01-13 | 2015-04-08 | 中南大学 | Fixed-point directional video real-time mosaic method without valid overlapping variable structure |
CN104660995A (en) * | 2015-02-11 | 2015-05-27 | 尼森科技(湖北)有限公司 | Disaster relief visual system |
CN104580986A (en) * | 2015-02-15 | 2015-04-29 | 王生安 | Video communication system combining virtual reality glasses |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109104613A (en) * | 2017-06-21 | 2018-12-28 | 苏宁云商集团股份有限公司 | A kind of VR live broadcasting method and system for realizing the switching of multimachine position |
CN107481324A (en) * | 2017-07-05 | 2017-12-15 | 微幻科技(北京)有限公司 | A kind of method and device of virtual roaming |
CN107481324B (en) * | 2017-07-05 | 2021-02-09 | 微幻科技(北京)有限公司 | Virtual roaming method and device |
CN107454434A (en) * | 2017-08-14 | 2017-12-08 | 姜汉龙 | Virtual reality net cast method and video playing terminal |
CN108156467A (en) * | 2017-11-16 | 2018-06-12 | 腾讯科技(成都)有限公司 | Data transmission method and device, storage medium and electronic device |
WO2019096064A1 (en) * | 2017-11-16 | 2019-05-23 | 腾讯科技(深圳)有限公司 | Data transmission method and device, storage medium, and electronic device |
CN108156467B (en) * | 2017-11-16 | 2021-05-11 | 腾讯科技(成都)有限公司 | Data transmission method and device, storage medium and electronic device |
CN107896333A (en) * | 2017-11-29 | 2018-04-10 | 北京未来媒体科技股份有限公司 | The method and device that a kind of remote control panoramic video based on intelligent terminal plays |
CN108418832A (en) * | 2018-03-26 | 2018-08-17 | 深圳市酷开网络科技有限公司 | A kind of virtual reality shopping guide method, system and storage medium |
CN108833892A (en) * | 2018-05-28 | 2018-11-16 | 徐州昇科源信息技术有限公司 | A kind of VR live broadcast system |
CN111614968A (en) * | 2020-05-11 | 2020-09-01 | 厦门潭宏信息科技有限公司 | Live broadcast method, equipment and storage medium |
CN111614968B (en) * | 2020-05-11 | 2021-12-17 | 厦门潭宏信息科技有限公司 | Live broadcast method, equipment and storage medium |
CN111447462A (en) * | 2020-05-20 | 2020-07-24 | 上海科技大学 | Video live broadcast method, system, storage medium and terminal based on visual angle switching |
CN111447462B (en) * | 2020-05-20 | 2022-07-05 | 上海科技大学 | Video live broadcast method, system, storage medium and terminal based on visual angle switching |
CN112203128A (en) * | 2020-10-12 | 2021-01-08 | 广州欢网科技有限责任公司 | Method, device and system for playing VR panoramic video of smart television terminal |
CN112383784A (en) * | 2020-11-16 | 2021-02-19 | 浙江传媒学院 | Video playing method, video transmission method and VR cluster playing system |
CN114615528A (en) * | 2020-12-03 | 2022-06-10 | 中移(成都)信息通信科技有限公司 | VR video playing method, system, device and medium |
CN114615528B (en) * | 2020-12-03 | 2024-04-19 | 中移(成都)信息通信科技有限公司 | VR video playing method, system, equipment and medium |
CN116266868A (en) * | 2021-12-17 | 2023-06-20 | 聚好看科技股份有限公司 | Display equipment and viewing angle switching method |
CN114268840A (en) * | 2021-12-20 | 2022-04-01 | 中国电信股份有限公司 | Video pushing method and device, storage medium and electronic equipment |
CN116320366A (en) * | 2023-05-18 | 2023-06-23 | 中数元宇数字科技(上海)有限公司 | Video stream data pushing method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106878764A (en) | A kind of live broadcasting method of virtual reality, system and application thereof | |
CN106789991B (en) | Multi-person interactive network live broadcast method and system based on virtual scene | |
CN106792246B (en) | Method and system for interaction of fusion type virtual scene | |
US10645369B2 (en) | Stereo viewing | |
CN107615338B (en) | Methods and apparatus for generating and using reduced resolution images and/or transmitting such images to playback or content distribution devices | |
CN108337497B (en) | Virtual reality video/image format and shooting, processing and playing methods and devices | |
CN106658011A (en) | Panoramic video coding and decoding methods and devices | |
CN107529064A (en) | A kind of self-adaptive encoding method based on VR terminals feedback | |
CN204350168U (en) | A kind of three-dimensional conference system based on line holographic projections technology | |
CN104065951B (en) | Video capture method, video broadcasting method and intelligent glasses | |
CN104335243B (en) | A kind of method and device for handling panorama | |
CN102934449A (en) | Three-dimensional recording and display system using near- and distal-focused images | |
CN105516639A (en) | Headset device, three-dimensional video call system and three-dimensional video call implementing method | |
CN108307197A (en) | Transmission method, playback method and the device and system of virtual reality video data | |
CN106657719A (en) | Intelligent virtual studio system | |
CN107835435B (en) | Event wide-view live broadcasting equipment and associated live broadcasting system and method | |
CN101047872B (en) | Stereo audio vedio device for TV | |
US11010923B2 (en) | Image encoding method and technical equipment for the same | |
CN108184078A (en) | A kind of processing system for video and its method | |
CN107547889B (en) | A kind of method and device carrying out three-dimensional video-frequency based on instant messaging | |
KR101433082B1 (en) | Video conversing and reproducing method to provide medium feeling of two-dimensional video and three-dimensional video | |
WO2017220851A1 (en) | Image compression method and technical equipment for the same | |
Inoue et al. | Multiple-angle 3D video technology for distant live concerts | |
TW202318880A (en) | Method and system for real-time streaming multi-angle video | |
CN115174942A (en) | Free visual angle switching method and interactive free visual angle playing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170620 |
|
WD01 | Invention patent application deemed withdrawn after publication |