CN116264640A - Viewing angle switching method, device and system for free viewing angle video - Google Patents

Viewing angle switching method, device and system for free viewing angle video Download PDF

Info

Publication number
CN116264640A
CN116264640A CN202111508547.9A CN202111508547A CN116264640A CN 116264640 A CN116264640 A CN 116264640A CN 202111508547 A CN202111508547 A CN 202111508547A CN 116264640 A CN116264640 A CN 116264640A
Authority
CN
China
Prior art keywords
video
barrage
terminal
time
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111508547.9A
Other languages
Chinese (zh)
Inventor
徐伟兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111508547.9A priority Critical patent/CN116264640A/en
Priority to PCT/CN2022/135954 priority patent/WO2023103875A1/en
Publication of CN116264640A publication Critical patent/CN116264640A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

A visual angle switching method, a visual angle switching device and a visual angle switching system for a free visual angle video relate to the technical field of Internet, and can automatically switch the playing visual angle of the free visual angle video to the visual angle corresponding to a first bullet screen after a terminal detects that a user operates the first bullet screen, so that the viewing experience of the free visual angle video is improved, and the method comprises the following steps: the terminal displays a first interface, wherein the first interface comprises a video picture of a first free view angle video at a first view angle and a first barrage, and the first barrage corresponds to a second view angle of the first free view angle video; when the first operation of the user on the first barrage is detected, the terminal displays a second interface, wherein the second interface comprises a video picture of the first free view video at the second view.

Description

Viewing angle switching method, device and system for free viewing angle video
Technical Field
The application relates to the technical field of internet, in particular to a viewing angle switching method, device and system for free viewing angle video.
Background
The freeview video (free viewpoint video, fv) refers to a video file of a special format generated by commonly capturing a target object by a plurality of cameras surrounding the target object and performing specific video encoding on the commonly captured video stream. And uploading the generated free view video to a media server for storage, and then distributing the free view video to a client on a terminal side through a content distribution network (content distribution Network, CDN). When a user views a freeview video at a client side using a terminal side, the viewing angle can be freely switched by an operation of finger dragging. Compared with the traditional video, the user can freely select the viewing angle, the defect that the viewing angle is determined by the guide of the traditional video is overcome, and the free viewing angle video provides an immersive interactive playing experience for the user.
However, when the shooting scene range of the target object is large and the viewing angle range that the user can watch on the client is limited, there is a blind area, which may cause the user to miss some highlight clips that can currently watch other viewing angles outside the viewing angle range, and the viewing experience of the user needs to be improved.
Disclosure of Invention
According to the viewing angle switching method, device and system for the free viewing angle video, after the terminal detects that the user operates the first bullet screen, the playing viewing angle of the free viewing angle video is automatically switched to the viewing angle corresponding to the first bullet screen, and viewing experience of the free viewing angle video is improved.
In order to achieve the above purpose, the embodiment of the present application provides the following technical solutions:
in a first aspect, a method for switching a view angle of a free view video is provided, the method including: the terminal displays a first interface, wherein the first interface comprises a video picture of a first free view angle video at a first view angle and a first barrage, and the first barrage corresponds to a second view angle of the first free view angle video; when the first operation of the user on the first barrage is detected, the terminal displays a second interface, wherein the second interface comprises a video picture of the first free view video at the second view.
Therefore, when watching the video at the first free viewing angle, the user can quickly switch to the viewing angle corresponding to the first barrage by operating the first barrage, so that the user can conveniently see the highlight of the viewing angle. It can be understood that when the shooting scene range of the first freeview video is larger, although the user can watch the limited view range on the terminal, a blind area exists, and the user can know which views exist highlight clips through the barrage content displayed on the video picture. The highlight clips may be video clips that are good for most users, video clips that are of interest to viewers, and so on. The user can rapidly switch to the view angle corresponding to the specific barrage by operating the specific barrage, so that the highlight of the view angle is watched, and the watching experience of the user is improved.
In a possible implementation manner, the terminal displays a first interface, including: the terminal displays a first interface at a first playing moment; when the first operation of the user on the first barrage is detected, the terminal displays a second interface, which comprises the following steps: when the first operation of the user on the first barrage is detected, the terminal displays a second interface at a second playing time, wherein the second playing time is related to the first playing time.
In other words, the first interface includes a video frame of the first free view video at the first playing time under the first view. The second interface comprises a video picture of the first free view video at a second playing time under a second view.
In a possible implementation manner, the second playing time is related to the first playing time, including: the second playing time is equal to the first playing time, or the second playing time is equal to the first playing time minus a first preset time offset value; or, the second playing time is equal to the appearance time of the first barrage, or the second playing time is equal to the appearance time of the first barrage minus a second preset time offset value.
That is, in some examples, after the user performs the first operation on the first interface for the first bullet screen, not only the play angle of view is switched to the angle of view corresponding to the first bullet screen, but also the play time is switched to the current play time of the first interface when the first operation is performed, that is, the first play time. Or, switching to a time slightly before the first playing time, i.e. the second playing time is equal to the first playing time minus the first preset time offset value. In still other examples, after the user performs the first operation on the first interface for the first bullet screen, not only the playing view angle is switched to the view angle corresponding to the first bullet screen, but also the playing time is switched to the appearance time of the first bullet screen, or to the playing time slightly before the appearance time of the first bullet screen, that is, the appearance time of the first bullet screen is subtracted by the second preset time offset value.
In one possible implementation, the first barrage further includes a second viewing angle. That is, the first barrage also displays the corresponding viewing angle of the first barrage.
In a possible implementation manner, when detecting a first operation of a user on a first barrage, the terminal displays a second interface at a second playing time, including: when detecting a first operation of a user on a first barrage, the terminal sends a switching request to a media server; the terminal receives a first video file returned by the media server, wherein the first video file is a video file corresponding to a second playing moment of a video of a first free view angle under a second view angle; and the terminal displays the video picture of the first free view video at the second playing time under the second view according to the first video file.
In a possible implementation manner, before the terminal displays the first interface at the first playing moment, the method further includes: the terminal requests bullet screen data corresponding to the video of the first free view angle at the first playing moment to the bullet screen server, wherein the bullet screen data corresponding to the first playing moment comprises data of a first bullet screen, and the data of the first bullet screen comprises a second view angle.
In a possible implementation manner, after the terminal displays the second interface at the second playing time, the method further includes: the terminal receives a second operation of sending the second barrage, the terminal sends data of the second barrage to the barrage server, the data of the second barrage comprise content of the second barrage and a third visual angle, and the third visual angle is a visual angle corresponding to a video of the first free visual angle when the second barrage is sent.
That is, when the terminal receives the second barrage input by the user in the video frame at the third viewing angle, the terminal establishes a relationship with the third viewing angle, and sends the content of the second barrage and the information of the third viewing angle to the barrage server.
In a possible implementation manner, the data of the second barrage further includes a second barrage appearance time, where the second barrage appearance time is equal to the second playing time, or the second barrage appearance time is equal to the second playing time minus a third preset time offset value.
In one possible implementation, the second interface further includes a first barrage. That is, the second playing time after the switching is still equal to the duration of the continuous display of the first barrage when the user performs the first operation for the first barrage.
In a possible implementation manner, after the terminal displays the second interface at the second playing time, the method further includes: the terminal displays a third interface at a third playing time, wherein the third playing time is after the displaying time of the first barrage, the third interface comprises a video picture of the video of the first free view angle under the second view angle, and the third interface does not comprise the first barrage.
In a possible implementation manner, the method further includes: the terminal displays a fourth interface at a fourth playing moment, wherein the fourth interface comprises a video picture of the video of the first free view angle under the first view angle and view angle recommendation, and the view angle recommendation comprises a fourth view angle; when detecting a third operation of selecting the fourth visual angle by the user, the terminal displays a video picture of the first free visual angle video at a fifth playing moment under the fourth visual angle, wherein the fifth playing moment is related to the fourth playing moment.
In a possible implementation manner, the fifth playing time is related to the fourth playing time, including: the fifth playing time is equal to the fourth playing time, or the fifth playing time is equal to the fourth playing time minus a fourth preset time offset value.
In a possible implementation manner, before the terminal displays the fourth interface at the fourth playing time, the method further includes: the terminal requests a second video file corresponding to a fourth playing time of the first free view video under the first view to the media server, and requests view angle recommendation corresponding to the fourth playing time of the first free view video to the barrage server, wherein the view angle recommendation corresponding to the fourth playing time is determined by the barrage server according to the barrage number corresponding to the fourth playing time of the first free view video.
That is, the user can rapidly switch the playing view angle of the video with the first free view angle through the view angle recommendation displayed by the terminal, so that the video watching experience of the user is improved.
In some embodiments, the barrage server may also provide the user with a choice of the viewing angle from which to recommend highlight clips to the user based on the collected barrage data. For example, in the process of playing the video with the free view angle, the number of the barrages corresponding to the video segment can show the heat and the precision of the video segment. When the number of the barrages corresponding to a certain video segment is larger than a preset threshold value, the video segment can be considered as a highlight segment. The user can quickly switch to the segment of interest by manipulating the recommended viewing angle.
In a second aspect, a method for barrage switching is provided, and the method is applied to a barrage server, and includes: receiving data of a first barrage sent by a first terminal, wherein the data of the first barrage comprises an identifier of a video of a first free view angle, a first view angle and the content of the first barrage; the first bullet screen is sent by the first terminal when the video picture of the first free view video under the first view is played; and storing the data of the first barrage.
In a possible implementation manner, the data of the first barrage further includes an appearance time of the first barrage.
In a possible implementation manner, the method further includes: receiving a first request sent by a second terminal, wherein the first request is used for requesting bullet screen data of a first playing moment of the video of the first free view angle, and the first playing moment is smaller than or equal to the appearance moment of the first bullet screen; and responding to the first request, and returning the data of the first barrage to the second terminal.
In a possible implementation manner, after the receiving the first request sent by the second terminal, the method further includes: and sending a view angle recommendation list corresponding to the first playing time of the first free view angle video to the second terminal.
In a possible implementation manner, before the sending, to the second terminal, the view recommendation list corresponding to the first play time of the first freeview video, the method further includes: respectively counting the number of the barrages at different viewing angles corresponding to the first playing time of the video at the first free viewing angle; and determining a view angle recommendation list corresponding to the first playing time of the video of the first free view angle according to the number of the barrages of different views corresponding to the first playing time of the video of the first free view angle.
In a possible implementation manner, determining, according to the number of the barrages at different views corresponding to the first play time of the first freeview video, a view recommendation list corresponding to the first play time of the first freeview video includes: when the number of the barrages of the second view angles corresponding to the first playing time of the video of the first free view angle is determined to be larger than a preset threshold, determining that the view angle recommendation list corresponding to the first playing time of the video of the first free view angle comprises the second view angle.
In one possible implementation, the number of second viewing angles is one or more.
In a third aspect, a terminal is provided, including: a processor, a memory and a touch screen, the memory, the touch screen being coupled to the processor, the memory being for storing computer program code, the computer program code comprising computer instructions which, when read from the memory by the processor, cause the terminal to perform the method as in the first aspect described above, and any one of the possible implementations thereof.
In a fourth aspect, there is provided a barrage server comprising: a processor, a memory and a communication interface, the memory, the communication interface being coupled to the processor, the memory being for storing computer program code, the computer program code comprising computer instructions which, when read from the memory by the processor, cause the barrage server to perform the method as in the second aspect described above, and any one of the possible implementations thereof.
A fifth aspect provides a computer readable storage medium comprising computer instructions which, when run on a terminal, cause the terminal to perform the method of the first or second aspect as described above and any one of the possible implementations of the aspects described above.
A sixth aspect provides a computer program product for causing a terminal to perform the method of the first or second aspect and any one of the possible implementations of the first or second aspect when the computer program product is run on a computer.
The technical effects achieved by the terminal, the bullet screen server, the computer readable storage medium, and the computer program product provided in the third aspect to the sixth aspect may refer to the description of the technical effects in the first aspect and the possible implementation manners in the first aspect, which are not repeated here.
Drawings
Fig. 1 is a schematic structural diagram of a play system of a video with a free view angle according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a client/application program of a freeview video according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 5A is a flow chart of a method for sending a video bullet screen with a free viewing angle according to an embodiment of the present application;
fig. 5B is a flow chart of a view angle switching method of a free view angle video according to an embodiment of the present application;
fig. 6 is a schematic diagram of some terminal interfaces provided in embodiments of the present application;
fig. 7 is a schematic diagram of still other terminal interfaces provided in embodiments of the present application;
fig. 8 is a schematic diagram of still other terminal interfaces provided in embodiments of the present application;
Fig. 9 is a schematic diagram of still other terminal interfaces provided in embodiments of the present application;
fig. 10 is a schematic diagram of still other terminal interfaces provided in an embodiment of the present application;
fig. 11 is a flowchart of another view angle switching method of a free view angle video according to an embodiment of the present application;
FIG. 12 is a schematic diagram of still other terminal interfaces provided in embodiments of the present application;
fig. 13 is a flowchart illustrating another view angle switching method of a free view angle video according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
In the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more. In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In order to facilitate understanding of the technical solution provided in the embodiments of the present application, technical terms related to the embodiments of the present application are first described.
Free view video: refers to a video file of a special format generated by commonly photographing a target object by a plurality of cameras surrounding the target object and performing a specific video encoding on the commonly photographed video stream. And uploading the generated free view video to a media server for storage, and then distributing the free view video to a client on a terminal side through a content distribution network (content distribution Network, CDN). When a user watches the free view video on the client side of the terminal, the free selection of the watching view can be manually controlled.
Machine position: a number of positions where each camera is located when a plurality of cameras surrounding a target object capture the target object. Generally, one camera corresponds to one photographing angle. A photographing angle may photograph an image of a target object within a certain angle range.
Viewing angle: also referred to as a viewing angle, a play angle, etc., refers to a shooting angle at which a user views a free view video. That is, an image of a target object seen from one angle of view may be considered as an image captured when a camera photographs at a position (i.e., a machine position) where the angle of view is located.
Video playing: after the video (including the freeview video) file is processed by decoding, rendering and the like, a group of images are displayed in a display window according to time sequence, and a corresponding audio playing process is performed.
Bullet screen: comments or notes displayed on the video at certain time points or periods of the video (including the freeview video) in a certain law bring interest and unexpected experience to video viewing.
As shown in fig. 1, a schematic system architecture is provided in an embodiment of the present application, where the system architecture includes a camera array 500, a media processing server 200, a media server 300, a content delivery network (content distribution network, CDN) 400, a terminal 100, and a bullet screen server 600.
Wherein the camera array 500 includes a plurality (e.g., 18-36) of cameras or other devices having cameras surrounding a target object (i.e., a subject/scene), which correspond to position 0, position 1 … …, position n, respectively. The camera array 500 photographs the target object from a plurality of angles, respectively, and uploads the photographed video streams of the plurality of angles to the media processing server 200.
The media processing server 200 is configured to transcode the received video streams with multiple angles to generate a freeview video, and then the media processing server 200 uploads the generated freeview video to the media server 300 for storage.
Illustratively, the media processing server 200 performs processing of video streams for multiple angles including, but not limited to, focus alignment, frame alignment synchronous encoding, rotation-assisted stream generation, virtual view synthesis, and the like. The focus alignment may be performed, for example, by clipping the image to ensure smooth transition of the image of the subject during the switching of the viewing angle. Frame alignment synchronization coding is used for aligning video frames of the same moment shot by a plurality of cameras. Virtual view angle synthesis, wherein virtual video streams are inserted into two paths of real video streams, so that the number of cameras required to be used in shooting is reduced, or pictures of a shooting object in the process of switching view angles are smoothly transited. And rotating the auxiliary flow for rotating the display, helping to reduce overall flow occupancy.
It should be noted that, the processing procedure of the media processing server 200 is merely an example, and the embodiment of the application is not limited to a specific technical scheme adopted in the process of generating the freeview video according to the video stream uploaded by the camera array 500.
It should be further noted that, the media processing server 200 may be at least one of a stand-alone physical server, a plurality of stand-alone physical servers, a cloud server providing cloud computing, a cloud computing platform, and a virtualization center.
The media server 300 stores and maintains a large number of freeview videos. When the media server 300 receives a request sent by the terminal 100, requesting the content of a certain freeview video, the media server 300 quickly returns the content of the corresponding freeview video to the terminal 100 through the CDN 400.
It should be noted that, the media server 300 may be at least one of a single independent physical server, a plurality of independent physical servers, a cloud server providing cloud computing, a cloud computing platform, and a virtualization center.
CDN400 is based on the edge servers deployed in various places, and enables terminal users (such as terminal 100) to obtain the required content such as free view video nearby through the load balancing, content distribution, scheduling and other functional modules of the central platform, so as to reduce network congestion and improve user access response speed and hit rate.
The terminal 100 may request the content of a certain freeview video from the media server 300 through the client of the freeview video installed thereon, and play the freeview video according to the content of the freeview video returned from the media server 300 through the CDN 400. If the terminal 100 also starts the bullet screen function when playing the freeview video, the terminal 100 also needs to request bullet screen information corresponding to the current playing time or time period from the bullet screen server 600. In other words, the frames played by the terminal 100 include the frames of the freeview video and the corresponding bullet screen information. In addition, when the terminal 100 plays the free view video, it may also receive the bullet screen content input by the user, and upload the bullet screen content input by the user and other bullet screen information to the bullet screen server 600 for storage.
For example, the terminal 100 in the embodiment of the present application may be, for example, a mobile phone, a tablet computer, a personal computer (personal computer, PC), a personal digital assistant (personal digital assistant, PDA), a netbook, a wearable device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a vehicle-mounted device, a smart screen, a smart car, or the like, and the specific form of the terminal 100 is not limited in this application.
The bullet screen server 600 is configured to receive bullet screen information sent from each terminal (including the terminal 100, for example), where the bullet screen information includes, but is not limited to, an identification of a video of a free viewing angle, an appearance time of a bullet screen, a duration time of the bullet screen, content of the bullet screen, and a viewing angle at the time of sending the bullet screen, and the meaning of each parameter will be described in detail later, and will not be described here.
Note that, here, the barrage server 600 may be at least one of a single independent physical server, a plurality of independent physical servers, a cloud server providing cloud computing, a cloud computing platform, and a virtualization center.
It should be understood that the system structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the system architecture to which the technical solution is applied. In other embodiments of the present application, the system architecture to which the technical solution provided in the embodiments of the present application is applicable may further include more or fewer devices than those illustrated, or some of the devices may be combined, or some of the devices may be split.
As shown in fig. 2, an exemplary structure diagram of a terminal 100 according to an embodiment of the present application is provided. The terminal 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It should be understood that the structure illustrated in the embodiments of the present invention does not constitute a specific limitation on the terminal 100. In other embodiments of the present application, terminal 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
Video codecs are used to compress or decompress digital video. The terminal 100 may support one or more video codecs. In this way, the terminal 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc. In the embodiment of the present application, the video codec is capable of decoding the freeview video downloaded from the media server 300 by the terminal 100, so as to realize playing on the terminal 100.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, an application program required for at least one function (such as a play function of a freeview video, etc.), and the like. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the terminal 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the terminal 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
In some embodiments of the present application, the internal memory 121 stores the program code required by the client/application program of the freeview video, where the client/application program of the freeview video may implement the functions of downloading the freeview video from the media server 300, playing the freeview video, receiving the barrage content input by the user, uploading the barrage content input by the user and other barrage information to the barrage server 600, receiving the operation of switching the viewing angle of the currently played freeview video, and so on.
For example, as shown in fig. 3, a schematic structure diagram of a client/application program of a freeview video according to an embodiment of the present application is provided. The client/application of the freeview video includes: the system comprises a user interface module, a user control module, a barrage module and a play module.
Wherein the user interface module is used for providing a graphical user interface (graphical user interface, GUI) for the client/application program to interact with the user, displaying video pictures of the free view video, and the like.
The user control module is used for receiving the operation of switching the visual angle of the user (such as sliding operation of a finger of the user on a display screen or operation of pressing a key of a remote controller by the user), calculating the switching direction of the visual angle (such as left switching the visual angle or right switching the visual angle) and the switching angle amplitude (such as 10 degrees, 20 degrees and the like) according to the operation of switching the visual angle of the user, and sending the calculated switching direction of the visual angle and the switching angle amplitude to the visual angle control module in the playing module so as to switch the viewing visual angle of the video of the current free visual angle.
In some examples, the display of terminal 100 is a touch screen. Then, the user control module includes a display screen sliding control module. And the display screen sliding control module is used for capturing display screen events, wherein the display screen events comprise detected sliding events of fingers, and the detected sliding events comprise finger sliding directions and distances. In other examples, the terminal 100 is configured with a remote control according to the switching direction and switching angle of the display event to the viewing angle. Then, the user control module includes a remote control key control module. The remote controller key control module is used for receiving key events of the remote controller, wherein the key events comprise the identification of keys, the key pressing times or pressing duration and the like. And determining the switching direction and the switching angle of the viewing angle according to the key event. For example, a key operated by the user, such as a left key or a right key, is identified based on the identification of the key, thereby determining the switching direction. And identifying the switching angle amplitude according to the pressing times or pressing time of the keys.
The playing module comprises a visual angle control module, a data unpacking module, a video rendering module, a data downloading module, an audio and video coding module and an audio and video synchronization module.
The visual angle control module is used for receiving the switching direction of the visual angle and the switching angular amplitude indicated by the user and sent by the user control module, calculating a target visual angle after switching based on the current visual angle of the free visual angle video, informing the data downloading module and downloading the video file of the target visual angle. The data downloading module requests the video file corresponding to the target view angle from the media server 300 by calling the software and hardware interface related to the terminal 100. Then, the audio and video data are unpacked from the video file returned by the server 300 through the data unpacking module of the client/application program of the terminal 100, decoded through the audio and video decoding module, video pictures are rendered through the video rendering module, video and audio in the media data are synchronized through the audio and video synchronizing module, and then the video corresponding to the playing target visual angle is played, namely the visual angle switching of the free visual angle video is realized.
The barrage module comprises a barrage input module, a barrage display module and a barrage control module. The barrage input module is configured to receive barrage content (which may also include a format of barrage content selected by the user, such as color, font, bold, slant, etc.) input by the user, and upload the barrage content and other barrage information to the barrage server 600. Optionally, the barrage input module may further automatically fill in a viewing angle corresponding to the free viewing angle video when the user inputs barrage content. The barrage display module is used for displaying barrage information corresponding to the current playing time of the free view video. The barrage control module is used for receiving the operation of switching the visual angles of the user (for example, the user clicks a control corresponding to the target barrage or the user selects an option of a specific visual angle in the visual angle recommendation list), analyzing the switched target visual angles according to the operation of the user, and sending the target visual angles to the visual angle control module of the playing module, so that the visual angle control module switches the played free visual angle video to the target visual angle.
The wireless communication function of the terminal 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The mobile communication module 150 may provide a solution of wireless communication including 2G/3G/4G/5G/6G/next generation wireless communication technology, etc., applied to the terminal 100. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied on the terminal 100.
Terminal 100 implements display functions via a GPU, display 194, and application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The terminal 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize the memory capability of the extension terminal 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The terminal 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the terminal 100. The charging management module 140 may also supply power to the terminal through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
Referring to fig. 4, fig. 4 shows a schematic diagram of a bullet screen server 600, where the bullet screen server 600 includes one or more processors 610, one or more memories 620, and one or more communication interfaces 630.
The processor 610, the memory 620, and the communication interface 630 are connected by a bus. The processor 610 may include a general purpose central processing unit (Central Processing Unit, CPU) (e.g., CPU0 and CPU 1), a microprocessor, an Application-specific integrated circuit (ASIC), a graphics processor (graphics processing unit, GPU), a neural-Network Processor (NPU), or an integrated circuit for controlling program execution in the present Application, etc.
In general, memory 620 may be used to store computer-executable program code that includes instructions. The memory 620 may include a stored program area and a stored data area. The storage program area may store an operating system, application program codes, and the like. In some examples, the storage data area stores bullet screen data for each freeview video uploaded by the user. In other examples, the stored data area also stores a view list of highlight segments of the respective freeview video, or the like, as determined from the user uploaded barrage data. In addition, the memory 620 may include high-speed random access memory, and may also include nonvolatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), and the like.
The processor 610 performs various functional applications and data processing of the barrage server 600 by executing instructions stored in the memory 620. In one example, the processor 610 may also include multiple CPUs, and the processor 610 may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, or processing cores for processing data (e.g., computer program instructions).
Communication interface 630 may be used to communicate with other devices or communication networks such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Note that, the media processing server 200, the media server 300, and the like shown in fig. 1 may refer to the structure of the barrage server 600 shown in fig. 3. The media processing server 200, 300 may comprise more or fewer components than the barrage server 600, or may combine certain components, or split certain components, or a different arrangement of components. The embodiments of the present application are not limited in this regard.
The following describes the technical solution provided in the embodiments of the present application in detail.
As shown in fig. 5A, a flow chart of a method for sending a video bullet screen with a free view angle according to an embodiment of the present application is shown, where the flow chart includes:
s500, the first terminal requests a video file a of the first freeview video under the view A from the media server.
S501, the media server returns the video file a to the first terminal.
S502, the first terminal plays a video picture of the first free view video under the view angle A according to the video file a.
Illustratively, in steps S500-S502, a first terminal (which may be any of the terminals 100 in fig. 1 described above) is installed with a client/application for playing a freeview video, which may be referred to as a client/application for a freeview video. The user may launch the client/application and select to play the first freeview video in the display interface of the client/application. In some embodiments, the first terminal defaults to playing the video frame of the first freeview video under view a, and then when detecting that the user selects to play the first freeview video, the first terminal requests the video file a of the first freeview video under view a from the media server via the CDN. Correspondingly, the media server returns a video file a to the first terminal through the CDN, and the first terminal displays a video picture of the first free view video under the view angle A according to the video file a. In other embodiments, the first terminal first plays the video frame of the first freeview video at the other view than the view a. When the first terminal detects that the user instructs to switch to the view angle A, the first terminal requests a video file a of the first free view angle video under the view angle A from the media server through the CDN. Correspondingly, the media server returns a video file a to the first terminal through the CDN, and the first terminal displays a video picture of the first free view video under the view angle A according to the video file a.
For example, the first terminal is a mobile phone. Video applications installed on the cell phone support playing free view video. Then, the user may launch a video application on the cell phone and select to play the first freeview video in the interface of the video application. For example, as shown in fig. 6 (1), a playing interface 601 for playing a first freeview video is displayed for a mobile phone. A video frame 602 of the first freeview video at a certain viewing angle and a bullet screen (including a bullet screen 603, for example) appearing on the video frame 602 with a certain rule are included in the play interface 601. Some functionality controls may also be included in the play interface 601, such as: a play progress bar 604, a play control 609, a next control 610, a bullet screen switch control 605, a bullet screen input box 606, a bullet screen send control 607, a view control 608, and the like.
The playing progress bar 604 is used for displaying the playing progress of the current video, and is used for adjusting the playing progress of the current video by a user in a dragging manner. A play control 609 is used to pause or continue playing the current video. A next control 610 for switching to the next video. A barrage switch control 605 is used to turn on or off the barrage function. A barrage input box 606 for receiving the content of the barrage entered by the user. The bullet screen send control 607 is used to send the content of the bullet screen input by the user or the default bullet screen content.
The view control 608 is configured to display a view angle (for example, 60 degrees) corresponding to the video frame 602 currently played by the mobile phone, and call up a view angle scale for switching the view angle. When it is detected that the user operates the view angle control 608, the mobile phone displays a play interface 611 as shown in fig. 6 (2), and a view angle scale 612 is displayed in the play interface 611. The user can switch the viewing angle of the free viewing angle video currently played by the mobile phone by sliding the finger left and right on the viewing angle scale 612. For example, when an operation of sliding the finger of the user rightward on the scale is detected, the cellular phone displays a playback interface 613 as shown in fig. 7 (1). The playback interface 613 is a video frame of the first freeview video at a view a (e.g., 70 degrees).
It should be noted that, here, taking the example that the user operates the view control 608 to call up the view scale 612, in other examples, the mobile phone may also directly display the view scale 612 in the play interface 601 of the freeview video. The interface for displaying the viewing angle scale 612 is not limited in the embodiments of the present application.
In some examples, after the handset does not receive the user operation view angle scale within the preset duration 1, the view angle scale may be hidden, such as the play interface 614 shown in fig. 7 (2). In other examples, when detecting the operation of clicking the display screen on the playing interface 614 by the user, the mobile phone displays the playing interface 615 as shown in (3) in fig. 7, and the above functional controls, such as the playing progress bar 604, the playing control 609, the next control 610, the bullet screen switch control 605, the bullet screen input box 606, the bullet screen send control 607, the view angle control 608, and the like, are redisplayed in the playing interface 615.
S503, when the first terminal plays the video picture of the first free view video under the view angle A, the operation 1 of inputting the first barrage by the user is received, and the first terminal generates the data of the first barrage. The data of the first barrage includes a viewing angle a and a playing time T1.
In some embodiments, when the first terminal plays the video frame of the first freeview video at the view a, the barrage function is further turned on, and then the first terminal further needs to request the barrage server for barrage data corresponding to the first freeview video at the corresponding playing time according to the playing progress of the first freeview video (i.e. step S503' in fig. 5A). Accordingly, after receiving the bullet screen data returned by the bullet screen server, the first terminal displays the corresponding bullet screen on the displayed video frame (i.e. step S503 "in fig. 5A). In some examples, when watching the first freeview video, the user may further input the content of the first barrage through the first terminal, and send the data of the first barrage to the barrage server.
For example, as shown in fig. 7 (3), when the mobile phone plays the video frame of the first free view video at the view angle a, the mobile phone detects that the user inputs the content of the first barrage in the barrage input box 606, and operates the barrage sending control 607, and the mobile phone generates the data of the first barrage. In this example, operation 1 includes inputting the content of the first bullet screen (e.g., "too wonderful") in the bullet screen input box 606, and operating the bullet screen send control 607. In this embodiment, the data of the first barrage generated by the mobile phone includes a viewing angle a (e.g., 70 degrees). In addition, the data of the first bullet screen may further include one or more of an identification of the first free view video, an identification of the first bullet screen, a content of the first bullet screen, a playing time T1 (i.e., a time of appearance of the first bullet screen, or referred to as a position of appearance of the first bullet screen, e.g., 20:10), a duration Δt of the first bullet screen, a format of the first bullet screen, a sending position of the first bullet screen (i.e., a playing time when the first bullet screen is sent), and the like.
As shown in table one below, an example of a data format employed for the data of the first barrage.
List one
Figure BDA0003404312740000101
Wherein the value of the film identification field is an identification of the first freeview video, such as film 1. The value of the bullet screen identification field is the identification of the first bullet screen, such as bullet screen 1. The value of the barrage content field is the content of the first barrage entered by the user, such as "too wonderful".
The value of the appearance time field is the playing time T1, namely the playing time when the first barrage starts to be displayed on the display screen. In some embodiments, the playing time T1 is equal to the sending time of the first bullet screen. For example, if the user of the first terminal sends the first bullet screen at a free view video of 20:10, the appearance time of the first bullet screen, that is, the playing time T1 may be 20:10. It will be appreciated that in other embodiments, the moment of occurrence of the first barrage is different from the moment of delivery of the first barrage. For example, in some scenes, where a highlight in the video has a longer duration, or where the user is accustomed to sending a bullet at the beginning of the highlight, then the sending time of the first bullet is determined to be the playing time T1, so as to remind other users of noticing the highlight in the video. In other embodiments, the playing time T1 is equal to the sending time of the first bullet screen-the preset time offset value 1. The preset time offset value 1 may be predefined or set by a user, which is not limited in the embodiment of the present application. Other preset time offset values (e.g., preset time offset value 2 to preset time offset 5) are similar to the preset time offset value 1 in the manner and calculation method, and may be the same as or different from the preset time offset value 1, and will not be described further. For example, the sending time of the first bullet screen is 20:10, and the preset time offset value 1 is 5 seconds, so that the appearance time of the first bullet screen, that is, the playing time T1 is 20:05. It will be appreciated that in other scenarios, where a highlight in the video is of a shorter duration, or where the user is accustomed to sending a bullet at the end of the highlight, then determining the time of sending the first bullet, the preset time offset value 1, as the play time T1, facilitates alerting other users to notice the highlight in the video in advance. In still other embodiments, the playing time T1 is equal to the sending time of the first bullet screen+the preset time offset value 2. For example, the sending time of the first bullet screen is 20:10, and the preset time offset value 2 is 3 seconds, so that the appearance time of the first bullet screen, that is, the playing time T1 is 20:08. The preset time offset value 1 and the preset time offset value 2 may be the same or different. It can be appreciated that in still other scenarios, the user is used to make a discussion before the highlight, and then the sending time of the first bullet screen+the preset time offset value 2 is determined as the playing time T1, so as to remind other users to notice the highlight in the video in time.
The duration field has a value of the duration Δt for the first bullet screen to appear on the display screen (e.g., 15 seconds). In some examples, the first bullet screen appears from the rightmost side of the terminal display screen, and then the first bullet screen moves from the rightmost side of the display screen to the left side until the first bullet screen disappears from the display screen. Wherein the whole process from the display of the first bullet screen starting from the rightmost side of the display screen to the disappearance of the first bullet screen at the leftmost side of the display screen takes a time Δt, for example 15 seconds. The duration Δt may be a preset value or set by the user. For example, the user may set a bullet screen mode, with different bullet screen modes corresponding to different durations. The field values of the font, the color, etc. are set by the user when inputting the content of the first bullet screen or are set by default for the first terminal. The value of the view field is view a.
S504, the first terminal sends data of the first barrage to the barrage server.
S505, the barrage server stores the data of the first barrage sent by the first terminal.
In some embodiments, after receiving the data of the first barrage sent by the first terminal, the barrage server performs an audit on the data of the first barrage, for example, content audit, format audit, and the like, and after the audit passes, stores the data of the first barrage.
Then, when other users adopt other terminals to watch the first free view angle video and start the barrage function, when the first free view angle video is played to the playing time T1, the played video picture comprises the content of the first barrage.
The following description will be given by taking, as an example, the case where another user views the first freeview video using the second terminal (which may be any one of the terminals 100 in fig. 1, or specifically, the first terminal).
As shown in fig. 5B, a flow chart of a method for switching a view angle of a free view angle video according to an embodiment of the present application is shown, where the flow chart includes:
s506, the second terminal requests the media server for the video file B of the first freeview video at the playing time T2 under the view B (shown in step S506a in fig. 5B). Wherein, the view angle B is different from the view angle A. The second terminal also requests the bullet screen server for bullet screen data corresponding to the playing time T2 of the first freeview video (shown in step S506B in fig. 5B). The barrage data corresponding to the playing time T2 includes data of the first barrage.
It should be noted that, step S506b may be performed before, after, or simultaneously with step S506a, and the embodiment of the present application does not limit the execution sequence of step S506a and step S506 b.
S507, the media server returns the video file B to the second terminal (shown in step S507a in fig. 5B). The barrage server returns barrage data corresponding to the playing time T2 of the first freeview video to the second terminal (shown in step S507B in fig. 5B).
It should be noted that, step S507b may be performed before, after, or simultaneously with step S507a, and the embodiment of the present application does not limit the execution sequence of step S507a and step S507 b.
And S508, the second terminal displays the video picture of the first free view video at the playing time T2 under the view angle B and the first barrage according to the video file B. Optionally, the second terminal further displays viewing angle information of the first barrage, i.e. viewing angle a.
The playing time T2 is greater than or equal to the playing time T1 and less than or equal to the playing time t1+the duration Δt of the first bullet screen.
In steps S506-S508, the second terminal is installed with a client/application program for playing the freeview video. The client/application installed on the second terminal may be the same as or different from the client/application installed on the first terminal. The user can instruct the second terminal to play the video picture of the first freeview video under any view through operating the client/application program of the freeview video on the second terminal. Optionally, after the user opens the barrage function, the second terminal further requests barrage data of corresponding playing time to the barrage server according to the playing progress of the video of the first free view angle, and displays the barrage of corresponding time. The related operation of the user on the second terminal may refer to the related operation on the first terminal in step S500-step S502, which will not be described here.
For example, the second terminal is a mobile phone. As shown in fig. 8 (1), the mobile phone plays the video frame of the first freeview video at the view B until the play time T1 (e.g., the play time T0, such as 20:05). That is, the second terminal requests the video file of the first freeview video under the view B (including the video file B of the first freeview video playing time T2 under the view B) from the media server via the CDN, and plays the video picture of the first freeview video under the view B according to the video file returned by the media server via the CDN. Optionally, if the user further starts the barrage function during the process of playing the first freeview video by the second terminal, the second terminal further requests barrage data (including barrage data corresponding to the playing time T2 of the first freeview video) of the first freeview video at the corresponding playing time to the barrage server. And the second terminal also displays the corresponding barrage according to the barrage data of the first free view video returned by the barrage server.
It should be noted that, in this example, the second terminal may request the video file of the first freeview video at the view B from the media server, and then request the barrage data of the first freeview video at the corresponding playing time from the barrage server, that is, first execute step S506a, and then execute step S506B. Optionally, the second terminal may request the barrage server for barrage data of the first freeview video at the corresponding playing time, and then request the media server for a video file of the first freeview video at the view B, that is, execute step S506B first, and then execute step S506a.
As shown in fig. 8 (2), when the playing time is equal to the playing time T1 (e.g., 20:10), the second terminal starts to display the first bullet screen. In some examples, the first bullet screen displayed by the second terminal may move from one side of the display screen to the other at a certain speed until it disappears. And the duration from the display of the first bullet screen from the second terminal to the disappearance of the first bullet screen is equal to the duration Δt (for example, 15 seconds) of the first bullet screen. In other words, from the playing time T1, the second terminal displays the first bullet screen for a period of time (e.g. 20:10-20:25) from the playing time T1 to the duration Δt of the first bullet screen. In other examples, the second terminal also displays a viewing angle corresponding to the first barrage, i.e., viewing angle a. That is, when the user views the first barrage at the second terminal, the user can also learn that the first barrage corresponds to the viewing angle a.
In this example, the play time T2 is any time in a period from the play time T1 to the play time t1+the duration Δt of the first bullet screen. For example, the playing time T2 may be any playing time between 20:10 and 20:25.
The second terminal is still described as a mobile phone. As shown in fig. 9 (1), the second terminal plays the video frame of the first freeview video at the view angle E (the non-view angle B, the view angle E may be the view angle a or other view angles) until the play time T1 (e.g., the play time T0, such as 20:05). As shown in fig. 9 (2), when the playing time is the playing time T1 (e.g. 20:10), the second terminal starts to display the first bullet screen. It should be noted that, at this time, the second terminal displays the video frame of the first freeview video at the view angle E. The display manner of the first barrage may refer to the description of the display manner of the first barrage in fig. 8, and will not be described herein.
Then, as shown in fig. 9 (3), when the first freeview video is played to a video frame at a playing time T3 (e.g. 20:15), where the playing time T3 is located after the playing time T1 and before t1+Δt, the terminal detects an operation of switching the user to the view B, and the second terminal requests, from the media server via the CDN, a video file B of the first freeview video at the view B, and plays, from the video file B returned by the media server via the CDN, the video frame of the first freeview video at the view B. It should be noted that at this time, the second terminal still displays the content of the first barrage. That is, the second terminal is displaying the video frame of the first freeview video at the view B and the first bullet screen.
In this example, the playing time T2 is any time within a period from the playing time T3 (i.e., after the first freeview video is switched to the view B) to the playing time t1+the duration Δt of the first bullet screen. For example, the playing time T5 may be any playing time between 20:15 and 20:25.
It may be noted that, in this example, the second terminal requests the bullet screen server for bullet screen data of the first freeview video at the corresponding playing time, and then requests the media server for the video file of the first freeview video at the view B, that is, step S506B is performed first, and then step S506a is performed.
S509, when the second terminal plays the video frame of the first free video at the playing time T2 under the view angle B, the second terminal detects that the user performs the operation 2 on the first barrage, and the second terminal sends a view angle switching request to the media server, for requesting the video file c of the first free view angle video at the playing time T4 under the view angle a.
The operation 2 is, for example, that a user's finger operates a virtual control corresponding to the first barrage on a display screen (which is a touch screen) of the second terminal, or that the user selects a control corresponding to the first barrage displayed on the second terminal by manipulating a button of the remote controller.
Wherein, the playing time T4 is related to the playing time T1. For example, play time t4=play time T1, or play time t4=play time T1-preset time offset value 3 (for example, 5 seconds or 10 seconds). Alternatively, the play time T4 is related to the play time T2. For example, play time t4=play time T2, or play time t4=play time T2-preset time offset value 4.
The second terminal is a mobile phone. As shown in fig. 10 (1), a playing interface 1001 is displayed for a mobile phone, where the playing interface 1001 includes a video frame of a first free view video at a view angle B (e.g. 80 degrees) and a corresponding barrage (including a control 1002 corresponding to the first barrage). For example, the playing interface 1001 plays video frames with a time of 20:15 at an 80-degree viewing angle. When the mobile phone detects that the control 1002 is operated (i.e., operation 2) by a user (e.g., clicking, double clicking, long pressing, gravity pressing, etc.), the mobile phone determines the first barrage according to the control 1002, and determines that the viewing angle (i.e., viewing angle a, e.g., 70 degrees) of the first barrage is the switched viewing angle.
In addition, the mobile phone also determines the playing time after the viewing angle is switched. In some examples, the mobile phone determines the playing time after the viewing angle is switched, i.e. the playing time T4, according to the occurrence time (i.e. the playing time T1) of the first barrage. For example, the play time T4 is equal to the play time T1. For example, as shown in fig. 10 (1), at the playing time T2 (e.g. 20:15 minutes), the mobile phone detects that the user clicks the control 1002 corresponding to the first bullet screen on the playing interface 1001, and switches from the viewing angle B to the viewing angle a, and the mobile phone displays the playing interface 1003 as shown in fig. 10 (2). The first bullet screen has an appearance time (i.e., playing time T1) of 20:10, and then the playing time T4 is 20:10. As can be seen from the description of the playing time T1 in step S503, the playing time T1 is already adjusted to be the starting time of the highlight (for example, playing time t1=the sending time of the first bullet screen-the preset time offset value 1), and when the playing time T4 is equal to the playing time T1, it means that the highlight starts to be played after the mobile phone switches to the viewing angle a. For another example, the play time T4 is equal to the play time T1-a preset time offset value 3. As can be seen from the description of the playing time T1 in step S503, the playing time T1 is not adjusted to be the starting time of the highlight (for example, the playing time t1=the sending time of the first bullet screen), in this case, the user sends the bullet screen at a time after the highlight starts, and then the mobile phone can start playing from the actual highlight after the viewing angle of the mobile phone is switched by adjusting the playing time T4 (that is, the playing time t4=the playing time T1-the preset time offset value 3). Or, the playing time T1 is adjusted to be the starting time of the highlight (for example, playing time t1=the sending time of the first bullet screen-the preset time offset value 1), and the mobile phone can still start playing from the playing time earlier than the highlight, so as to further ensure that the user does not miss the highlight. For example, the playing time T1 is 20:10, and the playing time T4 is 20:08. For another example, the play time T4 may be located after the play time T1. For example, the playing time T1 is 20:10, and the playing time T4 is 20:15.
In other examples, the mobile phone determines the playing time after the viewing angle is switched, that is, the playing time T4 according to the playing time (that is, the playing time T2) when the viewing angle is switched. For example, the play time T4 is equal to the play time T2. As shown in fig. 10 (1), at a playing time T2 (e.g. 20:15 minutes), the mobile phone detects that the user clicks a control 1002 corresponding to the first bullet screen on the playing interface 1001, and switches from the viewing angle B to the viewing angle a, and the mobile phone displays the playing interface 1004 as shown in fig. 10 (3). It may be noted that the playing time T4 is the same as the playing time T2 (e.g. the fourth playing time is 20:15). For another example, the play time T4 is equal to the play time T2-a preset time offset value 4. For example, the playing time T2 is 20:15, and the playing time T4 is 20:08. Of course, in other examples, the playing time T4 may also be located after the playing time T2. For example, the playing time T2 is 20:15, and the playing time T4 is 20:18.
After the mobile phone determines the switched view angle and the playing time after the view angle is switched, the mobile phone sends a view angle switching request to the media server to request the video file c of the first free view angle video at the playing time T4 under the view angle A.
The view angle switching request includes an identifier of the first freeview video and the view angle information after switching (i.e., view angle a). Optionally, the view angle switching request further includes a playing time after the view angle is switched, that is, playing time T4.
In other embodiments, the handset may not determine the playing time T4, but the media server determines the playing time T4. For example, the request sent by the mobile phone to the media server further includes a playing time T1 or a playing time T2. Then, after receiving the switching request, the media server determines the playing time T4 according to the playing time T1 or the playing time T2, and returns the video file c at the playing time T4 to the second terminal.
S510, the media server returns the video file c to the second terminal.
S511, the second terminal plays the video frame of the playing time T4 of the first freeview video under the view a according to the video file c.
Thereby, the second terminal switches from playing the video picture of the first free view video under the view angle B to playing the video picture of the first free view video under the view angle A. It may be noted that, in the view switching scheme provided in the embodiments of the present application, before the view is switched, the second terminal requests, via the CDN, the video file of the first freeview video under view B from the media server. After the view angle is switched, the second terminal requests the video file of the first free view angle video under the view angle A from the media server through the CDN. That is, the second terminal does not request the video file of the first freeview video at the other view than the view a and the view B from the media server in the view switching process.
In the prior art, in a scheme that a user switches the view angle by sliding a finger on a display screen or operating a remote controller, a terminal requests video files of all views from view angle B to view angle a of a first free view angle video from a media server. For example, the user instructs to switch from 60 degrees to 120 degrees, the terminal requests the video file of the first freeview video at all angles (e.g., 70 degrees, 80 degrees, 90 degrees, 100 degrees, 110 degrees, etc.) between 60 degrees and 120 degrees, and switches the video pictures of the corresponding views linearly from angle to angle. However, in the embodiment of the present application, the terminal requests only the video file of the first freeview video at 120 degrees from the media server, and displays the video frame of 120 degrees.
Compared with the prior art, in the view angle switching scheme provided by the embodiment of the application, the second terminal does not request the video files of the first free view angle video under other view angles except the view angle A and the view angle B from the media server, so that the traffic is saved. In addition, the terminal only needs to receive the video file of the first free view video under the view angle B, processes the video file of the first free view video under the view angle B, reduces the video processing work of the terminal, improves the processing efficiency of the terminal, accelerates the response speed of the terminal, and reduces the power consumption of the terminal.
In addition, if the related art view switching scheme is adopted, the second terminal needs to switch video pictures at all views between the display view B and the view a in the process of switching from the view B to the view a. When a user watches a moving video picture, the moving video picture is overlapped with frequent visual angle switching, so that dizziness of the user is easily caused, and the watching experience of the video is poor. However, according to the viewing angle switching scheme provided by the embodiment of the application, frequent viewing angle switching is not performed, so that dizziness of a user is avoided, and viewing experience of a video is improved.
In other embodiments, some transitional pictures may be added during the switching from video pictures at view B to video pictures at view a of the first freeview video. For example, at the time of switching, a converted image frame is generated from an image frame of a last N frame of a view B of a first freeview video (N is a natural number greater than or equal to 1) displayed by the terminal and an image frame of a first M frame of a view a of the first freeview video (M is a natural number greater than or equal to 1) to be displayed by the terminal, and the converted image frame is inserted between an image frame of a last frame of a view B of the first freeview video and an image frame of a first frame of a view a of the first freeview video, so that the view switching of the first freeview video is more natural and realistic. For another example, during switching, an image frame under a partial view angle is selected from the view angle B to the view angle a, and the image frame of the last frame of the view angle B of the first freeview video is inserted between the image frame of the first frame of the view angle a of the first freeview video, so that the view angle switching of the first freeview video is more natural and real. The transition picture in the view angle switching process is not particularly limited.
It may also be noted that, the first terminal sending the first barrage and the second terminal switching the viewing angle according to the first barrage are both respectively interacted with the barrage server, so that there is no limitation on whether the first terminal and the second terminal are the same type of terminal, whether the same operating system is running, whether the same free viewing angle video playing client/application is running, and the like. In other words, the technical scheme provided by the embodiment of the application can realize cross-terminal cross-operation platform cross-application visual angle switching, and is wide in application range.
In summary, in the embodiment of the present application, when a user sends a first barrage through a first terminal, the first terminal automatically establishes a corresponding relationship between the first barrage and a viewing angle of a video currently watched by the user. Subsequently, when other users watch the video with the same free viewing angle through the second terminal, the first barrage can be operated to rapidly switch to the viewing angle corresponding to the first barrage when watching the first barrage. Therefore, when the shooting scene range of the target object is large, although the user can view the limited view angle range on the client/application, there is a blind area, it is known which views exist highlight clips through the barrage content displayed on the video screen. The highlight clips may be video clips that are good for most users, video clips that are of interest to viewers, and so on. Further, the user can quickly switch to the view angle corresponding to the specific barrage by operating the specific barrage, so as to view the highlight of the view angle.
In other embodiments, the media server or the second terminal may automatically identify the content of the first freeview video, for example, identifying the first freeview video as different segments according to one or more of a scenario, a scene, a shooting technique (e.g., a long shot), a person, etc. of the video. For example, the play time of segment 1 ranges from 15:00 to 25:00.
In response to the operation of the user on the first barrage at the playing time T2 (20:15), the second terminal switches to a video picture of the view angle a corresponding to the first barrage at the playing time T4, where the playing time T4 may be the starting time (15:00) of the segment 1, the occurrence time T1 (20:10) of the first barrage, or the current time T2 (20:15).
In some embodiments, the playing time T4 may be a starting time (15:00) of the segment 1, where the playing time range of the segment 1 is 15:00-25:00, and when the second terminal finishes playing the segment where the view angle a is located, that is, at the ending time 25:00 of the segment, the second terminal may automatically switch back to the original view angle (for example, the view angle B) to continue playing the segment at the time 25:00. Of course, in other examples, the second terminal may also ask the user whether to switch back to the original viewing angle to continue playing the segment after 25:00 hours. After receiving the user's consent, switching back to the original viewing angle to continue playing the segment at the moment of 25:00. In still other embodiments, instead of switching back to the original view B, the playback of the post-segment 25:00 at view a may be continued.
Of course, in still other embodiments, in the scene shown in fig. 10 (2) or fig. 10 (3), after the second terminal plays the video clip corresponding to the first bullet screen, the appearance time of the first bullet screen is 20:10 for 15 seconds, i.e. the second terminal plays out 20:10-20:25, the second terminal is at 20:25, instead of switching back to the original 70-degree viewing angle, the first bullet screen is continued to be played at the end 20 of 80-degree viewing angle: fragments 25 and later.
Of course, in still other embodiments, in the scenario shown in fig. 10 (2) or fig. 10 (3), after the first barrage display is finished, the terminal is at 20: and at 25, the original visual angle can be automatically switched back to continue playing 20: fragments after 25 moments.
In the above embodiments, the user may select to switch to the interested viewing angle according to the content of the barrage, and in other embodiments, the barrage server may also recommend the viewing angle where the highlight is located for the user according to the collected barrage data, so that the user may select. For example, in the process of playing the video with the free view angle, the number of the barrages corresponding to the video segment can show the heat and the precision of the video segment. When the number of the barrages corresponding to a certain video segment is larger than a preset threshold value, the video segment can be considered as a highlight segment. The scheme provided in this embodiment is described in detail below.
In this embodiment, each user of the whole network can select to view a video picture of the first freeview video at an arbitrary view angle, and can also arbitrarily switch the view angle of the first freeview video. During the playing process of the video with the first free view angle, when a user considers that a certain video clip is very attractive, relevant barrage content can be input. And the terminal sends the barrage content input by the user and other barrage data to the barrage server. It should be noted that, the bullet screen data sent by the terminal to the bullet screen server includes information such as the content of the bullet screen, the identification of the video of the first free view angle, the view angle of the bullet screen, and the appearance time of the bullet screen. Of course, the bullet screen data may also include other content, which may be referred to in table one above. In addition, the process of transmitting the bullet screen data to the server by the terminal may refer to the related content in the above steps S500-S505, which is not described herein.
Hereinafter, a detailed description will be given of a procedure in which the bullet screen server calculates a list of recommended angle of views from the collected bullet screen data, and other users select a switching angle of view through the list of recommended angle of views.
As shown in fig. 11, a flow chart of a view angle switching method of a free view angle video according to an embodiment of the present application is as follows:
S1101, determining a recommended viewing angle corresponding to each playing time according to the barrage data of the video of the first free viewing angle by the barrage server.
In some embodiments, the barrage server periodically (e.g., every 1 minute) calculates a recommended viewing angle corresponding to each play time in the first freeview video. In one specific implementation, the playback time of the first freeview video is divided into a plurality of time periods (e.g., each time period has a length of 10-15 seconds). And then, counting according to different visual angles by taking one time period as a unit to calculate the corresponding recommended visual angles in each time period. For example, according to the appearance time corresponding to each barrage in the barrage data, determining which time period corresponds to the barrage. And then, counting the number of the barrages corresponding to each visual angle of each time period according to the visual angles corresponding to each barrage in the barrage data for each time period. And determining the viewing angle or the viewing angles as recommended viewing angles when the number of the barrages at the viewing angle or the viewing angles in the time period is larger than or equal to a preset threshold value. In one example, when the number of the shots at the multiple views existing in a certain period of time is greater than or equal to the preset threshold, the first R views may be taken as recommended views according to the order of the number of the shots corresponding to the respective views from large to small, where R is an integer greater than 0. For example, r=1, or r=5, or the user may manually configure the value of R. It should be noted that the calculation step is merely an example of a specific implementation, and does not limit the method for calculating the recommended viewing angle in each time period.
For example, as shown in Table II, there are examples of some of the bullet screen data stored by the bullet screen server. From the respective occurrence moments of the bullet screen, it can be determined that bullet screens 1 to 4 correspond to the period 1 (10:00-10:15). Further, for the time period 1, the number of barrages corresponding to the view angle 1 may be calculated to be 3, and the number of barrages corresponding to the view angle 2 may be calculated to be 1. The number of the barrages corresponding to the visual angle 1 is smaller than a preset threshold value, and no recommended visual angle exists in the time period 1.
Similarly, barrages 5 through 8 correspond to time period 2 (10:16-10:30). Further, for the time period 2, the number of the barrages corresponding to the view angle 1 may be calculated to be 2, the number of barrages corresponding to the view angle 2 is 1, and the number of barrages corresponding to the view angle 3 is 1. The number of the barrages corresponding to the visual angle 1, the number of the barrages corresponding to the visual angle 2, the number of the barrages corresponding to the visual angle 3 are smaller than a preset threshold value, and no recommended visual angle exists in the time period 2.
Similarly, barrages 100-201 correspond to time period 3 (20:16-20:30). Further, for the period 3, the number of the barrages corresponding to the viewing angle 3 may be calculated to be 200, and the number of the barrages corresponding to the viewing angle 1 may be calculated to be 1. When the number of the barrages corresponding to the view angle 3 is greater than or equal to a preset threshold (e.g. 150), determining that the recommended view angle corresponding to the time period 3 is the view angle 3.
Watch II
Figure BDA0003404312740000171
In other embodiments, the bullet screen server may also determine the recommended viewing angle in conjunction with the specific content of the bullet screen and the number of bullet screens. For example, the content of some of the shots indicates that the video frame of the viewing angle is highlight, for example, the content of the shots is "too highlight", "recommended" and the like, and in combination with the number of shots satisfying a certain condition (for example, the number is greater than a preset threshold value) under the viewing angle, the viewing angle is determined to be the recommended viewing angle. For another example, if the content of some of the shots indicates that the video frame of the view is not highlight, e.g., the content of the shots is "too unsightly", "skipped", etc., then the view may be excluded as the recommended view. In summary, the method for determining the angle of view of the leg and foot according to the embodiments of the present application is not particularly limited.
S1102, the barrage server stores recommended viewing angles corresponding to each playing time.
S1103, the second terminal requests the video file d of the first freeview video at the playing time T5 under the view C from the media server.
The playing time T5 may be any playing time of the first freeview video.
S1104, the media server returns the video file d to the second terminal.
S1105, the second terminal requests, to the barrage server, a recommended viewing angle of the first freeview video at the playing time T5.
And S1106, returning the recommended view angle of the first free view angle video at the playing time T5 to the second terminal by the barrage server. The recommended view of the first freeview video at the playing time T5 includes a view D.
It should be noted that, step S1105 may be performed before, after, or simultaneously with step S1103, and the embodiment of the present application does not limit the execution sequence of step S1105 and step S1103.
It should be noted that, in some embodiments, when the second terminal starts the barrage function, the second terminal requests, not only the barrage server for barrage data corresponding to each playing time according to the playing progress of the video at the first free viewing angle, but also requests, from the barrage server, a recommended viewing angle corresponding to each playing time. In other embodiments, the second terminal does not start the barrage function, but starts the function of recommending the viewing angle, and then the second terminal requests the recommended viewing angle corresponding to each playing time to the barrage server according to the playing progress of the video at the first free viewing angle. In still other embodiments, when the second terminal starts to play the first freeview video, the recommended views corresponding to all the playing moments are requested from the barrage server. Subsequently, when the bullet screen server calculates that the recommended viewing angle changes, the second terminal can be informed to acquire the latest recommended viewing angle. When the recommended viewing angle calculated by the barrage server is unchanged or changed less, the second terminal may not be notified to acquire the latest recommended viewing angle. In this way, the subsequent second terminal can display the recommended viewing angles corresponding to the corresponding playing moments according to the recommended viewing angles corresponding to all the playing moments requested when the playing starts, the updated recommended viewing angles obtained later, and the playing progress of the video with the first free viewing angle.
S1107, the second terminal plays the video picture of the first free view video at the playing time T5 under the view angle C according to the video file d, and the second terminal also displays the recommended view angle at the playing time T5.
The second terminal is exemplified as a mobile phone. As shown in fig. 12 (1), a playing interface 1201 of a first free view video is displayed for a mobile phone, where the playing interface 1201 includes a list 1202 of recommended views, and the list 1202 of recommended views includes recommended views, where the number of recommended views is one or more. In some examples, when the bullet screen server statistics correspond to multiple recommended perspectives at the time of play, multiple recommended perspectives may be included in the list 1202 of recommended perspectives. Optionally, the plurality of recommended views may be ranked according to the order of the number of the shots corresponding to each recommended view from large to small, and the recommended views with the larger number of shots are preferentially recommended to the user, or the recommended views with the larger number of shots are ranked in front. In other examples, recommendation index is also included in recommendation view list 1202 to characterize the priority order of the various recommendation views. The priority order of the respective recommended perspectives may be related to the order of magnitude of the number of barrages corresponding to the respective recommended perspectives. In still other examples, the number of recommended perspectives, etc., is also included in the list 1202 of recommended perspectives.
S1108, the second terminal detects the operation of selecting the view angle D by the user, and requests the video file e of the playing time T6 of the first freeview video under the view angle D from the media server.
In some embodiments, the play time T6 is related to the play time T5. For example, play time t6=play time T5, or play time t6=play time T5-preset time offset value 5 (for example, the preset time offset value is 5 seconds).
The second terminal is still exemplified as a mobile phone by way of example. As shown in fig. 12 (1), the mobile phone displays a video frame at a viewing angle C (e.g., 70 degrees) at a playing time T5 (e.g., 20:16). When the mobile phone detects an operation of selecting a view D (e.g., 60 degrees) in the list 1202 of recommended views on the play interface 1201 as shown in fig. 12 (1), the mobile phone requests the video file e of the play time T6 (e.g., 20:16 or 20:11) of the first freeview video at the view D from the media server.
In other embodiments, the playing time T6 is related to the segment corresponding to the playing time T5. Specifically, the media server or the second terminal may automatically identify the content of the first freeview video, for example, identify the first freeview video as different segments according to one or more of a scenario, a scene, a shooting technique (such as a long shot), a person, and the like of the video. The playing time T6 is the starting playing time of the clip corresponding to the playing time T5. For example, the playing time is 20:16, and the playing time range of the clip 2 is 20:00-25:00. It can be seen that the playing time T5 is 20:16, and corresponds to the clip 2, and then the playing time T6 may be 20:00.
S1109, the media server returns the video file e to the second terminal.
S1110, the second terminal plays the video picture of the playing time T6 of the first free view video under the view D according to the video file e.
Illustratively, as shown in (2) of fig. 12, the mobile phone displays a video frame at a viewing angle D.
Thereby, the second terminal switches from playing the video picture of the first free view video under the view angle C to playing the video picture of the first free view video under the view angle D. It may be noted that, in the view switching scheme provided in the embodiments of the present application, before the view is switched, the second terminal requests, via the CDN, the video file of the first freeview video under view C from the media server. After the view angle is switched, the second terminal requests the video file of the first free view angle video under the view angle D from the media server through the CDN. That is, the second terminal does not request the video file of the first freeview video at the other view than the view C and the view D from the media server in the view switching process.
In addition, when the second terminal does not open the barrage function or does not have the condition of displaying the barrage, the second terminal can display a list of recommended viewing angles, and the user can display according to the list of recommended viewing angles
For other contents of this embodiment, please refer to the foregoing description, and no further description is given here.
As shown in fig. 13, a flow chart of a view angle switching method of a free view angle video according to an embodiment of the present application is as follows:
s1310, the terminal displays a first interface, wherein the first interface comprises a video picture of a first free view angle video at a first view angle and a first barrage, and the first barrage corresponds to a second view angle of the first free view angle video.
Optionally, the first bullet screen in the first interface further includes a second viewing angle.
In some examples, the terminal displays the first interface at the first playback time. In other words, the first interface includes a video frame of the first free view video at the first playing time under the first view.
In other examples, before the terminal displays the first interface at the first playing time, the terminal requests, to the bullet screen server, bullet screen data corresponding to the video at the first free view angle at the first playing time, where the bullet screen data corresponding to the first playing time includes data of a first bullet screen, and the data of the first bullet screen includes the second view angle.
S1320, when the first operation of the user on the first barrage is detected, the terminal displays a second interface, wherein the second interface comprises a video picture of the first free view video at the second view.
In some examples, when detecting the first operation of the user on the first barrage, the terminal displays a second interface at a second playing time. Wherein the second playing time is related to the first playing time. For example, the second playing time is equal to the first playing time, or the second playing time is equal to the first playing time minus a first preset time offset value. For another example, the first playing time is related to the appearance time of the first bullet screen. Then the second play time is also related to the moment of appearance of the first bullet screen. Then, the second playing time is equal to the appearance time of the first bullet screen, or the second playing time is equal to the appearance time of the first bullet screen minus a second preset time offset value.
In other examples, the terminal sends a handover request to the media server when a first operation of the user on the first bullet screen is detected. And the media server returns a first video file to the terminal, wherein the first video file is a video file corresponding to the second playing time of the video with the first free view angle under the second view angle. And the terminal displays the video picture of the first free view video at the second playing time under the second view according to the first video file.
Optionally, the second interface comprises a first bullet screen.
Optionally, the terminal plays a third interface at a third playing time, where the third playing time is after the display time of the first barrage, the third interface includes a video picture of the video at the first free view angle under the second view angle, and the third interface does not include the first barrage.
In still other examples, the terminal displays a fourth interface at a fourth playing time, the fourth interface including a video picture of the first freeview video at the first view and a view recommendation, the view recommendation including a fourth view. When detecting a third operation of selecting the fourth visual angle by the user, the terminal displays a video picture of the first free visual angle video at a fifth playing time under the fourth visual angle, wherein the fifth playing time is related to the fourth playing time. For example, the fifth playing time is equal to the fourth playing time, or the fifth playing time is equal to the fourth playing time minus a fourth preset time offset value.
Optionally, before the terminal displays the fourth interface at the fourth playing time, the terminal requests the second video file corresponding to the fourth playing time of the first free view video at the first view to the media server, and requests the view recommendation corresponding to the fourth playing time of the first free view video to the bullet screen server, wherein the view recommendation corresponding to the fourth playing time is determined by the bullet screen server according to the bullet screen quantity corresponding to the fourth playing time of the first free view video.
Embodiments of the present application also provide a chip system, as shown in fig. 14, which includes at least one processor 1301 and at least one interface circuit 1302. The processor 1301 and the interface circuit 1302 may be interconnected by wires. For example, interface circuit 1302 may be used to receive signals from other devices, such as a memory of terminal 100. For another example, interface circuit 1302 may be used to send signals to other devices (e.g., processor 1301). Illustratively, the interface circuit 1302 may read instructions stored in the memory and send the instructions to the processor 1301. The instructions, when executed by processor 1301, may cause the terminal to perform the various steps performed by terminal 100 (e.g., a cell phone) in the embodiments described above. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiment of the application also provides a device which is contained in the terminal and has the function of realizing the terminal behavior in any one of the methods in the embodiment. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes at least one module or unit corresponding to the functions described above. For example, a detection module or unit, a display module or unit, a determination module or unit, a calculation module or unit, and the like.
The embodiments also provide a computer storage medium comprising computer instructions which, when run on a terminal, cause the terminal to perform a method as in any of the embodiments above.
Embodiments of the present application also provide a computer program product for causing a computer to perform any of the methods of the embodiments described above when the computer program product is run on the computer.
Embodiments of the present application also provide a graphical user interface on a terminal having a display screen, a camera, a memory, and one or more processors for executing one or more computer programs stored in the memory, the graphical user interface comprising a graphical user interface displayed by the terminal when performing any of the methods of the embodiments described above.
It will be appreciated that the above-described terminal, etc. may comprise hardware structures and/or software modules that perform the respective functions in order to achieve the above-described functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present invention.
The embodiment of the present application may divide the functional modules of the terminal and the like according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present invention, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A method for switching the viewing angle of a free viewing angle video, the method comprising:
the terminal displays a first interface, wherein the first interface comprises a video picture of a first free view angle video at a first view angle and a first barrage, and the first barrage corresponds to a second view angle of the first free view angle video;
when the first operation of the user on the first barrage is detected, the terminal displays a second interface, wherein the second interface comprises a video picture of the first free view video at the second view.
2. The method of claim 1, wherein the terminal displays a first interface comprising:
the terminal displays the first interface at a first playing moment;
when the first operation of the user on the first barrage is detected, the terminal displays a second interface, which comprises:
When the first operation of the user on the first barrage is detected, the terminal displays the second interface at a second playing time, wherein the second playing time is related to the first playing time.
3. The method of claim 2, wherein the second playback time is associated with the first playback time, comprising:
the second playing time is equal to the first playing time, or the second playing time is equal to the first playing time minus a first preset time offset value; or, the second playing time is equal to the appearance time of the first barrage, or the second playing time is equal to the appearance time of the first barrage minus a second preset time offset value.
4. A method according to any one of claims 1-3, wherein the first barrage further comprises the second viewing angle.
5. The method according to any one of claims 2-4, wherein the terminal displaying the second interface at the second playing time when the first operation of the user on the first bullet screen is detected, includes:
when detecting a first operation of a user on the first barrage, the terminal sends a switching request to a media server; the terminal receives a first video file returned by the media server, wherein the first video file is a video file corresponding to the second playing time of the video of the first free view angle under the second view angle;
And the terminal displays the video picture of the first free view video at the second playing moment under the second view according to the first video file.
6. The method according to any one of claims 2-5, wherein the terminal displays the first interface before the first playing time, the method further comprising:
the terminal requests bullet screen data corresponding to the video of the first free view angle at the first playing time to a bullet screen server, wherein the bullet screen data corresponding to the first playing time comprises the data of the first bullet screen, and the data of the first bullet screen comprises the second view angle.
7. The method according to any one of claims 2-6, wherein after the terminal displays the second interface at the second playing time, the method further comprises:
the terminal receives a second operation of sending a second barrage, the terminal sends data of the second barrage to the barrage server, the data of the second barrage comprise content of the second barrage and a third visual angle, and the third visual angle is a visual angle corresponding to the first free visual angle video when the second barrage is sent.
8. The method of claim 7, wherein the data of the second bullet screen further comprises an appearance time of the second bullet screen, wherein the appearance time of the second bullet screen is equal to the second play time or the appearance time of the second bullet screen is equal to the second play time minus a third preset time offset value.
9. The method according to any one of claims 1 to 8, wherein,
the second interface also includes the first barrage.
10. The method according to any one of claims 2-9, wherein after the terminal displays the second interface at the second playing time, the method further comprises:
the terminal displays a third interface at a third playing time, wherein the third playing time is after the displaying time of the first barrage, the third interface comprises video pictures of the first free view video at the second view, and the third interface does not comprise the first barrage.
11. The method according to any one of claims 1-10, further comprising:
the terminal displays a fourth interface at a fourth playing time, wherein the fourth interface comprises a video picture of the first free view video at a first view angle and view angle recommendation, and the view angle recommendation comprises a fourth view angle;
When detecting a third operation of selecting the fourth visual angle by the user, the terminal displays a video picture of the first free visual angle video at a fifth playing time under the fourth visual angle, wherein the fifth playing time is related to the fourth playing time.
12. The method of claim 11, wherein the fifth playback time is associated with the fourth playback time, comprising:
the fifth playing time is equal to the fourth playing time, or the fifth playing time is equal to the fourth playing time minus a fourth preset time offset value.
13. The method according to claim 11 or 12, wherein the terminal displays a fourth interface at a fourth playing time, the method further comprising:
the terminal requests a second video file corresponding to the fourth playing time of the first free view video under the first view to a media server, and requests view angle recommendation corresponding to the fourth playing time of the first free view video to a barrage server, wherein the view angle recommendation corresponding to the fourth playing time is determined by the barrage server according to the barrage number corresponding to the fourth playing time of the first free view video.
14. A terminal, comprising: a processor, a memory, and a touch screen, the memory, the touch screen being coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the terminal to perform the view switching method of the freeview video of any of claims 1-13.
15. A computer readable storage medium comprising computer instructions which, when run on a terminal, cause the terminal to perform the method of view switching of freeview video according to any of claims 1-13.
CN202111508547.9A 2021-12-10 2021-12-10 Viewing angle switching method, device and system for free viewing angle video Pending CN116264640A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111508547.9A CN116264640A (en) 2021-12-10 2021-12-10 Viewing angle switching method, device and system for free viewing angle video
PCT/CN2022/135954 WO2023103875A1 (en) 2021-12-10 2022-12-01 Viewpoint switching method, apparatus and system for free viewpoint video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111508547.9A CN116264640A (en) 2021-12-10 2021-12-10 Viewing angle switching method, device and system for free viewing angle video

Publications (1)

Publication Number Publication Date
CN116264640A true CN116264640A (en) 2023-06-16

Family

ID=86721704

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111508547.9A Pending CN116264640A (en) 2021-12-10 2021-12-10 Viewing angle switching method, device and system for free viewing angle video

Country Status (2)

Country Link
CN (1) CN116264640A (en)
WO (1) WO2023103875A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106358036B (en) * 2016-08-31 2018-05-08 杭州当虹科技有限公司 A kind of method that virtual reality video is watched with default visual angle
CN108810600B (en) * 2017-04-28 2020-12-22 华为技术有限公司 Video scene switching method, client and server
CN111641871A (en) * 2020-05-29 2020-09-08 广州华多网络科技有限公司 Live video display method and device, terminal and readable storage medium
CN113014943A (en) * 2021-03-03 2021-06-22 上海七牛信息技术有限公司 Video playing method, video player and video live broadcasting system

Also Published As

Publication number Publication date
WO2023103875A1 (en) 2023-06-15

Similar Documents

Publication Publication Date Title
EP3606082B1 (en) Panoramic video playback method and client terminal
US10681342B2 (en) Behavioral directional encoding of three-dimensional video
US11671712B2 (en) Apparatus and methods for image encoding using spatially weighted encoding quality parameters
CN110213616B (en) Video providing method, video obtaining method, video providing device, video obtaining device and video providing equipment
CN104012106B (en) It is directed at the video of expression different points of view
KR20220130197A (en) Filming method, apparatus, electronic equipment and storage medium
KR102511407B1 (en) Video distribution device, video distribution system, video distribution method and video distribution program
KR20170023885A (en) Compositing and transmitting contextual information during an audio or video call
US20160381341A1 (en) View interpolation for visual storytelling
CN113141514B (en) Media stream transmission method, system, device, equipment and storage medium
CN113141523B (en) Resource transmission method, device, terminal and storage medium
WO2023169297A1 (en) Animation special effect generation method and apparatus, device, and medium
CN113747240B (en) Video processing method, apparatus and storage medium
US20200372933A1 (en) Image acquisition system and method
US11622099B2 (en) Information-processing apparatus, method of processing information, and program
JP2020524450A (en) Transmission system for multi-channel video, control method thereof, multi-channel video reproduction method and device thereof
CN113141541B (en) Code rate switching method, device, equipment and storage medium
CN116264640A (en) Viewing angle switching method, device and system for free viewing angle video
CN111698262B (en) Bandwidth determination method, device, terminal and storage medium
CN109429055B (en) Image display method and device, video file processing method and device
CN116260986A (en) Bullet screen display method, device and system of free view video
US20240129636A1 (en) Apparatus and methods for image encoding using spatially weighted encoding quality parameters
KR20180013243A (en) Method and Apparatus for Providing and Storing Streaming Contents
CN114745597A (en) Video processing method and apparatus, electronic device, and computer-readable storage medium
JP2015049641A (en) Information processing device, program, and communication system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination