WO2022062903A1 - 弹幕播放方法、相关设备及存储介质 - Google Patents

弹幕播放方法、相关设备及存储介质 Download PDF

Info

Publication number
WO2022062903A1
WO2022062903A1 PCT/CN2021/117205 CN2021117205W WO2022062903A1 WO 2022062903 A1 WO2022062903 A1 WO 2022062903A1 CN 2021117205 W CN2021117205 W CN 2021117205W WO 2022062903 A1 WO2022062903 A1 WO 2022062903A1
Authority
WO
WIPO (PCT)
Prior art keywords
bullet screen
time
real
layer
barrage
Prior art date
Application number
PCT/CN2021/117205
Other languages
English (en)
French (fr)
Inventor
黄炳洁
李帅
李龙华
郭实
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21871275.0A priority Critical patent/EP4210337A4/en
Publication of WO2022062903A1 publication Critical patent/WO2022062903A1/zh
Priority to US18/187,440 priority patent/US20230232077A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Definitions

  • the present application relates to the technical field of virtual reality, and in particular, to a barrage playing method, related equipment and storage medium.
  • the embodiments of the present application disclose a method for playing a bullet screen, a related device and a storage medium, which can solve the problem of poor viewing effect of video content due to the existing presentation form of the bullet screen.
  • a first aspect of the present application discloses a method for playing a bullet screen.
  • the method includes: receiving a bullet screen opening instruction for a target video, where the bullet screen opening instruction is used to instruct a virtual reality VR bullet screen to play the bullet screen;
  • a ring-shaped transparent bullet screen layer is drawn at the preset orientation of the video layer of the target video, wherein the first height of the video layer is smaller than the second height of the bullet screen layer; obtain The bullet screen information of the real-time bullet screen of the target video; according to the bullet screen information, calculate the three-dimensional coordinates of the real-time bullet screen on the bullet screen layer; according to the three-dimensional coordinates, on the bullet screen layer Refresh and play the real-time barrage.
  • a ring-shaped transparent bullet screen layer is added to the preset position of the video layer, and the height of the bullet screen layer is greater than that of the video layer, so that the vertical expansion and diversion of the bullet screen in the space is realized, and the bullet screen layer is free from Due to the limitation of the screen of the electronic device, the proportion of the bullet screen is greater than or equal to 100%, so that the bullet screen can be divided vertically, reducing the peak pressure of the bullet screen, and improving the viewing effect of the video content.
  • the method further includes: dividing the bullet screen layer into new bullet screen areas , the main screen barrage area and the historical barrage area, wherein each real-time barrage spans the new barrage area, the main screen barrage area and the historical barrage area at a constant speed; initialize the new barrage area If the bullet screen set in the new bullet screen area is initialized, execute the real-time acquisition of the target video.
  • the bullet screen information of the bullet screen
  • a bullet screen enters the market from the main screen entrance of the main screen bullet screen area, and is displayed in the main screen bullet screen area.
  • the initializing the bullet screen set in the newly-born bullet screen area includes: calculating an initialization time domain; acquiring the bullet screen set in the newly-born bullet screen area within the initialization time domain; Calculate the initialization coordinates of each bullet screen in the bullet screen set; and display each bullet screen in the bullet screen set in the newly created bullet screen area according to the initialization coordinates.
  • each bullet screen displayed in the entire new bullet screen area is static and does not move, and each bullet screen is displayed in a fixed position in the new bullet screen area according to the initialization coordinates.
  • the initialization time domain is: [0, ⁇ t 1 ], where, The initialization coordinates are: in, Among them, v is the speed of the bullet screen, R is the radius of the bullet screen layer, ⁇ is the angle between the entrance of the main screen bullet screen area and the horizontal direction, and ⁇ is the angle between the position where the bullet screen is generated and the horizontal direction , t is a certain moment.
  • the bullet screen information includes the bullet screen movement speed, the real-time playback time, and the bullet screen height
  • the real-time bullet screen is calculated on the bullet screen layer according to the bullet screen information.
  • the three-dimensional coordinates include: at the real-time playback time, according to the movement speed of the bullet screen, calculate the horizontal coordinates of the real-time bullet screen on the horizontal plane of the bullet screen layer; at the real-time playback time, according to the bullet screen screen height, and calculate the ordinate of the real-time bullet screen on the longitudinal plane of the bullet screen layer.
  • the calculating the horizontal coordinates of the real-time bullet screen on the horizontal plane of the bullet screen layer according to the movement speed of the bullet screen at the real-time playback time includes: according to the bullet screen movement speed screen movement speed, calculate the first time domain corresponding to the new barrage area, the second time domain corresponding to the main screen barrage area, and the third time domain corresponding to the historical barrage area; if the real-time playback time In the first time domain, according to the first horizontal formula, calculate the first horizontal coordinate of the real-time barrage in the new barrage area; or if the real-time playback time is in the second time domain, according to The second horizontal formula is used to calculate the second horizontal coordinates of the real-time bullet screen in the main screen bullet screen area; or if the real-time playback time is within the third time domain, the real-time bullet screen is calculated according to the third horizontal formula.
  • the bullet screen layer also expands a new bullet screen area in the horizontal direction.
  • the user turns his head to the right, he can watch the newly generated bullet screen in advance in the new bullet screen area, so that the user has a sense of expectation; when the user turns his head to the left , you can review the historical bullet screen in the historical bullet screen area, so that users have a sense of nostalgia, which can enrich the rendering form of the bullet screen and improve the user experience.
  • the first time domain is: [0, ⁇ t 1 ], where,
  • the first level formula is: in,
  • the second time domain is:
  • the second level formula is:
  • the third time domain is:
  • the third level formula is: Among them, t is the real-time playback time, v is the movement speed of the bullet screen, R is the radius of the bullet screen layer, ⁇ is the angle between the entrance of the main screen bullet screen area and the horizontal direction, and ⁇ is the The angle between the position where the real-time bullet screen is generated and the horizontal direction, W is the length of the main screen of the main screen bullet screen area, is the first horizontal coordinate or the second horizontal coordinate or the third horizontal coordinate.
  • calculating the ordinate of the real-time bullet screen on the longitudinal plane of the bullet screen layer includes: counting statistics in the The number of real-time barrages in the real-time playback time; the first distance is calculated according to the number of bars, the barrage height of each real-time barrage, and the first height of the video layer; according to the number of bars, The barrage height of each real-time barrage and the second height of the barrage layer are used to calculate the second distance; if the preset distance is less than or equal to the first distance, the real-time barrage is calculated according to the first vertical formula.
  • the first ordinate of the bullet screen on the longitudinal plane of the bullet screen layer wherein the preset distance is the preset minimum distance between two adjacent bullet screen that does not affect the viewing effect of the target video ; or if the preset spacing is greater than or equal to the first spacing, and the preset spacing is less than or equal to the second spacing, calculate the real-time bullet screen at the bullet screen layer according to the second vertical formula
  • a new bullet screen area is vertically expanded. According to the situation of the bullet screen that arrives at the same time, the ordinate is calculated, and the bullet screen is divided vertically to realize the peak bullet screen.
  • the vertical dispersion of the video allows users to enjoy the content of the clip itself in the video while experiencing the peak barrage.
  • the first distance is:
  • the second spacing is:
  • the first longitudinal formula is: in,
  • the second longitudinal formula is:
  • the third longitudinal formula is:
  • N is the number of real-time bullet screens that arrive at the real-time playback time
  • h i is the bullet screen height of the i-th real-time bullet screen
  • h j is the j-th real-time bullet screen height
  • H 0 is the first height
  • H max is the second height
  • d is the distance between two adjacent real-time barrages
  • d min is the preset distance
  • zi is the first ordinate or the second ordinate or the third ordinate.
  • the drawing a ring-shaped transparent bullet screen layer at a preset orientation of the video layer of the target video in response to the bullet screen opening instruction includes: in response to the bullet screen opening instruction, outputting a bullet screen layer editing page; receiving the bullet screen layer drawing parameters input on the bullet screen layer editing page; according to the bullet screen layer drawing parameters, drawing a ring-shaped transparent bullet in the preset position of the video layer of the target video curtain layer.
  • the barrage layer drawing parameters may include, but are not limited to: the height of the barrage layer, the radius of the barrage layer, and the radian of the barrage layer. These barrage layer drawing parameters are adjustable, and the user can adjust the barrage layer by adjusting the barrage layer. Drawing parameters can be customized to suit your own bullet screen shape.
  • a second aspect of the present application discloses an electronic device, including a processor and a memory; the memory is used to store instructions; the processor is used to call the instructions in the memory, so that the electronic device executes the barrage playback method.
  • a third aspect of the present application discloses a bullet screen playback system, the bullet screen playback system includes an electronic device and a portable device, wherein the electronic device is used to execute the bullet screen playback method.
  • a fourth aspect of the present application discloses a computer-readable storage medium, where the computer-readable storage medium stores at least one instruction, and when the at least one instruction is executed by a processor, implements the bullet screen playing method.
  • FIG. 1 is a schematic diagram of a framework of a bullet screen playback system disclosed in an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a method for playing a bullet screen disclosed in an embodiment of the present application.
  • FIG. 2A is a schematic diagram of a barrage layer drawing parameter disclosed in an embodiment of the present application.
  • FIG. 2B is a schematic diagram of coordinates of a barrage moving on a horizontal plane disclosed in an embodiment of the present application.
  • FIG. 2C is a schematic diagram showing the coordinates of a barrage arriving at the same time moving in the longitudinal direction disclosed in an embodiment of the present application.
  • FIG. 2D is a schematic diagram of a bullet screen distribution of a sparse bullet screen scene disclosed in an embodiment of the present application.
  • FIG. 2E is a schematic diagram of a bullet screen distribution of a peak bullet screen scene disclosed in an embodiment of the present application.
  • FIG. 2F is a schematic diagram of a bullet screen distribution of an ultra-dense bullet screen scene disclosed in an embodiment of the present application.
  • FIG. 2G is a schematic diagram of the distribution of a bullet screen after a ring-shaped bullet screen layer is extended according to an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application.
  • FIG. 1 is a schematic diagram of a framework of a bullet screen playback system disclosed in an embodiment of the present application.
  • the bullet screen playback system includes an electronic device and a portable device, and FIG. 1 also includes a bullet screen layer and a video layer rendered in a VR virtual space.
  • the electronic devices may include, but are not limited to, smart phones, tablet computers, and personal digital assistants (PDAs).
  • a virtual reality (Virtual Reality, VR) application (Application, APP) is installed on the electronic device, and a VR SDK (Software Development Kit, software development kit) is integrated in the VR APP. Integrate the drawing ability of the VR bullet screen layer in the VR SDK, as well as the ability to calculate and refresh the position of the bullet screen in real time. After the user clicks to launch the VR APP, the electronic device can draw the video layer and play the video in the virtual space of VR.
  • VR Virtual Reality
  • APP Application, APP
  • VR SDK Software Development Kit, software development kit
  • the electronic device can use the VR SDK to obtain the bullet screen layer drawing parameters set by the user (such as the height of the bullet screen layer, the radius of the bullet screen layer, and the arc of the bullet screen layer).
  • Bullet screen layer drawing parameters draw a circular transparent bullet screen layer above the video layer, where the height of the bullet screen layer is greater than or equal to the height of the video layer.
  • the barrage information of each real-time barrage (such as barrage content, barrage height, and barrage motion speed) is passed to the VR SDK, and the VR SDK can calculate the 3D dimension of each barrage in real time. Coordinates, and finally, in the VR virtual space, according to the three-dimensional coordinates, refresh and play the real-time bullet screen on the drawn bullet screen layer.
  • the bullet screen layer drawn in the VR virtual space can be divided into multiple areas, such as the new bullet screen area, the main screen bullet screen area and the historical bullet screen area.
  • Each newly generated real-time bullet screen appears from the new bullet screen area at a constant speed Move to the main screen bullet screen area, and then move to the historical bullet screen area to disappear.
  • the portable device may be VR glasses. After wearing the VR glasses, the user can see the real-time barrage rendered on the VR virtual space.
  • the bullet screen area is free from the limitation of the screen of the electronic device, and the height of the bullet screen layer is greater than the height of the video layer, wherein the area of the video layer is the area of the screen.
  • the bullet screen layer described in Figure 1 expands the new bullet screen area vertically, which can be used to realize the vertical dispersion of the bullet screen during the peak period, so that users can experience the peak bullet screen while experiencing the peak bullet screen. At the same time, enjoy the content of the high-energy video clip itself.
  • the bullet screen layer also expands a new bullet screen area in the horizontal direction.
  • the use of the barrage form shown in Figure 1 can not only divert the barrage vertically, but also reduce the The peak pressure of the bullet screen can improve the viewing effect of video content, and at the same time, it can also enrich the rendering form of the bullet screen and improve the user experience.
  • FIG. 2 is a schematic flowchart of a method for playing a bullet screen disclosed in an embodiment of the present application.
  • the bullet screen playing method shown in FIG. 2 is applied to the electronic device shown in FIG. 1 above, and the method includes the following steps:
  • the electronic device receives a bullet screen opening instruction for the target video.
  • the electronic device may receive a bullet screen opening instruction for the target video sent by other devices (such as a handle).
  • the bullet screen opening instruction is used to instruct to play the bullet screen in a virtual reality VR bullet screen mode.
  • the electronic device can draw a video layer on the VR virtual space and play the target video, and the user can watch the target video rendered on the video layer by wearing a portable device (such as VR glasses).
  • a portable device such as VR glasses
  • the electronic device outputs a barrage layer editing page in response to the barrage opening instruction.
  • the electronic device can output the bullet screen layer editing page on the VR virtual space, and the user can set the bullet screen layer drawing parameters on the bullet screen layer editing page, and the bullet screen layer drawing parameters can include but not limited to the height of the bullet screen layer. , the radius of the bullet screen layer, and the radian of the bullet screen layer.
  • These bullet screen layer drawing parameters are adjustable. By adjusting the bullet screen layer drawing parameters, users can customize the shape of the bullet screen that suits them.
  • the electronic device receives the bullet screen layer drawing parameters input on the bullet screen layer editing page.
  • the electronic device draws a ring-shaped transparent bullet screen layer at a preset orientation of the video layer of the target video according to the bullet screen layer drawing parameters.
  • the bullet screen layer is drawn according to the drawing parameters (height, radian and radius) of the bullet screen layer.
  • the bullet screen layer is a ring structure, and the bullet screen layer has extended areas in the vertical and horizontal directions of the video layer. , wherein the preset orientation is, for example, above the video layer, and the first height of the video layer is smaller than the second height of the bullet screen layer. The area of the entire bullet screen layer is larger than that of the video layer.
  • the electronic device divides the bullet screen layer into a new bullet screen area, a main screen bullet screen area, and a historical bullet screen area.
  • each newly generated barrage will traverse the new barrage area, the main screen barrage area and the historical barrage area at a constant speed.
  • the electronic device initializes the barrage set in the newly created barrage area.
  • FIG. 2B is a schematic diagram of coordinates of a barrage moving on a horizontal plane disclosed in an embodiment of the present application.
  • v is the speed of the bullet screen
  • R is the radius of the bullet screen layer
  • is the angle between the entrance of the main screen bullet screen area and the horizontal direction
  • is the angle between the position where the bullet screen is generated and the horizontal direction
  • t is a certain moment.
  • ⁇ and ⁇ are the radians in the barrage drawing parameters, which can be adjusted according to the user's requirements.
  • Each bullet screen starts to move from the place where the bullet screen is generated.
  • the bullet screen set in the entire new bullet screen area is initialized.
  • the real-time bullet screen i is displayed in the new bullet screen area.
  • the real-time bullet screen i moves to the entrance of the main screen, enters the bullet screen area of the main screen from the entrance of the main screen, and then moves to the historical bullet screen area, until it finally disappears from the demise of the bullet screen.
  • the initializing the barrage set in the new barrage area includes:
  • each barrage in the barrage set is displayed in the newly created barrage area.
  • the initialization time domain is: [0, ⁇ t 1 ], where,
  • the initialization coordinates are: in,
  • each bullet screen displayed in the entire new bullet screen area is static and does not move, and each bullet screen is displayed in a fixed position in the new bullet screen area according to the initialization coordinates.
  • the bullet screen in the new bullet screen area starts to move in real time, and the corresponding position coordinates will also change accordingly.
  • step S27 The electronic device determines whether the bullet screen set in the new bullet screen area has been initialized. If the bullet screen set in the new bullet screen area has been initialized, step S28 is executed. If the bullet screen set has not been initialized, step S27 is repeatedly executed.
  • the electronic device acquires the bullet screen information of the real-time bullet screen of the target video.
  • the bullet screen information may include, but is not limited to, the bullet screen content, the bullet screen motion speed, the real-time playback time, the bullet screen height, the bullet screen original position, and the bullet screen height.
  • the electronic device calculates the three-dimensional coordinates of the real-time bullet screen on the bullet screen layer according to the bullet screen information.
  • the bullet screen information includes the bullet screen movement speed, real-time playback time and the bullet screen height
  • the calculation of the three-dimensional coordinates of the real-time bullet screen on the bullet screen layer according to the bullet screen information includes:
  • the three-dimensional coordinates refer to the three-dimensional coordinates of the bullet screen on the bullet screen layer of the VR virtual space at the real-time playback time t, including the horizontal coordinates on the horizontal plane and the vertical coordinates on the vertical plane.
  • step 11) at the real-time playback time, according to the movement speed of the bullet screen, calculating the horizontal coordinates of the real-time bullet screen on the horizontal plane of the bullet screen layer includes: :
  • the first time domain corresponding to the new bullet screen area calculates the first time domain corresponding to the new bullet screen area, the second time domain corresponding to the main screen bullet screen area, and the third time domain corresponding to the historical bullet screen area;
  • the third horizontal coordinate of the real-time bullet screen in the historical bullet screen area is calculated according to a third horizontal formula.
  • the first time domain is the time range in which the real-time barrage moves on the new barrage area
  • the second time domain is the time range in which the real-time barrage moves on the main screen barrage area
  • the third time domain is the real-time barrage in the history The time frame of the movement on the barrage area.
  • each time domain and formula in this optional implementation can be understood in conjunction with FIG. 2B , and the real-time bullet screen i can be calculated from the bullet screen layer drawing parameters ( ⁇ , ⁇ , R, v) in FIG. 2B .
  • the time ⁇ t 1 from the place where the screen is generated to the entrance of the main screen, the time when the real-time bullet screen i leaves the main screen bullet screen area and enters the historical bullet screen area The time of the real-time danmaku i at the demise of the danmaku And calculate the horizontal position coordinates (x, y) of the real-time barrage i at each real-time playback time.
  • the first time domain is: [0, ⁇ t 1 ], where,
  • the first level formula is: in,
  • the second time domain is:
  • the second level formula is:
  • the third time domain is:
  • the third level formula is:
  • t is the real-time playback time
  • v is the movement speed of the bullet screen
  • R is the radius of the bullet screen layer
  • is the angle between the entrance of the main screen bullet screen area and the horizontal direction
  • is the The angle between the position where the real-time bullet screen is generated and the horizontal direction
  • W is the length of the main screen of the main screen bullet screen area, is the first horizontal coordinate or the second horizontal coordinate or the third horizontal coordinate.
  • step 12) at the real-time playback time, according to the height of the bullet screen, calculating the ordinate of the real-time bullet screen on the longitudinal plane of the bullet screen layer includes: :
  • the preset spacing is less than or equal to the first spacing, calculate the first ordinate of the real-time bullet screen on the longitudinal plane of the bullet screen layer according to the first longitudinal formula, where the preset spacing is the preset spacing The set minimum distance between two adjacent bullet screens that does not affect the viewing effect of the target video; or
  • the preset spacing is greater than or equal to the first spacing, and the preset spacing is less than or equal to the second spacing, according to the second longitudinal formula, calculate the longitudinal direction of the real-time bullet screen in the bullet screen layer the second ordinate on the face;
  • the third vertical coordinate of the real-time bullet screen on the longitudinal plane of the bullet screen layer is calculated according to a third longitudinal formula.
  • FIG. 2C is a schematic diagram showing the longitudinal movement of a bullet screen arriving at the same time disclosed in an embodiment of the present application.
  • the coordinate system is drawn with the center of the video layer as the coordinate origin.
  • N is the number of real-time bullet screens arriving at the real-time playback time (that is, the same moment)
  • hi is the height of the real-time bullet screen in the i -th item
  • H 0 is the height of the video layer (that is, the first height)
  • H max is the height of the bullet screen layer (ie, the second height)
  • d is the distance between two adjacent real-time bullet screens.
  • a preset distance d min can be set through multiple tests in advance, the d min is the minimum distance between two adjacent bullet screens that does not affect the viewing effect of the target video, d min can enable the user to obtain the bullet screen + For the best viewing experience of videos, the interval between bullet screens will not be too small to cause denseness, which will affect the viewing of video content.
  • the maximum distance (ie, the first distance) between two adjacent bullet screens allowed by the region of the video layer can be calculated:
  • the maximum distance (ie, the second distance) between two adjacent bullet screens allowed by the area of the bullet screen layer can be calculated:
  • the number of barrage arriving at a certain moment can be divided into three scenarios: sparse barrage scene, peak barrage scene and ultra-dense barrage scene Scenes.
  • the first scenario sparse barrage scene
  • the following first longitudinal formula can be used to calculate the first ordinate:
  • the first longitudinal formula is: in,
  • the best vertical interval of the bullet screen can be in the interval After internal adjustment, after determining d, the first ordinate zi corresponding to the bullet screen i can be calculated.
  • the second scene the peak barrage scene
  • the second longitudinal formula is:
  • the best vertical interval of the bullet screen can be in the interval After internal adjustment, after determining d, the second ordinate zi corresponding to the bullet screen i can be calculated.
  • the third scenario ultra-intensive barrage scene
  • the bullet screens arriving at the same time are so dense that when the bullet screen layer is distributed to the bullet screen layer, the height of the bullet screen layer still cannot disperse all the bullet screen vertically to the greatest extent.
  • the following third longitudinal formula can be used to calculate the third ordinate:
  • the third longitudinal formula is:
  • the best vertical interval of the bullet screen can only take the maximum interval allowed by the bullet screen layer.
  • the barrage can only be filled to the height of the entire barrage layer at the expense of the interval between barrages, as shown in Figure 2F, where the slanted dotted line in the middle is the real-time coverage of the barrage Scope.
  • FIG. 2G is a schematic diagram of the distribution of the bullet screen after the annular bullet screen layer is extended according to an embodiment of the present application.
  • the dashed range surrounded by the boundary is the real-time coverage of the bullet screen in each area (the new bullet screen area, the main screen bullet screen area, and the historical bullet screen area).
  • the real-time bullet screen appears from the new bullet screen area, then moves to the main screen bullet screen area, and then moves to the historical bullet screen area and disappears.
  • the characteristics such as the amount of likes and the height of each barrage can be further used to divert the stream.
  • bullet screens with more likes and smaller heights are concentrated in the vertical center of the screen, and bullet screens with lower likes and larger heights are distributed to both sides of the bullet screen layer, so that less valuable bullet screens can be realized.
  • the screen is split to both sides, and the more valuable bullet screens are concentrated in the center of the video layer, so that the user experience can be optimized.
  • the electronic device refreshes and plays the real-time bullet screen on the bullet screen layer according to the three-dimensional coordinates.
  • a ring-shaped transparent bullet screen layer is added to the preset orientation of the video layer, so that the area of the bullet screen layer is larger than that of the video layer, and the bullet screen is realized in the
  • the vertical expansion and diversion of the space and the horizontal expansion and extension get rid of the limitation of the screen of the electronic device, so that the proportion of the barrage is greater than or equal to 100%.
  • a new bullet screen area is vertically expanded, which can be used to realize the vertical dispersion of the bullet screen during the peak period, so that users can enjoy the high-energy video clip itself while experiencing the peak bullet screen.
  • the bullet screen layer also expands a new bullet screen area in the horizontal direction.
  • the user turns his head to the right, he can watch the newly generated bullet screen in advance in the new bullet screen area, so that the user has a sense of expectation; the user turns left.
  • you can review the historical bullet screen in the historical bullet screen area, so that users have a sense of nostalgia, so that not only can the bullet screen be divided vertically, reduce the peak pressure of the bullet screen, improve the viewing effect of video content, but also enrich the bullet screen. Rendering form to improve user experience.
  • FIG. 3 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application.
  • the electronic device 3 includes a memory 31 , at least one processor 32 , a computer program 33 stored in the memory 31 and executable on the at least one processor 32 , and at least one communication bus 34 .
  • FIG. 3 is only an example of the electronic device 3, and does not constitute a limitation on the electronic device 3, and may include more or less components than the one shown, or combine some components, Or different components, for example, the electronic device 3 may also include input and output devices, network access devices, and the like.
  • the at least one processor 32 may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC) ), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the processor 32 can be a microprocessor or the processor 32 can also be any conventional processor, etc.
  • the processor 32 is the control center of the electronic device 3, and uses various interfaces and lines to connect the entire electronic device 3 of each part.
  • the memory 31 can be used to store the computer program 33 and/or modules/units, and the processor 32 executes or executes the computer programs and/or modules/units stored in the memory 31 and calls the computer programs and/or modules/units stored in the memory 31. 31 to realize various functions of the electronic device 3 .
  • the memory 31 may mainly include a stored program area and a stored data area, wherein the stored program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; the storage data area may Data such as audio data and the like created in accordance with the use of the electronic device 3 are stored.
  • the memory 31 may include non-volatile and volatile memory, such as hard disk, internal memory, plug-in hard disk, Smart Media Card (SMC), Secure Digital (SD) card, flash memory card ( Flash Card), at least one disk storage device, flash memory device, or other storage device.
  • non-volatile and volatile memory such as hard disk, internal memory, plug-in hard disk, Smart Media Card (SMC), Secure Digital (SD) card, flash memory card ( Flash Card), at least one disk storage device, flash memory device, or other storage device.
  • the electronic device introduced in the embodiment of the present application may be used to implement some or all of the processes in the method embodiment introduced in FIG. 2 of the present application, and reference may be made to the related description in the embodiment described in FIG. 2 above, which will not be repeated here.
  • Embodiments of the present application further provide a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the computer-readable storage medium runs on a processor, the method flow shown in FIG. 2 is implemented.
  • the embodiment of the present application further provides a computer program product, which implements the method flow shown in FIG. 2 when the computer program product runs on the processor.
  • the steps of the method or algorithm described in conjunction with the disclosure of the embodiments of this application may be implemented in a hardware manner, or may be implemented in a manner in which a processor executes software instructions.
  • Software instructions can be composed of corresponding software modules, and software modules can be stored in random access memory (Random Access Memory, RAM), flash memory, read only memory (Read Only Memory, ROM), erasable programmable read only memory ( Erasable Programmable ROM, EPROM), Electrically Erasable Programmable Read-Only Memory (Electrically EPROM, EEPROM), registers, hard disk, removable hard disk, CD-ROM, or any other form of storage medium known in the art.
  • RAM Random Access Memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • registers hard disk, removable hard disk, CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor, such that the processor can read information from, and write information to, the storage medium.
  • the storage medium can also be an integral part of the processor.
  • the processor and storage medium may reside in an ASIC.
  • the ASIC may be located in an electronic device.
  • the processor and storage medium may also exist in the electronic device as discrete components.
  • the aforementioned storage medium includes various media that can store program codes, such as ROM, RAM, magnetic disk, or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种弹幕播放方法,包括:接收针对目标视频的弹幕开启指令,所述弹幕开启指令用于指示采用虚拟现实VR弹幕方式播放弹幕;响应于所述弹幕开启指令,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层,其中,所述视频层的第一高度小于所述弹幕层的第二高度;获取所述目标视频的实时弹幕的弹幕信息;根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标;根据所述三维坐标,在所述弹幕层上刷新并播放所述实时弹幕。本申请还提供一种电子设备、弹幕播放系统及存储介质。本申请能对弹幕进行分流,减少弹幕高峰压力,提高视频内容的观看效果。

Description

弹幕播放方法、相关设备及存储介质
本申请要求于2020年9月22日提交中国专利局、申请号为202011003984.0、发明名称为“弹幕播放方法、相关设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及虚拟现实技术领域,尤其涉及一种弹幕播放方法、相关设备及存储介质。
背景技术
随着视频技术的发展,弹幕文化在视频播放中随处可见。用户可以在观看视频的同时,通过发送弹幕的方式与其他用户进行讨论沟通,这种方式增加了多个用户之间的娱乐互动。
然而,实际中发现,弹幕文化在给人们带来娱乐体验的同时,也带来了一些问题,比如,在视频的一些片段里,整个屏幕往往会被弹幕充满,导致用户无法在享受弹幕乐趣的同时,友好地体验视频中片段本身的内容。
可见,目前的这种弹幕形式,使得视频内容的观看效果较差。
发明内容
本申请实施例公开了一种弹幕播放方法、相关设备及存储介质,能够解决现有弹幕的表现形式使得视频内容的观看效果较差的问题。
本申请第一方面公开了一种弹幕播放方法,所述方法包括:接收针对目标视频的弹幕开启指令,所述弹幕开启指令用于指示采用虚拟现实VR弹幕方式播放弹幕;响应于所述弹幕开启指令,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层,其中,所述视频层的第一高度小于所述弹幕层的第二高度;获取所述目标视频的实时弹幕的弹幕信息;根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标;根据所述三维坐标,在所述弹幕层上刷新并播放所述实时弹幕。
本申请中,通过充分利用VR虚拟空间,在视频层的预设方位增加环形透明的弹幕层,弹幕层的高度大于视频层的高度,实现了弹幕在空间的纵向扩展分流,摆脱了电子设备的屏幕的限制,使得弹幕的比例大于或等于100%,从而可以在纵向上分流弹幕,减少弹幕高峰压力,提高视频内容的观看效果。
在一些可选的实施方式中,所述在所述目标视频的视频层的预设方位绘制环形透明的弹幕层之后,所述方法还包括:将所述弹幕层划分为新生弹幕区、主屏弹幕区以及历史弹幕区,其中,每条所述实时弹幕匀速跨越所述新生弹幕区、所述主屏弹幕区以及所述历史弹幕区;初始化所述新生弹幕区中的弹幕集;判断所述新生弹幕区中的弹幕集是否被初始化完成;若所述新生弹幕区中的弹幕集被初始化完成,执行所述的获取所述目标视频的实时弹幕的弹幕信息。
其中,通过初始化过程,可以保证在t=0时刻,有弹幕从主屏弹幕区的主屏入口处入场,并在主屏弹幕区中显示。
在一些可选的实施方式中,所述初始化所述新生弹幕区中的弹幕集包括:计算初始化时间域;获取处于所述初始化时间域内的所述新生弹幕区中的弹幕集;计算所述弹幕集中的每条弹幕的初始化坐标;按照所述初始化坐标,在所述新生弹幕区中显示所述弹幕集中的每条弹幕。
其中,在初始化阶段,整个新生弹幕区中显示的每条弹幕均是静态的,并没有运动,每条弹幕均按照初始化坐标显示在新生弹幕区中的固定位置。
在一些可选的实施方式中,所述初始化时间域为:[0,Δt 1],其中,
Figure PCTCN2021117205-appb-000001
所述初始化坐标为:
Figure PCTCN2021117205-appb-000002
其中,
Figure PCTCN2021117205-appb-000003
其中,v为弹幕运动速度,R为所述弹幕层的半径,α为所述主屏弹幕区的入口处与水平方向的夹角,β为弹幕产生的位置与水平方向的夹角,t为某一时刻。
在一些可选的实施方式中,所述弹幕信息包括弹幕运动速度、实时播放时间以及弹幕高度,所述根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标包括:在所述实时播放时间,根据所述弹幕运动速度,计算所述实时弹幕在所述弹幕层的水平面上的水平坐标;在所述实时播放时间,根据所述弹幕高度,计算所述实时弹幕在所述弹幕层的纵向面上的纵坐标。
在一些可选的实施方式中,所述在所述实时播放时间,根据所述弹幕运动速度,计算所述实时弹幕在所述弹幕层的水平面上的水平坐标包括:根据所述弹幕运动速度,计算所述新生弹幕区对应的第一时间域、所述主屏弹幕区对应的第二时间域以及所述历史弹幕区对应的第三时间域;若所述实时播放时间处于所述第一时间域内,根据第一水平公式,计算所述实时弹幕在所述新生弹幕区内的第一水平坐标;或若所述实时播放时间处于所述第二时间域内,根据第二水平公式,计算所述实时弹幕在所述主屏弹幕区内的第二水平坐标;或若所述实时播放时间处于所述第三时间域内,根据第三水平公式,计算所述实时弹幕在所述历史弹幕区内的第三水平坐标。
其中,弹幕层在横向上也扩展了新的弹幕区域,用户向右转头时,可以在新生弹幕区提前观看新产生的弹幕,让用户有期待感;用户向左转头时,可以在历史弹幕区回顾历史弹幕,让用户有怀旧感,从而可以丰富弹幕的渲染形式,提高用户体验。
在一些可选的实施方式中,所述第一时间域为:[0,Δt 1],其中,
Figure PCTCN2021117205-appb-000004
所述第一水平公式为:
Figure PCTCN2021117205-appb-000005
其中,
Figure PCTCN2021117205-appb-000006
所述第二时间域为:
Figure PCTCN2021117205-appb-000007
所述第二水平公式为:
Figure PCTCN2021117205-appb-000008
所述第三时间域为:
Figure PCTCN2021117205-appb-000009
所述第三水平公式为:
Figure PCTCN2021117205-appb-000010
Figure PCTCN2021117205-appb-000011
其中,t为所述实时播放时间,v为所述弹幕运动速度,R为所述弹幕层的半径,α为所述主屏弹幕区的入口处与水平方向的夹角,β为所述实时弹幕产生的位置与水平方向的夹角,W为所述主屏弹幕区的主屏的长度,
Figure PCTCN2021117205-appb-000012
为所述第一水平坐标或所述第二水平坐标或所述第三水平坐标。
在一些可选的实施方式中,所述在所述实时播放时间,根据所述弹幕高度,计算所述实时弹幕在所述弹幕层的纵向面上的纵坐标包括:统计在所述实时播放时间内的实时弹幕的条数;根据所述条数、每条所述实时弹幕的弹幕高度以及所述视频层的第一高度,计算第一间距;根据所述条数、每条所述实时弹幕的弹幕高度以及所述弹幕层的第二高度,计算第二间距;若预设间距小于或等于所述第一间距,根据第一纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第一纵坐标,其中,所述预设间距为预先设置的不影响所述目标视频的观看效果的相邻两条弹幕之间的最小间距;或若所述预设间距大于或等于所述第一间距,且所述预设间距小于或等于所述第二间距,根据第二纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第二纵坐标;或若所述预设间距大于所述第二间距,根据第三纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第三纵坐标。
其中,相比较于原有的弹幕区域,在纵向上扩展了新的弹幕区域,按照同一时刻到达的弹幕的情况,计算纵坐标,在纵向上分流弹幕,实现了高峰期弹幕的纵向分散,使得用户在体验高峰弹幕的同时,享受视频中片段本身的内容。
在一些可选的实施方式中,所述第一间距为:
Figure PCTCN2021117205-appb-000013
所述第二间距为:
Figure PCTCN2021117205-appb-000014
所述第一纵向公式为:
Figure PCTCN2021117205-appb-000015
其中,
Figure PCTCN2021117205-appb-000016
所述第二纵向公式为:
Figure PCTCN2021117205-appb-000017
其中,
Figure PCTCN2021117205-appb-000018
所述第三纵向公式为:
Figure PCTCN2021117205-appb-000019
其中,
Figure PCTCN2021117205-appb-000020
其中,N为所述实时播放时间到达的实时弹幕的条数,h i为第i条所述实时弹幕的弹幕高度,h j为第j条所述实时弹幕的弹幕高度,H 0为所述第一高度,H max为所述第二高度,d为相邻两条所述实时弹幕的间距,d min为所述预设间距,z i为所述第一纵坐标或所述第二纵坐标或所述第三纵坐标。
在一些可选的实施方式中,所述响应于所述弹幕开启指令,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层包括:响应于所述弹幕开启指令,输出弹幕层编辑页面;接收在所述弹幕层编辑页面输入的弹幕层绘制参数;按照所述弹幕层绘制参数,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层。
其中,弹幕层绘制参数可以包括,但不限于:弹幕层的高度、弹幕层的半径以及弹幕层 的弧度,这些弹幕层绘制参数均是可调节的,用户通过调节弹幕层绘制参数,可以实现个性化定制适合自己的弹幕形状。
本申请第二方面公开了一种电子设备,包括处理器和存储器;所述存储器,用于存储指令;所述处理器,用于调用所述存储器中的指令,使得所述电子设备执行所述的弹幕播放方法。
本申请第三方面公开了一种弹幕播放系统,所述弹幕播放系统包括电子设备以及便携设备,其中,所述电子设备用于执行所述的弹幕播放方法。
本申请第四方面公开了一种计算机可读存储介质,所述计算机可读存储介质存储有至少一个指令,所述至少一个指令被处理器执行时实现所述的弹幕播放方法。
附图说明
图1是本申请实施例公开的一种弹幕播放系统的框架示意图。
图2是本申请实施例公开的一种弹幕播放方法的流程示意图。
图2A是本申请实施例公开的一种弹幕层绘制参数的示意图。
图2B是本申请实施例公开的一种弹幕在水平面上运动的坐标示意图。
图2C是本申请实施例公开的一种同一时刻到达的弹幕在纵向上运动的坐标示意图。
图2D是本申请实施例公开的一种稀疏弹幕场景的弹幕分布示意图。
图2E是本申请实施例公开的一种高峰弹幕场景的弹幕分布示意图。
图2F是本申请实施例公开的一种超密集弹幕场景的弹幕分布示意图。
图2G是本申请实施例公开的一种环形弹幕层伸展后的弹幕分布示意图。
图3是本申请实施例公开的一种电子设备的结构示意图。
具体实施方式
下面结合本申请实施例中的附图对本申请实施例进行描述。
为了更好的理解本申请实施例公开的一种弹幕播放方法、相关设备及存储介质,下面首先对本申请实施例适用的网络架构进行描述。
请参见图1,图1是本申请实施例公开的一种弹幕播放系统的框架示意图。如图1所示,所述弹幕播放系统包括电子设备和便携设备,图1中还包括在VR虚拟空间中渲染的弹幕层以及视频层。
其中,电子设备可以包括,但不限于:智能手机,平板电脑以及个人数字助理PDA。电子设备上安装有虚拟现实(Virtual Reality,VR)应用程序(Application,APP),该VR APP中集成有VR SDK(Software Development Kit,软件开发工具包)。在VR SDK中集成VR弹幕层的绘制能力,以及弹幕位置实时计算及刷新的能力。用户点击以启动VR APP之后,电子设备可以在VR的虚拟空间中绘制视频层并播放视频。如果用户选择采用VR弹幕方式播放弹幕,电子设备可以利用VR SDK获取用户设置的弹幕层绘制参数(比如弹幕层的高度、弹幕层的半径以及弹幕层的弧度),根据该弹幕层绘制参数,在视频层的上方绘制环形透明的弹幕层,其中,弹幕层的高度大于或等于视频层的高度。在需要播放弹幕时,将每条实时弹幕的弹幕信息(比如弹幕内容、弹幕高度、弹幕运动速度)传递到VR SDK,可以由VR SDK实时计算出每条弹幕的三维坐标,最后,即可在VR虚拟空间,根据该三维坐标,在绘制好的弹幕层上刷新并播放实时弹幕。
在VR虚拟空间绘制的弹幕层可以被划分为多个区域,比如新生弹幕区、主屏弹幕区以及历史弹幕区,每条新产生的实时弹幕,从新生弹幕区出现,匀速移动至主屏弹幕区,再移动至历史弹幕区消逝。
其中,便携设备可以为VR眼镜,用户佩戴该VR眼镜之后,即可看到在VR虚拟空间上渲染的实时弹幕。
在图1所描述的弹幕播放系统中,弹幕区域摆脱了电子设备的屏幕的限制,弹幕层的高度大于视频层的高度,其中,视频层的区域面积即屏幕的面积。图1中所述的弹幕层,相比较于原有的弹幕区域,在纵向上扩展了新的弹幕区域,可用于实现高峰期弹幕的纵向分散,使得用户在体验高峰弹幕的同时,享受视频高能片段本身的内容,同时,弹幕层在横向上也扩展了新的弹幕区域,用户向右转头时,可以在新生弹幕区提前观看新产生的弹幕,让用户有期待感;用户向左转头时,可以在历史弹幕区回顾历史弹幕,让用户有怀旧感,因此,采用图1所示的弹幕形式,不仅可以在纵向上分流弹幕,减少弹幕高峰压力,提高视频内容的观看效果,同时,还可以丰富弹幕的渲染形式,提高用户体验。
基于前述实施例,下面阐述本申请实施例涉及的弹幕播放方法。
请参见图2,图2是本申请实施例公开的一种弹幕播放方法的流程示意图。如图2所示的弹幕播放方法应用于如上图1所示的电子设备中,所述方法包括如下步骤:
S21、电子设备接收针对目标视频的弹幕开启指令。
本申请实施例中,电子设备可以接收其他设备(比如手柄)发送的接收针对目标视频的弹幕开启指令。其中,所述弹幕开启指令用于指示采用虚拟现实VR弹幕方式播放弹幕。
其中,在步骤S21之前,电子设备可以在VR虚拟空间上绘制了视频层并播放目标视频,用户通过佩戴便携设备(如VR眼镜),可以观看在视频层上渲染的目标视频。
S22、电子设备响应于所述弹幕开启指令,输出弹幕层编辑页面。
其中,电子设备可以在VR虚拟空间上输出弹幕层编辑页面,用户可以在该弹幕层编辑页面上设置弹幕层绘制参数,该弹幕层绘制参数可以包括但不限于弹幕层的高度、弹幕层的半径以及弹幕层的弧度,这些弹幕层绘制参数均是可调节的,用户通过调节弹幕层绘制参数,可以实现个性化定制适合自己的弹幕形状。
S23、电子设备接收在所述弹幕层编辑页面输入的弹幕层绘制参数。
S24、电子设备按照所述弹幕层绘制参数,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层。
根据所述弹幕层绘制参数(高度、弧度以及半径)绘制出来的弹幕层,如图2A所示,弹幕层为环状结构,弹幕层在视频层的纵向和横向均有扩展区域,其中,预设方位比如视频层的上方,所述视频层的第一高度小于所述弹幕层的第二高度。整个弹幕层的区域面积大于视频层的区域面积。
S25、电子设备将所述弹幕层划分为新生弹幕区、主屏弹幕区以及历史弹幕区。
其中,每条新产生的弹幕会匀速跨越所述新生弹幕区、所述主屏弹幕区以及所述历史弹幕区。
S26、电子设备初始化所述新生弹幕区中的弹幕集。
本申请实施例中,在目标视频开始播放前,需要在整个新生弹幕区,初始化显示该新生弹幕区的弹幕集,这样才能保证在t=0时刻,有弹幕从主屏弹幕区的主屏入口处入场,并在主屏弹幕区中显示。
请参见图2B,图2B是本申请实施例公开的一种弹幕在水平面上运动的坐标示意图。其中,v为弹幕运动速度,R为所述弹幕层的半径,α为所述主屏弹幕区的入口处与水平方向的夹角,β为弹幕产生的位置与水平方向的夹角,t为某一时刻。其中,α和β即弹幕绘制参数中的弧度,可以根据用户的要求进行调整。
每条弹幕从弹幕产生处开始运动,在目标视频开始播放前,初始化整个新生弹幕区中的弹幕集,在播放目标视频时,开始在新生弹幕区中显示实时弹幕i,实时弹幕i运动到主屏入口处,从主屏入口处进入主屏弹幕区,再运动到历史弹幕区,直至最后从弹幕消亡处消逝。
具体的,所述初始化所述新生弹幕区中的弹幕集包括:
计算初始化时间域;
获取处于所述初始化时间域内的所述新生弹幕区中的弹幕集;
计算所述弹幕集中的每条弹幕的初始化坐标;
按照所述初始化坐标,在所述新生弹幕区中显示所述弹幕集中的每条弹幕。
其中,所述初始化时间域为:[0,Δt 1],其中,
Figure PCTCN2021117205-appb-000021
所述初始化坐标为:
Figure PCTCN2021117205-appb-000022
其中,
Figure PCTCN2021117205-appb-000023
需要说明的是,在初始化阶段,整个新生弹幕区中显示的每条弹幕均是静态的,并没有运动,每条弹幕均按照初始化坐标显示在新生弹幕区中的固定位置。当初始化完成后,目标视频开始播放时,新生弹幕区中的弹幕开始实时运动,相应的位置坐标也会随之变化。
S27、电子设备判断所述新生弹幕区中的弹幕集是否被初始化完成,若所述新生弹幕区中的弹幕集被初始化完成,执行步骤S28,若所述新生弹幕区中的弹幕集未完成初始化,重复执行步骤S27。
本申请实施例中,所述新生弹幕区中的弹幕集需要被初始化完成之后,才能确保在t=0时刻有弹幕从主屏入口处进入到主屏弹幕区中显示,如果初始化未完成,需要一直循环反复进行初始化,直至初始化完成。
S28、电子设备获取所述目标视频的实时弹幕的弹幕信息。
其中,所述弹幕信息可以包括,但不限于:弹幕内容、弹幕运动速度、实时播放时间、弹幕高度、弹幕原始位置以及弹幕高度。
S29、电子设备根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标。
其中,所述弹幕信息包括弹幕运动速度、实时播放时间以及弹幕高度,所述根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标包括:
11)在所述实时播放时间,根据所述弹幕运动速度,计算所述实时弹幕在所述弹幕层的水平面上的水平坐标;
12)在所述实时播放时间,根据所述弹幕高度,计算所述实时弹幕在所述弹幕层的纵向面上的纵坐标。
其中,三维坐标指的是在实时播放时间t这一时刻,弹幕在VR虚拟空间的弹幕层上的三维坐标,包括水平面上的水平坐标以及纵向面上的纵坐标。
作为一种可选的实施方式,步骤11)中,所述在所述实时播放时间,根据所述弹幕运动速度,计算所述实时弹幕在所述弹幕层的水平面上的水平坐标包括:
根据所述弹幕运动速度,计算所述新生弹幕区对应的第一时间域、所述主屏弹幕区对应的第二时间域以及所述历史弹幕区对应的第三时间域;
若所述实时播放时间处于所述第一时间域内,根据第一水平公式,计算所述实时弹幕在所述新生弹幕区内的第一水平坐标;或
若所述实时播放时间处于所述第二时间域内,根据第二水平公式,计算所述实时弹幕在所述主屏弹幕区内的第二水平坐标;或
若所述实时播放时间处于所述第三时间域内,根据第三水平公式,计算所述实时弹幕在所述历史弹幕区内的第三水平坐标。
其中,第一时间域即实时弹幕在新生弹幕区上运动的时间范围,第二时间域即实时弹幕在主屏弹幕区上运动的时间范围,第三时间域即实时弹幕在历史弹幕区上运动的时间范围。
可以结合图2B来理解该可选的实施方式中的每个时间域和公式,根据图2B中的各个弹幕层绘制参数(α,β,R,v)可以计算出实时弹幕i从弹幕产生处运动到主屏入口处的时间Δt 1,实时弹幕i离开主屏弹幕区并进入历史弹幕区的时间
Figure PCTCN2021117205-appb-000024
实时弹幕i在弹幕消亡处的时间
Figure PCTCN2021117205-appb-000025
并计算实时弹幕i在每个实时播放时间的水平位置坐标(x,y)。
实时弹幕i在每个弹幕区的时间域和水平坐标的计算公式如下:
(1)新生弹幕区:
所述第一时间域为:[0,Δt 1],其中,
Figure PCTCN2021117205-appb-000026
所述第一水平公式为:
Figure PCTCN2021117205-appb-000027
其中,
Figure PCTCN2021117205-appb-000028
(2)主屏弹幕区:
所述第二时间域为:
Figure PCTCN2021117205-appb-000029
所述第二水平公式为:
Figure PCTCN2021117205-appb-000030
(3)历史弹幕区:
所述第三时间域为:
Figure PCTCN2021117205-appb-000031
所述第三水平公式为:
Figure PCTCN2021117205-appb-000032
其中,t为所述实时播放时间,v为所述弹幕运动速度,R为所述弹幕层的半径,α为所述主屏弹幕区的入口处与水平方向的夹角,β为所述实时弹幕产生的位置与水平方向的夹角,W为所述主屏弹幕区的主屏的长度,
Figure PCTCN2021117205-appb-000033
为所述第一水平坐标或所述第二水平坐标或所述第 三水平坐标。
作为一种可选的实施方式,步骤12)中,所述在所述实时播放时间,根据所述弹幕高度,计算所述实时弹幕在所述弹幕层的纵向面上的纵坐标包括:
统计在所述实时播放时间内的实时弹幕的条数;
根据所述条数、每条所述实时弹幕的弹幕高度以及所述视频层的第一高度,计算第一间距;
根据所述条数、每条所述实时弹幕的弹幕高度以及所述弹幕层的第二高度,计算第二间距,其中,所述第二高度大于或等于所述第一高度;
若预设间距小于或等于所述第一间距,根据第一纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第一纵坐标,其中,所述预设间距为预先设置的不影响所述目标视频的观看效果的相邻两条弹幕之间的最小间距;或
若所述预设间距大于或等于所述第一间距,且所述预设间距小于或等于所述第二间距,根据第二纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第二纵坐标;或
若所述预设间距大于所述第二间距,根据第三纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第三纵坐标。
为了更好的理解该可选的实施方式,请一并参见图2C,图2C是本申请实施例公开的一种同一时刻到达的弹幕在纵向上运动的坐标示意图。如图2C所示,以视频层的中心为坐标原点,绘制坐标系。其中,N为所述实时播放时间(即同一时刻)到达的实时弹幕的条数,h i为第i条所述实时弹幕的弹幕高度,H 0为视频层的高度(即第一高度),H max为弹幕层的高度(即第二高度),d为相邻两条所述实时弹幕的间距。同一时刻到达的弹幕稀疏时,间距d较大;同一时刻到达的弹幕弹幕密集时,间距d较小。
可以预先通过多次试验,设置一个预设间距d min,该d min为不影响所述目标视频的观看效果的相邻两条弹幕之间的最小间距,d min可以使用户获得弹幕+视频的最佳观看体验,使得弹幕之间的间隔不会过小导致密集,而影响视频内容的观看。
根据图2C中的各个参数,可以计算出视频层的区域所允许的相邻两条弹幕之间的最大间距(即第一间距):
Figure PCTCN2021117205-appb-000034
同时,可以计算出弹幕层的区域所允许的相邻两条弹幕之间的最大间距(即第二间距):
Figure PCTCN2021117205-appb-000035
根据该第一间距、第二间距以及预设间距之间的大小关系,可以将某一时刻到达的弹幕的数量划分为三种场景:稀疏弹幕场景、高峰弹幕场景以及超密集弹幕场景。
第一种场景:稀疏弹幕场景
如果
Figure PCTCN2021117205-appb-000036
表明视频层的高度足以在满足用户体验指标d min的前提下,覆盖这些弹幕,此时到达的弹幕较为稀疏。根据图2C中的各个参数,可以采用如下第一纵向公式计算第一纵坐标:
所述第一纵向公式为:
Figure PCTCN2021117205-appb-000037
其中,
Figure PCTCN2021117205-appb-000038
其中,最佳的弹幕纵向间隔可在区间
Figure PCTCN2021117205-appb-000039
内调整,在确定d后,即可计算出弹幕i相应的第一纵坐标z i
在稀疏弹幕场景中,无需将弹幕分流到新扩展的弹幕层两侧,仅需将所有弹幕对齐视频层顶部,并向下纵向分布,如图2D所示,其中,中间的斜虚线范围即弹幕的实时覆盖范围。
第二种场景:高峰弹幕场景
Figure PCTCN2021117205-appb-000040
的情况下,同一时刻到达的弹幕较为密集。如果弹幕全分布在视频层,则过于密集,用户体验差,但是,如果弹幕层的高度足以满足用户体验指标d min,则将弹幕分流至弹幕层,可以降低密集度,提升用户体验。根据图2C中的各个参数,可以采用如下第二纵向公式计算第二纵坐标:
所述第二纵向公式为:
Figure PCTCN2021117205-appb-000041
其中,
Figure PCTCN2021117205-appb-000042
其中,最佳的弹幕纵向间隔可在区间
Figure PCTCN2021117205-appb-000043
内调整,在确定d后,即可计算出弹幕i相应的第二纵坐标z i
在高峰弹幕场景,可将所有弹幕从屏幕的正中心,向垂直方向两侧分布,直到分布到垂直方向两侧的弹幕层,如图2E所示,其中,中间的斜虚线范围即弹幕的实时覆盖范围。
第三种场景:超密集弹幕场景
Figure PCTCN2021117205-appb-000044
的情况下,同一时刻到达的弹幕非常密集,以至于分流到弹幕层时,弹幕层的高度仍无法将所有弹幕在纵向最大程度地分散开。根据图2C中的各个参数,可以采用如下第三纵向公式计算第三纵坐标:
所述第三纵向公式为:
Figure PCTCN2021117205-appb-000045
其中,
Figure PCTCN2021117205-appb-000046
其中,最佳的弹幕纵向间隔只能取弹幕层所能允许的最大间隔
Figure PCTCN2021117205-appb-000047
在超密集弹幕场景,只能将弹幕撑满整个弹幕层高度,而以牺牲弹幕之间间隔为代价,如图2F所示,其中,中间的斜虚线范围即弹幕的实时覆盖范围。
参见图2G,图2G是本申请实施例公开的一种环形弹幕层伸展后的弹幕分布示意图。如图2G所示,被边界包围的斜虚线范围即弹幕在各个区域(新生弹幕区、主屏弹幕区以及历史弹幕区)的实时覆盖范围。实时弹幕从新生弹幕区出现,然后移动到主屏弹幕区,再移动到历史弹幕区消逝。当同一时刻的弹幕的条数较稀疏时,则采用“稀疏弹幕对齐视频顶部”的方式进行分布;当同一时刻的弹幕的条数较密集时,则采用“高峰弹幕从屏幕中心向两侧扩展” 的方式进行分布。
作为一种可选的实施方式,在对实时弹幕进行纵向分流时,还可以进一步利用每条弹幕的点赞量、弹幕高度等特征进行分流。比如,点赞量越多并且高度越小的弹幕集中到屏幕垂直方向的中心,点赞量越低并且高度越大的弹幕分流到弹幕层两侧,这样可以实现价值较低的弹幕分流到两侧,价值较高的弹幕集中在视频层中心,从而可以实现用户体验的最优化。
S210、电子设备根据所述三维坐标,在所述弹幕层上刷新并播放所述实时弹幕。
在图2所描述的方法中,通过充分利用VR虚拟空间,在视频层的预设方位增加环形透明的弹幕层,使得弹幕层的区域面积大于视频层的区域面积,实现了弹幕在空间的纵向扩展分流以及横向扩展延伸,摆脱了电子设备的屏幕的限制,使得弹幕的比例大于或等于100%。此外,相比较于原有的弹幕区域,在纵向上扩展了新的弹幕区域,可用于实现高峰期弹幕的纵向分散,使得用户在体验高峰弹幕的同时,享受视频高能片段本身的内容,同时,弹幕层在横向上也扩展了新的弹幕区域,用户向右转头时,可以在新生弹幕区提前观看新产生的弹幕,让用户有期待感;用户向左转头时,可以在历史弹幕区回顾历史弹幕,让用户有怀旧感,从而不仅可以在纵向上分流弹幕,减少弹幕高峰压力,提高视频内容的观看效果,同时,还可以丰富弹幕的渲染形式,提高用户体验。
以上所述,仅是本申请的具体实施方式,但本申请的保护范围并不局限于此,对于本领域的普通技术人员来说,在不脱离本申请创造构思的前提下,还可以做出改进,但这些均属于本申请的保护范围。
请参见图3,图3是本申请实施例公开的一种电子设备的结构示意图。所述电子设备3包括存储器31、至少一个处理器32、存储在所述存储器31中并可在所述至少一个处理器32上运行的计算机程序33及至少一条通讯总线34。
本领域技术人员可以理解,图3所示的示意图仅仅是电子设备3的示例,并不构成对电子设备3的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如电子设备3还可以包括输入输出设备、网络接入设备等。
所述至少一个处理器32可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。该处理器32可以是微处理器或者该处理器32也可以是任何常规的处理器等,所述处理器32是所述电子设备3的控制中心,利用各种接口和线路连接整个电子设备3的各个部分。
所述存储器31可用于存储所述计算机程序33和/或模块/单元,所述处理器32通过运行或执行存储在所述存储器31内的计算机程序和/或模块/单元,以及调用存储在存储器31内的数据,实现所述电子设备3的各种功能。所述存储器31可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据电子设备3的使用所创建的数据(比如音频数据)等。此外,存储器31可以包括非易失性和易失性存储器,例如硬盘、内存、插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure  Digital,SD)卡,闪存卡(Flash Card)、至少一个磁盘存储器件、闪存器件、或其他存储器件。
本申请实施例中介绍的电子设备可以用于实施本申请图2介绍的方法实施例中的部分或全部流程,可参见前述图2所述实施例中的相关阐述,这里不再赘述。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当其在处理器上运行时,实现图2所示的方法流程。
本申请实施例还提供一种计算机程序产品,当所述计算机程序产品在处理器上运行时,实现图2所示的方法流程。
结合本申请实施例公开内容所描述的方法或者算法的步骤可以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于随机存取存储器(Random Access Memory,RAM)、闪存、只读存储器(Read Only Memory,ROM)、可擦除可编程只读存储器(Erasable Programmable ROM,EPROM)、电可擦可编程只读存储器(Electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、只读光盘(CD-ROM)或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于电子设备中。当然,处理器和存储介质也可以作为分立组件存在于电子设备中。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。

Claims (13)

  1. 一种弹幕播放方法,其特征在于,所述方法包括:
    接收针对目标视频的弹幕开启指令,所述弹幕开启指令用于指示采用虚拟现实VR弹幕方式播放弹幕;
    响应于所述弹幕开启指令,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层,其中,所述视频层的第一高度小于所述弹幕层的第二高度;
    获取所述目标视频的实时弹幕的弹幕信息;
    根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标;
    根据所述三维坐标,在所述弹幕层上刷新并播放所述实时弹幕。
  2. 根据权利要求1所述的弹幕播放方法,其特征在于,所述在所述目标视频的视频层的预设方位绘制环形透明的弹幕层之后,所述方法还包括:
    将所述弹幕层划分为新生弹幕区、主屏弹幕区以及历史弹幕区,其中,每条所述实时弹幕匀速跨越所述新生弹幕区、所述主屏弹幕区以及所述历史弹幕区;
    初始化所述新生弹幕区中的弹幕集;
    判断所述新生弹幕区中的弹幕集是否被初始化完成;
    若所述新生弹幕区中的弹幕集被初始化完成,执行所述的获取所述目标视频的实时弹幕的弹幕信息。
  3. 根据权利要求2所述的弹幕播放方法,其特征在于,所述初始化所述新生弹幕区中的弹幕集包括:
    计算初始化时间域;
    获取处于所述初始化时间域内的所述新生弹幕区中的弹幕集;
    计算所述弹幕集中的每条弹幕的初始化坐标;
    按照所述初始化坐标,在所述新生弹幕区中显示所述弹幕集中的每条弹幕。
  4. 根据权利要求3所述的弹幕播放方法,其特征在于,
    所述初始化时间域为:[0,Δt 1],其中,
    Figure PCTCN2021117205-appb-100001
    所述初始化坐标为:
    Figure PCTCN2021117205-appb-100002
    其中,
    Figure PCTCN2021117205-appb-100003
    其中,v为弹幕运动速度,R为所述弹幕层的半径,α为所述主屏弹幕区的入口处与水平方向的夹角,β为弹幕产生的位置与水平方向的夹角,t为某一时刻。
  5. 根据权利要求2所述的弹幕播放方法,其特征在于,所述弹幕信息包括弹幕运动速度、实时播放时间以及弹幕高度,所述根据所述弹幕信息,计算所述实时弹幕在所述弹幕层上的三维坐标包括:
    在所述实时播放时间,根据所述弹幕运动速度,计算所述实时弹幕在所述弹幕层的水平面上的水平坐标;
    在所述实时播放时间,根据所述弹幕高度,计算所述实时弹幕在所述弹幕层的纵向面上的纵坐标。
  6. 根据权利要求5所述的弹幕播放方法,其特征在于,所述在所述实时播放时间,根据所述弹幕运动速度,计算所述实时弹幕在所述弹幕层的水平面上的水平坐标包括:
    根据所述弹幕运动速度,计算所述新生弹幕区对应的第一时间域、所述主屏弹幕区对应的第二时间域以及所述历史弹幕区对应的第三时间域;
    若所述实时播放时间处于所述第一时间域内,根据第一水平公式,计算所述实时弹幕在所述新生弹幕区内的第一水平坐标;或
    若所述实时播放时间处于所述第二时间域内,根据第二水平公式,计算所述实时弹幕在所述主屏弹幕区内的第二水平坐标;或
    若所述实时播放时间处于所述第三时间域内,根据第三水平公式,计算所述实时弹幕在所述历史弹幕区内的第三水平坐标。
  7. 根据权利要求6所述的弹幕播放方法,其特征在于,
    所述第一时间域为:[0,Δt 1],其中,
    Figure PCTCN2021117205-appb-100004
    所述第一水平公式为:
    Figure PCTCN2021117205-appb-100005
    其中,
    Figure PCTCN2021117205-appb-100006
    所述第二时间域为:
    Figure PCTCN2021117205-appb-100007
    所述第二水平公式为:
    Figure PCTCN2021117205-appb-100008
    所述第三时间域为:
    Figure PCTCN2021117205-appb-100009
    所述第三水平公式为:
    Figure PCTCN2021117205-appb-100010
    其中,t为所述实时播放时间,v为所述弹幕运动速度,R为所述弹幕层的半径,α为所述主屏弹幕区的入口处与水平方向的夹角,β为所述实时弹幕产生的位置与水平方向的夹角,W为所述主屏弹幕区的主屏的长度,
    Figure PCTCN2021117205-appb-100011
    为所述第一水平坐标或所述第二水平坐标或所述第三水平坐标。
  8. 根据权利要求5所述的弹幕播放方法,其特征在于,所述在所述实时播放时间,根据所述弹幕高度,计算所述实时弹幕在所述弹幕层的纵向面上的纵坐标包括:
    统计在所述实时播放时间内的实时弹幕的条数;
    根据所述条数、每条所述实时弹幕的弹幕高度以及所述视频层的第一高度,计算第一间距;
    根据所述条数、每条所述实时弹幕的弹幕高度以及所述弹幕层的第二高度,计算第二间距;
    若预设间距小于或等于所述第一间距,根据第一纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第一纵坐标,其中,所述预设间距为预先设置的不影响所述目标视频的观 看效果的相邻两条弹幕之间的最小间距;或
    若所述预设间距大于或等于所述第一间距,且所述预设间距小于或等于所述第二间距,根据第二纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第二纵坐标;或
    若所述预设间距大于所述第二间距,根据第三纵向公式,计算所述实时弹幕在所述弹幕层的纵向面上的第三纵坐标。
  9. 根据权利要求8所述的弹幕播放方法,其特征在于,
    所述第一间距为:
    Figure PCTCN2021117205-appb-100012
    所述第二间距为:
    Figure PCTCN2021117205-appb-100013
    所述第一纵向公式为:
    Figure PCTCN2021117205-appb-100014
    其中,
    Figure PCTCN2021117205-appb-100015
    所述第二纵向公式为:
    Figure PCTCN2021117205-appb-100016
    其中,
    Figure PCTCN2021117205-appb-100017
    所述第三纵向公式为:
    Figure PCTCN2021117205-appb-100018
    其中,
    Figure PCTCN2021117205-appb-100019
    其中,N为所述实时播放时间到达的实时弹幕的条数,h i为第i条所述实时弹幕的弹幕高度,h j为第j条所述实时弹幕的弹幕高度,H 0为所述第一高度,H max为所述第二高度,d为相邻两条所述实时弹幕的间距,d min为所述预设间距,z i为所述第一纵坐标或所述第二纵坐标或所述第三纵坐标。
  10. 根据权利要求1至9中任一项所述的弹幕播放方法,其特征在于,所述响应于所述弹幕开启指令,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层包括:
    响应于所述弹幕开启指令,输出弹幕层编辑页面;
    接收在所述弹幕层编辑页面输入的弹幕层绘制参数;
    按照所述弹幕层绘制参数,在所述目标视频的视频层的预设方位绘制环形透明的弹幕层。
  11. 一种电子设备,其特征在于,包括处理器和存储器;所述存储器,用于存储指令;所述处理器,用于调用所述存储器中的指令,使得所述电子设备执行如权利要求1至10中任一项所述的弹幕播放方法。
  12. 一种弹幕播放系统,其特征在于,所述弹幕播放系统包括电子设备以及便携设备,其中,所述电子设备用于执行如权利要求1至10中任一项所述的弹幕播放方法。
  13. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有至少一个指令,所述至少一个指令被处理器执行时实现如权利要求1至10中任一项所述的弹幕播放方法。
PCT/CN2021/117205 2020-09-22 2021-09-08 弹幕播放方法、相关设备及存储介质 WO2022062903A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21871275.0A EP4210337A4 (en) 2020-09-22 2021-09-08 METHOD FOR READING COMMENTS ON SCREEN, ASSOCIATED DEVICE AND STORAGE MEDIUM
US18/187,440 US20230232077A1 (en) 2020-09-22 2023-03-21 Bullet screen play method, related device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011003984.0 2020-09-22
CN202011003984.0A CN114257849B (zh) 2020-09-22 2020-09-22 弹幕播放方法、相关设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/187,440 Continuation US20230232077A1 (en) 2020-09-22 2023-03-21 Bullet screen play method, related device, and storage medium

Publications (1)

Publication Number Publication Date
WO2022062903A1 true WO2022062903A1 (zh) 2022-03-31

Family

ID=80788486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/117205 WO2022062903A1 (zh) 2020-09-22 2021-09-08 弹幕播放方法、相关设备及存储介质

Country Status (4)

Country Link
US (1) US20230232077A1 (zh)
EP (1) EP4210337A4 (zh)
CN (1) CN114257849B (zh)
WO (1) WO2022062903A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115022726A (zh) * 2022-05-09 2022-09-06 北京爱奇艺科技有限公司 环绕信息生成和弹幕显示方法、装置、设备及存储介质
WO2023193796A1 (zh) * 2022-04-08 2023-10-12 北京字跳网络技术有限公司 基于虚拟现实的弹幕信息显示方法、装置及电子设备

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117197400A (zh) * 2022-05-31 2023-12-08 北京字跳网络技术有限公司 信息交互方法、装置、电子设备和存储介质
CN117201855A (zh) * 2022-05-31 2023-12-08 北京字跳网络技术有限公司 信息的显示方法、装置、终端和存储介质
CN114911396A (zh) * 2022-06-07 2022-08-16 深圳市天趣星空科技有限公司 用于智能眼镜的信息处理方法和系统
WO2024088375A1 (zh) * 2022-10-28 2024-05-02 北京字跳网络技术有限公司 一种弹幕呈现方法、装置、设备和存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205038432U (zh) * 2015-08-21 2016-02-17 广州弹幕网络科技有限公司 一种弹幕影院
CN105828164A (zh) * 2016-04-28 2016-08-03 武汉斗鱼网络科技有限公司 一种避免弹幕重叠显示的方法及系统
CN106210861A (zh) * 2016-08-23 2016-12-07 上海幻电信息科技有限公司 显示弹幕的方法及系统
CN107147941A (zh) * 2017-05-27 2017-09-08 努比亚技术有限公司 视频播放的弹幕显示方法、装置及计算机可读存储介质
CN107360459A (zh) * 2017-07-07 2017-11-17 腾讯科技(深圳)有限公司 一种弹幕的处理方法、装置和存储介质
WO2018207045A1 (en) * 2017-05-10 2018-11-15 Within Unlimited, Inc. Placement and dynamic rendering of caption information in virtual reality video
CN110572704A (zh) * 2019-09-16 2019-12-13 腾讯科技(深圳)有限公司 一种控制弹幕播放速度的方法、装置、设备及介质
CN110662099A (zh) * 2018-06-28 2020-01-07 北京京东尚科信息技术有限公司 用于显示弹幕的方法和装置
CN111586426A (zh) * 2020-04-30 2020-08-25 广州华多网络科技有限公司 全景直播的信息展示方法、装置、设备及存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303746A (zh) * 2016-08-17 2017-01-04 乐视控股(北京)有限公司 弹幕数据的处理方法及装置
CN106454387B (zh) * 2016-10-31 2020-02-07 北京小米移动软件有限公司 全景视频弹幕显示方法及装置
CN109429087B (zh) * 2017-06-26 2021-03-02 上海优土视真文化传媒有限公司 虚拟现实视频弹幕的显示方法、介质和系统
CN108696767B (zh) * 2018-05-15 2021-05-25 北京字节跳动网络技术有限公司 弹幕播放方法、装置、计算机可读存储介质和终端

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205038432U (zh) * 2015-08-21 2016-02-17 广州弹幕网络科技有限公司 一种弹幕影院
CN105828164A (zh) * 2016-04-28 2016-08-03 武汉斗鱼网络科技有限公司 一种避免弹幕重叠显示的方法及系统
CN106210861A (zh) * 2016-08-23 2016-12-07 上海幻电信息科技有限公司 显示弹幕的方法及系统
WO2018207045A1 (en) * 2017-05-10 2018-11-15 Within Unlimited, Inc. Placement and dynamic rendering of caption information in virtual reality video
CN107147941A (zh) * 2017-05-27 2017-09-08 努比亚技术有限公司 视频播放的弹幕显示方法、装置及计算机可读存储介质
CN107360459A (zh) * 2017-07-07 2017-11-17 腾讯科技(深圳)有限公司 一种弹幕的处理方法、装置和存储介质
CN110662099A (zh) * 2018-06-28 2020-01-07 北京京东尚科信息技术有限公司 用于显示弹幕的方法和装置
CN110572704A (zh) * 2019-09-16 2019-12-13 腾讯科技(深圳)有限公司 一种控制弹幕播放速度的方法、装置、设备及介质
CN111586426A (zh) * 2020-04-30 2020-08-25 广州华多网络科技有限公司 全景直播的信息展示方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4210337A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023193796A1 (zh) * 2022-04-08 2023-10-12 北京字跳网络技术有限公司 基于虚拟现实的弹幕信息显示方法、装置及电子设备
CN115022726A (zh) * 2022-05-09 2022-09-06 北京爱奇艺科技有限公司 环绕信息生成和弹幕显示方法、装置、设备及存储介质
CN115022726B (zh) * 2022-05-09 2023-12-15 北京爱奇艺科技有限公司 环绕信息生成和弹幕显示方法、装置、设备及存储介质

Also Published As

Publication number Publication date
US20230232077A1 (en) 2023-07-20
EP4210337A4 (en) 2024-02-21
CN114257849B (zh) 2023-06-02
EP4210337A1 (en) 2023-07-12
CN114257849A (zh) 2022-03-29

Similar Documents

Publication Publication Date Title
WO2022062903A1 (zh) 弹幕播放方法、相关设备及存储介质
WO2021143564A1 (zh) 直播控制方法和装置、电子设备、直播系统及存储介质
US11012727B2 (en) Predictive content delivery for video streaming services
JP6558587B2 (ja) 情報処理装置、表示装置、情報処理方法、プログラム、および情報処理システム
CN106331877B (zh) 弹幕播放方法及装置
AU2015231761B2 (en) Object tracking in zoomed video
CN112237012B (zh) 用于控制多视点全方位内容中的音频的装置及方法
US20170171274A1 (en) Method and electronic device for synchronously playing multiple-cameras video
US20120069143A1 (en) Object tracking and highlighting in stereoscopic images
US10271105B2 (en) Method for playing video, client, and computer storage medium
WO2023104102A1 (zh) 一种直播评论展示方法、装置、设备、程序产品及介质
US20240040211A1 (en) Methods, Systems, and Media For Presenting Interactive Elements Within Video Content
US20130147904A1 (en) Processing media streams during a multi-user video conference
US10261749B1 (en) Audio output for panoramic images
CN110505406A (zh) 背景虚化方法、装置、存储介质及终端
US20240196025A1 (en) Computer program, server device, terminal device, and method
CN110069230A (zh) 扩展内容显示方法、装置及存储介质
WO2023226814A1 (zh) 视频处理方法、装置、电子设备及存储介质
US20200220907A1 (en) Method, system, and non-transitory computer readable record medium for enhancing video quality of video call
WO2020125009A1 (zh) 一种视频处理方法及电视
WO2020206647A1 (zh) 跟随用户运动控制播放视频内容的方法和装置
WO2017185645A1 (zh) 竖直全屏播放方法、装置及其移动播放终端
KR102593043B1 (ko) 증강 현실 기반 디스플레이 방법, 기기, 저장매체 및 프로그램 제품
US20130300823A1 (en) Stereo effect enhancement systems and methods
WO2021088973A1 (zh) 直播流显示方法、装置、电子设备及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21871275

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021871275

Country of ref document: EP

Effective date: 20230406

NENP Non-entry into the national phase

Ref country code: DE