CN109429087B - Virtual reality video barrage display method, medium, and system - Google Patents

Virtual reality video barrage display method, medium, and system Download PDF

Info

Publication number
CN109429087B
CN109429087B CN201710493899.9A CN201710493899A CN109429087B CN 109429087 B CN109429087 B CN 109429087B CN 201710493899 A CN201710493899 A CN 201710493899A CN 109429087 B CN109429087 B CN 109429087B
Authority
CN
China
Prior art keywords
virtual reality
screen
reality video
bullet screen
pan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710493899.9A
Other languages
Chinese (zh)
Other versions
CN109429087A (en
Inventor
李艳萍
许海林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Youtushizhen Culture Media Co ltd
Original Assignee
Shanghai Youtushizhen Culture Media Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Youtushizhen Culture Media Co ltd filed Critical Shanghai Youtushizhen Culture Media Co ltd
Priority to CN201710493899.9A priority Critical patent/CN109429087B/en
Publication of CN109429087A publication Critical patent/CN109429087A/en
Application granted granted Critical
Publication of CN109429087B publication Critical patent/CN109429087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a display method, a computer readable medium and a system of a virtual reality video barrage, which solve the problems and defects generated when the expression form of the traditional video barrage is applied to a Virtual Reality (VR) video. The technical scheme is as follows: and associating the bullet screen with time and space information, and controlling the position of the VR bullet screen on the screen and the position of the VR bullet screen on the screen to disappear based on the associated information.

Description

Virtual reality video barrage display method, medium, and system
Technical Field
The present invention relates to Virtual Reality (VR) technology, and in particular, to a barrage display method and system for VR video.
Background
Domestic barrage technology is started in 2007, and is represented by video websites ACFUN and Bilibili. The barrage is a very novel model, which can form simple interactivity with the viewer, who announces the perception of the video at that moment in time in text, which then becomes floating from the screen. The formation of the bullet screen marks that the video industry enters a brand new stage and is a prototype of the interactive video.
In recent years, Virtual Reality (VR) technology has been developed vigorously and is also widely used in the video field. The VR video is also called VR panoramic video and is also a video. In contrast, each frame of the VR video is a (horizontal) 360-degree x (vertical) 180-degree panoramic picture. When watching, the user is not limited to a fixed visual angle range any more, but can turn around up, down, left and right, and select the content at any angle in the whole space to watch. The reality and the sense of immersion of experience can be strengthened to the cooperation VR glasses and watch. The VR video subverts the viewing experience of common videos and also injects vitality into the further development of interactive videos.
During watching the VR video, the users also desire to communicate with each other by means of a text bullet screen. However, the traditional video bullet screen expression has the following problems:
1) traditional video barrage only relates to time axis information, if direct application in the VR video, can't satisfy the coexistence problem of time and spatial information.
2) The scenes of conventional video are fixed relative to the screen. The scene content corresponding to the bullet screen characters sent by the user at a certain time is also fixed. However, the scene of the VR video is 360 degrees, and the scene contents seen by different users at the same time are likely to be inconsistent. If the user issues bullet screen characters aiming at the content seen by the user, the user who does not see the scene in the direction cannot know the meaning specifically expressed by the user. Then the best effect cannot be achieved by means of communication through the bullet screen.
3) Conventional video banners typically appear from one end of the screen and disappear from the other. If the method is applied to VR video, characters can be cut off when split-screen watching is carried out by using VR glasses, and the problem of double images can occur when the contents of left and right screens are inconsistent.
Disclosure of Invention
The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.
The invention provides a display method, a computer readable medium and a system of a virtual reality video barrage, and aims to solve the problems and defects generated when the expression form of the traditional video barrage is applied to a VR video.
The technical scheme of the invention is as follows: the invention discloses a display method of a virtual reality video barrage, which comprises the following steps:
step 1: recording a current virtual reality video playing time point when the barrage is sent and a Pan value and a Tilt value of a virtual reality video picture frame at the current screen center point, wherein the Pan value represents a picture frame horizontal direction visual angle range, and the Tilt value represents a picture frame vertical direction visual angle range;
step 2: controlling the bullet screen to appear at the position of the current screen center point;
and step 3: and controlling the bullet screen to disappear at the positions of the Pan value and Tilt value of the virtual reality video picture frame at the center point of the current screen when the bullet screen is sent.
According to an embodiment of the method for displaying the virtual reality video barrage, the barrage is automatically converted into characters after being input by keyboard typing or voice inputting.
According to an embodiment of the display method of the virtual reality video barrage, in step 1, a Pan value and a Tilt value of a virtual reality video frame at a center point of a current screen are obtained in a finger dragging mode.
According to an embodiment of the display method of the virtual reality video barrage, in step 1, a Pan value and a Tilt value of a virtual reality video frame at a current screen center point are obtained in a gyroscope induction mode.
The present invention also discloses a computer readable medium storing a computer readable program for performing the aforementioned steps.
The invention also discloses a computer system, which comprises a processor and a computer readable medium, wherein the computer readable medium stores a computer readable program, and the computer readable program runs in the processor to execute the steps.
The invention also discloses a display system of the virtual reality video barrage, which comprises:
the association module is used for recording the playing time point of the current virtual reality video when the barrage is sent and the Pan value and Tilt value of the virtual reality video frame at the center point of the current screen, wherein the Pan value represents the visual angle range of the frame in the horizontal direction, and the Tilt value represents the visual angle range of the frame in the vertical direction;
the bullet screen appearance control module is used for controlling the bullet screen to appear at the position of the current screen center point;
and the bullet screen disappearance control module is used for controlling the bullet screen to disappear at the positions of the Pan value and Tilt value of the virtual reality video picture frame at the central point of the current screen when the bullet screen is sent.
According to an embodiment of the display system of the virtual reality video barrage of the present invention, the system further includes:
and the bullet screen input module is used for automatically converting the bullet screen into characters after keyboard typing input or voice input.
According to an embodiment of the display system of the virtual reality video barrage, in the association module, a Pan value and a Tilt value of a virtual reality video picture frame at the center point of a current screen are obtained in a finger dragging mode.
According to an embodiment of the display system of the virtual reality video barrage, in the association module, a Pan value and a Tilt value of a virtual reality video frame at the center point of a current screen are obtained in a gyroscope induction mode.
Compared with the prior art, the invention has the following beneficial effects: the invention relates the barrage with the time and space information, and controls the position of the VR barrage on the screen and the position of the VR barrage disappearing on the screen based on the related information. The traditional bullet screen display method of the non-VR video cannot be applied to the field of VR video bullet screen display, bullet screen display does not exist in the prior art of the VR video, and the invention has a pioneering design in the field so that the bullet screen can be perfectly displayed in the VR video.
Drawings
The above features and advantages of the present disclosure will be better understood upon reading the detailed description of embodiments of the disclosure in conjunction with the following drawings. In the drawings, components are not necessarily drawn to scale, and components having similar relative characteristics or features may have the same or similar reference numerals.
Fig. 1 shows a flowchart of an embodiment of a display method of a virtual reality video bullet screen of the present invention.
FIG. 2 illustrates a schematic diagram of an embodiment of a virtual reality video bullet screen display system of the present invention.
Fig. 3 shows a diagram of Pan and Tilt values of a picture frame of a center point of a finger dragging capture screen.
Fig. 4 and 5 show schematic views of controlling the position where the bullet screen appears and disappears on the screen.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. It is noted that the aspects described below in connection with the figures and the specific embodiments are only exemplary and should not be construed as imposing any limitation on the scope of the present invention.
Embodiment of display method of virtual reality video barrage
Fig. 1 shows a flow of an embodiment of a display method of a virtual reality video bullet screen of the present invention. Referring to fig. 1, the following is a detailed description of specific implementation steps of the display method of the present embodiment.
Step S1: and the VR barrage is associated with the time and space information.
VR barrage generally refers to the barrage information of characters, and is generally inputted through a user keyboard or automatically converted into characters after being inputted by a user voice.
The association operation here refers to recording a Pan value and a Tilt value of a VR video frame at a current VR video playing time point and a current screen center point when the barrage is sent, where the Pan value represents a frame horizontal direction view angle range, and the Tilt value represents a frame vertical direction view angle range.
There are many ways in which the above-described association can be achieved, and two of them are exemplified by the present invention. The first example is to acquire a Pan value and a Tilt value of a VR video picture frame at the center point of a current screen by means of finger dragging.
As shown in fig. 3, when playing a VR panorama (the panorama is a picture with a horizontal view angle within 360 degrees and a vertical view angle within 180 degrees), Pan and Tilt values at the center of the playing window correspond to Pan and Tilt values on the panorama one-to-one. Assuming that the current values of Pan and Tilt at the center of the playing window are denoted as CurPan and CurTilt, respectively, then when the finger is pressed down on the screen and moved by distances dx, dy, the moving response dPan, dTilt on the panorama is known, and then the new values of Pan and Tilt at the center of the playing window are denoted as NewPan and NewTilt, then:
NewPan=CurPan+dPan
NewTilt=CurTilt+dTilt
and simultaneously updating the current values of Pan and Tilt in the center of the playing window:
CurPan=NewPan
CurTilt=NewTilt
a simple formula for calculating dPan, dTilt from dx, dy is as follows:
dPan=dx*coefficient
dTilt=dy*coefficient
wherein coefficient is an empirical coefficient related to the pixel density of the mobile phone screen, for example, the value is 0.75.
A second example is to change the Pan and Tilt values of the picture frame at the center point of the screen by means of gyro sensing. A gyroscope, also called an angular velocity sensor, is different from an accelerometer (G-sensor) and measures a rotation angular velocity when a physical quantity is a yaw or a tilt. The mobile phone cannot measure or reconstruct complete 3D actions by only using an accelerometer, the G-sensor can only detect axial linear actions when the rotary actions cannot be measured. However, the gyroscope can well measure the rotation and deflection actions, so that the actual actions of a user can be accurately analyzed and judged, and then the mobile phone is correspondingly operated according to the actions.
The gyroscope in the mobile phone is called a micro-mechanical gyroscope and is also called a three-axis gyroscope, namely, the position, the moving track and the acceleration in 6 directions are simultaneously measured.
When the mobile phone rotates, a corresponding matrix M is obtained according to the measured information of the mobile phone gyroscope:
Figure BDA0001332158650000061
and calculating the Euler angle corresponding to the matrix M according to the conversion relation between the matrix and the Euler angle:
Yaw=atan2(-m20,m00)
Pitch=asin(m10)
Roll=atan2(-m12,m11)
the Euler angles yaw and pitch correspond to Pan, Tilt values over the panorama.
Step S2: and controlling the bullet screen to appear at the position of the center point of the current screen.
Step S3: and controlling the bullet screen to disappear at the positions of the Pan value and Tilt value of the virtual reality video picture frame at the center point of the current screen when the bullet screen is sent.
As shown in fig. 4 and 5, the bullet screen appears from the center point of the screen (point B in fig. 4, i.e., the center point of the current screen where the user watches VR video) and drifts to the position (point a in fig. 4) where the text is transmitted, and disappears after a few seconds of pause. In the process of barrage drifting, because VR video content is 360 degrees diversified multi-angle, user's sight also can begin to move from the screen center along with the direction that the barrage drifted. The position where the bullet screen stops drifting is the position where the bullet screen is sent, and the content of the bullet screen is matched with the content of the picture frame. The viewer can find out the interested content source through the bullet screen display mode, and the requirements of accurately acquiring information and exchanging information among users with the same interest point are met.
Another benefit of the bullet screen emerging from the center point of the screen is: when split screen is watched in VR glasses, the same bullet screen appears from the center of the screen of the left screen and the right screen at the same time, and moves to the same direction and disappears at the same time. The moving range of the barrage is also in the respective screen area, and no double image exists under the condition of binocular viewing.
Embodiments of a display system for a virtual reality video bullet screen
Fig. 2 illustrates the principles of an embodiment of the virtual reality video bullet screen display system of the present invention. Referring to fig. 2, the display system of the present embodiment includes: the system comprises a correlation module 1, a bullet screen appearance control module 2 and a bullet screen disappearance control module 3.
The association module 1 associates the VR barrage with time and space information, that is, records a Pan value and a Tilt value of a virtual reality video frame of a current virtual reality video playing time point and a current screen center point when the barrage is sent, wherein the Pan value represents a frame horizontal direction view angle range, and the Tilt value represents a frame vertical direction view angle range.
In addition, the system also comprises a bullet screen input module 4, wherein the VR bullet screen generally refers to bullet screen information of characters, and is generally input through a keyboard of a user or automatically converted into characters after being input by voice of the user.
There are many ways in which the above-described association can be achieved, and two of them are exemplified by the present invention. The first example is to acquire a Pan value and a Tilt value of a VR video picture frame at the center point of a current screen by means of finger dragging.
As shown in fig. 3, when playing a VR panorama (the panorama is a picture with a horizontal view angle within 360 degrees and a vertical view angle within 180 degrees), Pan and Tilt values at the center of the playing window correspond to Pan and Tilt values on the panorama one-to-one. Assuming that the current values of Pan and Tilt at the center of the playing window are denoted as CurPan and CurTilt, respectively, then when the finger is pressed down on the screen and moved by distances dx, dy, the moving response dPan, dTilt on the panorama is known, and then the new values of Pan and Tilt at the center of the playing window are denoted as NewPan and NewTilt, then:
NewPan=CurPan+dPan
NewTilt=CurTilt+dTilt
and simultaneously updating the current values of Pan and Tilt in the center of the playing window:
CurPan=NewPan
CurTilt=NewTilt
a simple formula for calculating dPan, dTilt from dx, dy is as follows:
dPan=dx*coefficient
dTilt=dy*coefficient
wherein coefficient is an empirical coefficient related to the pixel density of the mobile phone screen, for example, the value is 0.75.
A second example is to change the Pan and Tilt values of the picture frame at the center point of the screen by means of gyro sensing. A gyroscope, also called an angular velocity sensor, is different from an accelerometer (G-sensor) and measures a rotation angular velocity when a physical quantity is a yaw or a tilt. The mobile phone cannot measure or reconstruct complete 3D actions by only using an accelerometer, the G-sensor can only detect axial linear actions when the rotary actions cannot be measured. However, the gyroscope can well measure the rotation and deflection actions, so that the actual actions of a user can be accurately analyzed and judged, and then the mobile phone is correspondingly operated according to the actions.
The gyroscope in the mobile phone is called a micro-mechanical gyroscope and is also called a three-axis gyroscope, namely, the position, the moving track and the acceleration in 6 directions are simultaneously measured.
When the mobile phone rotates, a corresponding matrix M is obtained according to the measured information of the mobile phone gyroscope:
Figure BDA0001332158650000081
and calculating the Euler angle corresponding to the matrix M according to the conversion relation between the matrix and the Euler angle:
Yaw=atan2(-m20,m00)
Pitch=asin(m10)
Roll=atan2(-m12,m11)
the Euler angles yaw and pitch correspond to Pan, Tilt values over the panorama.
And the bullet screen appearance control module 2 controls the bullet screen to appear at the position of the current screen center point.
And the bullet screen disappearance control module 3 controls the bullet screen to disappear at the positions of the Pan value and Tilt value of the virtual reality video picture frame at the center point of the current screen when the bullet screen is sent.
As shown in fig. 4 and 5, the bullet screen appears from the center point of the screen (point B in fig. 4, i.e., the center point of the current screen where the user watches VR video) and drifts to the position (point a in fig. 4) where the text is transmitted, and disappears after a few seconds of pause. In the process of barrage drifting, because VR video content is 360 degrees diversified multi-angle, user's sight also can begin to move from the screen center along with the direction that the barrage drifted. The position where the bullet screen stops drifting is the position where the bullet screen is sent, and the content of the bullet screen is matched with the content of the picture frame. The viewer can find out the interested content source through the bullet screen display mode, and the requirements of accurately acquiring information and exchanging information among users with the same interest point are met.
Another benefit of the bullet screen emerging from the center point of the screen is: when split screen is watched in VR glasses, the same bullet screen appears from the center of the screen of the left screen and the right screen at the same time, and moves to the same direction and disappears at the same time. The moving range of the barrage is also in the respective screen area, and no double image exists under the condition of binocular viewing.
Furthermore, the invention also discloses a computer-readable medium, on which a computer-readable program is stored, which program is adapted to carry out the method steps as described in the foregoing.
The invention also discloses a computer system, which comprises a processor and a computer readable medium, wherein the computer readable medium is stored with a computer readable program, the processor reads the computer readable program on the medium, and the program is operated to execute the steps of the method.
While, for purposes of simplicity of explanation, the methodologies are shown and described as a series of acts, it is to be understood and appreciated that the methodologies are not limited by the order of acts, as some acts may, in accordance with one or more embodiments, occur in different orders and/or concurrently with other acts from that shown and described herein or not shown and described herein, as would be understood by one skilled in the art.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks (disks) usually reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A display method of a virtual reality video bullet screen comprises the following steps:
step 1: recording a current virtual reality video playing time point when the barrage is sent and a Pan value and a Tilt value of a virtual reality video picture frame at the current screen center point, wherein the Pan value represents a picture frame horizontal direction visual angle range, and the Tilt value represents a picture frame vertical direction visual angle range;
step 2: controlling the bullet screen to appear at the position of the center point of the current screen of the VR video watched by the user;
and step 3: and controlling the bullet screen to disappear at the positions of the Pan value and Tilt value of the virtual reality video picture frame at the center point of the current screen when the bullet screen is sent.
2. The method of claim 1, wherein the bullet screen is automatically converted to text after being typed or spoken by a keyboard.
3. The method for displaying the virtual reality video bullet screen according to claim 1, wherein in step 1, Pan value and Tilt value of the virtual reality video frame at the center point of the current screen are obtained by means of finger dragging.
4. The method for displaying the virtual reality video barrage according to claim 1, wherein in step 1, Pan and Tilt values of the virtual reality video frame at the current screen center point are obtained by a gyroscope induction manner.
5. A computer-readable medium, characterized in that a computer-readable program is stored, which computer-readable program is adapted to carry out the steps of any of claims 1 to 4.
6. A computer system comprising a processor, a computer readable medium storing a computer readable program, the computer readable program running in the processor to perform the steps of any of claims 1 to 4.
7. A display system for a virtual reality video bullet screen, comprising:
the association module is used for recording the playing time point of the current virtual reality video when the barrage is sent and the Pan value and Tilt value of the virtual reality video frame at the center point of the current screen, wherein the Pan value represents the visual angle range of the frame in the horizontal direction, and the Tilt value represents the visual angle range of the frame in the vertical direction;
the bullet screen appearance control module is used for controlling the bullet screen to appear at the position of the center point of the current screen of the VR video watched by the user;
and the bullet screen disappearance control module is used for controlling the bullet screen to disappear at the positions of the Pan value and Tilt value of the virtual reality video picture frame at the central point of the current screen when the bullet screen is sent.
8. The virtual reality video barrage display system of claim 7, further comprising:
and the bullet screen input module is used for automatically converting the bullet screen into characters after keyboard typing input or voice input.
9. The display system of the virtual reality video barrage according to claim 7, wherein in the association module, Pan and Tilt values of the virtual reality video frame at the center point of the current screen are obtained by means of finger dragging.
10. The display system of the virtual reality video barrage according to claim 7, wherein in the association module, Pan and Tilt values of the virtual reality video frame at the current screen center point are obtained by a gyroscope induction manner.
CN201710493899.9A 2017-06-26 2017-06-26 Virtual reality video barrage display method, medium, and system Active CN109429087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710493899.9A CN109429087B (en) 2017-06-26 2017-06-26 Virtual reality video barrage display method, medium, and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710493899.9A CN109429087B (en) 2017-06-26 2017-06-26 Virtual reality video barrage display method, medium, and system

Publications (2)

Publication Number Publication Date
CN109429087A CN109429087A (en) 2019-03-05
CN109429087B true CN109429087B (en) 2021-03-02

Family

ID=65498817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710493899.9A Active CN109429087B (en) 2017-06-26 2017-06-26 Virtual reality video barrage display method, medium, and system

Country Status (1)

Country Link
CN (1) CN109429087B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113542846B (en) * 2020-04-21 2022-12-23 上海哔哩哔哩科技有限公司 AR barrage display method and device
CN114201039B (en) * 2020-09-18 2023-08-29 聚好看科技股份有限公司 Display device for realizing virtual reality
CN114257849B (en) * 2020-09-22 2023-06-02 华为技术有限公司 Barrage playing method, related equipment and storage medium
CN114374882A (en) * 2021-12-23 2022-04-19 咪咕文化科技有限公司 Barrage information processing method and device, terminal and computer-readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104575553A (en) * 2014-12-19 2015-04-29 百度时代网络技术(北京)有限公司 Method and device for generating barrage overlaid on playing object
CN105916001A (en) * 2016-05-12 2016-08-31 乐视控股(北京)有限公司 Video barrage display method and device
KR20170042258A (en) * 2016-11-10 2017-04-18 김영덕 System for providing shopping information based on augmented reality and control method thereof
CN106658146A (en) * 2016-12-28 2017-05-10 上海翌创网络科技股份有限公司 Bullet screen method based on virtual reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6486908B1 (en) * 1998-05-27 2002-11-26 Industrial Technology Research Institute Image-based method and system for building spherical panoramas

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104575553A (en) * 2014-12-19 2015-04-29 百度时代网络技术(北京)有限公司 Method and device for generating barrage overlaid on playing object
CN105916001A (en) * 2016-05-12 2016-08-31 乐视控股(北京)有限公司 Video barrage display method and device
KR20170042258A (en) * 2016-11-10 2017-04-18 김영덕 System for providing shopping information based on augmented reality and control method thereof
CN106658146A (en) * 2016-12-28 2017-05-10 上海翌创网络科技股份有限公司 Bullet screen method based on virtual reality

Also Published As

Publication number Publication date
CN109429087A (en) 2019-03-05

Similar Documents

Publication Publication Date Title
CN109429087B (en) Virtual reality video barrage display method, medium, and system
CN106331732B (en) Generate, show the method and device of panorama content
US9407964B2 (en) Method and system for navigating video to an instant time
EP2870771B1 (en) Augmentation of multimedia consumption
US10015527B1 (en) Panoramic video distribution and viewing
US20180018944A1 (en) Automated object selection and placement for augmented reality
CN111010510B (en) Shooting control method and device and electronic equipment
KR102575230B1 (en) Remote controlling apparatus, and method for operating the same
US11107195B1 (en) Motion blur and depth of field for immersive content production systems
US20180160194A1 (en) Methods, systems, and media for enhancing two-dimensional video content items with spherical video content
US20170201808A1 (en) System and method of broadcast ar layer
US20130260360A1 (en) Method and system of providing interactive information
US9854229B2 (en) Three-dimensional image processing apparatus and method for adjusting location of sweet spot for displaying multi-view image
CN110463195A (en) Method and apparatus for rendering timing text and figure in virtual reality video
CN102763061A (en) Systems and methods for navigating a three-dimensional media guidance application
US11094105B2 (en) Display apparatus and control method thereof
CN102780893A (en) Image processing apparatus and control method thereof
CN102945563A (en) Showing and interacting system and method for panoramic videos
US20180005440A1 (en) Universal application programming interface for augmented reality
WO2017032336A1 (en) System and method for capturing and displaying images
JP6147966B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
US11119567B2 (en) Method and apparatus for providing immersive reality content
CN203607077U (en) Virtual traveling machine
CN104737526A (en) Method and apparatus for recording video sequences
JP5660573B2 (en) Display control apparatus, display control method, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant