WO2024114461A1 - 视频播放方法及其装置 - Google Patents

视频播放方法及其装置 Download PDF

Info

Publication number
WO2024114461A1
WO2024114461A1 PCT/CN2023/133105 CN2023133105W WO2024114461A1 WO 2024114461 A1 WO2024114461 A1 WO 2024114461A1 CN 2023133105 W CN2023133105 W CN 2023133105W WO 2024114461 A1 WO2024114461 A1 WO 2024114461A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
information
barrage
bullet
display
Prior art date
Application number
PCT/CN2023/133105
Other languages
English (en)
French (fr)
Inventor
孙鑫
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2024114461A1 publication Critical patent/WO2024114461A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present application belongs to the technical field of electronic equipment, and specifically relates to a video playback method and device thereof.
  • the video application when a video application is switched to background playback, the video application can be displayed in the form of a small window.
  • the displayed content is not comprehensive, resulting in low efficiency in video information transmission.
  • the purpose of the embodiments of the present application is to provide a video playback method and device thereof, which can solve the problem of low efficiency of video information transmission.
  • an embodiment of the present application provides a video playback method, the method comprising:
  • the video playback interface includes bullet screen information of the target video
  • a video playback window for playing the target video and a bullet screen display window for displaying bullet screen information of the target video are displayed on the upper layer of the current foreground running interface.
  • an embodiment of the present application provides a video playback device, the device comprising:
  • a first display module used to display a video playback interface for playing a target video; the video playback interface includes bullet screen information of the target video;
  • the second display module is used to display a video playback window for playing the target video and a barrage display window for displaying barrage information of the target video on the upper layer of the current foreground running interface when the target video is switched to background playback.
  • an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instruction stored in the memory and executable on the processor, wherein the program or instruction, when executed by the processor, implements the steps of the method described in the first aspect.
  • an embodiment of the present application provides a readable storage medium, on which a program or instruction is stored, and when the program or instruction is executed by a processor, the steps of the method described in the first aspect are implemented.
  • an embodiment of the present application provides a chip, the chip comprising a processor and a communication interface, the communication interface The communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the method described in the first aspect.
  • an embodiment of the present application provides a computer program product, which is stored in a storage medium and is executed by at least one processor to implement the method described in the first aspect.
  • a video playback window and a bullet screen display window are displayed on the upper layer of the current foreground running interface, wherein the video playback window is used to play the target video, and the bullet screen display window is used to display the bullet screen information of the target video.
  • the target video is switched to the background for playback, the target video is played through the video playback window, and the bullet screen information of the target video is displayed through the bullet screen display window, thereby improving the information transmission efficiency of the video.
  • FIG1 is a flow chart of a video playback method provided by an embodiment of the present application.
  • FIG. 2 is one of the application scenario diagrams of the video playback method provided in an embodiment of the present application
  • FIG3 is a second application scenario diagram of the video playback method provided in an embodiment of the present application.
  • FIG4 is a third application scenario diagram of the video playback method provided in an embodiment of the present application.
  • FIG5 is a fourth application scenario diagram of the video playback method provided in an embodiment of the present application.
  • FIG6 is a structural diagram of a video playback device provided in an embodiment of the present application.
  • FIG7 is a structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG8 is a hardware structure diagram of the electronic device provided in an embodiment of the present application.
  • first, second, etc. in the specification and claims of this application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It should be understood that the data used in this way can be interchangeable under appropriate circumstances, so that the embodiments of the present application can be implemented in an order other than those illustrated or described here, and the objects distinguished by "first”, “second”, etc. are generally of one type, and the number of objects is not limited.
  • the first object can be one or more.
  • “and/or” in the specification and claims represents at least one of the connected objects, and the character “/" generally indicates that the objects associated with each other are in an "or” relationship.
  • the present application provides a video playback method. Please refer to FIG1, which is a flow chart of the video playback method provided by the present application.
  • the video playback method provided by the present application includes the following steps:
  • the video playback method provided in this embodiment can be applied to an electronic device, and a video playback interface is displayed on the display interface of the electronic device, wherein the video playback interface is used to play the target video and display the barrage information of the target video.
  • the above-mentioned bullet-chat information can be displayed in the form of subtitles on the video playback interface.
  • the above-mentioned bullet-chat information can be displayed in the form of scrolling display on the video playback interface.
  • the video playback interface includes a window for displaying the bullet-chat information. This step does not limit the specific display method of the bullet-chat information.
  • a video playback window for playing the target video and a bullet screen display window for displaying bullet screen information of the target video are displayed on the upper layer of the current foreground running interface.
  • the switched interface is determined as the foreground running interface, and the target video is switched to background playback.
  • the video playback window is displayed on the upper layer of the foreground running interface, and the barrage display window is displayed.
  • the above-mentioned video playback window is used to play the target video
  • the above-mentioned barrage display window is used to display the barrage information of the target video.
  • the label D1 in Figure 2 is used to represent the video playback window, and the label D2 is used to represent the barrage display window.
  • the video playback window D1 plays the target video
  • the barrage display window D2 displays the barrage information related to the target video, namely "Bullet screen 1", "Bullet screen 2", “Bullet screen 3" and "Bullet screen 4" in Figure 2.
  • a video playback window and a bullet screen display window are displayed on the upper layer of the current foreground running interface, wherein the video playback window is used to play the target video, and the bullet screen display window is used to display the bullet screen information of the target video.
  • the target video is switched to the background for playback, the target video is played through the video playback window, and the bullet screen information of the target video is displayed through the bullet screen display window, thereby improving the information transmission efficiency of the video.
  • the step of displaying a bullet screen display window for displaying bullet screen information of the target video on an upper layer of the current foreground running interface includes:
  • a bullet screen display window for displaying bullet screen information of the target video is displayed on the upper layer of the current foreground running interface.
  • a video playback window is displayed on the upper layer of the current foreground running interface, wherein the video playback window is used to play the target video.
  • a bullet screen display window is displayed on an upper layer of a current foreground running interface, wherein the bullet screen display window is used to display bullet screen information of a target video.
  • the pointing object of the above-mentioned first input is the video playback window.
  • the above-mentioned first input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the step of displaying a bullet screen display window for displaying bullet screen information of the target video on an upper layer of the current foreground running interface includes:
  • a video processing unit for displaying the target video is displayed on the upper layer of the current foreground running interface.
  • the barrage display window of the barrage information is displayed on the upper layer of the current foreground running interface.
  • the first display parameters of the video playback window displayed on the foreground running interface are obtained, and the second display parameters of the barrage display window are determined based on the above first display parameters, and then the barrage display window is displayed.
  • the first display parameter includes at least one of the following: a first display size, a first display position; and the second display parameter includes at least one of the following: a second display size, a second display position.
  • An optional implementation is that, when the display parameters include display size and display position, the first display size is determined as the second display size, that is, the display size of the video playback window is the same as the display parameter of the bullet screen display window.
  • the second display position is determined based on the first display position, and the video playback window and the bullet screen display window are set as adjacent windows.
  • Another optional implementation is that, when the display parameters include display size and display position, the product of the first display size and 0.8 is determined as the second display size, that is, the video playback window is reduced by 20% as a whole, and the display size of the video playback window after the reduction by 20% is determined as the second display size.
  • the second display position is determined based on the first display position, and the video playback window and the bullet screen display window are set as adjacent windows.
  • the display size and display position of the bullet screen display window may also be determined through other implementations, which are not specifically limited here.
  • the second display parameter of the bullet screen display window is determined by the first display parameter of the video playback window, so as to adjust the display size and/or display position of the bullet screen display window.
  • the method further comprises:
  • the barrage information of the target video in the barrage display window is updated and displayed.
  • the pointing object of the above-mentioned second input is the bullet screen display window.
  • the above-mentioned second input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the target transparency is determined based on the second input to the bullet screen display window, and the transparency of the bullet screen information displayed in the bullet screen display window is adjusted to the target transparency, so that the transparency of the bullet screen information can be flexibly adjusted based on the second input.
  • the method further comprises:
  • the barrage information of the target video corresponding to the target barrage progress is displayed in the barrage display window.
  • the pointing object of the above-mentioned third input is the bullet screen display window.
  • the above-mentioned third input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the target bullet screen progress is determined based on the third input to the bullet screen display window, the bullet screen information is located according to the target bullet screen progress, and the bullet screen information is displayed in the bullet screen display window.
  • the method further comprises:
  • the bullet screen display window is canceled.
  • the fourth input points to the bullet screen display window.
  • the fourth input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the bullet screen display window is cancelled and only the video playback window is displayed on the upper layer of the foreground running interface.
  • the method further comprises:
  • the bullet screen information of the target video corresponding to the input information is displayed in the bullet screen display window.
  • the pointing object of the fifth input is the bullet screen display window.
  • the fifth input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the bullet screen editing area is displayed.
  • An optional implementation is to display the bullet screen editing area in the bullet screen display window; another optional implementation is to display the bullet screen editing area outside the bullet screen display window.
  • the sixth input points to the bullet screen editing area.
  • the sixth input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the input information corresponding to the sixth input is determined, and the bullet screen information corresponding to the input information is displayed in the bullet screen display window.
  • the application scenario shown in Figure 3 includes a video playback window and a barrage display window.
  • the barrage editing area D21 is displayed in the barrage display window. Based on the user's sixth input in the barrage editing area D21, the input information is determined, and then the barrage information corresponding to the input information is displayed in the barrage display window, that is, the barrage information "Haha” displayed in the barrage display window.
  • the user displays the barrage editing area through the fifth input, and then displays the barrage information in the barrage display window through the sixth input of the barrage editing area, thereby sending the barrage information of the target video in the above manner, increasing the interaction method between the user and the target video.
  • the receiving a fifth input to the bullet screen display window includes:
  • the step of displaying the bullet screen information of the target video corresponding to the input information in the bullet screen display window includes:
  • the second barrage information corresponding to the input information is displayed in the barrage display window; the second barrage information includes the input information and the barrage information identifier corresponding to the first barrage information.
  • the barrage display window displays at least one barrage information, and each barrage information includes a corresponding barrage information identifier.
  • the barrage information identifier is used to represent the publishing object of the corresponding barrage information.
  • the bullet-screen information when the fifth input of the bullet-screen information in the bullet-screen display window is received, the bullet-screen information is determined as the first bullet-screen information, and then the bullet-screen input area is displayed; the input information is determined based on the sixth input of the user in the bullet-screen input area, wherein the input information includes but is not limited to text information, picture information or other types of information. Further, the second bullet-screen information corresponding to the input information is displayed in the bullet-screen display window, wherein the second bullet-screen information includes the input information and the bullet-screen information identifier corresponding to the first bullet-screen information.
  • the first barrage information is determined to be "barrage 1"
  • the publishing object "user 1" corresponding to "barrage 1" is determined as the barrage information identifier
  • the input information is determined, and then the second barrage information is displayed in the barrage display window.
  • the content of the second barrage information includes "Reply to User 1: You are right.”
  • the user determines the barrage information to be replied by executing the fifth input on the first barrage information, and then generates the second barrage information displayed in the barrage display area based on the sixth input to the barrage editing area, wherein the second barrage information is the user's reply content to the first barrage information, thereby realizing the reply to the barrage information and increasing the interaction between the user and the barrage information.
  • the method further includes: receiving a seventh input of the third bullet chat information in the bullet chat display window;
  • the barrage information released by the first barrage release object is displayed in the first display area.
  • the object pointed to by the seventh input is the barrage information in the barrage display window, and the barrage information is determined as the third barrage information.
  • the seventh input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the first display area is displayed in the bullet message display window.
  • the first display area is associated with the first bullet message publishing object corresponding to the third bullet message, and the first display area is used to display the bullet message information published by the first bullet message publishing object.
  • the barrage information published by the publishing object corresponding to the third barrage information is displayed in the first display area, so that all the barrage information published by the publishing object can be conveniently viewed.
  • the method further comprises:
  • the barrage information released by the first barrage release object and the barrage information released by the second barrage release object corresponding to the fourth barrage information are displayed in the first display area.
  • the object pointed to by the above-mentioned eighth input is the barrage information in the barrage display window, and the above-mentioned barrage information is determined as the fourth barrage information.
  • the above-mentioned eighth input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the second barrage release object when the first display area displays the barrage information released by the first barrage release object, based on the eighth input of the fourth barrage information in the barrage display window, the second barrage release object is also displayed in the first display area.
  • the target video is a live video
  • the method further comprises:
  • a temporary conversation window is displayed; the temporary conversation window is used to conduct a conversation with the third barrage publishing object corresponding to the fifth barrage information.
  • the object pointed to by the ninth input is the barrage information in the barrage display window.
  • the barrage information can be determined as the fifth barrage information.
  • the ninth input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • a session is established between the user and the publishing object corresponding to the fifth barrage information, that is, a temporary session window is displayed, and the user can communicate with the publishing object corresponding to the fifth barrage information, that is, the third barrage publishing object, through the temporary session window.
  • the identifier D3 shown in Figure 5 represents a temporary session window, and the temporary session window D3 does not intersect with the video playback window, and the temporary session window D3 does not intersect with the bullet screen display window.
  • the temporary session window can be displayed in the bullet screen display window.
  • a session is established between the user and the publishing object corresponding to the barrage information, so that the user can quickly communicate with the publishing object.
  • the method further comprises:
  • the bullet screen display window and the video playback window are cancelled, and the video playback interface is displayed.
  • the tenth input points to the bullet screen display window and the video playback window.
  • the tenth input includes but is not limited to touch input, sliding input, gesture input, voice input or other types of input, which are not specifically limited here.
  • the bullet screen display window and the video playback window are cancelled and the video playback interface is displayed.
  • the tenth input is a sliding input.
  • the bullet screen display window and the video playback window are cancelled, and the video playback interface is displayed in full screen.
  • the video playback device 600 includes:
  • the first display module 601 is used to display a video playback interface for playing a target video; the video playback interface includes bullet screen information of the target video;
  • the second display module 602 is used to display a video playback window for playing the target video and a barrage display window for displaying barrage information of the target video on the upper layer of the current foreground running interface when the target video is switched to background playback.
  • the second display module 602 is specifically used to:
  • a bullet screen display window for displaying bullet screen information of the target video is displayed on the upper layer of the current foreground running interface.
  • the second display module 602 is further specifically used for:
  • the first display parameter includes at least one of the following: a first display size and a first display position
  • the second display parameter includes at least one of the following: a second display size and a second display position
  • a bullet screen display window for displaying the bullet screen information of the target video is displayed on the upper layer of the current foreground running interface.
  • the video playback device 600 further includes:
  • a first receiving module used for receiving a second input to the bullet screen display window
  • a first determination module configured to determine target transparency in response to the second input
  • the first updating module is used to update the bullet screen information of the target video in the bullet screen display window according to the target transparency.
  • the video playback device 600 further includes:
  • a second receiving module used for receiving a third input to the bullet screen display window
  • a second determination module configured to determine a target bullet screen progress in response to the third input
  • the third display module is used to display the barrage information of the target video corresponding to the target barrage progress in the barrage display window.
  • the video playback device 600 further includes:
  • a third receiving module used to receive a fourth input to the bullet screen display window
  • a processing module is used to cancel the display of the bullet screen display window in response to the fourth input.
  • the video playback device 600 further includes:
  • a fourth receiving module used for receiving a fifth input to the bullet screen display window
  • a fourth display module configured to display a bullet screen editing area in response to the fifth input
  • a fifth receiving module used for receiving a sixth input in the bullet comment editing area
  • a third determining module configured to determine input information in response to the sixth input
  • the fifth display module is used to display the target video corresponding to the input information in the bullet screen display window. Barrage information.
  • the fourth receiving module is specifically configured to:
  • the fifth display module is specifically used for:
  • the second barrage information corresponding to the input information is displayed in the barrage display window; the second barrage information includes the input information and the barrage information identifier corresponding to the first barrage information.
  • the video playback device 600 further includes:
  • a sixth receiving module used for receiving a seventh input of the third bullet screen information in the bullet screen display window
  • a sixth display module configured to display, in response to the seventh input, a first display area associated with a first barrage publishing object corresponding to the third barrage information in the barrage display window;
  • the seventh display module is used to display the barrage information released by the first barrage release object in the first display area.
  • the video playback device 600 further includes:
  • a seventh receiving module used for receiving an eighth input of the fourth bullet screen information in the bullet screen display window
  • An eighth display module is used to display the barrage information released by the first barrage release object and the barrage information released by the second barrage release object corresponding to the fourth barrage information in the first display area in response to the eighth input.
  • the target video is a live video
  • the video playback device 600 further includes:
  • An eighth receiving module configured to receive a ninth input of the fifth bullet-screen information in the bullet-screen display window
  • the ninth display module is used to display a temporary conversation window in response to the ninth input; the temporary conversation window is used to conduct a conversation with the third bullet screen publishing object corresponding to the fifth bullet screen information.
  • the video playback device 600 further includes:
  • a ninth receiving module used for receiving a tenth input from a user on the bullet screen display window and the video playback window;
  • the tenth display module is used to cancel the bullet screen display window and the video playback window and display the video playback interface in response to the tenth input.
  • a video playback window and a bullet screen display window are displayed on the upper layer of the current foreground running interface, wherein the video playback window is used to play the target video, and the bullet screen display window is used to display the bullet screen information of the target video.
  • the target video is switched to the background for playback, the target video is played through the video playback window, and the bullet screen information of the target video is displayed through the bullet screen display window, thereby improving the information transmission efficiency of the video.
  • the video playback device in the embodiment of the present application can be an electronic device, or a component in the electronic device, such as an integrated circuit or a chip.
  • the electronic device can be a terminal, or a device other than a terminal.
  • the electronic device can be a mobile phone, a tablet computer, a laptop computer, a PDA, a vehicle-mounted electronic device, a mobile Internet device (Mobile Internet Device, MID), an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) device, a robot, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant, PDA), etc. It can also be a server,
  • the embodiments of the present application do not specifically limit the network attached storage (NAS), personal computer (PC), television (TV), ATM or self-service machine, etc.
  • the video playback device in the embodiment of the present application may be a device having an operating system.
  • the operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiment of the present application.
  • the embodiment of the present application provides a video playback device that can implement each process implemented by the method embodiment of Figure 1. To avoid repetition, they are not described here.
  • an embodiment of the present application also provides an electronic device 700, including a processor 701, a memory 702, and a program or instruction stored in the memory 702 and executable on the processor 701.
  • a processor 701 a memory 702
  • the program or instruction is executed by the processor 701
  • each process of the above-mentioned video playback method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, it will not be repeated here.
  • the electronic devices in the embodiments of the present application include the mobile electronic devices and non-mobile electronic devices mentioned above.
  • FIG8 is a schematic diagram of the hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, and a processor 810 and other components.
  • the electronic device 800 may also include a power source (such as a battery) for supplying power to each component, and the power source may be logically connected to the processor 810 through a power management system, so that the power management system can manage charging, discharging, and power consumption management.
  • a power source such as a battery
  • the electronic device structure shown in FIG8 does not constitute a limitation on the electronic device, and the electronic device may include more or fewer components than shown, or combine certain components, or arrange components differently, which will not be described in detail here.
  • the display unit 806 is also used to display a video playback interface for playing the target video
  • a video playback window for playing the target video and a bullet screen display window for displaying bullet screen information of the target video are displayed on the upper layer of the current foreground running interface.
  • the user input unit 807 is further configured to receive a first input to the video playback window when the target video is switched to background playback;
  • the display unit 806 is also used to display a bullet screen display window for displaying the bullet screen information of the target video on the upper layer of the current foreground running interface in response to the first input.
  • the processor 810 is further configured to determine, when the target video is switched to background playback, the second display parameter of the bullet screen display window according to the first display parameter of the video playback window;
  • the display unit 806 is also used to display a bullet screen display window for displaying the bullet screen information of the target video on the upper layer of the current foreground running interface according to the second display parameter.
  • the user input unit 807 is further used to receive a second input to the bullet screen display window
  • the processor 810 is further configured to determine target transparency in response to the second input
  • the display unit 806 is also used to update the bullet screen information of the target video in the bullet screen display window according to the target transparency.
  • the user input unit 807 is further configured to receive a third input to the bullet screen display window
  • the processor 810 is further configured to determine the target bullet screen progress in response to the third input;
  • the display unit 806 is further used to display the barrage information of the target video corresponding to the target barrage progress in the barrage display window.
  • the user input unit 807 is further used to receive a fourth input to the bullet screen display window
  • the display unit 806 is further configured to cancel display of the bullet screen display window in response to the fourth input.
  • the user input unit 807 is further configured to receive a fifth input to the bullet screen display window
  • the display unit 806 is further configured to display a bullet comment editing area in response to the fifth input;
  • the user input unit 807 is further used to receive a sixth input in the bullet comment editing area
  • the processor 810 is further configured to determine input information in response to the sixth input;
  • the display unit 806 is further used to display the bullet screen information of the target video corresponding to the input information in the bullet screen display window.
  • the user input unit 807 is further used to receive a fifth input of the first bullet screen information in the bullet screen display window;
  • the display unit 806 is further configured to display second bullet screen information corresponding to the input information in the bullet screen display window.
  • the display unit 806 is further configured to display, in response to the seventh input, a first display area associated with a first barrage publishing object corresponding to the third barrage information in the barrage display window;
  • the barrage information released by the first barrage release object is displayed in the first display area.
  • the user input unit 807 is further configured to receive an eighth input of the fourth bullet-screen information in the bullet-screen display window;
  • the display unit 806 is further used to display the barrage information released by the first barrage release object and the barrage information released by the second barrage release object corresponding to the fourth barrage information in the first display area in response to the eighth input.
  • the user input unit 807 is further configured to receive a ninth input of the fifth bullet-screen information in the bullet-screen display window;
  • the display unit 806 is further configured to display a temporary conversation window in response to the ninth input.
  • the user input unit 807 is further used to receive a tenth input from the user to the bullet screen display window and the video playback window;
  • the display unit 806 is further used to cancel the bullet screen display window and the video playback window in response to the tenth input, and display the video playback interface.
  • a video playback window and a bullet screen display window are displayed on the upper layer of the current foreground running interface, wherein the video playback window is used to play the target video, and the bullet screen display window is used to display the bullet screen information of the target video.
  • the target video is switched to the background for playback, the target video is played through the video playback window, and the bullet screen display window displays the bullet screen information of the target video. information, thereby improving the efficiency of video information transmission.
  • the input unit 804 may include a graphics processing unit (GPU) 8041 and a microphone 8042, and the graphics processor 8041 processes the image data of the static picture or video obtained by the image capture device (such as a camera) in the video capture mode or the image capture mode.
  • the display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a liquid crystal display, an organic light emitting diode, etc.
  • the user input unit 806 includes at least one of the display panel 8061 and other input devices 8062.
  • the display panel 8061 is also called a touch screen.
  • the display panel 8061 may include two parts: a touch detection device and a touch controller.
  • Other input devices 8062 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, etc.), a trackball, a mouse, and a joystick, which will not be repeated here.
  • the memory 809 can be used to store software programs and various data.
  • the memory 809 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
  • the memory 809 may include a volatile memory or a non-volatile memory, or the memory 809 may include both volatile and non-volatile memories.
  • the non-volatile memory may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or a flash memory.
  • the volatile memory may be a random access memory (RAM), a static random access memory (SRAM), a dynamic random access memory (DRAM), a synchronous dynamic random access memory (SDRAM), a double data rate synchronous dynamic random access memory (DDRSDRAM), an enhanced synchronous dynamic random access memory (ESDRAM), a synchronous link dynamic random access memory (SLDRAM) and a direct memory bus random access memory (DRRAM).
  • the memory 809 in the embodiment of the present application includes but is not limited to these and any other suitable types of memory.
  • the processor 810 may include one or more processing units; optionally, the processor 810 integrates an application processor and a modem processor, wherein the application processor mainly processes operations related to an operating system, a user interface, and application programs, and the modem processor mainly processes wireless communication signals, such as a baseband processor. It is understandable that the modem processor may not be integrated into the processor 810.
  • An embodiment of the present application also provides a readable storage medium, on which a program or instruction is stored.
  • a program or instruction is stored.
  • each process of the above-mentioned video playback method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, it will not be repeated here.
  • the processor is the processor in the electronic device described in the above embodiment.
  • the readable storage medium includes a computer readable storage medium, such as a computer read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk.
  • the present application also provides a chip, which includes a processor and a communication interface, wherein the communication interface is coupled to the processor, and the processor is used to run a program or instruction to implement each of the above-mentioned video playback method embodiments.
  • the process is the same and can achieve the same technical effect. To avoid repetition, it will not be described here.
  • the chip mentioned in the embodiments of the present application can also be called a system-level chip, a system chip, a chip system or a system-on-chip chip, etc.
  • An embodiment of the present application provides a computer program product, which is stored in a storage medium.
  • the program product is executed by at least one processor to implement the various processes of the above-mentioned video playback method embodiment and can achieve the same technical effect. To avoid repetition, it will not be repeated here.
  • the technical solution of the present application can be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, a disk, or an optical disk), and includes a number of instructions for a terminal (which can be a mobile phone, a computer, a server, or a network device, etc.) to execute the methods described in each embodiment of the present application.
  • a storage medium such as ROM/RAM, a disk, or an optical disk
  • a terminal which can be a mobile phone, a computer, a server, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供了一种视频播放方法及其装置,属于电子设备技术领域。上述视频播放方法包括:显示用于播放目标视频的视频播放界面;视频播放界面中包括目标视频的弹幕信息;在目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放目标视频的视频播放窗口,以及用于显示目标视频的弹幕信息的弹幕显示窗口。

Description

视频播放方法及其装置
相关申请的交叉引用
本申请主张在2022年11月29日在中国提交的中国专利申请No.202211511206.1的优先权,其全部内容通过引用包含于此。
技术领域
本申请属于电子设备技术领域,具体涉及一种视频播放方法及其装置。
背景技术
在相关技术中,在视频应用切换至后台播放的情况下,视频应用可以以小窗的形式显示。但是,由于显示尺寸受限,显示内容不全面,导致视频的信息传递效率低。
发明内容
本申请实施例的目的是一种视频播放方法及其装置,能够解决视频的信息传递效率低的问题。
第一方面,本申请实施例提供了一种视频播放方法,该方法包括:
显示用于播放目标视频的视频播放界面;所述视频播放界面中包括所述目标视频的弹幕信息;
在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
第二方面,本申请实施例提供了一种视频播放装置,该装置包括:
第一显示模块,用于显示用于播放目标视频的视频播放界面;所述视频播放界面中包括所述目标视频的弹幕信息;
第二显示模块,用于在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
第三方面,本申请实施例提供了一种电子设备,该电子设备包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面所述的方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面所述的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通 信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
第六方面,本申请实施例提供一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现如第一方面所述的方法。
本申请实施例中,在目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示视频播放窗口以及显示弹幕显示窗口,其中,视频播放窗口用于播放目标视频,弹幕显示窗口用于显示目标视频的弹幕信息。本申请实施例中,在目标视频切换至后台播放的情况下,通过视频播放窗口播放目标视频,并通过弹幕显示窗口显示目标视频的弹幕信息,以此提高视频的信息传递效率。
附图说明
图1是本申请实施例提供的视频播放方法的流程图;
图2是本申请实施例提供的视频播放方法的应用场景图之一;
图3是本申请实施例提供的视频播放方法的应用场景图之二;
图4是本申请实施例提供的视频播放方法的应用场景图之三;
图5是本申请实施例提供的视频播放方法的应用场景图之四;
图6是本申请实施例提供的视频播放装置的结构图;
图7是本申请实施例提供的电子设备的结构图;
图8是本申请实施例提供的电子设备的硬件结构图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”等所区分的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
本申请实施例提供了一种视频播放方法,请参阅图1,图1是本申请实施例提供的视频播放方法的流程图。本申请实施例提供的视频播放方法包括以下步骤:
S101,显示用于播放目标视频的视频播放界面。
本实施例提供的视频播放方法可以应用于电子设备,在电子设备的显示界面显示视频播放界面,其中,上述视频播放界面用于播放目标视频,以及显示目标视频的弹幕信息。
可选地,上述弹幕信息可以以字幕的方式显示在视频播放界面。可选地,上述弹幕信息可以以滚动显示的方式,显示在视频播放界面。可选地,视频播放界面包括一个窗口,该窗口用于显示弹幕信息。本步骤在此不对弹幕信息的具体显示方式进行限定。
S102,在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
在从视频播放界面切换至其他界面的情况下,将切换后的界面确定为前台运行界面,并将目标视频切换至后台播放,这种情况下,在前台运行界面的上层显示视频播放窗口,以及显示弹幕显示窗口。
其中,上述视频播放窗口用于播放目标视频,上述弹幕显示窗口用于显示目标视频的弹幕信息。
为便于理解,请参阅图2,图2中的标识D1用于表示视频播放窗口,标识D2用于表示弹幕显示窗口,其中,视频播放窗口D1播放目标视频,弹幕显示窗口D2显示有该目标视频相关来的弹幕信息,即图2中的“弹幕1”、“弹幕2”、“弹幕3”和“弹幕4”。
本申请实施例中,在目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示视频播放窗口以及显示弹幕显示窗口,其中,视频播放窗口用于播放目标视频,弹幕显示窗口用于显示目标视频的弹幕信息。本申请实施例中,在目标视频切换至后台播放的情况下,通过视频播放窗口播放目标视频,并通过弹幕显示窗口显示目标视频的弹幕信息,以此提高视频的信息传递效率。
可选地,在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于显示所述目标视频的弹幕信息的弹幕显示窗口的步骤,包括:
在所述目标视频切换至后台播放的情况下,接收对所述视频播放窗口的第一输入;
响应于所述第一输入,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
本实施例中,在目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示视频播放窗口,其中,上述视频播放窗口用于播放目标视频。
在接收到对视频播放窗口的第一输入的情况下,响应于该第一输入,在当前的前台运行界面的上层显示弹幕显示窗口,其中,弹幕显示窗口用于显示目标视频的弹幕信息。
应理解,上述第一输入的指向对象为视频播放窗口,上述第一输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
可选地,在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于显示所述目标视频的弹幕信息的弹幕显示窗口的步骤,包括:
在所述目标视频切换至后台播放的情况下,根据所述视频播放窗口的第一显示参数,确定所述弹幕显示窗口的第二显示参数;
根据所述第二显示参数,在当前的前台运行界面的上层,显示用于显示所述目标视频 的弹幕信息的弹幕显示窗口。
本实施例中,在目标视频切换至后台播放的情况下,获取前台运行界面显示的视频播放窗口的第一显示参数,并基于上述第一显示参数确定弹幕显示窗口的第二显示参数,进而显示弹幕显示窗口。
应理解,上述第一显示参数包括以下至少一项:第一显示尺寸、第一显示位置;第二显示参数包括以下至少一项:第二显示尺寸、第二显示位置。
一种可选地实施方式为,在显示参数包括显示尺寸和显示位置的情况下,将第一显示尺寸确定为第二显示尺寸,即视频播放窗口的显示尺寸与弹幕显示窗口的显示参数相同。基于第一显示位置确定第二显示位置,将视频播放窗口与弹幕显示窗口设置为相邻的窗口。
另一种可选地实施方式为,在显示参数包括显示尺寸和显示位置的情况下,将第一显示尺寸与0.8的乘积确定为第二显示尺寸,即对视频播放窗口整体缩小20%,将缩小20%后的视频播放窗口的显示尺寸确定为第二显示尺寸。基于第一显示位置确定第二显示位置,将视频播放窗口与弹幕显示窗口设置为相邻的窗口。
应理解,还可以通过其他实施方式确定弹幕显示窗口的显示尺寸和显示位置,在此不进行具体限定。
本实施例中,通过视频播放窗口的第一显示参数,确定弹幕显示窗口的第二显示参数,以此调整弹幕显示窗口的显示尺寸和/或显示位置。
可选地,所述方法还包括:
接收对所述弹幕显示窗口的第二输入;
响应于所述第二输入,确定目标透明度;
根据所述目标透明度,更新显示所述弹幕显示窗口中的所述目标视频的弹幕信息。
上述第二输入的指向对象为弹幕显示窗口,上述第二输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,基于对弹幕显示窗口的第二输入确定目标透明度,并将弹幕显示窗口显示的弹幕信息的透明度调整为该目标透明度,以此基于第二输入,灵活的调整弹幕信息的透明度。
可选地,所述方法还包括:
接收对所述弹幕显示窗口的第三输入;
响应于所述第三输入,确定目标弹幕进度;
在所述弹幕显示窗口中显示所述目标弹幕进度对应的所述目标视频的弹幕信息。
上述第三输入的指向对象为弹幕显示窗口,上述第三输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,基于对弹幕显示窗口的第三输入确定目标弹幕进度,根据目标弹幕进度定位弹幕信息,并在弹幕显示窗口显示该弹幕信息。
可选地,所述方法还包括:
接收对所述弹幕显示窗口的第四输入;
响应于所述第四输入,取消显示所述弹幕显示窗口。
上述第四输入的指向对象为弹幕显示窗口,上述第四输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,在接收到对弹幕显示窗口的第四输入的情况下,取消显示弹幕显示窗口,只在前台运行界面的上层显示视频播放窗口。
可选地,所述方法还包括:
接收对所述弹幕显示窗口的第五输入;
响应于所述第五输入,显示弹幕编辑区域;
接收在所述弹幕编辑区域的第六输入;
响应于所述第六输入,确定输入信息;
在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的弹幕信息。
上述第五输入的指向对象为弹幕显示窗口,上述第五输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,在接收到对弹幕显示窗口的第五输入的情况下,显示弹幕编辑区域。一种可选地实施方式为,在弹幕显示窗口显示弹幕编辑区域;另一种可选地实施方式为,在弹幕显示窗口之外显示弹幕编辑区域。
上述第六输入的指向对象为弹幕编辑区域,上述第六输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,在接收到对弹幕编辑区域的第六输入的情况下,确定第六输入对应的输入信息,并在弹幕显示窗口显示该输入信息对应的弹幕信息。
为便于理解,请参阅图3,图3示出的应用场景包括视频播放窗口和弹幕显示窗口,在弹幕显示窗口显示弹幕编辑区域D21,基于用户在弹幕编辑区域D21的第六输入,确定输入信息,进而在弹幕显示窗口显示该输入信息对应的弹幕信息,即弹幕显示窗口显示的弹幕信息“哈哈”。
本实施例中,用户通过第五输入显示弹幕编辑区域,进而通过对弹幕编辑区域的第六输入在弹幕显示窗口显示弹幕信息,以此通过上述方式发送目标视频的弹幕信息,增加用户与目标视频的交互方式。
可选地,所述接收对所述弹幕显示窗口的第五输入,包括:
接收对所述弹幕显示窗口中的第一弹幕信息的第五输入;
所述在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的弹幕信息,包括:
在所述弹幕显示窗口中显示所述输入信息对应的第二弹幕信息;所述第二弹幕信息包括所述输入信息以及所述第一弹幕信息对应的弹幕信息标识。
需要说明的是,弹幕显示窗口显示有至少一个弹幕信息,且每个弹幕信息包括对应的弹幕信息标识,可选地,上述弹幕信息标识用于表征对应的弹幕信息的发布对象。
本实施例中,在接收到对弹幕显示窗口中的弹幕信息的第五输入的情况下,将该弹幕信息确定为第一弹幕信息,进而显示弹幕输入区域;基于用户在该弹幕输入区域的第六输入确定输入信息,其中,上述输入信息包括但不限于文本信息、图片信息或其他类型的信息。进一步的,在弹幕显示窗口中显示输入信息对应的第二弹幕信息,其中,上述第二弹幕信息包括输入信息以及第一弹幕信息对应的弹幕信息标识。
为便于理解,请参阅图4,在图4示出的应用场景中,基于用户在弹幕显示窗口的第五输入,确定第一弹幕信息为“弹幕1”,将“弹幕1”对应的发布对象“用户1“确定为弹幕信息标识;基于用户在弹幕编辑区域的第六输入,确定输入信息,进而在弹幕显示窗口显示第二弹幕信息,该第二弹幕信息的内容包括“回复用户1:你说的对”。
本实施例中,用户通过对第一弹幕信息执行第五输入,确定要回复的弹幕信息,进而基于对弹幕编辑区域的第六输入,生成在弹幕显示区域显示的第二弹幕信息,其中,第二弹幕信息为用户对第一弹幕信息的回复内容,以此实现对弹幕信息的回复,增加用户与弹幕信息之间的互动。
可选地,所述方法还包括:接收对所述弹幕显示窗口中的第三弹幕信息的第七输入;
响应于所述第七输入,在所述弹幕显示窗口中显示与所述第三弹幕信息对应的第一弹幕发布对象相关联的第一显示区域;
在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息。
上述第七输入的指向对象为弹幕显示窗口中的弹幕信息,将上述弹幕信息确定为第三弹幕信息,上述第七输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,基于用户对弹幕显示窗口中的第三弹幕信息的第七输入,在弹幕显示窗口显示第一显示区域。其中,上述第一显示区域与第三弹幕信息对应的第一弹幕发布对象相关联,且第一显示区域用于显示第一弹幕发布对象发布的弹幕信息。
本实施例中,通过对弹幕显示窗口中的第三弹幕信息的第七输入,在第一显示区域显示第三弹幕信息对应的发布对象发布的弹幕信息,以此可以便捷的查看该发布对象发布的全部弹幕信息。
可选地,所述方法还包括:
接收对所述弹幕显示窗口中的第四弹幕信息的第八输入;
响应于所述第八输入,在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息和所述第四弹幕信息对应的第二弹幕发布对象发布的弹幕信息。
上述第八输入的指向对象为弹幕显示窗口中的弹幕信息,将上述弹幕信息确定为第四弹幕信息,上述第八输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,在第一显示区域显示第一弹幕发布对象发布的弹幕信息的情况下,基于对弹幕显示窗口中的第四弹幕信息的第八输入,在第一显示区域还显示第二弹幕发布对象 发布的弹幕信息,其中,第二弹幕发布对象为第四弹幕信息对应的发布对象。也就是说,基于上述第八输入,在第一显示区域显示第一弹幕发布对象发布的弹幕信息,以及第二弹幕发布对象发布的弹幕信息。
本实施例中,通过对弹幕显示窗口中的第三弹幕信息的第七输入,以及对弹幕显示窗口中的第四弹幕信息的第八输入,在第一显示区域同时显示上述两个弹幕信息对应的发布对象发布的全部弹幕信息。
可选地,所述目标视频为直播视频;
所述方法还包括:
接收对所述弹幕显示窗口中的第五弹幕信息的第九输入;
响应于所述第九输入,显示临时会话窗口;所述临时会话窗口用于与所述第五弹幕信息对应的第三弹幕发布对象进行会话。
上述第九输入的指向对象为弹幕显示窗口中的弹幕信息,可以将上述弹幕信息确定为第五弹幕信息,上述第九输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,在接收到用户对弹幕显示窗口中的第五弹幕信息的第九输入的情况下,建立用户与第五弹幕信息对应的发布对象之间的会话,即显示临时会话窗口,用户可以通过该临时会话窗口与第五弹幕信息对应的发布对象即第三弹幕发布对象交流。
为便于理解,请参阅图5,图5示出的标识D3表示临时会话窗口,上述临时会话窗口D3与视频播放窗口不存在相交区域,且临时会话窗口D3与弹幕显示窗口也不存在相交区域。在其他实施方式中,可以在弹幕显示窗口内显示临时会话窗口。
本实施例中,基于用户对弹幕显示窗口中的弹幕信息的第九输入,建立用户与该弹幕信息对应的发布对象之间的会话,以使得用户可以快速的与该发布对象之间进行会话交流。
可选地,所述方法还包括:
接收用户对所述弹幕显示窗口和所述视频播放窗口的第十输入;
响应于所述第十输入,取消所述弹幕显示窗口和所述视频播放窗口,显示所述视频播放界面。
上述第十输入的指向对象为弹幕显示窗口和视频播放窗口,上述第十输入包括但不限于触控输入、滑动输入、手势输入、语音输入或其他类型的输入,在此不做具体限定。
本实施例中,在接收到用户对弹幕显示窗口和视频播放窗口的第十输入的情况下,取消显示弹幕显示窗口和视频播放窗口,显示视频播放界面。
示例性的,上述第十输入为滑动输入,在用户将弹幕显示窗口拖动至视频播放窗口的情况下,取消显示弹幕显示窗口和视频播放窗口,并以全屏显示的方式显示视频播放界面。
下面结合附图,通过具体的实施例及其应用场景对本申请实施例提供的视频播放装置进行详细地说明。
如图6所示,视频播放装置600包括:
第一显示模块601,用于显示用于播放目标视频的视频播放界面;所述视频播放界面中包括所述目标视频的弹幕信息;
第二显示模块602,用于在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
可选地,所述第二显示模块602,具体用于:
在所述目标视频切换至后台播放的情况下,接收对所述视频播放窗口的第一输入;
响应于所述第一输入,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
可选地,所述第二显示模块602,还具体用于:
在所述目标视频切换至后台播放的情况下,根据所述视频播放窗口的第一显示参数,确定所述弹幕显示窗口的第二显示参数;所述第一显示参数包括以下至少一项:第一显示尺寸、第一显示位置;所述第二显示参数包括以下至少一项:第二显示尺寸、第二显示位置;
根据所述第二显示参数,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
可选地,所述视频播放装置600还包括:
第一接收模块,用于接收对所述弹幕显示窗口的第二输入;
第一确定模块,用于响应于所述第二输入,确定目标透明度;
第一更新模块,用于根据所述目标透明度,更新显示所述弹幕显示窗口中的所述目标视频的弹幕信息。
可选地,所述视频播放装置600还包括:
第二接收模块,用于接收对所述弹幕显示窗口的第三输入;
第二确定模块,用于响应于所述第三输入,确定目标弹幕进度;
第三显示模块,用于在所述弹幕显示窗口中显示所述目标弹幕进度对应的所述目标视频的弹幕信息。
可选地,所述视频播放装置600还包括:
第三接收模块,用于接收对所述弹幕显示窗口的第四输入;
处理模块,用于响应于所述第四输入,取消显示所述弹幕显示窗口。
可选地,所述视频播放装置600还包括:
第四接收模块,用于接收对所述弹幕显示窗口的第五输入;
第四显示模块,用于响应于所述第五输入,显示弹幕编辑区域;
第五接收模块,用于接收在所述弹幕编辑区域的第六输入;
第三确定模块,用于响应于所述第六输入,确定输入信息;
第五显示模块,用于在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的 弹幕信息。
可选地,所述第四接收模块,具体用于:
接收对所述弹幕显示窗口中的第一弹幕信息的第五输入;
所述第五显示模块,具体用于:
在所述弹幕显示窗口中显示所述输入信息对应的第二弹幕信息;所述第二弹幕信息包括所述输入信息以及所述第一弹幕信息对应的弹幕信息标识。
可选地,所述视频播放装置600还包括:
第六接收模块,用于接收对所述弹幕显示窗口中的第三弹幕信息的第七输入;
第六显示模块,用于响应于所述第七输入,在所述弹幕显示窗口中显示与所述第三弹幕信息对应的第一弹幕发布对象相关联的第一显示区域;
第七显示模块,用于在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息。
可选地,所述视频播放装置600还包括:
第七接收模块,用于接收对所述弹幕显示窗口中的第四弹幕信息的第八输入;
第八显示模块,用于响应于所述第八输入,在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息和所述第四弹幕信息对应的第二弹幕发布对象发布的弹幕信息。
可选地,所述目标视频为直播视频;
所述视频播放装置600还包括:
第八接收模块,用于接收对所述弹幕显示窗口中的第五弹幕信息的第九输入;
第九显示模块,用于响应于所述第九输入,显示临时会话窗口;所述临时会话窗口用于与所述第五弹幕信息对应的第三弹幕发布对象进行会话。
可选地,所述视频播放装置600还包括:
第九接收模块,用于接收用户对所述弹幕显示窗口和所述视频播放窗口的第十输入;
第十显示模块,用于响应于所述第十输入,取消所述弹幕显示窗口和所述视频播放窗口,显示所述视频播放界面。
本申请实施例中,在目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示视频播放窗口以及显示弹幕显示窗口,其中,视频播放窗口用于播放目标视频,弹幕显示窗口用于显示目标视频的弹幕信息。本申请实施例中,在目标视频切换至后台播放的情况下,通过视频播放窗口播放目标视频,并通过弹幕显示窗口显示目标视频的弹幕信息,以此提高视频的信息传递效率。
本申请实施例中的视频播放装置可以是电子设备,也可以是电子设备中的部件、例如集成电路或芯片。该电子设备可以是终端,也可以为除终端之外的其他设备。示例性的,电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、移动上网装置(Mobile Internet Device,MID)、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、机器人、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,还可以为服务器、 网络附属存储器(Network Attached Storage,NAS)、个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例中的视频播放装置可以为具有操作系统的装置。该操作系统可以为安卓(Android)操作系统,可以为iOS操作系统,还可以为其他可能的操作系统,本申请实施例不作具体限定。
本申请实施例提供视频播放装置能够实现图1的方法实施例实现的各个过程,为避免重复,这里不再赘述。
可选地,如图7所示,本申请实施例还提供一种电子设备700,包括处理器701,存储器702,存储在存储器702上并可在所述处理器701上运行的程序或指令,该程序或指令被处理器701执行时实现上述视频播放方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,本申请实施例中的电子设备包括上述所述的移动电子设备和非移动电子设备。
图8为实现本申请实施例的一种电子设备的硬件结构示意图。
该电子设备800包括但不限于:射频单元801、网络模块802、音频输出单元803、输入单元804、传感器805、显示单元806、用户输入单元807、接口单元808、存储器809、以及处理器810等部件。
本领域技术人员可以理解,电子设备800还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器810逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图8中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
其中,显示单元806,还用于显示用于播放目标视频的视频播放界面;
在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
其中,用户输入单元807,还用于在所述目标视频切换至后台播放的情况下,接收对所述视频播放窗口的第一输入;
显示单元806,还用于响应于所述第一输入,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
其中,处理器810,还用于在所述目标视频切换至后台播放的情况下,根据所述视频播放窗口的第一显示参数,确定所述弹幕显示窗口的第二显示参数;
显示单元806,还用于根据所述第二显示参数,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口的第二输入;
处理器810,还用于响应于所述第二输入,确定目标透明度;
显示单元806,还用于根据所述目标透明度,更新显示所述弹幕显示窗口中的所述目标视频的弹幕信息。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口的第三输入;
处理器810,还用于响应于所述第三输入,确定目标弹幕进度;
显示单元806,还用于在所述弹幕显示窗口中显示所述目标弹幕进度对应的所述目标视频的弹幕信息。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口的第四输入;
显示单元806,还用于响应于所述第四输入,取消显示所述弹幕显示窗口。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口的第五输入;
显示单元806,还用于响应于所述第五输入,显示弹幕编辑区域;
用户输入单元807,还用于接收在所述弹幕编辑区域的第六输入;
处理器810,还用于响应于所述第六输入,确定输入信息;
显示单元806,还用于在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的弹幕信息。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口中的第一弹幕信息的第五输入;
显示单元806,还用于在所述弹幕显示窗口中显示所述输入信息对应的第二弹幕信息。
其中,显示单元806,还用于响应于所述第七输入,在所述弹幕显示窗口中显示与所述第三弹幕信息对应的第一弹幕发布对象相关联的第一显示区域;
在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口中的第四弹幕信息的第八输入;
显示单元806,还用于响应于所述第八输入,在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息和所述第四弹幕信息对应的第二弹幕发布对象发布的弹幕信息。
其中,用户输入单元807,还用于接收对所述弹幕显示窗口中的第五弹幕信息的第九输入;
显示单元806,还用于响应于所述第九输入,显示临时会话窗口。
其中,用户输入单元807,还用于接收用户对所述弹幕显示窗口和所述视频播放窗口的第十输入;
显示单元806,还用于响应于所述第十输入,取消所述弹幕显示窗口和所述视频播放窗口,显示所述视频播放界面。
本申请实施例中,在目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示视频播放窗口以及显示弹幕显示窗口,其中,视频播放窗口用于播放目标视频,弹幕显示窗口用于显示目标视频的弹幕信息。本申请实施例中,在目标视频切换至后台播放的情况下,通过视频播放窗口播放目标视频,并通过弹幕显示窗口显示目标视频的弹幕信 息,以此提高视频的信息传递效率。
应理解的是,本申请实施例中,输入单元804可以包括图形处理器(Graphics Processing Unit,GPU)8041和麦克风8042,图形处理器8041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元806可包括显示面板8061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板8061。用户输入单元806包括显示面板8061以及其他输入设备8062中的至少一种。显示面板8061,也称为触摸屏。显示面板8061可包括触摸检测装置和触摸控制器两个部分。其他输入设备8062可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
存储器809可用于存储软件程序以及各种数据。存储器809可主要包括存储程序或指令的第一存储区和存储数据的第二存储区,其中,第一存储区可存储操作系统、至少一个功能所需的应用程序或指令(比如声音播放功能、图像播放功能等)等。此外,存储器809可以包括易失性存储器或非易失性存储器,或者,存储器809可以包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDRSDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DRRAM)。本申请实施例中的存储器809包括但不限于这些和任意其它适合类型的存储器。
处理器810可包括一个或多个处理单元;可选的,处理器810集成应用处理器和调制解调处理器,其中,应用处理器主要处理涉及操作系统、用户界面和应用程序等的操作,调制解调处理器主要处理无线通信信号,如基带处理器。可以理解的是,上述调制解调处理器也可以不集成到处理器810中。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述视频播放方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器(ROM)、随机存取存储器(RAM)、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述视频播放方法实施例的各个 过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
本申请实施例提供一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现上述视频播放方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (20)

  1. 一种视频播放方法,包括:
    显示用于播放目标视频的视频播放界面;所述视频播放界面中包括所述目标视频的弹幕信息;
    在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
  2. 根据权利要求1所述的方法,其中,在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于显示所述目标视频的弹幕信息的弹幕显示窗口的步骤,包括:
    在所述目标视频切换至后台播放的情况下,接收对所述视频播放窗口的第一输入;
    响应于所述第一输入,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
  3. 根据权利要求1所述的方法,其中,在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于显示所述目标视频的弹幕信息的弹幕显示窗口的步骤,包括:
    在所述目标视频切换至后台播放的情况下,根据所述视频播放窗口的第一显示参数,确定所述弹幕显示窗口的第二显示参数;所述第一显示参数包括以下至少一项:第一显示尺寸、第一显示位置;所述第二显示参数包括以下至少一项:第二显示尺寸、第二显示位置;
    根据所述第二显示参数,在当前的前台运行界面的上层,显示用于显示所述目标视频的弹幕信息的弹幕显示窗口。
  4. 根据权利要求1所述的方法,所述方法还包括:
    接收对所述弹幕显示窗口的第二输入;
    响应于所述第二输入,确定目标透明度;
    根据所述目标透明度,更新显示所述弹幕显示窗口中的所述目标视频的弹幕信息。
  5. 根据权利要求1所述的方法,所述方法还包括:
    接收对所述弹幕显示窗口的第三输入;
    响应于所述第三输入,确定目标弹幕进度;
    在所述弹幕显示窗口中显示所述目标弹幕进度对应的所述目标视频的弹幕信息。
  6. 根据权利要求1所述的方法,所述方法还包括:
    接收对所述弹幕显示窗口的第四输入;
    响应于所述第四输入,取消显示所述弹幕显示窗口。
  7. 根据权利要求1所述的方法,所述方法还包括:
    接收对所述弹幕显示窗口的第五输入;
    响应于所述第五输入,显示弹幕编辑区域;
    接收在所述弹幕编辑区域的第六输入;
    响应于所述第六输入,确定输入信息;
    在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的弹幕信息。
  8. 根据权利要求7所述的方法,其中,所述接收对所述弹幕显示窗口的第五输入,包括:
    接收对所述弹幕显示窗口中的第一弹幕信息的第五输入;
    所述在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的弹幕信息,包括:
    在所述弹幕显示窗口中显示所述输入信息对应的第二弹幕信息;所述第二弹幕信息包括所述输入信息以及所述第一弹幕信息对应的弹幕信息标识。
  9. 根据权利要求1所述的方法,所述方法还包括:
    接收对所述弹幕显示窗口中的第三弹幕信息的第七输入;
    响应于所述第七输入,在所述弹幕显示窗口中显示与所述第三弹幕信息对应的第一弹幕发布对象相关联的第一显示区域;
    在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息。
  10. 根据权利要求9所述的方法,所述方法还包括:
    接收对所述弹幕显示窗口中的第四弹幕信息的第八输入;
    响应于所述第八输入,在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息和所述第四弹幕信息对应的第二弹幕发布对象发布的弹幕信息。
  11. 根据权利要求1所述的方法,其中,所述目标视频为直播视频;
    所述方法还包括:
    接收对所述弹幕显示窗口中的第五弹幕信息的第九输入;
    响应于所述第九输入,显示临时会话窗口;所述临时会话窗口用于与所述第五弹幕信息对应的第三弹幕发布对象进行会话。
  12. 根据权利要求1所述的方法,所述方法还包括:
    接收用户对所述弹幕显示窗口和所述视频播放窗口的第十输入;
    响应于所述第十输入,取消所述弹幕显示窗口和所述视频播放窗口,显示所述视频播放界面。
  13. 一种视频播放装置,包括:
    第一显示模块,用于显示用于播放目标视频的视频播放界面;所述视频播放界面中包括所述目标视频的弹幕信息;
    第二显示模块,用于在所述目标视频切换至后台播放的情况下,在当前的前台运行界面的上层显示用于播放所述目标视频的视频播放窗口,以及用于显示所述目标视频的弹幕信息的弹幕显示窗口。
  14. 根据权利要求13所述的装置,所述装置还包括:
    第四接收模块,用于接收对所述弹幕显示窗口的第五输入;
    第四显示模块,用于响应于所述第五输入,显示弹幕编辑区域;
    第五接收模块,用于接收在所述弹幕编辑区域的第六输入;
    第三确定模块,用于响应于所述第六输入,确定输入信息;
    第五显示模块,用于在所述弹幕显示窗口中显示所述输入信息对应的所述目标视频的弹幕信息。
  15. 根据权利要求14所述的装置,其中,所述第四接收模块,具体用于:
    接收对所述弹幕显示窗口中的第一弹幕信息的第五输入;
    所述第五显示模块,具体用于:
    在所述弹幕显示窗口中显示所述输入信息对应的第二弹幕信息;所述第二弹幕信息包括所述输入信息以及所述第一弹幕信息对应的弹幕信息标识。
  16. 根据权利要求13所述的装置,所述装置还包括:
    第六接收模块,用于接收对所述弹幕显示窗口中的第三弹幕信息的第七输入;
    第六显示模块,用于响应于所述第七输入,在所述弹幕显示窗口中显示与所述第三弹幕信息对应的第一弹幕发布对象相关联的第一显示区域;
    第七显示模块,用于在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息。
  17. 根据权利要求16所述的装置,,所述装置还包括:
    第七接收模块,用于接收对所述弹幕显示窗口中的第四弹幕信息的第八输入;
    第八显示模块,用于响应于所述第八输入,在所述第一显示区域显示所述第一弹幕发布对象发布的弹幕信息和所述第四弹幕信息对应的第二弹幕发布对象发布的弹幕信息。
  18. 根据权利要求13所述的装置,其中,所述目标视频为直播视频;
    所述装置还包括:
    第八接收模块,用于接收对所述弹幕显示窗口中的第五弹幕信息的第九输入;
    第九显示模块,用于响应于所述第九输入,显示临时会话窗口;所述临时会话窗口用于与所述第五弹幕信息对应的第三弹幕发布对象进行会话。
  19. 一种电子设备,包括处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1-12中任一项所述的视频播放方法的步骤。
  20. 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1-12中任一项所述的视频播放方法的步骤。
PCT/CN2023/133105 2022-11-29 2023-11-22 视频播放方法及其装置 WO2024114461A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211511206.1A CN115866314A (zh) 2022-11-29 2022-11-29 视频播放方法及其装置
CN202211511206.1 2022-11-29

Publications (1)

Publication Number Publication Date
WO2024114461A1 true WO2024114461A1 (zh) 2024-06-06

Family

ID=85667749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/133105 WO2024114461A1 (zh) 2022-11-29 2023-11-22 视频播放方法及其装置

Country Status (2)

Country Link
CN (1) CN115866314A (zh)
WO (1) WO2024114461A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866314A (zh) * 2022-11-29 2023-03-28 维沃移动通信有限公司 视频播放方法及其装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130330056A1 (en) * 2012-03-26 2013-12-12 Customplay Llc Identifying A Cinematic Technique Within A Video
CN107920270A (zh) * 2017-10-27 2018-04-17 努比亚技术有限公司 视频分屏播放控制方法、终端及计算机可读存储介质
CN110278475A (zh) * 2018-03-16 2019-09-24 优酷网络技术(北京)有限公司 弹幕信息的显示方法及装置
CN111124571A (zh) * 2019-12-13 2020-05-08 维沃移动通信有限公司 界面显示方法及电子设备
CN113495664A (zh) * 2020-04-02 2021-10-12 腾讯科技(深圳)有限公司 基于媒体信息流的信息展示方法、装置、设备及存储介质
CN115866314A (zh) * 2022-11-29 2023-03-28 维沃移动通信有限公司 视频播放方法及其装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4446728B2 (ja) * 2002-12-17 2010-04-07 株式会社リコー 複数のマルチメディア文書に格納された情報の表示法
CN103986962B (zh) * 2014-06-03 2016-03-02 合一网络技术(北京)有限公司 显示悬浮式播放窗口的方法和系统
CN110891198B (zh) * 2019-11-29 2021-06-15 腾讯科技(深圳)有限公司 视频播放提示、多媒体播放提示、弹幕处理方法和装置
CN113453028A (zh) * 2020-03-27 2021-09-28 阿里巴巴集团控股有限公司 信息显示方法、装置及电子设备
CN115065874A (zh) * 2022-06-20 2022-09-16 维沃移动通信有限公司 视频播放方法、装置、电子设备和可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130330056A1 (en) * 2012-03-26 2013-12-12 Customplay Llc Identifying A Cinematic Technique Within A Video
CN107920270A (zh) * 2017-10-27 2018-04-17 努比亚技术有限公司 视频分屏播放控制方法、终端及计算机可读存储介质
CN110278475A (zh) * 2018-03-16 2019-09-24 优酷网络技术(北京)有限公司 弹幕信息的显示方法及装置
CN111124571A (zh) * 2019-12-13 2020-05-08 维沃移动通信有限公司 界面显示方法及电子设备
CN113495664A (zh) * 2020-04-02 2021-10-12 腾讯科技(深圳)有限公司 基于媒体信息流的信息展示方法、装置、设备及存储介质
CN115866314A (zh) * 2022-11-29 2023-03-28 维沃移动通信有限公司 视频播放方法及其装置

Also Published As

Publication number Publication date
CN115866314A (zh) 2023-03-28

Similar Documents

Publication Publication Date Title
WO2024114461A1 (zh) 视频播放方法及其装置
CN113163050A (zh) 会话界面显示方法及装置
CN112099705A (zh) 投屏方法、装置及电子设备
CN112835485A (zh) 应用界面处理方法、装置、电子设备及可读存储介质
CN113285866B (zh) 信息发送方法、装置和电子设备
WO2023155877A1 (zh) 应用图标管理方法、装置和电子设备
WO2024037415A1 (zh) 投屏内容显示方法、装置、设备及存储介质
CN113282424A (zh) 信息引用方法、装置和电子设备
WO2022242515A1 (zh) 界面显示方法及装置
CN114385049A (zh) 消息处理方法、装置、设备和存储介质
CN112148167A (zh) 控件设置方法、装置和电子设备
WO2024114571A1 (zh) 信息显示方法、装置、电子设备和存储介质
CN115357158A (zh) 消息处理方法、装置、电子设备及存储介质
WO2024114530A1 (zh) 组件的显示方法、装置、电子设备及介质
WO2024153027A1 (zh) 应用显示方法和应用显示装置
WO2024109635A1 (zh) 界面显示方法及其装置
WO2024114529A1 (zh) 信息分享方法、装置、电子设备及可读存储介质
WO2024109731A1 (zh) 内容分享方法、装置、电子设备和可读存储介质
CN114374663A (zh) 消息处理方法和消息处理装置
WO2024078552A1 (zh) 后台应用的管理方法、装置、电子设备及介质
CN114327088A (zh) 消息发送方法、装置、电子设备及介质
WO2023151659A1 (zh) 应用图标显示方法、装置
WO2023030292A1 (zh) 多媒体文件的播放方法和装置
CN114827737A (zh) 图像生成方法、装置和电子设备
CN114866835A (zh) 弹幕显示方法、弹幕显示装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23896612

Country of ref document: EP

Kind code of ref document: A1