WO2019072096A1 - 视频直播中的互动方法、装置、系统及计算机可读存储介质 - Google Patents

视频直播中的互动方法、装置、系统及计算机可读存储介质 Download PDF

Info

Publication number
WO2019072096A1
WO2019072096A1 PCT/CN2018/108167 CN2018108167W WO2019072096A1 WO 2019072096 A1 WO2019072096 A1 WO 2019072096A1 CN 2018108167 W CN2018108167 W CN 2018108167W WO 2019072096 A1 WO2019072096 A1 WO 2019072096A1
Authority
WO
WIPO (PCT)
Prior art keywords
live video
interactive
type
video
group
Prior art date
Application number
PCT/CN2018/108167
Other languages
English (en)
French (fr)
Inventor
贺蕾
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019072096A1 publication Critical patent/WO2019072096A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present application relates to the field of video live broadcast technology, and in particular, to an interactive method, apparatus, system, and computer readable storage medium in a live video broadcast.
  • the live broadcast can be divided into text pictures and live broadcasts.
  • the live pictures are for live broadcasts of forums such as Tianya and Douban.
  • the live broadcasts are, for example, live sports events and news. Through live video, users can watch football matches, sports events, major events and news online in real time.
  • users can interact with the anchor or other users when watching live broadcasts or live broadcasts. For example, users can express their opinions by clicking on them. The attitude of the live event.
  • An example of the present application provides an interaction method in a video live broadcast, including: acquiring information of a current live video, where the information includes: a type of the current live video, group graphics interface data to be loaded; The interface data is obtained, and the group map corresponding to the type is obtained and used as a target group map; the target group map is sent to the terminal device, so that the terminal device displays the current live video according to the target group image.
  • the interface is rendered.
  • the example of the present application further provides an interaction method in a live video broadcast, including: receiving a target group map sent by a third-party live broadcast management platform, where the target group map is information of the third-party live broadcast management platform according to the current live broadcast video. Obtained, wherein the information includes: a type of the current live video, a group map interface data to be loaded, and the target group map is a group map corresponding to the type of the current live video; And rendering the display interface of the current live video to enable the user to interact based on the rendered display interface.
  • the example of the present application further provides an interactive device in a live video broadcast, comprising: an information acquiring module, configured to acquire information of a current live video, where the information includes: a type of the current live video, and a group map to be loaded.
  • the interface data acquisition module is configured to acquire a group map corresponding to the type and use the target group map according to the group map interface data, and a group map sending module, configured to send the target group map to the terminal device So that the terminal device renders the display interface of the current live video according to the target group map.
  • the application example also provides an interactive device in a live video broadcast, comprising: a target group map receiving module, configured to receive a target group map sent by a third party live broadcast management platform, wherein the target group map is the third party live broadcast
  • the management platform is obtained according to the information of the current live video, where the information includes: the type of the current live video, the group interface data to be loaded, and the target group map is corresponding to the type of the current live video.
  • a rendering module configured to render the display interface of the current live video according to the target group image, so that the user interacts based on the rendered display interface.
  • the present application further provides an interactive device in a video live broadcast, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: acquire information of a current live video, where The information includes: a type of the current live video, a group map interface data to be loaded, and a group map corresponding to the type is obtained according to the group map interface data, and is used as a target group map; Sending to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • the application example also provides an interactive device in a live video, comprising: a processor; a memory for storing processor executable instructions; wherein the processor is configured to: receive a target sent by a third party live management platform And the target group map is obtained by the third-party live broadcast management platform according to the information of the current live video, where the information includes: the current live broadcast video type, the group map interface data to be loaded,
  • the target group map is a group map corresponding to the type of the current live video; the display interface of the current live video is rendered according to the target group map, so that the user interacts based on the rendered display interface.
  • the example of the present application further provides a third-party live broadcast management platform, including: an interactive device in a live video broadcast according to the example of the present application, the device includes: an information acquiring module, configured to acquire information of a current live video, where The information includes: a type of the current live video, a group map interface data to be loaded, and a group map obtaining module, configured to acquire a group map corresponding to the type according to the group map interface data and serve as a target group map;
  • the image sending module is configured to send the target group image to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • the present application provides a terminal device, including: an interactive device in a live video broadcast according to the example of the present application, the device includes: a target group map receiving module, configured to receive a target group map sent by a third-party live broadcast management platform, where The target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live broadcast video type, the group map interface data to be loaded, and the target group.
  • the figure is a group map corresponding to the type of the current live video; the rendering module is configured to render the display interface of the current live video according to the target group image, so that the user performs the display interface based on the rendering interactive.
  • An example of the present application provides an interactive system in a video live broadcast, including: a third-party live broadcast management platform provided by the example of the present application, and a terminal device provided by the example of the present application.
  • the application example further provides a non-transitory computer readable storage medium storing one or more programs, when the one or more programs are executed by one device, causing the device Perform an interactive method in the live video of the example of the present application.
  • the application example further provides a computer program product, when an instruction in the computer program product is executed by a processor, performing an interactive method in a live video, the method comprising: acquiring information of a current live video, wherein The information includes: a type of the current live video, group graphics interface data to be loaded, and a group map corresponding to the type is obtained according to the group interface data, and is used as a target group map; The image is sent to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • the application example further provides a computer program product, when an instruction in the computer program product is executed by a processor, performing an interactive method in a live video, the method comprising: receiving a third party live broadcast management platform to send a target group map, wherein the target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live broadcast video type, the group map interface data to be loaded.
  • the target group map is a group map corresponding to the type of the current live video; the display interface of the current live video is rendered according to the target group map, so that the user interacts based on the rendered display interface. .
  • FIG. 1A is a schematic diagram of a system architecture involved in some examples of the present application.
  • FIG. 1B is a schematic flowchart of an interaction method in a live video broadcast according to some examples of the present application.
  • FIG. 2 is a schematic diagram showing the effect of displaying an interface in an example of the present application.
  • FIG. 3 is a schematic diagram of an interactive control in the related art
  • FIG. 4 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 5 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 6 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 7 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 8 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 9 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 10 is a schematic flowchart of an interaction method in a live video broadcast proposed by other examples of the present application.
  • FIG. 11 is a flowchart of a resource sent by a third-party live broadcast management platform in an example of the present application
  • FIG. 12 is a schematic structural diagram of an interaction device in a live video broadcast according to some examples of the present application.
  • FIG. 13 is a schematic structural diagram of an interaction device in a live video broadcast according to another example of the present application.
  • FIG. 14 is a schematic structural diagram of an interaction apparatus in a live video broadcast proposed by other examples of the present application.
  • FIG. 15 is a schematic structural diagram of an interaction apparatus in a live video broadcast proposed by other examples of the present application.
  • FIG. 16 is a structural block diagram of an interaction apparatus in a live video broadcast according to another example of the present application.
  • the present application proposes an interactive method, apparatus and system for live video broadcast, which can be applied to the system architecture shown in FIG. 1A.
  • the background server 110 provides social network services (eg, user registration, message, video generation, message transmission, video transmission, chat session generation, etc.) to a plurality of users through the third-party live broadcast management platform 102. Online publishing and other online social interactions) wherein the plurality of users operate their respective terminal devices 104 (e.g., terminal devices 104a-c).
  • the terminal device 104, the third-party live broadcast management platform 102, and the background server 110 interact through one or more networks 106.
  • the third-party live broadcast management platform 102 and the background server 110 may be integrated or may be separately configured.
  • the third party live broadcast management platform 102 can be implemented by the backend server 110, in which case each user connects to the backend via an application client 108 (e.g., application client 108a-c) executing on the terminal device 104. Server 110, thereby interacting with another user.
  • the backend server 110 identifies users in the network by their respective user identities, such as usernames, nicknames, or account identities.
  • the application client 108 can be a live application client.
  • a user may trigger a particular service by interacting with a user interface provided by the application client 108.
  • a user can open a live program, record a live video, or watch a live video.
  • FIG. 1B is a schematic flowchart of an interaction method in a live video broadcast proposed by some examples of the present application.
  • the third-party live broadcast management platform may be, for example, a background server device, which is not limited.
  • the executor of the present example is a third-party live broadcast management platform, and the third-party live broadcast management platform can be used to manage related data in a live video application in the terminal device, and the live broadcast application can be, for example, a Tencent video application. No restrictions.
  • FIG. 2 is a schematic diagram showing the effect of the display interface in the example of the present application, including: a first display interface 21 of the live video in the small screen mode, and a live video in the full screen mode.
  • the second display interface 22, the main exhibition hall 23, different display interfaces may be used to display videos broadcast by a plurality of broadcasters, for example, the broadcaster A in the first display interface 21 and the broadcaster B in the second display interface 22.
  • the video broadcast by the broadcaster A may be displayed in the first display interface 21 in real time, and simultaneously, and the video broadcast by the broadcaster B is displayed in the second display interface 22, wherein
  • the number of the first display interfaces 21 is not limited to one, and is not limited thereto.
  • the method includes:
  • S101 Acquire information of the current live video, where the information includes: a type of the current live video, and a group interface data to be loaded.
  • the background server device of the terminal device dynamically updates the current live video, such as a news video, and pushes the updated news video to the video live broadcast of the terminal device in real time. Show in the app.
  • the information of the current live video may be read by the third-party live broadcast management platform from the background server of the terminal device, where the information includes: the type of the current live video, and the group map interface data to be loaded.
  • the third-party live broadcast management platform may read the content and/or type tag of the current live video from the background server, and parse the content and/or type tags to obtain the type of the current live video.
  • the third-party live broadcast management platform may also directly read the content and/or type tag of the current live video from the video live broadcast application of the terminal device, where the content may be, for example, a voice message or text obtained by parsing the video frame.
  • the news does not limit this.
  • the content of the video broadcasted by different types of live video is different.
  • the type of the current live video can be classified into a common type, a disaster type, or a celebration type.
  • the type is not limited thereto, and the group is based on the diversity of the live video.
  • the figure may be, for example, a background picture, an illustration, or a special effect picture involved in displaying the broadcasted video in a live video application, and is not limited thereto.
  • the group image to be loaded is a group map corresponding to the type of the current live video. For example, if the current live video type is a disaster type, the background image in the corresponding group image is a black background, if the current live video type is a celebration. For the type, the background image in the corresponding group picture is a red background, and the group pictures corresponding to different types of videos are different or the same, which is not limited.
  • the group map to be loaded may be downloaded from the Internet by the third-party live broadcast management platform according to the type of the current live video, or may be read from the group library by the third-party live broadcast management platform based on a storage path from the local storage. There is no limit to this.
  • the group interface data to be loaded may be a corresponding Uniform Resource Locator (URL).
  • the local storage of the third-party live management platform is read from the group library based on a storage path, and the group interface data to be loaded is the storage path of the group library.
  • FIG. 3 is a schematic diagram of an interactive control in the related art.
  • the icon is rendered as a gesture of praise, and the user can praise the live video by operating the interactive control, and the interaction effect is relatively simple.
  • the information of the current live video may be obtained, where the information includes: the type of the current live video, and then the subsequent group image may be determined to be displayed based on the type, because different types of video are used.
  • the corresponding group maps are different. Therefore, the display effect and interactive effects of the live video broadcast are diversified, and the emotional atmosphere of the broadcasted video is more appropriately expressed, thereby improving the user experience.
  • S102 Acquire a group map corresponding to the type according to the group map interface data and use the target group map.
  • the group map corresponding to the type may be obtained from the group library according to the group map interface data and used as the target group map.
  • the target group map may include a background image, an illustrator, or a special effect picture corresponding to the type of the current live video, and is not limited thereto. Specifically, the target group map corresponds to the emotion conveyed in the type.
  • a group library can be pre-configured, and the group library is stored in the local storage.
  • the group interface data can be a storage path of the group library, and the third-party live management platform can directly access the local storage based on the storage path.
  • the group map corresponding to the type of the current live video is read, that is, the group gallery stores a plurality of types, and a group map corresponding to each type, wherein the multiple types may be performed by counting the types of the video.
  • the group library may be established in a statistical manner in advance, and the specific establishment process of the group library and the plurality of types may be referred to the following examples.
  • the timeliness of obtaining the corresponding group map can be guaranteed, and the sensory discomfort brought by the dynamic switching of the group graph to the user can be avoided, and the user experience is improved.
  • the group map corresponding to the type from the group library and as the target group map, since the group library is pre-statistically established, the matching of the video type and the corresponding group image is better. Improve the timeliness of the corresponding group maps and ensure the quality of the group maps.
  • S103 Send the target group image to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • the third-party live broadcast management platform sends the target group map to the terminal device in real time, and the terminal device loads and performs subsequent processing.
  • the subsequent processing procedure of the terminal device refer to the following examples, which are not described herein.
  • the method may further include:
  • S401 Obtain an icon of the interactive control corresponding to the type from the group library, and send the icon of the interactive control to the terminal device, where the interactive control is used to enable the user to interact with the current live video, and the different types of live video correspond to
  • the interactive controls are different, and the icons of different interactive controls are different.
  • the corresponding group map is determined based on the type of the current live video, but further based on the type.
  • the icon of the corresponding interactive control is determined, and the interactive control is used to enable the user to interact with the current live video.
  • the interactive controls corresponding to different types of live video are different, and the icons of different interactive controls are different.
  • the interactive control may be, for example, a control that causes the user to perform a praise operation, or a control that causes the user to trigger a comment operation, which is not limited thereto.
  • the interactive controls corresponding to different types of live video are different.
  • the corresponding interactive control is a control for the user to perform a blessing operation, if the current live video type is a celebration type.
  • the corresponding interactive control is a control for causing the user to perform a flowering operation, and the interactive controls corresponding to different types of videos are different or the same, and no limitation is imposed thereon.
  • S402 Obtain an interaction control identifier corresponding to a type of the current live video from the control identifier library, and send the interaction control identifier to the terminal device.
  • the corresponding interactive control identifier is used to enable the terminal device to load and arrange corresponding interactive controls in the live display interface of the video.
  • This identifier can be used to uniquely tag the corresponding interactive control.
  • the third-party live management platform may obtain the interactive control identifier corresponding to the type of the current live video from the control identifier library, or may be downloaded from the Internet, that is, the control identifier library stores multiple types. And the interaction control identifier corresponding to each type, the control identifier library may be established in a statistical manner in advance, and the specific establishment process of the control identifier library may refer to the following example.
  • the matching of the video type and the corresponding interactive control identifier is better. While obtaining the timeliness of the corresponding interactive control identifier, the interactive control corresponding to the interactive control identifier can be matched to the user's emotional demand for the currently broadcasted video.
  • the method may further include:
  • the preset time can be preset.
  • the preset time may be preset by the factory program of the third-party live management platform, or may be set by the user of the third-party live broadcast management platform according to their own needs, which is not limited.
  • the user can be the user to which the terminal device belongs, and/or other users on the Internet that watch the current live video.
  • the interaction information of the user to which the terminal device belongs can be counted, but also the interaction information of other users on the Internet of the current live video can be counted, so that the user can obtain various interactions. Information that enhances user interaction from another dimension.
  • the information of the current live video is obtained, where the information includes: the type of the current live video, the group interface data to be loaded, and the group map corresponding to the type is obtained according to the group interface data, and is used as the target group map.
  • sending the target group image to the terminal device so that the terminal device renders the display interface of the current live video according to the target group image, because the corresponding group image is determined and displayed according to the type of the current live video, because different types of video are used.
  • the corresponding group maps are different. Therefore, the display effect and interactive effects of the live video broadcast are diversified, and the emotional atmosphere of the broadcasted video is more appropriately expressed, thereby improving the user experience.
  • FIG. 6 is a schematic flowchart diagram of an interaction method in a live video broadcast proposed by other examples of the present application.
  • the method includes:
  • S601 Perform statistical classification on the types of videos to obtain multiple types.
  • the massive video data currently broadcast on the Internet can be crawled and counted, and multiple possible video types are divided based on the emotions conveyed by the video content, for example, if the video content is documentary Classes, weather forecasts, archaeology, etc., can determine the general type of video based on the emotions it conveys. If the video content is earthquake or disaster video, the video of the disaster type can be determined based on the emotions it conveys.
  • the video content is a commemorative class, a celebration class, and an award-winning class, and the video of the celebration type can be determined based on the emotions conveyed by the video content, and there is no limitation thereto.
  • S602 Determine a group map corresponding to each type and a corresponding interactive control, and determine an icon of the interactive control, and obtain a plurality of group maps, a plurality of interactive controls, and icons of the plurality of interactive controls.
  • the group diagram and the corresponding interactive control corresponding to each type may be determined by manual labeling, or may be based on the machine learning method, based on the existing group map and the crawled from the Internet.
  • the icons of the interactive controls determine the most appropriate group map and corresponding interactive controls for each type.
  • the background image in the corresponding group image may be marked as a blue background, and the corresponding interactive control is determined as an interactive control that enables the user to perform the like operation, and the icon of the interactive control
  • the type of the video is a disaster type
  • the background image in the corresponding group image may be marked as a black background or a gray background, and the corresponding interactive control is determined to enable the user to perform the blessing operation.
  • the interactive control, the icon of the interactive control can be, for example, "hands folded"; if the type of the video is a celebration type, the background image in the corresponding group picture can be marked as a red background, and the corresponding interactive control is determined to be the user.
  • An interactive control capable of performing a gift operation, and the icon of the interactive control may be, for example, a “gift-shaped icon”, which is not limited thereto.
  • the type of the video determined in the example of the present application is at least two types, and the corresponding group diagram is at least two types, and the corresponding interactive controls and icons are at least two, so that the display effect and the interactive effect of the live video broadcast are diverse. Chemical.
  • S603 Create a group library based on each type and corresponding group diagram, and an icon of the corresponding interactive control, and use the storage path of the group library as the group diagram interface data.
  • each type, the group map corresponding thereto, and the icon of the corresponding interactive control may be generated in advance, and the group library is established based on the correspondence.
  • the support group can directly obtain the group map corresponding to the type from the group library and use it as the target group map to ensure the acquisition efficiency.
  • S604 Establish a control identifier library based on the identifier of each type and corresponding interactive control.
  • each type and the identifier of the interactive control corresponding thereto may be generated in advance, and the control identifier library is established based on the correspondence.
  • the control identifier library is configured to provide an identifier of the interactive control that matches the type of the current live video in real time, so that the terminal device can load the interactive control corresponding to the identifier of the provided interactive control in real time, thereby avoiding dynamic switching of the interactive control to the user.
  • the sensory discomfort brought by it increases the user experience.
  • the support By pre-establishing the control identifier library, the support subsequently obtains the identifier of the interactive control corresponding to the type directly from the control identifier library, thereby ensuring the acquisition efficiency.
  • S605 Acquire information of the current live video, where the information includes: a type of the current live video, and a group interface data to be loaded.
  • the background server device of the terminal device dynamically updates the current live video, such as a news video, and pushes the updated news video to the video live broadcast of the terminal device in real time. Show in the app.
  • the information of the current live video may be read by the third-party live broadcast management platform from the background server of the terminal device, where the information includes: the type of the current live video, and the group map interface data to be loaded.
  • the third-party live broadcast management platform may read the content and/or type tag of the current live video from the background server, and parse the content and/or type tags to obtain the type of the current live video.
  • the third-party live broadcast management platform may also directly read the content and/or type tag of the current live video from the video live broadcast application of the terminal device, where the content may be, for example, a voice message or text obtained by parsing the video frame.
  • the news does not limit this.
  • the content of the video broadcasted by different types of live video is different.
  • the type of the current live video can be classified into a common type, a disaster type, or a celebration type.
  • the type is not limited thereto, and the group is based on the diversity of the live video.
  • the figure may be, for example, a background picture, an illustration, or a special effect picture involved in displaying the broadcasted video in a live video application, and is not limited thereto.
  • the group image to be loaded is a group map corresponding to the type of the current live video. For example, if the current live video type is a disaster type, the background image in the corresponding group image is a black background, if the current live video type is a celebration. For the type, the background image in the corresponding group picture is a red background, and the group pictures corresponding to different types of videos are different or the same, which is not limited.
  • the group map to be loaded may be downloaded from the Internet by the third-party live broadcast management platform according to the type of the current live video, or may be read from the group library by the third-party live broadcast management platform based on a storage path from the local storage. There is no limit to this.
  • the group interface data to be loaded may be a corresponding Uniform Resource Locator (URL).
  • the local storage of the third-party live management platform is read from the group library based on a storage path, and the group interface data to be loaded is the storage path of the group library.
  • S606 Obtain a group map corresponding to the type from the group library according to the group map interface data and use the target group map.
  • the target group map may include a background image, an illustrator, or a special effect picture corresponding to the type of the current live video, and is not limited thereto. Specifically, the target group map corresponds to the emotion conveyed in the type.
  • the timeliness of obtaining the corresponding group map can be guaranteed, and the sensory discomfort brought by the dynamic switching of the group graph to the user can be avoided, and the user experience is improved.
  • the group map corresponding to the type from the group library and as the target group map, since the group library is pre-statistically established, the matching of the video type and the corresponding group image is better. Improve the timeliness of the corresponding group maps and ensure the quality of the group maps.
  • S607 Send the target group image to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • the third-party live broadcast management platform sends the target group map to the terminal device in real time, and the terminal device loads and performs subsequent processing.
  • the subsequent processing procedure of the terminal device refer to the description of the foregoing example, and no further details are provided herein.
  • the support group obtains the group map corresponding to the type directly from the group library and uses the target group map as the target group map to ensure the acquisition efficiency.
  • the control identifier library is configured to provide an identifier of the interactive control that matches the type of the current live video in real time, so that the terminal device can load the interactive control corresponding to the identifier of the provided interactive control in real time, thereby avoiding dynamic switching of the interactive control to the user. The sensory discomfort brought by it increases the user experience.
  • the support subsequently obtains the identifier of the interactive control corresponding to the type directly from the control identifier library, thereby ensuring the acquisition efficiency.
  • the group map corresponding to the type is better. Improve the timeliness of the corresponding group maps and ensure the quality of the group maps.
  • the mapping is determined based on the type of the current live video, and the group maps corresponding to different types of videos are different. Therefore, the display effect and the interactive effect of the live video are diversified, and the broadcasted is more appropriately expressed. The emotional atmosphere of the video enhances the user experience.
  • FIG. 7 is a schematic flowchart diagram of an interaction method in a live video broadcast proposed by other examples of the present application.
  • the terminal device may be, for example, a personal computer, a mobile phone, a tablet computer, or the like having hardware devices of various operating systems.
  • the executor of the present embodiment is a terminal device, and the client device is configured with a video broadcast application, and the video live application can be, for example, a Tencent video application, which is not limited.
  • the method includes:
  • S701 Receive a target group image sent by a third-party live broadcast management platform, where the target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live broadcast video type, and the group map to be loaded. Interface data, the target group map is a group map corresponding to the type of the current live video.
  • the third-party live broadcast management platform may be, for example, a background server device, which is not limited thereto.
  • the content of the video broadcasted by different types of live video is different.
  • the type of the current live video can be classified into a common type, a disaster type, or a celebration type.
  • the type is not limited to the diversity of the live video.
  • the group map may be, for example, a background image, an illustration, or a special effect picture involved in displaying the broadcasted video in a live video application, and is not limited thereto.
  • the group image to be loaded is a group map corresponding to the type of the current live video. For example, if the current live video type is a disaster type, the background image in the corresponding group image is a black background, if the current live video type is a celebration. For the type, the background image in the corresponding group picture is a red background, and the group pictures corresponding to different types of videos are different or the same, which is not limited.
  • the group interface data to be loaded may be a corresponding Uniform Resource Locator (URL).
  • the local storage of the third-party live management platform is read from the group library based on a storage path, and the group interface data to be loaded is the storage path of the group library.
  • the target group map may include a background image, an illustrator, or a special effect picture corresponding to the type of the current live video, and is not limited thereto. Specifically, the target group map corresponds to the emotion conveyed in the type.
  • S702 Render a display interface of the current live video according to the target group image, so that the user interacts based on the rendered display interface.
  • the background image of the display interface of the current live video may be replaced with a black background or a gray background in real time, which is not limited.
  • the display interface of the current live video may include: a first display interface of the live video in the small screen mode, the number of the first display interface is at least one, and the second display interface of the live video in the full screen mode, as shown in the figure 2 is shown.
  • the rendering of the current live video presentation interface according to the target group image may be included in S702, so that the user may interact based on the rendered display interface.
  • S801 Render the first display interface and the second display interface according to the target group map.
  • the first display interface 21 of the live video in the small screen mode shown in FIG. 2 and the second display interface 22 of the live video in the full screen mode are respectively rendered, for example, based on the information of the current live video, If the type of the current live video is determined to be a disaster type, the background image of the first display interface 21 and the second display interface 22 of the current live video may be replaced with a black background or a gray background in real time. For other types of videos, By analogy, there is no limit to this.
  • the method may further include:
  • the terminal device receives an icon of the interactive control identifier and the interactive control sent by the third-party live broadcast management platform, where the interactive control is used to enable the user to interact with the current live video, and the interactive controls corresponding to different types of live video are different.
  • the icons of the interactive controls are different.
  • S702 may further include:
  • S902 Obtain a corresponding interactive control according to the identifier of the interactive control.
  • S904 The arranged interactive control is rendered according to the icon of the interactive control, so that the user interacts based on the rendered interactive control.
  • the example shown in FIG. 9 is correspondingly based on the interaction control identifier and the icon of the interactive control sent by the third-party live broadcast management platform, that is, in the display interface of the current live video, for example, may be the right of the interface. In the lower corner, an interactive control corresponding to the type of the current live video is displayed. Compared with the related art, only an interactive control for making the user like the user is arranged in the interface, and the interactive effect of the live video broadcast can be diversified.
  • the method may further include:
  • S1001 Receives the statistical result sent by the third-party live broadcast platform, and the statistical result is obtained by the third-party live broadcast platform collecting the interactive information of the user operating the interactive control within a preset time.
  • the statistical result includes a first statistical result and a second statistical result, where the first statistical result is that the third-party live broadcast platform collects interactive information that the user operates the interactive control within a preset time.
  • the obtained second statistical result is obtained by the third-party live broadcast platform for collecting interactive information that the user other than the user operates the interactive control within the preset time.
  • the preset time can be preset.
  • the preset time may be preset by the factory program of the third-party live management platform, or may be set by the user of the third-party live broadcast management platform according to their own needs, which is not limited.
  • the user can be the user to which the terminal device belongs, and/or other users on the Internet that watch the current live video.
  • the interaction information of the user to which the terminal device belongs can be counted, but also the interaction information of other users on the Internet of the current live video can be counted, so that the user can obtain various interactions. Information that enhances user interaction from another dimension.
  • the displaying the statistical result may be: receiving an acquisition instruction of the first statistical result and/or the second statistical result by the user, and displaying a statistical result corresponding to the obtaining instruction.
  • the statistical results are displayed, including: when the user operates the interactive control, determining an interaction effect corresponding to the type of the video to which the interactive control belongs; and displaying the statistical result based on the corresponding interaction effect.
  • the interactive effects of different types of live video are different, that is, different types of live video, and different types of interactive effects are used to display statistical information.
  • the corresponding interactive effect is Above the floating Bezier curve, at this time, if the statistical result is obtained by statistically calculating the interaction information of other users, the corresponding interaction effect can also be set by setting the transparency to the floating Bezier curve and setting it to rotate up. The form is expressed. If the statistical result is the interactive information of the user to which the terminal device belongs, the corresponding interactive effect can be expressed by configuring the Bezier curve of the fireworks style and displaying the form of "+1" and "zoom".
  • the corresponding interaction effect is a floating Bezier curve.
  • the corresponding interaction effect can also be performed by floating up.
  • the Bezier curve is set in the form of transparency. If the statistical result is the interactive information of the user to which the terminal device belongs, the corresponding interactive effect can be expressed by displaying the form of "+1".
  • the corresponding interaction effect is a floating Bezier curve.
  • the corresponding interaction effect may also be The floating Bezier curve sets the transparency and sets it to the form of the rotation up and floating. If the statistical result is the interactive information of the user to which the terminal device belongs, the corresponding interaction effect can be displayed by displaying the bezier curve of the fireworks style, and Shows the form of "+1" "zoom" to represent.
  • the statistical result may be displayed at the preset position of the anchor room 23 of the current live video with the determined interaction effect, which is not limited thereto.
  • FIG. 11 is a flowchart of a resource sent by a third-party live broadcast management platform in an example of the present application, where the resources include a target group map determined by a third-party live broadcast management platform based on a current live video type.
  • the target group map is sent by the third-party live broadcast management platform, where the target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live broadcast video type, to be loaded.
  • the group map interface data, the target group map is a group map corresponding to the current live video type, and the current live video display interface is rendered according to the target group map, so that the user interacts based on the rendered display interface, because the The type of the current live video determines the corresponding group image for display, and the group maps corresponding to different types of videos are different. Therefore, the display effect and the interactive effect of the live video are diversified, and the emotion of the broadcasted video is more appropriately expressed.
  • the atmosphere enhances the user experience.
  • FIG. 12 is a schematic structural diagram of an interaction apparatus in a live video broadcast according to an example of the present application.
  • the device 120 includes: an information obtaining module 1201, a group map obtaining module 1202, and a group map sending module 1203, where
  • the information obtaining module 1201 is configured to obtain information about a current live video, where the information includes: a type of the current live video, and a group interface data to be loaded.
  • the information obtaining module 1201 is specifically configured to:
  • the group map obtaining module 1202 is configured to acquire a group map corresponding to the type according to the group map interface data and serve as a target group map.
  • the group map obtaining module 1202 is specifically configured to:
  • the group map corresponding to the type is obtained from the group library and used as the target group map.
  • the group sending module 1203 is configured to send the target group image to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • FIG. 13 is a schematic structural diagram of an interaction apparatus in a live video broadcast according to some examples of the present application.
  • the device 130 includes: an information acquiring module 1301, a group map acquiring module 1302, and a group map sending module 1303, where
  • the information obtaining module 1301 is configured to obtain information about a current live video, where the information includes: a type of the current live video, and a group interface data to be loaded.
  • the information obtaining module 1301 is specifically configured to:
  • the group map obtaining module 1302 is configured to acquire a group map corresponding to the type according to the group map interface data and serve as a target group map.
  • the group map obtaining module 1302 is specifically configured to:
  • the group map corresponding to the type is obtained from the group library and used as the target group map.
  • the group sending module 1303 is configured to send the target group image to the terminal device, so that the terminal device renders the display interface of the current live video according to the target group image.
  • the apparatus 130 further includes:
  • the icon obtaining module 1304 is configured to obtain an icon of the interactive control corresponding to the type from the group library, and send the icon of the interactive control to the terminal device, where the interactive control is used to enable the user to interact with the current live video, different types.
  • the live video corresponds to different interactive controls, and the icons of different interactive controls are different.
  • the identifier obtaining module 1305 is configured to obtain an interaction control identifier corresponding to a type of the current live video from the control identifier library, and send the interaction control identifier to the terminal device.
  • the interactive information statistics module 1306 is configured to collect interaction information of the user to operate the interactive control within a preset time, obtain a statistical result, and send the statistical result to the terminal device.
  • the establishing module 1307 is configured to perform statistical categorization on the types of the video, obtain multiple types, determine a group map corresponding to each type, and corresponding interactive controls, and determine icons of the interactive controls, and obtain multiple group maps and multiple An interactive control and an icon of a plurality of interactive controls, and a group library is created based on each type and corresponding group map, and an icon of the corresponding interactive control, and the storage path of the group library is used as the group map interface data, based on each type and The identity of the corresponding interactive control establishes a control identity library.
  • the information of the current live video is obtained, where the information includes: the type of the current live video, the group interface data to be loaded, and the group map corresponding to the type is obtained according to the group interface data, and is used as the target group map.
  • sending the target group image to the terminal device so that the terminal device renders the display interface of the current live video according to the target group image, because the corresponding group image is determined and displayed according to the type of the current live video, because different types of video are used.
  • the corresponding group maps are different. Therefore, the display effect and interactive effects of the live video broadcast are diversified, and the emotional atmosphere of the broadcasted video is more appropriately expressed, thereby improving the user experience.
  • FIG. 14 is a schematic structural diagram of an interaction apparatus in a live video broadcast proposed by other examples of the present application.
  • the apparatus 140 includes: a target group map receiving module 1401 and a rendering module 1402, where
  • the target group map receiving module 1401 is configured to receive a target group map sent by a third-party live broadcast management platform, where the target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live video Type, group image interface data to be loaded, and the target group map is a group map corresponding to the type of the current live video.
  • the rendering module 1402 is configured to render the display interface of the current live video according to the target group image, so that the user interacts based on the rendered display interface.
  • the display interface of the current live video includes: a first display interface of the live video in the small screen mode, the number of the first display interface is at least one, and the second display interface of the live video in the full screen mode, the rendering module 1402 Specifically for:
  • the first display interface and the second display interface are respectively rendered according to the target group map.
  • FIG. 15 is a schematic structural diagram of an interaction apparatus in a live video broadcast according to an example of the present application.
  • the device 150 includes: a target group map receiving module 1501 and a rendering module 1502, where
  • the target group map receiving module 1501 is configured to receive a target group map sent by a third-party live broadcast management platform, where the target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live video Type, group image interface data to be loaded, and the target group map is a group map corresponding to the type of the current live video.
  • the rendering module 1502 is configured to render the display interface of the current live video according to the target group image, so that the user interacts based on the rendered display interface.
  • the display interface of the current live video includes: a first display interface of the live video in the small screen mode, the number of the first display interface is at least one, and the second display interface of the live video in the full screen mode, the rendering module 1502 Specifically for:
  • the first display interface and the second display interface are respectively rendered according to the target group map.
  • the apparatus 150 further includes:
  • the identifier icon obtaining module 1503 is configured to receive an icon of the interactive control identifier and the interactive control sent by the third-party live broadcast management platform, wherein the interactive control is used to enable the user to interact with the current live video, and the interactive control corresponding to different types of live video Unlike the different interactive controls, the icons are different.
  • the rendering module 1502 includes:
  • the obtaining sub-module 15021 is configured to obtain a corresponding interactive control according to the interaction control identifier.
  • a sub-module 15022 is arranged for arranging corresponding interactive controls in the presentation interface.
  • the rendering sub-module 15023 is configured to render the arranged interactive control according to the icon of the interactive control, so that the user interacts based on the rendered interactive control.
  • the statistical result receiving module 1504 is configured to receive the statistical result sent by the third-party live broadcast platform, and the statistical result is obtained by the third-party live broadcast platform collecting the interactive information that the user operates the interactive control within a preset time.
  • the statistical result display module 1505 is configured to display statistical results.
  • the statistical result display module 1505 includes:
  • the determining sub-module 15051 is configured to determine an interaction effect corresponding to the type of the video to which the interactive control belongs when the user operates the interactive control.
  • the presentation sub-module 15052 is configured to display statistical results based on the corresponding interaction effects.
  • the target group map is sent by the third-party live broadcast management platform, where the target group map is obtained by the third-party live broadcast management platform according to the current live video information, where the information includes: the current live broadcast video type, to be loaded.
  • the group map interface data, the target group map is a group map corresponding to the current live video type, and the current live video display interface is rendered according to the target group map, so that the user interacts based on the rendered display interface, because the The type of the current live video determines the corresponding group image for display. Since the group maps corresponding to different types of videos are different, the display effect and interaction effect of the live video broadcast are diversified, and the emotion of the broadcast video is more appropriately expressed. The atmosphere enhances the user experience.
  • FIG. 16 is a structural block diagram of an interaction apparatus in a live video broadcast according to another example of the present application.
  • apparatus 1600 can include one or more of the following components: processing component 1602, memory 1604, power component 1606, multimedia component 1608, audio component 1610, input/output (I/O) interface 1612, sensor component 1614, And a communication component 1616.
  • Processing component 1602 typically controls the overall operation of device 1600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 1602 can include one or more processors 1620 to execute instructions to perform all or part of the steps described above.
  • processing component 1602 can include one or more modules to facilitate interaction between component 1602 and other components.
  • the processing component 1602 can include a multimedia module to facilitate interaction between the multimedia component 1608 and the processing component 1602.
  • Memory 1604 is configured to store various types of data to support operation at device 1600. Examples of such data include instructions for any application or method operating on device 1600, contact data, phone book data, messages, pictures, videos, and the like. Memory 1604 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Disk or Optical Disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 1606 provides power to various components of device 1600.
  • Power component 1606 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 1600.
  • the multimedia component 1608 includes a touch display screen that provides an output interface between the device 1600 and the user.
  • the touch display screen can include a liquid crystal display (LCD) and a touch panel (TP).
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel.
  • the touch sensor can sense not only the boundaries of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • multimedia component 1608 includes a front camera and/or a rear camera. When the device 1600 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data.
  • Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 1610 is configured to output and/or input an audio signal.
  • audio component 1610 includes a microphone (MIC) that is configured to receive an external audio signal when device 1600 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 1604 or transmitted via communication component 1616.
  • audio component 1610 also includes a speaker for outputting an audio signal.
  • the I/O interface 1612 provides an interface between the processing component 1602 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 1614 includes one or more sensors for providing state assessment of various aspects to device 1600.
  • sensor assembly 1614 can detect an open/closed state of device 1600, a relative positioning of components, such as a display and a keypad of device 1600, and sensor component 1614 can also detect a change in position of one of components 1600 or device 1600, The presence or absence of contact by the user with the device 1600, the orientation or acceleration/deceleration of the device 1600 and the temperature change of the device 1600.
  • Sensor assembly 1614 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 1614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly 1614 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 1616 is configured to facilitate wired or wireless communication between device 1600 and other devices.
  • the device 1600 can access a wireless network based on a communication standard, such as Wi-Fi, 2G or 3G, or a combination thereof.
  • communication component 1616 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • communication component 1616 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • device 1600 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gates An array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing interactive methods in the live video broadcast described above.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable gates
  • controller microcontroller, microprocessor, or other electronic component implementation for performing interactive methods in the live video broadcast described above.
  • the present application also proposes a non-transitory computer readable storage medium storing one or more programs, when the one or more programs are executed by one device, causing the device to execute the application.
  • portions of the application can be implemented in hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals. Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • each functional unit in each example of the present application may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as separate products, may also be stored in a computer readable storage medium.
  • the readable storage medium mentioned above may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

本申请提出一种视频直播中的互动方法、装置、系统及计算机可读存储介质,该视频直播中的互动方法包括获取当前直播视频的信息,其中,信息包括:当前直播视频的类型、待加载的组图接口数据;根据组图接口数据,获取与类型对应的组图并作为目标组图;将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染。

Description

视频直播中的互动方法、装置、系统及计算机可读存储介质
本申请要求于2017年10月10日提交中国专利局、申请号为201710934904.5、发明名称为“视频直播中的互动方法、装置及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及视频直播技术领域,尤其涉及一种视频直播中的互动方法、装置、系统以及计算机可读存储介质。
背景
直播可以分为文字图片直播和视频直播,其中,文字图片直播例如为天涯、豆瓣等论坛的直播帖,视频直播例如为直播体育赛事、新闻等。通过视频直播,用户可以在线实时收看球赛、体育赛事、重大活动和新闻等,此外,用户在观看直播或进行直播时,还可以与主播或其他用户进行互动,例如可以通过点赞来表达用户对直播事件的态度。
技术内容
本申请实例提供了一种视频直播中的互动方法,包括:获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
本申请实例还提供了一种视频直播中的互动方法,包括:接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应组图;根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
本申请实例还提供了一种视频直播中的互动装置,包括:信息获取模块,用于获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;组图获取模块,用于根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;组图发送模块,用于将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
本申请实例还提供了一种视频直播中的互动装置,包括:目标组图接收模块,用于接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应的组图;渲染模块,用于根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
本申请实例还提供了一种视频直播中的互动装置,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
本申请实例还提供了一种视频直播中的互动装置,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应组图;根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
本申请实例还提供了一种第三方直播管理平台,包括:本申请实例提出的视频直播中的互动装置,所述装置包括:信息获取模块,用于获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;组图获取模块,用于根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;组图发送模块,用于将所述目标组图发送至终端设备,以使所述终端设备根据所述 目标组图对所述当前直播视频的展示界面进行渲染。
本申请实例提供了一种终端设备,包括:本申请实例提出的视频直播中的互动装置,所述装置包括:目标组图接收模块,用于接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应的组图;渲染模块,用于根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
本申请实例还提供了一种视频直播中的互动系统,包括:本申请实例提供的一种第三方直播管理平台,和本申请实例提供的一种终端设备。
本申请实例还提供了一种非临时性计算机可读存储介质,所述计算机可读存储介质存储有一个或者多个程序,当所述一个或者多个程序被一个设备执行时,使得所述设备执行本申请实例的视频直播中的互动方法。
本申请实例还提供了一种计算机程序产品,当所述计算机程序产品中的指令由处理器执行时,执行一种视频直播中的互动方法,所述方法包括:获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
本申请实例还提供了一种计算机程序产品,当所述计算机程序产品中的指令由处理器执行时,执行一种视频直播中的互动方法,所述方法包括:接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应组图;根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
本申请附加的方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图简要说明
本申请上述的和/或附加的方面和优点从下面结合附图对实例的描述中将变得明显和容易理解,其中:
图1A是本申请一些实例涉及的一种系统构架示意图;
图1B是本申请一些实例提出的视频直播中的互动方法的流程示意图;
图2为本申请实例中展示界面的效果示意图;
图3为相关技术中的互动控件示意图;
图4是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图5是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图6是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图7是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图8是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图9是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图10是本申请另一些实例提出的视频直播中的互动方法的流程示意图;
图11为本申请实例中第三方直播管理平台下发资源的流程图;
图12是本申请一些实例提出的视频直播中的互动装置的结构示意图;
图13是本申请另一些实例提出的视频直播中的互动装置的结构示意图;
图14是本申请另一些实例提出的视频直播中的互动装置的结构示意图;
图15是本申请另一些实例提出的视频直播中的互动装置的结构示意图;
图16为本申请另一些实例提出的视频直播中的互动装置的结构框图。
实施方式
下面详细描述本申请的实例,所述实例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实例是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。相反,本申请的实例包括落入所附加权利要求书的精神和内涵范围内的所有变化、修改和等同物。
本申请提出了一种视频直播中的互动方法、装置及系统,该方法可应用于图1A 所示的系统构架中。
如图1A所示,后台服务器110通过第三方直播管理平台102,向多个用户提供社交网络服务(例如,用户注册、消息、视频的生成、消息的传输、视频的传输、聊天会话的生成、在线发布和其他在线社交互动),其中所述多个用户分别操作他们各自的终端设备104(例如,终端设备104a-c)。其中,终端设备104、第三方直播管理平台102、后台服务器110通过一个或多个网络106进行交互。第三方直播管理平台102和后台服务器110可以集成于一体,也可以分开设置。
在一些实施例中,第三方直播管理平台102可以由后台服务器110实现,此时,每个用户通过在终端设备104上执行的应用客户端108(例如,应用客户端108a-c)连接至后台服务器110,从而与另一用户交互。后台服务器110通过用户各自的用户标识(诸如,用户名、昵称或账户标识)来识别网络中的用户。在一些实例中,应用客户端108可以为直播应用客户端。
在一些实例中,用户可以通过与应用客户端108提供的用户界面交互来触发一个特定的服务。例如,用户可以打开直播程序,录制直播视频或观看直播视频。
图1B是本申请一些实例提出的视频直播中的互动方法的流程示意图。
本实例以该视频直播中的互动方法应用在第三方直播管理平台中来举例说明,其中,该第三方直播管理平台可以例如是后台服务器设备,对此不作限制。
本实例的执行主体为第三方直播管理平台,该第三方直播管理平台可以用于对终端设备中视频直播类应用中的相关数据进行管理,该视频直播类应用可以例如为腾讯视频应用,对此不作限制。
该视频直播中的互动方法可以具体应用在视频直播类应用对视频进行直播的过程中,可以理解的是,视频直播类应用对视频进行直播的过程中,会将当前直播视频展示在一个界面中,该界面可以被称为展示界面,参见图2,图2为本申请实例中展示界面的效果示意图,其包括:小屏模式下直播视频的第一展示界面21,和全屏模式下直播视频的第二展示界面22、主播厅23,不同的展示界面可以用于展示多个播报员所播报的视频,例如,第一展示界面21中的播报员A和第二展示界面22中的播报员B进行消息连线过程中,则可以实时地,并且同时地,在第一展示界面21中展示播报员A所播报的视频,在第二展示界面22中展示播报员B所播报的视频,其中,第一展示界面21的个数不限于一个,对此不作限制。
参见图1B,该方法包括:
S101:获取当前直播视频的信息,其中,信息包括:当前直播视频的类型、待加载的组图接口数据。
可以理解的是,根据视频直播类应用的工作原理,终端设备的后台服务器设备会对当前直播视频,例如新闻视频,进行动态更新,并实时将更新后的新闻视频推送至终端设备的视频直播类应用中进行展示。
因而,在本申请的实例中,可以由第三方直播管理平台从终端设备的后台服务器中读取当前直播视频的信息,其中的信息包括:当前直播视频的类型、待加载的组图接口数据。
一些实例中,第三方直播管理平台可以从后台服务器中读取当前直播视频的内容和/或类型标签,并对内容和/或类型标签进行解析,以得到当前直播视频的类型。
或者,第三方直播管理平台也可以直接从终端设备的视频直播类应用中读取当前直播视频的内容和/或类型标签,其中的内容可以例如为,对视频帧进行解析得到的语音消息或者文本消息,对此不作限制。
其中,不同类型的直播视频所播报的视频的情感内容不同,例如,当前直播视频的类型可以划分为,普通类型、灾难类型,或者庆祝类型,类型不限于此,依据直播视频的多样性,组图可以例如为,视频直播类应用中展示所播报视频所涉及到的背景图片、插画,或者特效图片等,对此不作限制。
待加载的组图为与当前直播视频的类型对应的组图,例如,若当前直播视频的类型为灾难类型,则对应的组图中的背景图片为黑色背景,若当前直播视频的类型为庆祝类型,则对应的组图中的背景图片为红色背景,不同类型的视频对应的组图不同或者相同,对此不作限制。
待加载的组图可以由第三方直播管理平台根据当前直播视频的类型从互联网上下载得到,或者,也可以由第三方直播管理平台从本地存储器中基于一个存储路径从组图库中读取得到,对此不作限制。
若待加载的组图由第三方直播管理平台根据当前直播视频的类型从互联网上下载得到,则待加载的组图接口数据可以为对应的统一资源定位符(Uniform Resource Locator,URL),若从第三方直播管理平台的本地存储器中基于一个存储路径从组图库中读取得到,则该待加载的组图接口数据为该组图库的存储路径。
相关技术中,在视频直播类应用中,通过展示单一形态的图片,单一形态的互动效果播报直播视频,互动效果的类型单一,不能恰当传达视频直播中的氛围,组图配置的灵活性较差。例如,通常在当前直播视频的展示界面中展示一个用于使用户基于当前直播视频的内容进行点赞操作的互动控件,参见图3,图3为相关技术中的互动控件示意图,该互动控件的图标渲染为一个点赞的手势,用户可以通过操作该互动控件对直播视频进行点赞操作,互动效果较为单一。
而在本申请的实例中,可以获取到当前直播视频的信息,其中,信息包括:当前直播视频的类型,而后,使得后续可以基于该类型确定对应的组图进行展示,由于不同类型的视频所对应的组图不相同,因而,实现视频直播的展示效果和互动效果的多样化,更恰当地表达所播视频的情感氛围,提升用户使用体验度。
S102:根据组图接口数据,获取与类型对应的组图并作为目标组图。
一些实例中,可以根据组图接口数据,从组图库中获取与类型对应的组图并作为目标组图。
该目标组图中可以包括,与当前直播视频的类型对应的背景图片、插画,或者特效图片等,对此不作限制,具体地,该目标组图与类型中所传达的情感相对应。
例如,可以预先配置一个组图库,并将该组图库存储在本地存储中,此时,组图接口数据可以为组图库的存储路径,第三方直播管理平台可以基于该存储路径直接从本地存储中读取与当前直播视频的类型对应的组图,即,组图库中存储有多种类型,以及与每种类型对应的组图,其中,所述多种类型可以是通过对视频的类型进行统计规律得到的,该组图库可以是预先采用统计的方式建立的,组图库的具体建立过程以及所述多种类型可以参见下述实例。
通过直接根据组图接口数据,获取与类型对应的组图并作为目标组图,能够保障获取对应组图的时效性,避免组图的动态切换给用户带来的感官不适,提升用户使用体验度。通过从组图库中获取与类型对应的组图并作为目标组图,由于组图库是预先采用统计的方式建立的,因而,其所包含的视频类型与对应的组图的匹配性更佳,在提升获取对应组图时效性的同时,保障组图的展示质量。
S103:将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染。
其中,由第三方直播管理平台将目标组图实时发送至终端设备,由终端设备进 行加载并进行后续处理,终端设备的后续处理流程可以参见下述实例,在此不作赘述。
进一步地,参见图4,该方法还可以包括:
S401:从组图库中获取与类型对应的互动控件的图标,并将互动控件的图标发送至终端设备,其中,互动控件用于使用户与当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
本申请的实例中,为了实现视频直播的互动效果的多样化,满足用户的多样化情感表达需求,提升用户使用粘性,不仅仅基于当前直播视频的类型确定对应的组图,还进一步基于该类型确定对应的互动控件的图标,互动控件用于使用户与当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
其中,互动控件可以例如为使用户进行点赞操作的控件,或者,使用户触发进行评论操作的控件,对此不作限制。
不同类型的直播视频对应的互动控件不相同,可以例如,若当前直播视频的类型为灾难类型,则对应的互动控件为用于使用户进行祈福操作的控件,若当前直播视频的类型为庆祝类型,则对应的互动控件为用于使用户进行撒花操作的控件,不同类型的视频对应的互动控件不同或者相同,对此不作限制。
S402:从控件标识库中获取与当前直播视频的类型对应的互动控件标识,并将互动控件标识发送至终端设备。
其中,该对应的互动控件标识用于使终端设备加载并在视频直播的展示界面中布置对应的互动控件。
该标识可以用于唯一标记该对应的互动控件。
具体地,可以由第三方直播管理平台从控件标识库中获取与当前直播视频的类型对应的互动控件标识,或者,也可以从互联网上下载得到,即,控件标识库中存储有多种类型,以及与每种类型对应的互动控件标识,该控件标识库可以是预先采用统计的方式建立的,控件标识库的具体建立过程可以参见下述实例。
其中,S102、S401、以及S402之间无时序限定关系。
通过从控件标识库中获取与类型对应的互动控件标识,由于控件标识库是预先采用统计的方式建立的,因而,其所包含的视频类型与对应的互动控件标识的匹配 性更佳,在提升获取对应的互动控件标识时效性的同时,保障互动控件标识对应的互动控件能够契合用户对当前所播报视频的情感需求。
进一步地,参见图5,该方法还可以包括:
S501:统计用户在预设时间内对互动控件进行操作的互动信息,得到统计结果,并将统计结果发送至终端设备。
其中的预设时间可以是预先设定的。
该预设时间可以由第三方直播管理平台的出厂程序预先设定,或者,也可以由第三方直播管理平台的用户根据自身需求进行设定,对此不作限制。
其中的用户可以为终端设备所属的用户,和/或,观看当前直播视频的互联网上的其他用户,对此不作限制。
在本申请的实例中,不仅仅可以统计终端设备所属的用户对互动控件进行操作的互动信息,还能够统计观看当前直播视频的互联网上的其他用户的互动信息,使得用户可以获得多方面的互动信息,从另一个维度增强用户的互动效果。
本实例中,通过获取当前直播视频的信息,其中,信息包括:当前直播视频的类型、待加载的组图接口数据,根据组图接口数据,获取与类型对应的组图并作为目标组图,以及将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染,由于是基于当前直播视频的类型确定对应的组图进行展示,由于不同类型的视频所对应的组图不相同,因而,实现视频直播的展示效果和互动效果的多样化,更恰当地表达所播视频的情感氛围,提升用户使用体验度。
图6是本申请另一些实例提出的视频直播中的互动方法的流程示意图。
本实例以该视频直播中的互动方法应用在第三方直播管理平台中来举例说明,参见图6,该方法包括:
S601:对视频的类型进行统计归类,得到多种类型。
作为一种示例,可以采用统计的方式,对当前互联网上所播报的海量视频数据进行爬取并统计,基于视频内容所传达的情感划分出多个可能的视频类型,例如,若视频内容为纪实类、天气预报类、考古类,则可以基于其所传达的情感确定普通类型的视频,若视频内容为地震类、灾害类视频,则可以基于其所传达的情感确定灾害类型的视频,而若视频内容为纪念类、庆祝类、获奖类,则可以基于其所传达的情感确定庆祝类型的视频,对此不作限制。
S602:确定与每种类型对应的组图和对应的互动控件,以及确定互动控件的图标,得到多种组图、多个互动控件以及多个互动控件的图标。
作为一种示例,可以采用人工标注的方式,确定与每种类型对应的组图和对应的互动控件,或者,也可以基于机器学习的方式,依据从互联网中爬取的已有的组图和互动控件的图标,确定与每种类型最契合的组图和对应的互动控件。
例如,若视频的类型为普通类型,则可以将对应的组图中的背景图片标注为蓝色背景,将对应的互动控件确定为使用户能够进行点赞操作的互动控件,该互动控件的图标可以例如为“竖起大拇指”;若视频的类型为灾难类型,则可以将对应的组图中的背景图片标注为黑色背景或者灰色背景,将对应的互动控件确定为使用户能够进行祈福操作的互动控件,该互动控件的图标可以例如为“双手合十”;若视频的类型为庆祝类型,则可以将对应的组图中的背景图片标注为红色背景,将对应的互动控件确定为使用户能够进行送礼物操作的互动控件,该互动控件的图标可以例如为“礼物形状的图标”,对此不作限制。
本申请实例中所确定的视频的类型为至少两种,对应的组图为至少两种,且,对应的互动控件及图标为至少两个,因此,实现视频直播的展示效果和互动效果的多样化。
S603:基于每种类型和对应的组图,以及对应的互动控件的图标建立组图库,并将组图库的存储路径作为组图接口数据。
其中,可以预先生成每种类型,与与其对应的组图,以及对应的互动控件的图标之间的对应关系,并基于该对应关系建立组图库。
通过预先建立组图库,支撑后续直接从组图库中获取与类型对应的组图并作为目标组图,保障获取效率。
S604:基于每种类型和对应的互动控件的标识建立控件标识库。
其中,可以预先生成每种类型,与与其对应的互动控件的标识之间的对应关系,并基于该对应关系建立控件标识库。
该控件标识库用于实时地提供与当前直播视频的类型匹配的互动控件的标识,使得终端设备能够实时地加载与所提供的互动控件的标识对应的互动控件,避免互动控件的动态切换给用户带来的感官不适,提升用户使用体验度。通过预先建立控件标识库,支撑后续直接从控件标识库中获取与类型对应的互动控件的标识,保障 获取效率。
S605:获取当前直播视频的信息,其中,信息包括:当前直播视频的类型、待加载的组图接口数据。
可以理解的是,根据视频直播类应用的工作原理,终端设备的后台服务器设备会对当前直播视频,例如新闻视频,进行动态更新,并实时将更新后的新闻视频推送至终端设备的视频直播类应用中进行展示。
因而,在本申请的实例中,可以由第三方直播管理平台从终端设备的后台服务器中读取当前直播视频的信息,其中的信息包括:当前直播视频的类型、待加载的组图接口数据。
一些实例中,第三方直播管理平台可以从后台服务器中读取当前直播视频的内容和/或类型标签,并对内容和/或类型标签进行解析,以得到当前直播视频的类型。
或者,第三方直播管理平台也可以直接从终端设备的视频直播类应用中读取当前直播视频的内容和/或类型标签,其中的内容可以例如为,对视频帧进行解析得到的语音消息或者文本消息,对此不作限制。
其中,不同类型的直播视频所播报的视频的情感内容不同,例如,当前直播视频的类型可以划分为,普通类型、灾难类型,或者庆祝类型,类型不限于此,依据直播视频的多样性,组图可以例如为,视频直播类应用中展示所播报视频所涉及到的背景图片、插画,或者特效图片等,对此不作限制。
待加载的组图为与当前直播视频的类型对应的组图,例如,若当前直播视频的类型为灾难类型,则对应的组图中的背景图片为黑色背景,若当前直播视频的类型为庆祝类型,则对应的组图中的背景图片为红色背景,不同类型的视频对应的组图不同或者相同,对此不作限制。
待加载的组图可以由第三方直播管理平台根据当前直播视频的类型从互联网上下载得到,或者,也可以由第三方直播管理平台从本地存储器中基于一个存储路径从组图库中读取得到,对此不作限制。
若待加载的组图由第三方直播管理平台根据当前直播视频的类型从互联网上下载得到,则待加载的组图接口数据可以为对应的统一资源定位符(Uniform Resource Locator,URL),若从第三方直播管理平台的本地存储器中基于一个存储路径从组图库中读取得到,则该待加载的组图接口数据为该组图库的存储路径。
S606:根据组图接口数据,从组图库中获取与类型对应的组图并作为目标组图。
该目标组图中可以包括,与当前直播视频的类型对应的背景图片、插画,或者特效图片等,对此不作限制,具体地,该目标组图与类型中所传达的情感相对应。
通过直接根据组图接口数据,获取与类型对应的组图并作为目标组图,能够保障获取对应组图的时效性,避免组图的动态切换给用户带来的感官不适,提升用户使用体验度。通过从组图库中获取与类型对应的组图并作为目标组图,由于组图库是预先采用统计的方式建立的,因而,其所包含的视频类型与对应的组图的匹配性更佳,在提升获取对应组图时效性的同时,保障组图的展示质量。
S607:将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染。
其中,由第三方直播管理平台将目标组图实时发送至终端设备,由终端设备进行加载并进行后续处理,终端设备的后续处理流程可以参见上述实例描述,在此不作赘述。
本实例中,通过预先建立组图库,支撑后续直接从组图库中获取与类型对应的组图并作为目标组图,保障获取效率。该控件标识库用于实时地提供与当前直播视频的类型匹配的互动控件的标识,使得终端设备能够实时地加载与所提供的互动控件的标识对应的互动控件,避免互动控件的动态切换给用户带来的感官不适,提升用户使用体验度。通过预先建立控件标识库,支撑后续直接从控件标识库中获取与类型对应的互动控件的标识,保障获取效率。通过从组图库中获取与类型对应的组图并作为目标组图,由于组图库是预先采用统计的方式建立的,因而,其所包含的视频类型与对应的组图的匹配性更佳,在提升获取对应组图时效性的同时,保障组图的展示质量。通过是基于当前直播视频的类型确定对应的组图进行展示,由于不同类型的视频所对应的组图不相同,因而,实现视频直播的展示效果和互动效果的多样化,更恰当地表达所播视频的情感氛围,提升用户使用体验度。
图7是本申请另一些实例提出的视频直播中的互动方法的流程示意图。
本实例以该视频直播中的互动方法应用在终端设备中来举例说明,其中,终端设备可以例如是个人计算机、手机、平板电脑等具有各种操作系统的硬件设备。
本实例的执行主体为终端设备,该终端设备中配置有视频直播类应用的客户端,该视频直播类应用可以例如为腾讯视频应用,对此不作限制。
参见图7,该方法包括:
S701:接收第三方直播管理平台发送的目标组图,其中,目标组图是第三方直播管理平台根据当前直播视频的信息获取的,其中,信息包括:当前直播视频的类型、待加载的组图接口数据,目标组图为与当前直播视频的类型对应的组图。
其中,该第三方直播管理平台可以例如是后台服务器设备,对此不作限制。
其中,不同类型的直播视频所播报的视频的情感内容不同,例如,当前直播视频的类型可以划分为,普通类型、灾难类型,或者庆祝类型,依据直播视频的多样性,类型不限于此。
组图可以例如为,视频直播类应用中展示所播报视频所涉及到的背景图片、插画,或者特效图片等,对此不作限制。
待加载的组图为与当前直播视频的类型对应的组图,例如,若当前直播视频的类型为灾难类型,则对应的组图中的背景图片为黑色背景,若当前直播视频的类型为庆祝类型,则对应的组图中的背景图片为红色背景,不同类型的视频对应的组图不同或者相同,对此不作限制。
若待加载的组图由第三方直播管理平台根据当前直播视频的类型从互联网上下载得到,则待加载的组图接口数据可以为对应的统一资源定位符(Uniform Resource Locator,URL),若从第三方直播管理平台的本地存储器中基于一个存储路径从组图库中读取得到,则该待加载的组图接口数据为该组图库的存储路径。
该目标组图中可以包括,与当前直播视频的类型对应的背景图片、插画,或者特效图片等,对此不作限制,具体地,该目标组图与类型中所传达的情感相对应。
S702:根据目标组图对当前直播视频的展示界面进行渲染,以使用户基于渲染后的展示界面进行互动。
例如,若基于当前直播视频的信息,确定当前直播视频的类型为灾难类型,则可以将当前直播视频的展示界面的背景图片实时地替换为黑色背景或者灰色背景,对此不作限制。
一些实例中,当前直播视频的展示界面可以包括:小屏模式下直播视频的第一展示界面,第一展示界面的个数为至少一个,和全屏模式下直播视频的第二展示界面,如图2所示。
具体的,参见图8,S702中根据目标组图对当前直播视频的展示界面进行渲染 可以包括S801,以使用户基于渲染后的展示界面进行互动。
S801:根据目标组图分别对第一展示界面和第二展示界面进行渲染。
即,根据目标组图分别对图2所示的小屏模式下直播视频的第一展示界面21以及全屏模式下直播视频的第二展示界面22进行渲染,例如,若基于当前直播视频的信息,确定当前直播视频的类型为灾难类型,则可以分别将当前直播视频的第一展示界面21以及第二展示界面22的背景图片实时地替换为黑色背景或者灰色背景,对于其它类型的视频,依此类推,对此不作限制。
一些实例中,参见图9,该方法还可以包括:
S901:终端设备接收第三方直播管理平台发送的互动控件标识和互动控件的图标,其中,互动控件用于使用户与当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
此时,S702还可以包括:
S902:根据互动控件标识获取对应的互动控件。
S903:在展示界面中布置对应的互动控件。
S904:根据互动控件的图标对所布置的互动控件进行渲染,以使用户基于渲染后的互动控件进行互动。
其中,图9所示实例中,相对应地基于第三方直播管理平台发送的互动控件标识和互动控件的图标进行后续处理,即,在当前直播视频的展示界面中,例如,可以为界面的右下角处,展示与当前直播视频的类型对应的互动控件,相比较于相关技术中,仅仅在界面中布置一个用于使用户点赞的互动控件,能够实现视频直播的互动效果的多样化。
一些实例中,参见图10,该方法还可以包括:
S1001:接收第三方直播平台发送的统计结果,统计结果为第三方直播平台统计用户在预设时间内对互动控件进行操作的互动信息所得到的。
具体的,所述统计结果包括第一统计结果和第二统计结果,所述第一统计结果为所述第三方直播平台统计所述用户在预设时间内对所述互动控件进行操作的互动信息所得到的,所述第二统计结果为所述第三方直播平台统计除所述用户之外的用户在所述预设时间内对互动控件进行操作的互动信息所得到的。
其中的预设时间可以是预先设定的。
该预设时间可以由第三方直播管理平台的出厂程序预先设定,或者,也可以由第三方直播管理平台的用户根据自身需求进行设定,对此不作限制。
其中的用户可以为终端设备所属的用户,和/或,观看当前直播视频的互联网上的其他用户,对此不作限制。
在本申请的实例中,不仅仅可以统计终端设备所属的用户对互动控件进行操作的互动信息,还能够统计观看当前直播视频的互联网上的其他用户的互动信息,使得用户可以获得多方面的互动信息,从另一个维度增强用户的互动效果。
S1002:展示统计结果。
在一些实例中,所述展示统计结果可以是:接收所述用户对所述第一统计结果和/或第二统计结果的获取指令,并展示与所述获取指令对应的统计结果。
在一些实例中,展示统计结果,包括:在用户对互动控件进行操作时,确定与互动控件所属视频的类型对应的互动效果;基于对应的互动效果展示统计结果。
其中,不同类型的直播视频对应的互动效果不相同,即不同类型的直播视频,采用不同类型的互动效果展示统计信息,可以例如,若当前直播视频的类型为普通类型,则对应的互动效果为以上浮贝塞尔曲线,此时,若统计结果为对其他用户的互动信息进行统计得到的,则对应的互动效果还可以通过对上浮贝塞尔曲线设定透明度,并且将其设置为旋转上浮的形式来表现,若统计结果为终端设备所属用户的互动信息,则对应的互动效果可以通过配置礼花样式的贝塞尔曲线,并展示“+1”“缩放”的形式来表现。
若当前直播视频的类型为灾难类型,则对应的互动效果为上浮贝塞尔曲线,此时,若统计结果为对其他用户的互动信息进行统计得到的,则对应的互动效果还可以通过对上浮贝塞尔曲线设定透明度的形式来表现,若统计结果为终端设备所属用户的互动信息,则对应的互动效果可以通过展示“+1”的形式来表现。
而若当前直播视频的类型为庆祝类型,则对应的互动效果为上浮贝塞尔曲线,此时,若统计结果为对其他用户的互动信息进行统计得到的,则对应的互动效果还可以通过对上浮贝塞尔曲线设定透明度,并且将其设置为旋转上浮的形式来表现,若统计结果为终端设备所属用户的互动信息,则对应的互动效果可以通过展示礼花样式的贝塞尔曲线,并展示“+1”“缩放”的形式来表现。
例如,参见图2,可以在当前直播视频的主播厅23的预设位置处以所确定的互 动效果展示该统计结果,对此不作限制。
参见图11,图11为本申请实例中第三方直播管理平台下发资源的流程图,其中的资源包括本申请实例中,第三方直播管理平台基于当前直播视频的类型所确定的目标组图、对应的互动控件的图标、标识、统计结果等,其中,假设当前直播视频的展示界面包括:小屏模式下直播视频的第一展示界面,和全屏模式下直播视频的第二展示界面,对此不作限制。
本实例中,通过接收第三方直播管理平台发送的目标组图,其中,目标组图是第三方直播管理平台根据当前直播视频的信息获取的,其中,信息包括:当前直播视频的类型、待加载的组图接口数据,目标组图为与当前直播视频的类型对应的组图,根据目标组图对当前直播视频的展示界面进行渲染,以使用户基于渲染后的展示界面进行互动,由于是基于当前直播视频的类型确定对应的组图进行展示,且不同类型的视频所对应的组图不相同,因而,实现视频直播的展示效果和互动效果的多样化,更恰当地表达所播视频的情感氛围,提升用户使用体验度。
图12是本申请一实例提出的视频直播中的互动装置的结构示意图。
参见图12,该装置120包括:信息获取模块1201、组图获取模块1202,以及组图发送模块1203,其中,
信息获取模块1201,用于获取当前直播视频的信息,其中,信息包括:当前直播视频的类型、待加载的组图接口数据。
一些实例中,信息获取模块1201,具体用于:
对当前直播视频的内容和/或类型标签进行解析,以获取当前直播视频的类型。
组图获取模块1202,用于根据组图接口数据,获取与类型对应的组图并作为目标组图。
一些实例中,组图获取模块1202,具体用于:
根据组图接口数据,从组图库中获取与类型对应的组图并作为目标组图。
组图发送模块1203,用于将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染。
需要说明的是,上述对图1B-图6中所示视频直播中的互动方法实例的解释说明也适用于该实例的视频直播中的互动装置120,在此不再赘述。
图13是本申请一些实例提出的视频直播中的互动装置的结构示意图。
参见图13,该装置130包括:信息获取模块1301、组图获取模块1302,以及组图发送模块1303,其中,
信息获取模块1301,用于获取当前直播视频的信息,其中,信息包括:当前直播视频的类型、待加载的组图接口数据。
一些实例中,信息获取模块1301,具体用于:
对当前直播视频的内容和/或类型标签进行解析,以获取当前直播视频的类型。
组图获取模块1302,用于根据组图接口数据,获取与类型对应的组图并作为目标组图。
一些实例中,组图获取模块1302,具体用于:
根据组图接口数据,从组图库中获取与类型对应的组图并作为目标组图。
组图发送模块1303,用于将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染。
一些实例中,该装置130还包括:
图标获取模块1304,用于从组图库中获取与类型对应的互动控件的图标,并将互动控件的图标发送至终端设备,其中,互动控件用于使用户与当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
标识获取模块1305,用于从控件标识库中获取与当前直播视频的类型对应的互动控件标识,并将互动控件标识发送至终端设备。
互动信息统计模块1306,用于统计用户在预设时间内对互动控件进行操作的互动信息,得到统计结果,并将统计结果发送至终端设备。
建立模块1307,用于对视频的类型进行统计归类,得到多种类型,确定与每种类型对应的组图和对应的互动控件,以及确定互动控件的图标,得到多种组图、多个互动控件以及多个互动控件的图标,并基于每种类型和对应的组图,以及对应的互动控件的图标建立组图库,并将组图库的存储路径作为组图接口数据,基于每种类型和对应的互动控件的标识建立控件标识库。
需要说明的是,上述对图1B-图6中所示视频直播中的互动方法实例的解释说明也适用于该实例的视频直播中的互动装置130,在此不再赘述。
本实例中,通过获取当前直播视频的信息,其中,信息包括:当前直播视频的 类型、待加载的组图接口数据,根据组图接口数据,获取与类型对应的组图并作为目标组图,以及将目标组图发送至终端设备,以使终端设备根据目标组图对当前直播视频的展示界面进行渲染,由于是基于当前直播视频的类型确定对应的组图进行展示,由于不同类型的视频所对应的组图不相同,因而,实现视频直播的展示效果和互动效果的多样化,更恰当地表达所播视频的情感氛围,提升用户使用体验度。
图14是本申请另一些实例提出的视频直播中的互动装置的结构示意图。
参见图14,该装置140包括:目标组图接收模块1401和渲染模块1402,其中,
目标组图接收模块1401,用于接收第三方直播管理平台发送的目标组图,其中,目标组图是第三方直播管理平台根据当前直播视频的信息获取的,其中,信息包括:当前直播视频的类型、待加载的组图接口数据,目标组图为与当前直播视频的类型对应的组图。
渲染模块1402,用于根据目标组图对当前直播视频的展示界面进行渲染,以使用户基于渲染后的展示界面进行互动。
一些实例中,当前直播视频的展示界面包括:小屏模式下直播视频的第一展示界面,第一展示界面的个数为至少一个,和全屏模式下直播视频的第二展示界面,渲染模块1402,具体用于:
根据目标组图分别对第一展示界面和第二展示界面进行渲染。
需要说明的是,上述对图7-图11中所示视频直播中的互动方法实例的解释说明也适用于该实例的视频直播中的互动装置140,在此不再赘述。
图15是本申请实例提出的视频直播中的互动装置的结构示意图。
参见图15,该装置150包括:目标组图接收模块1501和渲染模块1502,其中,
目标组图接收模块1501,用于接收第三方直播管理平台发送的目标组图,其中,目标组图是第三方直播管理平台根据当前直播视频的信息获取的,其中,信息包括:当前直播视频的类型、待加载的组图接口数据,目标组图为与当前直播视频的类型对应的组图。
渲染模块1502,用于根据目标组图对当前直播视频的展示界面进行渲染,以使用户基于渲染后的展示界面进行互动。
一些实例中,当前直播视频的展示界面包括:小屏模式下直播视频的第一展示界面,第一展示界面的个数为至少一个,和全屏模式下直播视频的第二展示界面, 渲染模块1502,具体用于:
根据目标组图分别对第一展示界面和第二展示界面进行渲染。
一些实例中,该装置150还包括:
标识图标获取模块1503,用于接收第三方直播管理平台发送的互动控件标识和互动控件的图标,其中,互动控件用于使用户与当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
渲染模块1502,包括:
获取子模块15021,用于根据互动控件标识获取对应的互动控件。
布置子模块15022,用于在展示界面中布置对应的互动控件。
渲染子模块15023,用于根据互动控件的图标对所布置的互动控件进行渲染,以使用户基于渲染后的互动控件进行互动。
统计结果接收模块1504,用于接收第三方直播平台发送的统计结果,统计结果为第三方直播平台统计用户在预设时间内对互动控件进行操作的互动信息所得到的。
统计结果展示模块1505,用于展示统计结果。
统计结果展示模块1505,包括:
确定子模块15051,用于在用户对互动控件进行操作时,确定与互动控件所属视频的类型对应的互动效果。
展示子模块15052,用于基于对应的互动效果展示统计结果。
需要说明的是,上述对图7-图11中所示视频直播中的互动方法实例的解释说明也适用于该实例的视频直播中的互动装置150,在此不再赘述。
本实例中,通过接收第三方直播管理平台发送的目标组图,其中,目标组图是第三方直播管理平台根据当前直播视频的信息获取的,其中,信息包括:当前直播视频的类型、待加载的组图接口数据,目标组图为与当前直播视频的类型对应的组图,根据目标组图对当前直播视频的展示界面进行渲染,以使用户基于渲染后的展示界面进行互动,由于是基于当前直播视频的类型确定对应的组图进行展示,由于不同类型的视频所对应的组图不相同,因而,实现视频直播的展示效果和互动效果的多样化,更恰当地表达所播视频的情感氛围,提升用户使用体验度。
图16为本申请另一些实例提出的视频直播中的互动装置的结构框图。
参照图16,装置1600可以包括以下一个或多个组件:处理组件1602,存储器1604,电源组件1606,多媒体组件1608,音频组件1610,输入/输出(I/O)的接口1612,传感器组件1614,以及通信组件1616。
处理组件1602通常控制装置1600的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件1602可以包括一个或多个处理器1620来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件1602可以包括一个或多个模块,便于处理组件1602和其他组件之间的交互。例如,处理组件1602可以包括多媒体模块,以方便多媒体组件1608和处理组件1602之间的交互。
存储器1604被配置为存储各种类型的数据以支持在装置1600的操作。这些数据的示例包括用于在装置1600上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器1604可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件1606为装置1600的各种组件提供电力。电源组件1606可以包括电源管理系统,一个或多个电源,及其他与为装置1600生成、管理和分配电力相关联的组件。
多媒体组件1608包括在装置1600和用户之间的提供一个输出接口的触控显示屏。在一些实例中,触控显示屏可以包括液晶显示器(LCD)和触摸面板(TP)。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与触摸或滑动操作相关的持续时间和压力。在一些实例中,多媒体组件1608包括一个前置摄像头和/或后置摄像头。当装置1600处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件1610被配置为输出和/或输入音频信号。例如,音频组件1610包括一个麦克风(MIC),当装置1600处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储 在存储器1604或经由通信组件1616发送。在一些实例中,音频组件1610还包括一个扬声器,用于输出音频信号。
I/O接口1612为处理组件1602和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件1614包括一个或多个传感器,用于为装置1600提供各个方面的状态评估。例如,传感器组件1614可以检测到装置1600的打开/关闭状态,组件的相对定位,例如组件为装置1600的显示器和小键盘,传感器组件1614还可以检测装置1600或装置1600中一个组件的位置改变,用户与装置1600接触的存在或不存在,装置1600方位或加速/减速和装置1600的温度变化。传感器组件1614可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件1614还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实例中,该传感器组件1614还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件1616被配置为便于装置1600和其他设备之间有线或无线方式的通信。装置1600可以接入基于通信标准的无线网络,如Wi-Fi,2G或3G,或它们的组合。在一个示例性实例中,通信组件1616经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实例中,通信组件1616还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实例中,装置1600可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述视频直播中的互动方法。
为了实现上述实例,本申请还提出一种非临时性计算机可读存储介质,计算机可读存储介质存储有一个或者多个程序,当一个或者多个程序被一个设备执行时,使得设备执行本申请实例的视频直播中的互动方法。
需要说明的是,在本申请的描述中,术语“第一”、“第二”等仅用于描述目的, 而不能理解为指示或暗示相对重要性。此外,在本申请的描述中,除非另有说明,“多个”的含义是两个或两个以上。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实例所属技术领域的技术人员所理解。
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实例的步骤之一或其组合。
此外,在本申请各个实例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取可读存储介质中。
上述提到的可读存储介质可以是只读存储器,磁盘或光盘等。
在本说明书的描述中,参考术语“一个实例”、“一些实例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实例或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实例或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实例或示例中以合适的方式结合。
尽管上面已经示出和描述了本申请的实例,可以理解的是,上述实例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对 上述实例进行变化、修改、替换和变型。

Claims (22)

  1. 一种视频直播中的互动方法,由第三方直播管理平台执行,包括以下步骤:
    获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;
    根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;
    将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
  2. 如权利要求1所述的视频直播中的互动方法,其中,所述根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图,包括:
    根据所述组图接口数据,从组图库中获取与所述类型对应的组图并作为目标组图。
  3. 如权利要求2所述的视频直播中的互动方法,其中,还包括:
    从所述组图库中获取与所述类型对应的互动控件的图标,并将所述互动控件的图标发送至所述终端设备,其中,所述互动控件用于使所述用户与所述当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
  4. 如权利要求2或3所述的视频直播中的互动方法,其中,还包括:
    从控件标识库中获取与所述当前直播视频的类型对应的互动控件标识,并将所述互动控件标识发送至所述终端设备。
  5. 如权利要求3所述的视频直播中的互动方法,其中,还包括:
    统计所述用户在预设时间内对所述互动控件进行操作的互动信息,得到统计结果,并将所述统计结果发送至所述终端设备。
  6. 如权利要求4所述的视频直播中的互动方法,其中,在所述获取当前直播视频的信息之前,还包括:
    对视频的类型进行统计归类,得到多种类型;
    确定与每种类型对应的组图和对应的互动控件,以及确定所述互动控件的图标,得到多种组图、多个互动控件以及多个互动控件的图标;
    基于所述每种类型和对应的组图,以及对应的互动控件的图标建立所述组图库, 并将所述组图库的存储路径作为所述组图接口数据;
    基于所述每种类型和对应的互动控件的标识建立所述控件标识库。
  7. 如权利要求1-6任一项所述的视频直播中的互动方法,其中,所述获取当前直播视频的信息,包括:
    对所述当前直播视频的内容和/或类型标签进行解析,以获取所述当前直播视频的类型。
  8. 一种视频直播中的互动方法,由终端设备执行,包括以下步骤:
    接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应的组图;
    根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
  9. 如权利要求8所述的视频直播中的互动方法,其中,所述当前直播视频的展示界面包括:小屏模式下直播视频的第一展示界面,所述第一展示界面的个数为至少一个,和全屏模式下直播视频的第二展示界面,所述根据所述目标组图对所述当前直播视频的展示界面进行渲染,包括:
    根据所述目标组图分别对所述第一展示界面和所述第二展示界面进行渲染。
  10. 如权利要求8所述的视频直播中的互动方法,其中,还包括:
    接收所述第三方直播管理平台发送的互动控件标识和互动控件的图标,其中,所述互动控件用于使所述用户与所述当前直播视频进行互动操作,不同类型的直播视频对应的互动控件不相同,不同互动控件的图标不相同。
  11. 如权利要求10所述的视频直播中的互动方法,其中,所述根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动,包括:
    根据所述互动控件标识获取对应的互动控件;
    在所述展示界面中布置所述对应的互动控件;
    根据所述互动控件的图标对所布置的互动控件进行渲染,以使用户基于渲染后的互动控件进行互动。
  12. 如权利要求10所述的视频直播中的互动方法,其中,还包括:
    接收所述第三方直播平台发送的统计结果,所述统计结果包括第一统计结果和第二统计结果,所述第一统计结果为所述第三方直播平台统计所述用户在预设时间内对所述互动控件进行操作的互动信息所得到的,所述第二统计结果为所述第三方直播平台统计除所述用户之外的用户在所述预设时间内对互动控件进行操作的互动信息所得到的。
  13. 如权利要求12所述的视频直播中的互动方法,还包括:
    接收所述用户对所述第一统计结果和/或第二统计结果的获取指令,并展示与所述获取指令对应的统计结果。
  14. 如权利要求13所述的视频直播中的互动方法,其中,所述展示与所述获取指令对应的统计结果,包括:
    在所述用户对所述互动控件进行操作时,确定与所述互动控件所属视频的类型对应的互动效果;
    基于所述对应的互动效果展示与所述获取指令对应的统计结果。
  15. 一种视频直播中的互动装置,其中,包括:
    信息获取模块,用于获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;
    组图获取模块,用于根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;
    组图发送模块,用于将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
  16. 一种视频直播中的互动装置,其中,包括:
    目标组图接收模块,用于接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应的组图;
    渲染模块,用于根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
  17. 一种视频直播中的互动装置,其中,包括处理器;用于存储处理器可执行 指令的存储器;其中,所述处理器被配置为:
    获取当前直播视频的信息,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据;
    根据所述组图接口数据,获取与所述类型对应的组图并作为目标组图;
    将所述目标组图发送至终端设备,以使所述终端设备根据所述目标组图对所述当前直播视频的展示界面进行渲染。
  18. 一种视频直播中的互动装置,其中,包括处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:
    接收第三方直播管理平台发送的目标组图,其中,所述目标组图是所述第三方直播管理平台根据当前直播视频的信息获取的,其中,所述信息包括:所述当前直播视频的类型、待加载的组图接口数据,所述目标组图为与所述当前直播视频的类型对应的组图;
    根据所述目标组图对所述当前直播视频的展示界面进行渲染,以使用户基于所述渲染后的展示界面进行互动。
  19. 一种第三方直播管理平台,其中,包括:
    如权利要求15或17所述的视频直播中的互动装置。
  20. 一种终端设备,其中,包括:
    如权利要求16或18所述的视频直播中的互动装置。
  21. 一种视频直播中的互动系统,包括:
    如权利要求19所述的第三方直播管理平台,和如权利要求20所述的终端设备。
  22. 一种计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时,使得处理器执行如权利要求1至14任一项所述的方法。
PCT/CN2018/108167 2017-10-10 2018-09-28 视频直播中的互动方法、装置、系统及计算机可读存储介质 WO2019072096A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710934904.5 2017-10-10
CN201710934904.5A CN109660853B (zh) 2017-10-10 2017-10-10 视频直播中的互动方法、装置及系统

Publications (1)

Publication Number Publication Date
WO2019072096A1 true WO2019072096A1 (zh) 2019-04-18

Family

ID=66101262

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/108167 WO2019072096A1 (zh) 2017-10-10 2018-09-28 视频直播中的互动方法、装置、系统及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN109660853B (zh)
WO (1) WO2019072096A1 (zh)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111258693A (zh) * 2020-01-13 2020-06-09 奇安信科技集团股份有限公司 远程显示方法及装置
CN111935489A (zh) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 网络直播方法、信息显示方法、装置、直播服务器及终端设备
CN113253901A (zh) * 2021-03-15 2021-08-13 北京字跳网络技术有限公司 一种直播间内的交互方法、装置、设备及存储介质
CN113490005A (zh) * 2021-06-30 2021-10-08 北京达佳互联信息技术有限公司 直播间的信息交互方法、装置、电子设备及存储介质
CN114298891A (zh) * 2021-12-29 2022-04-08 广州久邦世纪科技有限公司 一种基于OpenGL ES架构的图片处理方法
CN114489905A (zh) * 2022-01-27 2022-05-13 广州方硅信息技术有限公司 直播间活动数据处理方法及其装置、设备、介质、产品
CN114745558A (zh) * 2021-01-07 2022-07-12 北京字节跳动网络技术有限公司 直播监控方法、装置、系统、设备及介质
CN114793295A (zh) * 2021-01-25 2022-07-26 腾讯科技(深圳)有限公司 视频的处理方法、装置、电子设备及计算机可读存储介质
CN115225928A (zh) * 2022-05-11 2022-10-21 北京广播电视台 一种多类型音视频混播系统及方法

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069310B (zh) * 2019-04-23 2022-04-22 北京小米移动软件有限公司 切换桌面壁纸的方法、装置及存储介质
CN113253880B (zh) * 2020-02-11 2024-03-08 腾讯科技(深圳)有限公司 互动场景的页面的处理方法、装置及存储介质
CN111277853B (zh) * 2020-02-28 2023-09-08 腾讯科技(深圳)有限公司 直播信息的处理方法及装置
CN111741362A (zh) * 2020-08-11 2020-10-02 恒大新能源汽车投资控股集团有限公司 与视频用户互动的方法和装置
CN112333518B (zh) * 2020-09-22 2022-12-27 北京达佳互联信息技术有限公司 用于视频的功能配置方法、装置及电子设备
CN114765692B (zh) * 2021-01-13 2024-01-09 北京字节跳动网络技术有限公司 一种直播数据处理方法、装置、设备及介质
CN112464031A (zh) * 2021-02-02 2021-03-09 北京达佳互联信息技术有限公司 交互方法、装置、电子设备以及存储介质
CN115225916B (zh) * 2021-04-15 2024-04-23 北京字节跳动网络技术有限公司 视频处理方法、装置及设备
CN113766296B (zh) * 2021-05-10 2023-10-13 腾讯科技(深圳)有限公司 直播画面的展示方法和装置
CN113660504B (zh) * 2021-08-18 2024-04-16 北京百度网讯科技有限公司 消息展示方法、装置、电子设备以及存储介质
CN115767198A (zh) * 2022-11-10 2023-03-07 北京字跳网络技术有限公司 针对直播的组件触发方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102457780A (zh) * 2010-10-19 2012-05-16 腾讯科技(北京)有限公司 为网络视频提供即时数据的方法及系统
US20140043426A1 (en) * 2012-08-11 2014-02-13 Nikola Bicanic Successive real-time interactive video sessions
CN103997691A (zh) * 2014-06-02 2014-08-20 合一网络技术(北京)有限公司 视频交互的方法和系统
CN106487781A (zh) * 2016-09-13 2017-03-08 腾讯科技(深圳)有限公司 基于直播的资源数据处理方法、装置和系统
CN106791936A (zh) * 2016-11-28 2017-05-31 广州华多网络科技有限公司 一种虚拟礼物的展示方法及装置

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100701856B1 (ko) * 2005-08-12 2007-04-02 삼성전자주식회사 이동통신 단말기에서 메시지의 배경효과 제공 방법
WO2008113947A2 (fr) * 2007-02-28 2008-09-25 France Telecom Procede de transmission d'informations pour une restitution collective d'informations emotionnelles
CN101271528B (zh) * 2008-04-11 2012-06-27 北京中星微电子有限公司 一种输出图像的方法及装置
CN102455906B (zh) * 2010-11-01 2014-12-10 腾讯科技(深圳)有限公司 播放器皮肤变换的方法和系统
CN102637433B (zh) * 2011-02-09 2015-11-25 富士通株式会社 识别语音信号中所承载的情感状态的方法和系统
US8937620B1 (en) * 2011-04-07 2015-01-20 Google Inc. System and methods for generation and control of story animation
US20130151970A1 (en) * 2011-06-03 2013-06-13 Maha Achour System and Methods for Distributed Multimedia Production
US8719277B2 (en) * 2011-08-08 2014-05-06 Google Inc. Sentimental information associated with an object within a media
US20130232516A1 (en) * 2012-03-01 2013-09-05 David S. PAULL Method And Apparatus for Collection and Analysis of Real-Time Audience Feedback
CN106162221A (zh) * 2015-03-23 2016-11-23 阿里巴巴集团控股有限公司 直播视频的合成方法、装置及系统
GB2546468A (en) * 2015-11-13 2017-07-26 Velapp Ltd Video system
CN106101810B (zh) * 2016-08-15 2019-09-20 青岛海信电器股份有限公司 用于智能电视的界面主题变换方法、装置及智能电视
CN106454532A (zh) * 2016-10-21 2017-02-22 合网络技术(北京)有限公司 视频播放器及其互动显示方法
CN106791893B (zh) * 2016-11-14 2020-09-11 北京小米移动软件有限公司 视频直播方法及装置
CN106713967A (zh) * 2016-12-09 2017-05-24 武汉斗鱼网络科技有限公司 一种基于React Native的虚拟礼物展示方法
CN106792229B (zh) * 2016-12-19 2020-08-21 广州虎牙信息科技有限公司 基于直播间视频流弹幕的投票交互方法及其装置
CN106959795A (zh) * 2017-02-23 2017-07-18 北京潘达互娱科技有限公司 一种直播应用中的图标展示方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102457780A (zh) * 2010-10-19 2012-05-16 腾讯科技(北京)有限公司 为网络视频提供即时数据的方法及系统
US20140043426A1 (en) * 2012-08-11 2014-02-13 Nikola Bicanic Successive real-time interactive video sessions
CN103997691A (zh) * 2014-06-02 2014-08-20 合一网络技术(北京)有限公司 视频交互的方法和系统
CN106487781A (zh) * 2016-09-13 2017-03-08 腾讯科技(深圳)有限公司 基于直播的资源数据处理方法、装置和系统
CN106791936A (zh) * 2016-11-28 2017-05-31 广州华多网络科技有限公司 一种虚拟礼物的展示方法及装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111935489A (zh) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 网络直播方法、信息显示方法、装置、直播服务器及终端设备
CN111935489B (zh) * 2019-05-13 2023-08-04 阿里巴巴集团控股有限公司 网络直播方法、信息显示方法、装置、直播服务器及终端设备
CN111258693B (zh) * 2020-01-13 2024-04-09 奇安信科技集团股份有限公司 远程显示方法及装置
CN111258693A (zh) * 2020-01-13 2020-06-09 奇安信科技集团股份有限公司 远程显示方法及装置
CN114745558A (zh) * 2021-01-07 2022-07-12 北京字节跳动网络技术有限公司 直播监控方法、装置、系统、设备及介质
CN114745558B (zh) * 2021-01-07 2024-04-09 北京字节跳动网络技术有限公司 直播监控方法、装置、系统、设备及介质
CN114793295B (zh) * 2021-01-25 2023-07-07 腾讯科技(深圳)有限公司 视频的处理方法、装置、电子设备及计算机可读存储介质
CN114793295A (zh) * 2021-01-25 2022-07-26 腾讯科技(深圳)有限公司 视频的处理方法、装置、电子设备及计算机可读存储介质
CN113253901A (zh) * 2021-03-15 2021-08-13 北京字跳网络技术有限公司 一种直播间内的交互方法、装置、设备及存储介质
CN113490005B (zh) * 2021-06-30 2023-08-18 北京达佳互联信息技术有限公司 直播间的信息交互方法、装置、电子设备及存储介质
CN113490005A (zh) * 2021-06-30 2021-10-08 北京达佳互联信息技术有限公司 直播间的信息交互方法、装置、电子设备及存储介质
CN114298891A (zh) * 2021-12-29 2022-04-08 广州久邦世纪科技有限公司 一种基于OpenGL ES架构的图片处理方法
CN114489905A (zh) * 2022-01-27 2022-05-13 广州方硅信息技术有限公司 直播间活动数据处理方法及其装置、设备、介质、产品
CN115225928A (zh) * 2022-05-11 2022-10-21 北京广播电视台 一种多类型音视频混播系统及方法
CN115225928B (zh) * 2022-05-11 2023-07-25 北京广播电视台 一种多类型音视频混播系统及方法

Also Published As

Publication number Publication date
CN109660853A (zh) 2019-04-19
CN109660853B (zh) 2022-12-30

Similar Documents

Publication Publication Date Title
WO2019072096A1 (zh) 视频直播中的互动方法、装置、系统及计算机可读存储介质
CN106028166B (zh) 直播过程中直播间切换方法及装置
US10235305B2 (en) Method and system for sharing content, device and computer-readable recording medium for performing the method
WO2018072741A1 (zh) 基于即时通讯消息的任务管理
US10334282B2 (en) Methods and devices for live broadcasting based on live broadcasting application
CN111343476A (zh) 视频共享方法、装置、电子设备及存储介质
CN107526591B (zh) 切换直播间类型的方法和装置
EP2985980B1 (en) Method and device for playing stream media data
WO2016011742A1 (zh) 通话方法、装置和系统
JP2018502408A (ja) 情報をプッシュする方法、装置、設備及びシステム
CN109451341B (zh) 视频播放方法、视频播放装置、电子设备及存储介质
US20210389856A1 (en) Method and electronic device for displaying interactive content
CN113365153B (zh) 数据分享方法、装置、存储介质及电子设备
WO2017101345A1 (zh) 一种视频播放方法及装置
CN114025180A (zh) 一种游戏操作同步系统、方法、装置、设备及存储介质
US9794415B2 (en) Calling methods and devices
CN112291631A (zh) 信息获取方法、装置、终端及存储介质
CN112256180A (zh) 消息展示方法、消息评论处理方法、装置及电子设备
CN110636318A (zh) 消息显示方法、装置、客户端设备、服务器及存储介质
CN110620956A (zh) 直播虚拟资源的通知方法、装置、电子设备及存储介质
CN112616053B (zh) 直播视频的转码方法、装置及电子设备
CN113988021A (zh) 内容互动方法、装置、电子设备及存储介质
CN110913276B (zh) 数据处理的方法、装置、服务器、终端及存储介质
CN109831538A (zh) 一种消息处理方法、装置、服务器、终端及介质
US11496787B2 (en) Information processing method and device, electronic device, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18866501

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18866501

Country of ref document: EP

Kind code of ref document: A1