WO2024099450A1 - 直播间页面显示方法、装置、电子设备及存储介质 - Google Patents

直播间页面显示方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2024099450A1
WO2024099450A1 PCT/CN2023/131113 CN2023131113W WO2024099450A1 WO 2024099450 A1 WO2024099450 A1 WO 2024099450A1 CN 2023131113 W CN2023131113 W CN 2023131113W WO 2024099450 A1 WO2024099450 A1 WO 2024099450A1
Authority
WO
WIPO (PCT)
Prior art keywords
live broadcast
component
live
area
target
Prior art date
Application number
PCT/CN2023/131113
Other languages
English (en)
French (fr)
Inventor
李林兴
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2024099450A1 publication Critical patent/WO2024099450A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application

Definitions

  • the embodiments of the present disclosure relate to the field of Internet technology, and in particular, to a method, device, electronic device and storage medium for displaying a live broadcast room page.
  • live broadcast applications for competitive events
  • users can enter the event live broadcast room and watch the live video of the game through the live broadcast application client running on the terminal device.
  • the live broadcast application client in addition to the window area for playing the live video, the live broadcast application client also has a related function area. Users trigger the interactive components in the function area to realize corresponding interactive functions such as speaking and liking.
  • the interactive components are fixedly arranged in the functional area, which has the problems of low operating efficiency and poor interactive effect.
  • the embodiments of the present disclosure provide a method, device, electronic device and storage medium for displaying a live broadcast room page, so as to overcome the problems of low operating efficiency and poor interactive effect of interactive components in the live broadcast room page.
  • an embodiment of the present disclosure provides a live broadcast room page display method, including:
  • a live broadcast room page is displayed, wherein the live broadcast room page includes a first area and a second area, wherein the first area is used to display a live video; based on the content of the live video displayed in the first area, a target interactive component is displayed in the second area, and the target interactive component is used to execute a component function corresponding to the live video after being triggered.
  • a live broadcast room page display device including:
  • a display module used to display a live broadcast room page, wherein the live broadcast room page includes a first area and a second area, wherein the first area is used to display a live broadcast video;
  • a processing module is used to display a target interactive component in the second area according to the content of the live video displayed in the first area, and the target interactive component is used to execute a component function corresponding to the live video after being triggered.
  • an electronic device including:
  • a processor and a memory communicatively connected to the processor
  • the memory stores computer-executable instructions
  • the processor executes the computer-executable instructions stored in the memory to implement the live broadcast room page display method described in the first aspect and various possible designs of the first aspect.
  • an embodiment of the present disclosure provides a computer-readable storage medium, in which computer execution instructions are stored.
  • a processor executes the computer execution instructions, the live broadcast room page display method described in the first aspect and various possible designs of the first aspect is implemented.
  • an embodiment of the present disclosure provides a computer program product, including a computer program, which, when executed by a processor, implements the live broadcast room page display method described in the first aspect and various possible designs of the first aspect.
  • the live broadcast room page display method, device, electronic device and storage medium provided in this embodiment display the live broadcast room page, wherein the live broadcast room page includes a first area and a second area, wherein the first area is used to display the live broadcast video; according to the content of the live broadcast video displayed in the first area, the target interactive component is displayed in the second area, and the target interactive component is used to execute the component function corresponding to the live broadcast video after being triggered.
  • the corresponding target interactive component is determined by the content of the live broadcast video, and the corresponding target interactive component is displayed when playing different live broadcast videos, achieving the purpose of dynamic display of interactive components in the live broadcast room page, improving the interaction efficiency of users when triggering corresponding interactive components at different stages of the live broadcast, and making the concentration of interactive components triggered by users in the live broadcast room higher, thereby improving the direct interaction effect of users in the live broadcast room.
  • FIG1 is a diagram of an application scenario of a live broadcast room page display method provided by an embodiment of the present disclosure
  • FIG2 is a flow chart of a method for displaying a live broadcast room page according to an embodiment of the present disclosure
  • FIG3 is a schematic diagram of a live broadcast room page provided by an embodiment of the present disclosure.
  • FIG4 is a schematic diagram of another live broadcast room page provided in an embodiment of the present disclosure.
  • FIG5 is a schematic diagram showing a second area provided by an embodiment of the present disclosure.
  • FIG6 is a flowchart of a specific implementation method of step S102 in the embodiment shown in FIG2 ;
  • FIG7 is a schematic diagram of a process of displaying a target interaction component provided by an embodiment of the present disclosure
  • FIG8 is a second flow chart of a method for displaying a live broadcast room page according to an embodiment of the present disclosure
  • FIG9 is a flowchart of a specific implementation method of step S204 in the embodiment shown in FIG7 ;
  • FIG10 is a schematic diagram of determining the position of a target interactive component based on live event information provided by an embodiment of the present disclosure
  • FIG11 is a structural block diagram of a live broadcast room page display device provided in an embodiment of the present disclosure.
  • FIG12 is a schematic diagram of the structure of an electronic device provided by an embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram of the hardware structure of an electronic device provided in an embodiment of the present disclosure.
  • FIG1 is an application scenario diagram of the live broadcast room page display method provided by the embodiment of the present disclosure.
  • the live broadcast room page display method provided by the embodiment of the present disclosure can be applied to the application scenario of live broadcast of events.
  • the method provided by the embodiment of the present disclosure can be applied to a terminal device.
  • a client of a live broadcast application runs in the terminal device.
  • the terminal device communicates with a server running a live broadcast service end.
  • the server sends live broadcast data to the terminal device.
  • the terminal device obtains the live broadcast data and plays it in the client page, thereby realizing the video broadcast of competitive events such as football matches, basketball matches, and e-sports matches.
  • the live broadcast room page of the live broadcast application client is provided with interactive components for realizing interaction between users, such as the "chat” component and the "voting" component shown in the figure.
  • the live broadcast application client will execute the corresponding target function, such as displaying the function page corresponding to the "chat” component, or displaying the function page corresponding to the "voting" component.
  • interactive components are usually fixed settings, that is, before opening the live broadcast room, the live broadcast room is configured through the server, and one or more interactive components for fixed display are set.
  • the user enters the live broadcast room page through the live broadcast application client, and executes the corresponding several fixed target functions based on the several fixed interactive components provided by the live broadcast room.
  • the live broadcast process of the live broadcast room for the event will be divided into at least several live broadcast stages, such as before the start of the game, during the game, and after the game. In different live broadcast stages, the interactive components required by users are different.
  • the users in the live broadcast room need to use the interactive component of "match prediction"; and in the live broadcast stage during the game, the users in the live broadcast room need to use the interactive component of "chat".
  • users can only trigger the corresponding required interactive components by manually selecting from all the supported interactive components in the live broadcast room page at different live broadcast stages.
  • most interactive components will be in a folded state, which further reduces the operation efficiency in the selection process of interactive components, resulting in low operation efficiency and poor interaction effect.
  • the disclosed embodiment provides a live broadcast room page display method to solve the above-mentioned problem.
  • FIG. 2 is a flow chart of a method for displaying a live broadcast room page provided in an embodiment of the present disclosure.
  • the method of this embodiment can be applied in a terminal device or a server. Taking a terminal device as an example, the method includes:
  • Step S101 Display the live broadcast room page, which includes a first area and a second area, wherein the first area is used to display the live video.
  • the executor of the method of this embodiment is a terminal device, such as a smart phone.
  • a client of a live broadcast application is running in the terminal device.
  • the live broadcast room page is displayed in the client of the terminal device.
  • FIG3 is a schematic diagram of a live broadcast room page provided in an embodiment of the present disclosure.
  • the live broadcast room page includes a first area and a second area.
  • the first area is located at the upper part of the live broadcast room page, and the second area is located at the lower part of the live broadcast room page.
  • the first area is used to display live videos, such as live broadcast videos of football games, basketball games, etc.
  • the second area is used to display content related to the live video.
  • Interactive components such as public chat channel components, game lineup display components, etc.
  • FIG 4 is a schematic diagram of another live broadcast room page provided by an embodiment of the present disclosure.
  • the body posture is in a vertical screen state as shown in Figure 3, as shown in Figure 4, exemplarily, when the body posture is in a horizontal screen state, the first area and the second area are set on the left and right, and the first area for playing the live video is located on the left side of the live broadcast room page, and the second area for displaying the interactive component is located on the right side of the live broadcast room page.
  • the screen size of the live video (game video) can be better adapted, and the interactive functions in the live broadcast room page can be ensured to be normally implemented on the basis of improving the visual perception.
  • Step S102 Display a target interactive component in the second area according to the content of the live video displayed in the first area, and the target interactive component is used to execute a component function corresponding to the live video after being triggered.
  • an interactive component matching the content of the live video i.e., a target interactive component
  • the component content displayed is the component function for the live video.
  • the second area may include one or more target interactive components.
  • the component function corresponding to the live video is executed, which exemplarily specifically includes: displaying the component name and component content of the target interactive component in the second area; when the second area includes multiple target interactive components, executing the component function corresponding to the live video, which exemplarily specifically includes: displaying the component names of multiple target interactive components corresponding to the content of the live video in the second area, as well as the component content of the currently triggered target interactive component.
  • FIG5 is a schematic diagram of a second area display provided by an embodiment of the present disclosure.
  • the second area includes a component list area and a component content area. Taking a football match as an example, before the match starts, FIG5 shows the first stage.
  • the content of the live video displayed in the first area is an introduction to the players and lineups of both teams.
  • corresponding target interactive components including “lineup introduction”, “win/loss prediction” and “channel chat” are displayed.
  • the currently triggered target interactive component list area is displayed.
  • the currently triggered target interactive component is "Lineup Introduction”
  • the corresponding component content area for example, the team lineup of Team_1 is displayed.
  • the component content area will be triggered to display the corresponding component content. For example, when clicking "Channel Chat”, the component content area will display the component content corresponding to the interactive component of "Channel Chat”, such as the speech of the user in the live broadcast room (not shown in the figure).
  • the content of the live video displayed in the first area is the specific match situation of the football game.
  • the corresponding target interactive components including "channel chat” and “match data” are displayed in the component list area in the second area.
  • the currently triggered target interactive component is “channel chat”
  • the component content corresponding to the interactive component of "channel chat” is displayed.
  • the component content area will be triggered to display the corresponding component content, which will not be repeated.
  • the client of the terminal device will determine a default target interaction component as the currently triggered target interaction component.
  • the default triggered target interaction component is determined to be interaction component A (such as the "lineup introduction” interaction component); after the start of the game, the default triggered target interaction component is determined to be interaction component B (such as the "channel chat” interaction component), so that users in the live broadcast room use the same interaction component to interact in the default state, thereby improving the concentration of interaction components used by users in the live broadcast room and improving the interaction effect.
  • step S102 includes:
  • Step S1021 Receive the live broadcast information sent by the server, where the live broadcast information represents the content of the live video currently displayed on the live broadcast room page.
  • Step S1022 Determine at least one target interactive component based on the live broadcast information.
  • Step S1023 Display at least one target interactive component in the second area.
  • the live broadcast data sent by the target live broadcast room includes live broadcast information for representing the content of the live video currently displayed on the live broadcast room page.
  • the live broadcast information includes live broadcast event information, and the live broadcast event information represents the live broadcast event occurring in the live broadcast video.
  • the live broadcast event information is, for example, an identifier of the live broadcast time; the live broadcast event information is triggered based on the content of the live broadcast video, and is used to represent events of specific content links.
  • the live broadcast events corresponding to the content of the live broadcast video include, for example, “goal”, “substitution”, “foul”, “start of the first/second half of the game", “end of the first/second half of the game”, etc.
  • the live broadcast event is triggered based on the content of the live broadcast video, which means that when the content of the live broadcast video meets the requirements (for example, a foul or a goal occurs), the live broadcast information sent by the server to the client on the terminal device side will include live broadcast event information, and the live broadcast event information represents the identifier of a specific live broadcast event.
  • the corresponding target live event is determined, and based on the preset mapping relationship, the target interaction component corresponding to the target live event is obtained. For example, when the live event information info_1 is received in the live information, the corresponding target live event is determined to be "match start”, at this time, the target interaction components corresponding to the target live event are obtained, which are "channel chat", "team information” and "real-time match data” respectively; when the live event information info2 is received in the live information, the corresponding target live event is determined to be "match end”, at this time, the target interaction components corresponding to the target live event are obtained, which are "channel chat” and “final match data” respectively.
  • the live broadcast information may also include other information, such as live broadcast room information that characterizes the number of viewers in the live broadcast room, so as to further determine a more accurate target live broadcast event.
  • live broadcast room information that characterizes the number of viewers in the live broadcast room.
  • the specific implementation method is described in detail in the subsequent embodiments and will not be repeated here.
  • a highly real-time and diverse target interactive component determination method is achieved, the rationality of the target interactive component displayed in real time is improved, the concentration of interactive components used by users in the live broadcast room is improved, and the interactive effect is improved.
  • the target interactive component is displayed in the second area in an application scenario in which the terminal device is in a vertical screen state.
  • the target interactive component in the second area is not triggered by default, that is, the component content of the target interactive component is not displayed in the second area before the terminal device receives a trigger instruction.
  • the specific implementation of step S102 includes:
  • Step S1023 According to the content of the live video, a corresponding preview component is displayed in the second area, and the component appearance of the preview component is used to represent the content of at least one interactive function.
  • Step S1024 In response to a trigger operation on the preview component, display at least one target interactive component in the second area.
  • FIG7 is a schematic diagram of a process of displaying a target interactive component provided by an embodiment of the present disclosure.
  • the body posture is in a horizontal screen state, it is more matched with the screen size of the live video.
  • the screen size of the terminal device is the same as the screen size of the live video, full-screen playback without borders can be achieved, thereby achieving a better video viewing experience.
  • the second area partially overlaps with the first area, that is, the second area is a sub-area of the first area.
  • a preview component suspended above the live video is first displayed in the second area (in the first area), thereby reducing the interference of the interactive component on the user's viewing of the live video, wherein the component appearance of the preview component is used to characterize the content of at least one interactive function, for example, including characters representing the component name of the interactive component (the component appearance of the preview component in the figure includes "chat" and "data” characters).
  • the terminal device responds to the trigger operation on the preview component and displays the target interactive component in the second area.
  • the first area and the second area no longer overlap, and the second area corresponds to an independent area, so as to better display the component content of the target interactive component.
  • the specific implementation method of determining the target interactive component according to the content of the live video in the horizontal screen state and displaying the target interactive component in the second area is similar to the implementation method of displaying the target interactive component in the second area in the vertical screen state, and will not be repeated here.
  • the live broadcast room page includes a first area and a second area, wherein the first area is used to display the live broadcast video; according to the content of the live broadcast video displayed in the first area, the target interactive component is displayed in the second area, and the target interactive component is used to execute the component function corresponding to the live broadcast video after being triggered.
  • the corresponding target interactive component is determined by the content of the live broadcast video, and the corresponding target interactive component is displayed when playing different live broadcast videos, achieving the purpose of dynamic display of interactive components in the live broadcast room page, improving the interaction efficiency of users when triggering corresponding interactive components at different stages of the live broadcast, and making the concentration of interactive components triggered by users in the live broadcast room higher, improving the direct interaction effect of users in the live broadcast room.
  • FIG8 is a flow chart of the second method for displaying a live broadcast room page provided by an embodiment of the present disclosure. This embodiment is based on the embodiment shown in FIG2 .
  • the method for determining and displaying the target interactive component in step S102 is further refined.
  • the method for displaying a live broadcast room page includes:
  • Step S201 Display the live broadcast room page, which includes a first area and a second area, wherein the first area is used to display the live video.
  • Step S202 receiving live broadcast information sent by the server, the live broadcast information including live broadcast event information and/or live broadcast room information, the live broadcast event information represents the live broadcast event occurring in the live broadcast video, and the live broadcast room information represents the number of times the target interactive event is triggered in the live broadcast room.
  • Step S203 Determine the corresponding target interactive component according to the live broadcast information.
  • the content of the live video is determined by the live information sent by the server, and then the corresponding target interactive component is determined.
  • the live information includes live event information and live room information, wherein the live event information can be an identifier representing a live event such as "game start" or "game end”.
  • the specific implementation method of determining the corresponding target interactive component based on the live event information has been described in detail in the embodiment shown in FIG. 2, and will not be repeated here.
  • the live broadcast room information represents the number of times the target interactive event is triggered in the live broadcast room that plays the live video, wherein the target interactive event is an event triggered by the user in the live broadcast room, such as "like", “follow”, “support the selected team”, etc.
  • the target interactive event can be implemented by the corresponding interactive component, and the number of times the target interactive event is triggered will trigger the corresponding target interactive component accordingly. More specifically, for example, when the number of "likes" (target interactive events) reaches 30,000, the corresponding target interactive component is determined to be the interactive component A for participating in “giving virtual gifts"; when the number of "supports" reaches 100,000, the corresponding target interactive component is determined to be the interactive component B for participating in "physical lottery”.
  • the live event information can be further combined to determine the corresponding target interaction component, thereby determining the target interaction component that is ultimately displayed in the second area.
  • the target interaction component A is determined; based on the live broadcast event information, the target interaction component B is determined, and the set of target interaction component A and target interaction component B is the target interaction component that is ultimately displayed in the second area.
  • the target interaction component displayed in the second area can also be determined independently based on the live broadcast event information, or independently The target interactive component displayed in the second area is determined based on the live broadcast room information.
  • the specific implementation method is similar and will not be repeated here.
  • the corresponding target interactive components are determined by live event information and/or live room information, so that the target interactive components displayed in the second area can be dynamically adjusted according to the progress of the game and the user participation in the live room, thereby increasing the interest of users in the live room to participate in the interaction and improving the interaction effect.
  • Step S204 Determine the position of each target interactive component in the second area according to the live broadcast information.
  • the display position of the target interaction component in the second area will affect the user's attention to each target interaction component, thereby affecting the triggering probability of the target interaction component.
  • step S204 includes:
  • Step S2041 Determine the priority of each target interactive component according to the live broadcast information.
  • Step S2042 based on the priority of each target interactive component, determine the display information of each target interactive component.
  • Step S2043 Determine the position of each target interactive component in the second area based on the display information.
  • FIG. 10 is a schematic diagram of determining the position of a target interactive component based on live broadcast event information provided by an embodiment of the present disclosure.
  • the live broadcast event represented is "the start of the first half of the game”
  • the live broadcast event information is #2
  • the live broadcast event represented is "the start of the second half of the game”.
  • the target interactive components corresponding to the live broadcast event information #1 are interactive component A for displaying "AI win-lose prediction", interactive component B for displaying "channel chat”, and interactive component C for displaying "team support rate”.
  • the corresponding display order is determined to be [interactive component A, interactive component B, interactive component C], and each interactive component in the second area
  • the position can be referred to as shown in FIG9 .
  • the high-priority interactive component A is arranged at the far left in the component list area of the second area, and the positions of the interactive components B and C are arranged in the back.
  • the target interactive component with the highest priority (e.g., the far left in the component list area) is triggered by default, and its component content is displayed by default in the component content area of the second area, thereby guiding the focus of users in the live broadcast room.
  • the priorities of the interactive components B and C are the same, they can be randomly set, or they can be further determined based on the live broadcast information. Specifically, for example, the relative priorities of the interactive components B and C are determined according to the live broadcast room information in the live broadcast information, thereby determining the position sorting of the interactive components B and C in the component list area.
  • the target interactive components corresponding to the live event information #2 are interactive component A for displaying "AI win-lose prediction", interactive component B for displaying "channel chat”, and interactive component C for displaying "team support rate” (the same as live event information #1).
  • the high-priority interactive component B is arranged at the far left in the component list area of the second area, while the positions of interactive component C and interactive component A are set in the back in sequence.
  • the priority corresponding to each target interactive component is determined through the live broadcast information, so as to realize the dynamic arrangement of each target interactive component in the second area, better meet the needs of users for corresponding interactions at different stages of the game, and improve the interaction efficiency and interaction effect of the live broadcast room.
  • the display information is also used to characterize the time sequence of displaying the target interactive components in the target time period.
  • the method further includes: determining the time sequence of displaying each target interactive component in the target time period based on the display information.
  • the target time period is determined by the display information, for example, the time period between the start of the game and the end of the game, wherein, according to the display information, the interactive component A is displayed in the second area from the start of the game to the halftime break (the first time period); and the interactive component B is displayed in the second area from the halftime break to the end of the game (the second time period).
  • the display order of the target interactive components in the second area is determined by the display information, so as to realize the dynamic display of the interactive components based on the time feature, and further improve the interaction efficiency of the interactive components.
  • the live broadcast information includes the broadcasting time.
  • the broadcast duration is used to determine the priority of each target interactive component.
  • the broadcast duration can refer to the duration after the live broadcast room is opened, or it can be the duration calculated based on the target live event corresponding to the live video as the starting point, such as the duration from "the start of the game" to the current time. For example, after "the start of the game", in the 10th minute of the game, in the corresponding target interactive component, the priority of interactive component A is level 2, and the priority of interactive component B is level 1. In the 30th minute of the game, in the corresponding target interactive component, the priority of interactive component A is level 1, and the priority of interactive component B is level 2.
  • the priority of each target interactive component is further refined through the broadcast duration in the live broadcast information, so as to achieve the purpose of dynamically adjusting the sorting position of the target interactive component as the game duration changes, further improve the dynamic setting accuracy of the target interactive component, better meet the needs of users for corresponding interactions at different stages of the game, and improve the interaction efficiency and interaction effect of the live broadcast room.
  • Step S205 Determine the display mode and/or fixed mode of the target interactive component according to the live broadcast information, wherein the display mode is embedded display or floating display, and the fixed mode is a permanent state or an adjustable state.
  • Step S206 displaying each target interactive component in the second area based on the display mode and/or fixing mode of each target interactive component.
  • the target interactive component there are corresponding attributes: display mode and/or fixed mode, wherein the display mode refers to the style in which the target interactive component is displayed in the second area, including embedded display and floating display.
  • embedded display means that the target interactive component is embedded in the component list area of the second display area, and the component list area can correspond to a component box or component panel for carrying multiple components.
  • the target interactive component is fixedly set in the component box or component panel in an embedded manner; floating display means that the target interactive component is not fixed in a fixed manner, and is set in the component list area of the second area, for example, it is set outside the component box corresponding to the component list area, and can cover the interactive component in the component list area (making it invisible), so as to achieve a more prominent display effect to guide the user to trigger.
  • the live broadcast information when the live broadcast information sent by the server includes live broadcast event information representing the target live broadcast event, the display mode of a target interactive component is configured as a floating display. More specifically, the target live broadcast event corresponding to the live broadcast information is, for example, "goal", then the target interactive component A is configured as a floating display; and the target interactive component B is configured as an embedded display.
  • the fixed mode is another attribute of the target interactive component, which refers to the way in which the target interactive component responds to the switching component. Based on the preset mapping relationship and the live broadcast information, the fixed mode corresponding to the target interactive component can be determined.
  • the specific implementation method is similar to the determination method of the display mode. I will not go into details here.
  • Step S207 In response to the component moving operation, adjusting the position of the target interactive component in the adjustable state within the second area.
  • the component list area of the second region includes at least two target interaction components, which are displayed in an embedded manner, that is, at least two target interaction components are fixedly set in the component frame of the component list area in an embedded manner, wherein, exemplarily, the fixing manner of the target interaction component A is a permanent state, and the fixing manner of the target interaction component B is an adjustable state.
  • the target interaction component B in the adjustable state can move its position in the second region in response to a drag gesture (component moving operation), more specifically, for example, the target interaction component is moved from the second position on the left to the first position on the left.
  • the fixing method of the target interactive component is determined according to the live broadcast information, and the position of the target interactive component in the adjustable state is adjusted in combination with the component movement operation to meet the user's personalized operation needs.
  • the position of the target interactive component in the resident state cannot be moved, the key components related to the content of the user and the live video are guaranteed to be displayed in the live broadcast room page, thereby ensuring the concentration of interactive components used by users in the live broadcast room, improving user participation, and improving user interaction effects in event live broadcast scenarios.
  • step S210 is the same as the implementation method of step S101 in the embodiment shown in FIG. 2 of the present disclosure, and will not be described in detail here.
  • FIG11 is a structural block diagram of a live broadcast room page display device provided by an embodiment of the present disclosure. For the convenience of explanation, only the parts related to the embodiment of the present disclosure are shown.
  • the live broadcast room page display device includes:
  • a display module 31 is used to display a live broadcast room page, which includes a first area and a second area, wherein the first area is used to display the live broadcast video;
  • the processing module 32 is used to display a target interactive component in the second area according to the content of the live video displayed in the first area, and the target interactive component is used to execute a component function corresponding to the live video after being triggered.
  • the processing module 31 is specifically used to: obtain live broadcast information, the live broadcast information represents the content of the live video currently displayed on the live broadcast room page; determine at least one target interactive component based on the live broadcast information; and display at least one target interactive component in the second area.
  • the live broadcast information includes live broadcast event information.
  • the information represents a live event occurring in the live video; when the processing module 31 determines at least one target interactive component according to the live information, it is specifically used to: determine the corresponding target interactive component according to the target live event corresponding to the live event information in the live information.
  • the live broadcast information includes live broadcast room information, and the live broadcast room information represents the number of times a target interactive event is triggered in the live broadcast room; when the processing module 31 determines the target interactive component according to the target live broadcast event corresponding to the live broadcast event information in the live broadcast information, it is specifically used to: determine the target number of times the target interactive event is triggered in the live broadcast room corresponding to the live broadcast room page according to the live broadcast room information in the live broadcast information; determine the target interactive component according to the target number of times the target interactive event is triggered and the target live broadcast event corresponding to the live broadcast event information.
  • the processing module 31 is also used to: determine the priority of each target interactive component based on the live broadcast information; determine the display order of each target interactive component based on the priority of each target interactive component; and determine the position of each target interactive component in the second area based on the display order.
  • the live broadcast information includes the broadcast duration of the live broadcast video; when the processing module 31 determines the priority of each target interactive component based on the live broadcast information, it is specifically used to: determine the priority of each target interactive component based on the broadcast duration.
  • the processing module 31 is further used to: determine a display mode of the target interactive component according to the live broadcast information, wherein the display mode is an embedded display or a floating display;
  • the processing module 31 displays at least one target interactive component in the second area
  • the processing module 31 is specifically configured to: display each target interactive component in the second area based on the display mode of each target interactive component.
  • the processing module 31 is further used to: determine a fixing mode of the target interactive component based on the live broadcast information, wherein the fixing mode is a permanent state or an adjustable state; and adjust the position of the target interactive component in the adjustable state within the second area in response to a component movement operation.
  • the processing module 31 is specifically used to: display a corresponding preview component in the second area according to the content of the live video, and the component appearance of the preview component is used to represent the content of at least one interactive function; in response to a trigger operation on the preview component, display at least one target interactive component in the second area.
  • the display module 31 is connected to the processing module 32.
  • the live broadcast room page display device 3 provided in this embodiment can implement the technical solution of the above method embodiment, and its implementation principle and technical effect are similar. This embodiment will not be described in detail here.
  • FIG12 is a schematic diagram of the structure of an electronic device provided by an embodiment of the present disclosure. As shown in FIG12 , the electronic device 7 includes:
  • the memory 42 stores computer executable instructions
  • the processor 41 executes the computer execution instructions stored in the memory 42 to implement the live broadcast room page display method in the embodiments shown in Figures 2 to 10.
  • processor 41 and the memory 42 are connected via a bus 43 .
  • FIG13 it shows a schematic diagram of the structure of an electronic device 900 suitable for implementing the embodiment of the present disclosure
  • the electronic device 900 may be a terminal device or a server.
  • the terminal device may include but is not limited to mobile terminals such as mobile phones, laptop computers, digital broadcast receivers, personal digital assistants (PDAs), tablet computers (Portable Android Devices, PADs), portable multimedia players (Portable Media Players, PMPs), vehicle terminals (such as vehicle navigation terminals), etc., and fixed terminals such as digital TVs, desktop computers, etc.
  • PDAs personal digital assistants
  • PMPs portable multimedia players
  • vehicle terminals such as vehicle navigation terminals
  • fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG13 is only an example and should not bring any limitation to the functions and scope of use of the embodiment of the present disclosure.
  • the electronic device 900 may include a processing device (e.g., a central processing unit, a graphics processing unit, etc.) 901, which may perform various appropriate actions and processes according to a program stored in a read-only memory (ROM) 902 or a program loaded from a storage device 908 to a random access memory (RAM) 903.
  • a processing device e.g., a central processing unit, a graphics processing unit, etc.
  • RAM random access memory
  • Various programs and data required for the operation of the electronic device 900 are also stored in the RAM 903.
  • the processing device 901, the ROM 902, and the RAM 903 are connected to each other via a bus 904.
  • An input/output (I/O) interface 905 is also connected to the bus 904.
  • the following devices may be connected to the I/O interface 905: an input device 906 including, for example, a touch screen, a touch pad, a keyboard, a mouse, a camera, a microphone, an accelerometer, a gyroscope, etc.; an output device 907 including, for example, a liquid crystal display (LCD), a speaker, a vibrator, etc.; a storage device 908 including, for example, a magnetic tape, a hard disk, etc.; and a communication device 909.
  • the communication device 909 may allow the electronic device 900 to communicate with other devices wirelessly or by wire to exchange information.
  • FIG. 13 shows an electronic device 900 having various devices, it should be understood that it is not required to implement or possess all of the devices shown. More or fewer devices may be implemented or possessed instead.
  • an embodiment of the present disclosure includes a computer program product, which includes a computer program carried on a computer-readable medium, and the computer program contains program code for executing the method shown in the flowchart.
  • the computer program can be downloaded and installed from the network through the communication device 909, or installed from the storage device 908, or installed from the ROM 902.
  • the processing device 901 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
  • the computer-readable medium disclosed above may be a computer-readable signal medium or a computer-readable storage medium or any combination of the above two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device or device, or any combination of the above.
  • Computer-readable storage media may include, but are not limited to: an electrical connection with one or more wires, a portable computer disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium containing or storing a program that may be used by or in combination with an instruction execution system, device or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as part of a carrier wave, in which a computer-readable program code is carried.
  • This propagated data signal may take a variety of forms, including but not limited to an electromagnetic signal, an optical signal, or any suitable combination of the above.
  • the computer readable signal medium may also be any computer readable medium other than a computer readable storage medium, which may send, propagate or transmit a program for use by or in conjunction with an instruction execution system, apparatus or device.
  • the program code contained on the computer readable medium may be transmitted using any suitable medium, including but not limited to: wires, optical cables, RF (radio frequency), etc., or any suitable combination of the above.
  • the computer-readable medium may be included in the electronic device, or may exist independently without being installed in the electronic device.
  • the computer-readable medium carries one or more programs.
  • the electronic device executes the method shown in the above embodiment.
  • Computer program code for performing the operations of the present disclosure may be written in one or more programming languages or a combination thereof, including object-oriented programming languages such as Java, Smalltalk, C++, and conventional procedural programming languages such as "C" or similar programming languages.
  • the program code may be executed entirely on the user's computer, partially on the user's computer, as a separate software package, partially on the user's computer and partially on a remote computer, or entirely on a remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (e.g., via the Internet using an Internet service provider).
  • LAN Local Area Network
  • WAN Wide Area Network
  • each square box in the flow chart or block diagram can represent a module, a program segment or a part of a code, and the module, the program segment or a part of the code contains one or more executable instructions for realizing the specified logical function.
  • the functions marked in the square box can also occur in a sequence different from that marked in the accompanying drawings. For example, two square boxes represented in succession can actually be executed substantially in parallel, and they can sometimes be executed in the opposite order, depending on the functions involved.
  • each square box in the block diagram and/or flow chart, and the combination of the square boxes in the block diagram and/or flow chart can be implemented with a dedicated hardware-based system that performs a specified function or operation, or can be implemented with a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments described in the present disclosure may be implemented by software or hardware.
  • the name of a unit does not limit the unit itself in some cases.
  • the first acquisition unit may also be described as a "unit for acquiring at least two Internet Protocol addresses".
  • exemplary types of hardware logic components include: field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (ASSPs), systems on chip (SOCs), complex programmable logic devices (CPLDs), and the like.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSPs application specific standard products
  • SOCs systems on chip
  • CPLDs complex programmable logic devices
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in conjunction with an instruction execution system, device, or equipment.
  • a machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • a machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or equipment, or any suitable combination of the foregoing.
  • a more specific example of a machine-readable storage medium may include an electrical connection based on one or more lines, a portable computer disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM portable compact disk read-only memory
  • CD-ROM compact disk read-only memory
  • magnetic storage device or any suitable combination of the foregoing.
  • a method for displaying a live broadcast room page comprising:
  • a live broadcast room page is displayed, wherein the live broadcast room page includes a first area and a second area, wherein the first area is used to display a live video; based on the content of the live video displayed in the first area, a target interactive component is displayed in the second area, and the target interactive component is used to execute a component function corresponding to the live video after being triggered.
  • displaying a target interactive component in the second area according to the content of the live video displayed in the first area includes: acquiring live broadcast information, the live broadcast information representing the content of the live video currently displayed on the live broadcast room page; determining at least one of the target interactive components according to the live broadcast information; and displaying at least one of the target interactive components in the second area.
  • the live broadcast information includes live broadcast event information, and the live broadcast event information represents the live broadcast event occurring in the live broadcast video; based on the live broadcast information, at least one of the target interactive components is determined, including: based on the target live broadcast event corresponding to the live broadcast event information in the live broadcast information, the corresponding target interactive component is determined.
  • the live broadcast information includes live broadcast room information, and the live broadcast room information represents the number of times a target interactive event is triggered in the live broadcast room; based on the live broadcast information, at least one of the target interactive components is determined, including: based on the live broadcast room information in the live broadcast information, determining the target number of times the target interactive event is triggered in the live broadcast room corresponding to the live broadcast room page; determining the target interactive component based on the target number of times the target interactive event is triggered and the target live broadcast event corresponding to the live broadcast event information.
  • the method further includes: determining the priority of each of the target interactive components according to the live broadcast information; determining the display order of each of the target interactive components based on the priority of each of the target interactive components; and determining the position of each of the target interactive components in the second area based on the display order.
  • the live broadcast information includes the broadcast duration of the live broadcast video; determining the priority of each target interactive component based on the live broadcast information includes: determining the priority of each target interactive component based on the broadcast duration.
  • the target interaction components after determining at least one of the target interaction components according to the live broadcast information, it also includes: determining a display mode of the target interaction component according to the live broadcast information, wherein the display mode is an embedded display or a floating display; displaying at least one of the target interaction components in the second area, including: displaying each of the target interaction components in the second area based on the display mode of each of the target interaction components.
  • the target interactive components after determining at least one of the target interactive components according to the live broadcast information, it also includes: determining a fixing mode of the target interactive component according to the live broadcast information, wherein the fixing mode is a permanent state or an adjustable state; and adjusting the position of the target interactive component in the adjustable state within the second area in response to a component movement operation.
  • a target interactive component is displayed in the second area according to the content of the live video, including: displaying a corresponding preview component in the second area according to the content of the live video, the component appearance of the preview component being used to represent the content of at least one interactive function; and displaying at least one of the target interactive components in the second area in response to a trigger operation on the preview component.
  • a live broadcast room page display device including:
  • a display module used to display a live broadcast room page, wherein the live broadcast room page includes a first area and a second area, wherein the first area is used to display a live broadcast video;
  • a processing module is used to display a target interactive component in the second area according to the content of the live video displayed in the first area, and the target interactive component is used to execute a component function corresponding to the live video after being triggered.
  • the processing module is specifically used to: obtain live broadcast Information, wherein the live broadcast information represents the content of the live broadcast video currently displayed on the live broadcast room page; according to the live broadcast information, at least one of the target interactive components is determined; and at least one of the target interactive components is displayed in the second area.
  • the live broadcast information includes live broadcast event information, and the live broadcast event information represents the live broadcast event occurring in the live broadcast video; when the processing module determines at least one of the target interactive components according to the live broadcast information, it is specifically used to: determine the corresponding target interactive component according to the target live broadcast event corresponding to the live broadcast event information in the live broadcast information.
  • the live broadcast information includes live broadcast room information, and the live broadcast room information represents the number of times a target interactive event is triggered in the live broadcast room; when the processing module determines at least one of the target interactive components according to the live broadcast information, it is specifically used to: determine the target number of times the target interactive event is triggered in the live broadcast room corresponding to the live broadcast room page according to the live broadcast room information in the live broadcast information; determine the target interactive component according to the target number of times the target interactive event is triggered and the target live broadcast event corresponding to the live broadcast event information.
  • the processing module is further used to: determine the priority of each of the target interactive components according to the live broadcast information; determine the display order of each of the target interactive components based on the priority of each of the target interactive components; and determine the position of each of the target interactive components in the second area based on the display order.
  • the live broadcast information includes the broadcast duration of the live broadcast video; when the processing module determines the priority of each target interactive component based on the live broadcast information, it is specifically used to: determine the priority of each target interactive component based on the broadcast duration.
  • the processing module is further used to: determine a display mode of the target interactive component according to the live broadcast information, wherein the display mode is an embedded display or a floating display; when the processing module displays at least one of the target interactive components in the second area, the processing module is specifically used to: display each of the target interactive components in the second area based on the display mode of each of the target interactive components.
  • the processing module is further used to: determine a fixing mode of the target interactive component according to the live broadcast information, wherein the fixing mode is a permanent state or an adjustable state; In response to a component moving operation, a position of the target interactive component in the adjustable state within the second area is adjusted.
  • the processing module is specifically used to: display a corresponding preview component in the second area according to the content of the live video, and the component appearance of the preview component is used to represent the content of at least one interactive function; in response to a trigger operation on the preview component, display at least one of the target interactive components in the second area.
  • an electronic device comprising: a processor, and a memory communicatively connected to the processor;
  • the memory stores computer-executable instructions
  • the processor executes the computer-executable instructions stored in the memory to implement the live broadcast room page display method described in the first aspect and various possible designs of the first aspect.
  • a computer-readable storage medium stores computer execution instructions.
  • the live broadcast room page display method described in the first aspect and various possible designs of the first aspect is implemented.
  • an embodiment of the present disclosure provides a computer program product, including a computer program, which, when executed by a processor, implements the live broadcast room page display method described in the first aspect and various possible designs of the first aspect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本公开实施例提供一种直播间页面显示方法、装置、电子设备及存储介质,通过显示直播间页面,直播间页面内包括第一区域和第二区域,其中,第一区域用于显示直播视频;根据第一区域内显示的直播视频的内容,在第二区域显示目标交互组件,目标交互组件用于在被触发后,执行与直播视频对应的组件功能。由于在直播的不同阶段,用户所需要的交互组件相应不同,通过直播视频的内容确定对应的目标交互组件,实现了在播放不同的直播视频时,显示对应的目标交互组件,实现交互组件的动态显示,提高用户在直播的不同阶段触发相应的交互组件时的交互效率,并使直播间内用户所触发的交互组件的集中度更高,提高直播间内用户直接的互动效果。

Description

直播间页面显示方法、装置、电子设备及存储介质
本申请要求2022年11月10日递交的、标题为“直播间页面显示方法、装置、电子设备及存储介质”、申请号为2022114072404的中国发明专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本公开实施例涉及互联网技术领域,尤其涉及一种直播间页面显示方法、装置、电子设备及存储介质。
背景技术
当前,在竞技赛事类的直播应用中,用户通过终端设备上运行的直播应用客户端可以进入赛事直播间,观看比赛直播视频,在该过程中,直播应用客户端内除了设置有用于播放直播视频的窗口区域外,还设置有相关的功能区,用户通过对功能区内的交互组件进行触发,来实现例如发言、点赞等对应的交互功能。
然而,现有技术中的方案,功能区内固定设置交互组件,存在操作效率低、互动效果差的问题。
发明内容
本公开实施例提供一种直播间页面显示方法、装置、电子设备及存储介质,以克服直播间页面内的交互组件存在的操作效率低、互动效果差的问题。
第一方面,本公开实施例提供一种直播间页面显示方,包括:
显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。
第二方面,本公开实施例提供一种直播间页面显示装置,包括:
显示模块,用于显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;
处理模块,用于根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。
第三方面,本公开实施例提供一种电子设备,包括:
处理器,以及与所述处理器通信连接的存储器;
所述存储器存储计算机执行指令;
所述处理器执行所述存储器存储的计算机执行指令,以实现如上第一方面以及第一方面各种可能的设计所述的直播间页面显示方法。
第四方面,本公开实施例提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的直播间页面显示方法。
第五方面,本公开实施例提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现如上第一方面以及第一方面各种可能的设计所述的直播间页面显示方法。
本实施例提供的直播间页面显示方法、装置、电子设备及存储介质,通过显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。由于在直播的不同阶段,用户所需要的交互组件相应不同,通过直播视频的内容确定对应的目标交互组件,实现了在播放不同的直播视频时,显示对应的目标交互组件,达到了直播间页面内交互组件的动态显示的目的,提高用户在直播的不同阶段触发相应的交互组件时的交互效率,并使直播间内用户所触发的交互组件的集中度更高,提高直播间内用户直接的互动效果。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下 面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的直播间页面显示方法的一种应用场景图;
图2为本公开实施例提供的直播间页面显示方法的流程示意图一;
图3为本公开实施例提供的一种直播间页面的示意图;
图4为本公开实施例提供的另一种直播间页面的示意图;
图5为本公开实施例提供的一种显示第二区域的示意图;
图6为图2所示实施例中步骤S102的一种具体实现方式流程图;
图7为本公开实施例提供的一种显示目标交互组件的过程示意图;
图8为本公开实施例提供的直播间页面显示方法的流程示意图二;
图9为图7所示实施例中步骤S204的一种具体实现方式流程图;
图10为本公开实施例提供的一种基于直播事件信息确定目标交互组件的位置的示意图;
图11为本公开实施例提供的直播间页面显示装置的结构框图;
图12为本公开实施例提供的一种电子设备的结构示意图;
图13为本公开实施例提供的电子设备的硬件结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
下面对本公开实施例的应用场景进行解释:
图1为本公开实施例提供的直播间页面显示方法的一种应用场景图,本公开实施例提供的直播间页面显示方法,可以应用于赛事直播的应用场景。如图1所示,本公开实施例提供的方法,可以应用于终端设备,终端设备内运行有直播应用的客户端,终端设备与运行直播服务端的服务器通信,服务器向终端设备发送直播数据,终端设备获得直播数据并在客户端页面内播放,实现足球比赛、篮球比赛、电子竞技比赛等竞技赛事的视频直播。其中,直 播应用客户端的直播间页面内,设置有用于实现用户间互动的交互组件,例如图中所示的“聊天”组件、“投票”组件,当用户通过操作终端设备,触发对应的交互组件后,直播应用客户端会执行对应的目标功能,例如显示“聊天”组件对应的功能页面,或者,显示“投票”组件对应的功能页面。
现有技术中,互动组件通常是固定设置,即在开启直播间之前,通过服务端对直播间进行配置,设置固定显示的一种或多种交互组件,在直播间开播后,用户通过直播应用客户端进入直播间页面后,基于直播间提供的固定的几种交互组件,来执行对应的几种固定的目标功能。然而,在赛事直播的应用场景下,直播间对赛事的直播过程,会至少分为几个直播阶段,例如比赛开始前、比赛进行中和比赛结束后,而在不同的直播阶段中,用户所需要的交互组件是不同的,例如在比赛开始前的直播阶段,直播间内的用户需要使用的为“比赛预测”的交互组件;而在比赛进行中的直播阶段,直播间内的用户需要使用的为“聊天”的交互组件。此种情况下,用户在不同直播阶段,只能从直播间页面内所有支持的各类交互组件中,通过手动选取的方式,来触发相应所需要的交互组件,同时受限于直播间页面内有限的显示空间,大多数交互组件会处于折叠状态,这进一步地降低了交互组件选取过程中的操作效率,造成了操作效率低、互动效果差的问题。
本公开实施例提供一种直播间页面显示方法以解决上述问题。
参考图2,图2为本公开实施例提供的直播间页面显示方法的流程示意图一。本实施例的方法可以应用在终端设备或服务器中,以终端设备为例,该方法包括:
步骤S101:显示直播间页面,直播间页面内包括第一区域和第二区域,其中,第一区域用于显示直播视频。
示例性地,本实施例方法的执行主体为终端设备,例如智能手机,参考图1所示应用场景示意图,终端设备内运行有直播应用的客户端,用户通过客户端进入相应的直播间后,在终端设备的客户端内显示直播间页面。图3为本公开实施例提供的一种直播间页面的示意图,如图3所示,在直播间页面内包括第一区域和第二区域,示例性地,第一区域位于直播间页面上部,第二区域位于直播间页面的下部。其中,第一区域用于显示直播视频,例如足球比赛、篮球比赛等比赛直播视频;而第二区域用于显示与直播视频相关 的交互组件,例如公共聊天频道组件、比赛阵容展示组件等。
进一步地,根据终端设备的机身姿态,第一区域和第二区域的位置关系随之改变。图4为本公开实施例提供的另一种直播间页面的示意图,参考图3所示的机身姿态为竖屏状态的实施例情况,如图4所示,示例性地,在机身姿态为横屏状态下,第一区域和第二区域左右设置,用于播放直播视频的第一区域位于直播间页面的左侧,用于显示交互组件的第二区域位于直播间页面右侧。在赛事直播的应用场景下,通过在横屏状态下左右设置第一区域和第二区域,能够更好的适配直播视频(比赛视频)的画面尺寸,在提高视觉观感的基础上,保证直播间页面内的交互功能正常实现。
步骤S102:根据第一区域内显示的直播视频的内容,在第二区域显示目标交互组件,目标交互组件用于在被触发后,执行与直播视频对应的组件功能。
示例性地,在第一区域显示直播视频的内容的同时,在第二区域内,对应显示与直播视频的内容匹配的交互组件,即目标交互组件,目标交互组件被触发后,所显示的组件内容,即针对直播视频的组件功能。其中,第二区域内可以包括一个或多个目标交互组件,在第二区域内仅包括一个直播视频的内容对应的目标交互组件的情况下,执行与直播视频对应的组件功能,示例性地,具体包括:在第二区域显示该目标交互组件的组件名称以组件内容;在第二区域内包括多个目标交互组件的情况下,执行与直播视频对应的组件功能,示例性地,具体包括:第二区域内显示多个与直播视频的内容对应的目标交互组件的组件名称,以及当前被触发的目标交互组件的组件内容。
例如,在直播视频的内容为#1时,在第二区域内显示交互组件A、交互组件B和交互组件C,其中交互组件A被触发,显示交互组件A的组件内容;在直播视频的内容为#2时,在第二区域内显示交互组件A和交互组件D,交互组件D被触发,显示交互组件D的组件内容。图5为本公开实施例提供的一种显示第二区域的示意图,如图5所示,在第二区域内包括组件列表区和组件内容区,在以足球比赛为例,在比赛开始前,图5中示为第一阶段,第一区域内显示的直播视频的内容,是对双方球队球员和阵容的介绍,相应的,在第二区域内的组件列表区,显示对应的目标交互组件包括“阵容介绍”、“胜负预测”、“频道聊天”,在组件内容区,显示当前触发的目标交互组 件对应的组件内容,如图5所示,当前触发的目标交互组件为“阵容介绍”,在对应的组件内容区,例如显示队伍Team_1的球队阵容。而当用户点击组件列表区内的其他目标交互组件时,会触发组件内容区显示对应的组件内容,例如点击“频道聊天”时,在组件内容区,显示该“频道聊天”的交互组件对应的组件内容,例如直播间内用户的发言(图中未示出)。
而在比赛开始后,图5中示为第二阶段,第一区域内显示的直播视频的内容,是足球比赛的具体比赛情况,相应的,在第二区域内的组件列表区,显示对应的目标交互组件包括“频道聊天”、“比赛数据”,其中,当前触发的目标交互组件为“频道聊天”,在对应的组件内容区,显示该“频道聊天”的交互组件对应的组件内容。同样的,当用户点击组件列表区内的其他目标交互组件时,会触发组件内容区显示对应的组件内容,不再赘述。
其中,需要说明的是,上述图5中示出的“阵容介绍”的交互组件的组件内容的具体实现形式,仅为示例性地,上述“阵容介绍”、“胜负预测”、“频道聊天”、“比赛数据”等交互组件的组件内容的具体实现形式,可以根据需要灵活设置,此处不进行具体限定,也不再一一赘述。
进一步地,在一种可能的实现方式中,在组件列表区内包括多个目标交互组件的情况下,根据直播视频的内容,终端设备的客户端会确定一个默认的目标交互组件作为当前被触发的目标交互组件,例如,在比赛开始前,确定的默认被触发的目标交互组件为交互组件A(例如“阵容介绍”交互组件)、在比赛开始后,确定的默认被触发的目标交互组件为交互组件B(例如“频道聊天”交互组件),从而使直播间内的用户在默认状态下,使用相同的交互组件进行互动,提高直播间内用户所使用的交互组件的集中度,提高互动效果。
进一步地,在一种可能的实现方式中,如图6所示,步骤S102的具体实现方式包括:
步骤S1021:接收服务端发送的直播信息,直播信息表征直播间页面当前显示的直播视频的内容。
步骤S1022:根据直播信息,确定至少一个目标交互组件。
步骤S1023:在第二区域显示至少一个目标交互组件。
示例性地,服务端(运行直播应用服务端的服务器)在客户端(运行直 播应用客户端的终端设备)发送的直播数据中,包含有用于表征直播间页面当前显示的直播视频的内容的直播信息,其中,在比赛直播的应用场景下,示例性地,直播信息中包括直播事件信息,直播事件信息表征直播视频中发生的直播事件,直播事件信息例如为直播时间的标识;直播事件信息是基于直播视频的内容触发的,用于表征具体的内容环节的事件,例如,目标直播间内播放的直播视频为“足球比赛”,则该直播视频的内容对应的直播事件例如包括“进球”、“换人”、“犯规”、“上/下半场比赛开始”、“上/下半场比赛结束”等。直播事件是基于直播视频的内容触发的是指:当直播视频的内容满足要求(例如出现犯规、进球)时,服务端会向终端设备一侧的客户端发送的直播信息中,会包含直播事件信息,直播事件信息表征特定的直播事件的标识。
进一步地,基于直播信息中的直播事件信息,确定对应的目标直播事件,并基于预设的映射关系,获得目标直播事件对应的目标交互组件。例如,接收到直播信息中包括直播事件信息info_1时,确定对应的目标直播事件为“比赛开始”,此时,获取该目标直播事件对应的目标交互组件,分别为“频道聊天”、“队伍信息”和“实时比赛数据”;接收到直播信息中包括直播事件信息info2时,确定对应的目标直播事件为“比赛结束”,此时,获取该目标直播事件对应的目标交互组件,分别为“频道聊天”和“终场比赛数据”。
当然,在其他可能的实施例中,直播信息中除了包括直播事件信息外,还可以包括其他信息,例如表征直播间观看人数的直播间信息等,从而进一步确定更加准确的目标直播事件,具体实现方式在后续实施例中详细介绍,此处不再赘述。本实施了中,通过获取直播信息,确定与直播视频的内容对应的目标交互组件并进行显示,实现了高实时性的和多样性的目标交互组件确定方式,提高实时显示的目标交互组件的合理性,提高直播间内用户所使用的交互组件的集中度,提高互动效果。
上述实施例中,是以终端设备的机身姿态为竖屏状态的应用场景下,在第二区域显示目标交互组件的实现方式。在如图4所示实施例中横屏播放直播视频的应用场景中,示例性地,第二区域中的目标交互组件不默认触发,即在终端设备接收到触发指令之前,不在第二区域内显示目标交互组件的组件内容。示例性地,在横屏状态下,步骤S102的具体实现方式包括:
步骤S1023:据直播视频的内容,在第二区域显示对应的预览组件,预览组件的组件外观用于表征至少一种交互功能的内容。
步骤S1024:响应于针对该预览组件的触发操作,在第二区域显示至少一个目标交互组件。
图7为本公开实施例提供的一种显示目标交互组件的过程示意图,由于机身姿态为横屏状态时,与直播视频的画面尺寸更加匹配,例如当终端设备的屏幕尺寸与直播视频的画面尺寸相同时,能够实现无边框的全屏播放,因此能够实现更好的视频观感。如图7所示,示例性地,在横屏状态下,第二区域与第一区域的部分重合,即第二区域为第一区域的一个子区域,在通过第一区域播放直播视频时,当根据直播视频的内容(接收到目标直播事件对应的目标直播信息),需要显示目标交互组件时,在第二区域(第一区域内)首先显示一个悬浮在直播视频上方的预览组件,从而减少交互组件对用户观看直播视频的干扰,其中,预览组件的组件外观用于表征至少一种交互功能的内容,例如包括表征交互组件的组件名称的字符(图中预览组件的组件外观中包括“聊天”、“数据”字符)。之后,在用户(根据需要)对预览组件进行触发后,终端设备响应于针对该预览组件的触发操作,再于第二区域显示目标交互组件,此时,第一区域与第二区域不再重叠,第二区域对应一个独立的区域,从而更好的显示目标交互组件的组件内容。其中,在横屏状态下根据直播视频的内容确定目标交互组件,以及将目标交互组件显示在第二区域内的具体实现方式,与上述竖屏状态下在第二区域内显示目标交互组件的实现方式类似,此处不再赘述。
在本实施例中,通过显示直播间页面,直播间页面内包括第一区域和第二区域,其中,第一区域用于显示直播视频;根据第一区域内显示的直播视频的内容,在第二区域显示目标交互组件,目标交互组件用于在被触发后,执行与直播视频对应的组件功能。由于在直播的不同阶段,用户所需要的交互组件相应不同,通过直播视频的内容确定对应的目标交互组件,实现了在播放不同的直播视频时,显示对应的目标交互组件,达到了直播间页面内交互组件的动态显示的目的,提高用户在直播的不同阶段触发相应的交互组件时的交互效率,并使直播间内用户所触发的交互组件的集中度更高,提高直播间内用户直接的互动效果。
参考图8,图8为本公开实施例提供的直播间页面显示方法的流程示意图二。本实施例在图2所示实施例的基础上。进一步对步骤S102中确定目标交互组件并进行显示的方法进行细化,示例性地,该直播间页面显示方法包括:
步骤S201:显示直播间页面,直播间页面内包括第一区域和第二区域,其中,第一区域用于显示直播视频。
步骤S202:接收服务端发送的直播信息,直播信息中包括直播事件信息和/或直播间信息,直播事件信息表征直播视频中发生的直播事件,直播间信息表征直播间内触发目标互动事件的次数。
步骤S203:根据直播信息,确定对应的目标交互组件。
示例性地,本实施例步骤中,通过服务端发送的直播信息确定直播视频的内容,进而确定对应的目标交互组件。其中,直播信息中包括直播事件信息和直播间信息,其中,直播事件信息可以是表征“比赛开始”、“比赛结束”等直播事件的标识,基于直播事件信息确定对应的目标交互组件的具体实现方式在图2所示实施例中已进行详细介绍,此处不再赘述。
对于直播间信息,直播间信息表征播放直播视频的直播间内触发目标互动事件的次数,其中,目标互动事件是由直播间内的用户触发的事件,例如“点赞”、“关注”、“支持所选择的球队”等等。目标互动事件可以由相应的互动组件来实现,触发目标互动事件的次数,来相应的触发对应的目标互动组件,更具体地,例如,当“点赞”(目标互动事件)的次数到达三万时,确定对应的目标交互组件为用于参加“赠送虚拟礼物”的交互组件A;当“支持”的次数到达十万时,确定对应的目标交互组件为用于参加“实物抽奖”的交互组件B。
在此基础上,可以进一步结合直播事件信息,来确定对应的目标交互组件,从而确定最终显示在第二区域内的目标交互组件。例如,基于直播间信息,确定目标交互组件A;基于直播事件信息,确定目标交互组件B,目标交互组件A和目标交互组件B的集合,即最终显示在第二区域内的目标交互组件。当然,可以理解的是,除了本实施了中,结合直播事件信息和直播间信息,共同确定显示在第二区域内的目标交互组件的方式外,还可以分别独立地基于直播事件信息确定显示在第二区域内的目标交互组件,或者,独立 地基于直播间信息确定显示在第二区域内的目标交互组件,具体实现方式类似,此处不再赘述。
本实施了中,通过直播事件信息和/或直播间信息,来确定对应的目标交互组件,使第二区域内所显示的目标交互组件,可以根据比赛进程,以及直播间内的用户参与度而动态调整,从而提高直播间内用户参与互动的兴趣,提高互动效果。
步骤S204:根据直播信息,确定各目标交互组件在第二区域中的位置。
进一步地,在确定目标交互组件后,当目标交互组件包括多个的情况下,目标交互组件在第二区域中的显示位置,会影响用户对各目标交互组件的关注度,从而影响目标交互组件的触发概率。
示例性地,如图9所示,步骤S204的具体实现方式包括:
步骤S2041:根据直播信息,确定各目标交互组件的优先级。
步骤S2042:基于各目标交互组件的优先级,确定各目标交互组件的展示信息。
步骤S2043:基于展示信息确定各目标交互组件在第二区域中的位置。
示例性地,根据直播信息,例如直播事件信息,在确定各目标交互组件的同时,确定各目标交互组件的优先级,进而,基于各目标交互组件的优先级,将高优先级的目标交互组件排列在靠前位置;将低优先级的目标交互组件排列在靠后位置,从而实现对用户的关注点引导,使直播间内的用户所参与的互动事件的集中度更高,提高互动效果。图10为本公开实施例提供的一种基于直播事件信息确定目标交互组件的位置的示意图,如图10所示,基于直播信息中的直播事件信息,当直播事件信息为#1时,表征的直播事件为“上半场比赛开始”、直播事件信息为#2时,表征的直播事件为“下半场比赛开始”。相应的,直播事件信息#1对应的目标交互组件分别为用于展示“AI胜负预测”的交互组件A、用于展示“频道聊天”的交互组件B和用于展示“球队支持率”的交互组件C。其中,直播事件信息#1对应的各目标交互组件的优先级分别为:交互组件A的优先级=3;交互组件B的优先级=2;交互组件C的优先级=2;其中,交互组件A的优先级最高,交互组件B和交互组件C的优先级相同且较低,根据各目标交互组件的优先级,确定对应的展示次序为[交互组件A,交互组件B,交互组件C],各交互组件在第二区域内的 位置可参考图9所示,高优先级的交互组件A排在第二区域的组件列表区内的最左侧,交互组件B和交互组件C的位置依次靠后设置。进一步地,在一种可能的实现方式中,将最高优先级的目标交互组件(例如组件列表区内的最左侧),进行默认触发,将其组件内容默认显示在第二区域的组件内容区,从而实现对直播间内的用户的关注点的引导。可选地,在交互组件B和交互组件C的优先级相同的情况下,可以随机设置,也可以进一步基于直播信息进行确定,具体地,例如根据直播信息中的直播间信息,确定交互组件B和交互组件C的相对优先级,从而确定交互组件B和交互组件C在组件列表区的位置排序。
类似地,直播事件信息#2对应的目标交互组件分别为用于展示“AI胜负预测”的交互组件A、用于展示“频道聊天”的交互组件B和用于展示“球队支持率”的交互组件C(与直播事件信息#1相同)。其中,直播事件信息#2对应的各目标交互组件的优先级分别为:交互组件A的优先级=1;交互组件B的优先级=3;交互组件C的优先级=2;相应地,根据各目标交互组件的优先级,确定对应的展示次序为[交互组件B,交互组件C,交互组件A],如图10所示,此时,高优先级的交互组件B排在第二区域的组件列表区内的最左侧,而交互组件C和交互组件A的位置依次靠后设置。本实施步骤中,通过直播信息,确定各目标交互组件对应的优先级,从而实现各目交互组件在第二区域的动态排列,更好的满足用户在不同的比赛阶段进行相应互动的需求,提高直播间交互效率和互动效果。
进一步地,所述展示信息还用于表征所述目标交互组件在目标时间段内显示的时间次序。在步骤S2043之前或之后,还包括:基于展示信息确定各目标交互组件在目标时间段内显示的时间次序。
示例性地,目标时间段由展示信息确定,例如为比赛开始至比赛结束之间的时间段,其中,根据展示信息,在比赛开始后至中场休息(第一时间段),在第二区域显示交互组件A;在中场休息至比赛结束(第二时间段),在第二区域显示交互组件B。本实施例中,通过展示信息确定第二区域内的目标交互组件的展示次序,实现基于时间特征的交互组件动态显示,进一步提高交互组件的交互效率。
进一步地,一种可能的实现方式中,直播信息中包括开播时长,根据开 播时长,确定各目标交互组件的优先级。其中,开播时长可以是指直播间开启后的持续时长,也可以是基于直播视频对应的目标直播事件为起点,计算的持续时长,例如自“比赛开始”后至当前的持续时长。例如,在“比赛开始”后,在比赛的第10分钟,对应的目标交互组件中,交互组件A的优先级为2级,交互组件B的优先级为1级,在比赛的第30分钟,对应的目标交互组件中,交互组件A的优先级为1级,交互组件B的优先级为2级。本实施例步骤中,通过直播信息中的开播时长,进一步细化各目标交互组件的优先级,从而实现随比赛时长变化而动态调整目标交互组件的排序位置的目的,进一步提高对目标交互组件的动态设置精度,更好的满足用户在不同的比赛阶段进行相应互动的需求,提高直播间交互效率和互动效果。
步骤S205:根据直播信息,确定目标交互组件的显示方式和或固定方式,其中,显示方式为嵌入式显示或悬浮式显示,固定方式为常驻状态或可调整状态。
步骤S206:基于各目标交互组件的显示方式和/或固定方式,在第二区域显示各目标交互组件。
示例性地,对于目标交互组件,具有对应属性:显示方式和/或固定方式,其中,显示方式是指目标交互组件在第二区域进行显示的样式,包括嵌入式显示、悬浮式显示。具体的,嵌入式显示是指目标交互组件嵌入在第二显示区域的组件列表区,组件列表区可以对应一个用于承载多个组件的组件框或组件面板。目标交互组件以嵌入式的方式固定设置在组件框或组件面板内;悬浮式显示是指目标交互组件已非固定的方式,设置在第二区域的组件列表区内,例如设置在组件列表区对应的组件框外,并可以覆盖组件列表区内的交互组件(使其不可见),从而实现更加突出的显示效果,以引导用户进行触发。根据直播信息,当接收到服务端发送的直播信息中包括表征目标直播事件的直播事件信息后,将一个目标交互组件的显示方式配置为悬浮式显示,更具体地,直播信息对应的目标直播事件例如“进球”,则将目标交互组件A配置为悬浮式显示;将目标交互组件B配置为嵌入式显示。
进一步地,固定方式是目标交互组件的另一属性,指目标交互组件对切换组件进行响应的方式。可以基于预设的映射关系,根据直播信息,确定目标交互组件对应的固定方式,具体实现方式与显示方式的确定方式类似,此 处不再赘述。
步骤S207:响应于组件移动操作,调整处于可调整状态的目标交互组件在第二区域内的位置。
示例性地,第二区域的组件列表区内至少包括两个目标交互组件,其显示方式为嵌入式显示,即至少两个目标交互组件以嵌入式的方式固定设置在组件列表区的组件框内,其中,示例性地,目标交互组件A的固定方式为常驻状态,目标交互组件B的固定方式为可调整状态。当终端设备接收到针对组件框内的目标交互组件的长按手势后,处于可调整状态的目标交互组件B可以响应于拖拽手势(组件移动操作),移动其在第二区域内的位置,更具体地,例如,将目标交互组件由左侧第二位移动至左侧第一位。
本实施例中,通过根据直播信息确定目标交互组件的固定方式,并结合组件移动操作,调整处于可调整状态的目标交互组件的位置,从而满足用户的个性化操作需求,同时,由于处于常驻状态的目标交互组件的位置不可移动,保证了用户与直播视频的内容相关的关键组件在直播间页面内的展示,从而保证直播间内用户所使用的交互组件的集中度,提高用户参与度,提高赛事直播场景下的用户互动效果。
本实施例中,步骤S210的实现方式与本公开图2所示实施例中的步骤S101的实现方式相同,在此不再一一赘述。
对应于上文实施例的直播间页面显示方法,图11为本公开实施例提供的直播间页面显示装置的结构框图。为了便于说明,仅示出了与本公开实施例相关的部分。参照图11,直播间页面显示装置,包括:
显示模块31,用于显示直播间页面,直播间页面内包括第一区域和第二区域,其中,第一区域用于显示直播视频;
处理模块32,用于根据第一区域内显示的直播视频的内容,在第二区域显示目标交互组件,目标交互组件用于在被触发后,执行与直播视频对应的组件功能。
在本公开的一个实施例中,处理模块31,具体用于:获取直播信息,直播信息表征直播间页面当前显示的直播视频的内容;根据直播信息,确定至少一个目标交互组件;在第二区域显示至少一个目标交互组件。
在本公开的一个实施例中,直播信息中包括直播事件信息,直播事件信 息表征直播视频中发生的直播事件;处理模块31在根据直播信息,确定至少一个目标交互组件时,具体用于:根据直播信息中的直播事件信息对应的目标直播事件,确定对应的目标交互组件。
在本公开的一个实施例中,直播信息中包括直播间信息,直播间信息表征直播间内触发目标互动事件的次数;处理模块31在根据所述直播信息中的直播事件信息对应的目标直播事件,确定所述目标交互组件时,具体用于:根据直播信息中的直播间信息,确定直播间页面对应的直播间内触发目标互动事件的目标次数;根据触发所述目标互动事件的目标次数和所述直播事件信息对应的目标直播事件,确定所述目标交互组件。
在本公开的一个实施例中,处理模块31,还用于:根据直播信息,确定各目标交互组件的优先级;基于各目标交互组件的优先级,确定各目标交互组件的展示次序;基于展示次序确定各目标交互组件在第二区域中的位置。
在本公开的一个实施例中,直播信息中包括直播视频的开播时长;处理模块31在根据直播信息,确定各目标交互组件的优先级时,具体用于:根据开播时长,确定各目标交互组件的优先级。
在本公开的一个实施例中,处理模块31在根据直播信息,确定至少一个目标交互组件之后,还用于:根据直播信息,确定目标交互组件的显示方式,其中,显示方式为嵌入式显示或悬浮式显示;
处理模块31在第二区域显示至少一个目标交互组件时,包具体用于:基于各目标交互组件的显示方式,在第二区域显示各目标交互组件。
在本公开的一个实施例中,处理模块31在根据直播信息,确定至少一个目标交互组件之后,还用于:根据直播信息,确定目标交互组件的固定方式,其中,固定方式为常驻状态或可调整状态;响应于组件移动操作,调整处于可调整状态的目标交互组件在第二区域内的位置。
在本公开的一个实施例中,处理模块31,具体用于:据直播视频的内容,在第二区域显示对应的预览组件,预览组件的组件外观用于表征至少一种交互功能的内容;响应于针对该预览组件的触发操作,在第二区域显示至少一个目标交互组件。
其中,显示模块31和处理模块32连接。本实施例提供的直播间页面显示装置3可以执行上述方法实施例的技术方案,其实现原理和技术效果类似, 本实施例此处不再赘述。
图12为本公开实施例提供的一种电子设备的结构示意图,如图12所示,该电子设备7包括:
处理器41,以及与处理器41通信连接的存储器42;
存储器42存储计算机执行指令;
处理器41执行存储器42存储的计算机执行指令,以实现如图2-图10所示实施例中的直播间页面显示方法。
其中,可选地,处理器41和存储器42通过总线43连接。
相关说明可以对应参见图2-图10所对应的实施例中的步骤所对应的相关描述和效果进行理解,此处不做过多赘述。
参考图13,其示出了适于用来实现本公开实施例的电子设备900的结构示意图,该电子设备900可以为终端设备或服务器。其中,终端设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、个人数字助理(Personal Digital Assistant,简称PDA)、平板电脑(Portable Android Device,简称PAD)、便携式多媒体播放器(Portable Media Player,简称PMP)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。图13示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图13所示,电子设备900可以包括处理装置(例如中央处理器、图形处理器等)901,其可以根据存储在只读存储器(Read Only Memory,简称ROM)902中的程序或者从存储装置908加载到随机访问存储器(Random Access Memory,简称RAM)903中的程序而执行各种适当的动作和处理。在RAM 903中,还存储有电子设备900操作所需的各种程序和数据。处理装置901、ROM 902以及RAM 903通过总线904彼此相连。输入/输出(I/O)接口905也连接至总线904。
通常,以下装置可以连接至I/O接口905:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置906;包括例如液晶显示器(Liquid Crystal Display,简称LCD)、扬声器、振动器等的输出装置907;包括例如磁带、硬盘等的存储装置908;以及通信装置909。通信装置909可以允许电子设备900与其他设备进行无线或有线通信以交换 数据。虽然图13示出了具有各种装置的电子设备900,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置909从网络上被下载和安装,或者从存储装置908被安装,或者从ROM902被安装。在该计算机程序被处理装置901执行时,执行本公开实施例的方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(射频)等等,或者上述的任意合适的组合。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备执行上述实施例所示的方法。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括面向对象的程序设计语言-诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言-诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(Local Area Network,简称LAN)或广域网(Wide Area Network,简称WAN)-连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定,例如,第一获取单元还可以被描述为“获取至少两个网际协议地址的单元”。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、片上系统(SOC)、复杂可编程逻辑设备(CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
第一方面,根据本公开的一个或多个实施例,提供了一种直播间页面显示方法,包括:
显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。
根据本公开的一个或多个实施例,根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,包括:获取直播信息,所述直播信息表征所述直播间页面当前显示的直播视频的内容;根据所述直播信息,确定至少一个所述目标交互组件;在所述第二区域显示至少一个所述目标交互组件。
根据本公开的一个或多个实施例,所述直播信息中包括直播事件信息,所述直播事件信息表征所述直播视频中发生的直播事件;根据所述直播信息,确定至少一个所述目标交互组件,包括:根据所述直播信息中的直播事件信息对应的目标直播事件,确定对应的目标交互组件。
根据本公开的一个或多个实施例,所述直播信息中包括直播间信息,所述直播间信息表征直播间内触发目标互动事件的次数;根据所述直播信息,确定至少一个所述目标交互组件,包括:根据所述直播信息中的直播间信息,确定所述直播间页面对应的直播间内触发所述目标互动事件的目标次数;根据触发所述目标互动事件的目标次数和所述直播事件信息对应的目标直播事件,确定所述目标交互组件。
根据本公开的一个或多个实施例,所述方法还包括:根据所述直播信息,确定各所述目标交互组件的优先级;基于各所述目标交互组件的优先级,确定各所述目标交互组件的展示次序;基于所述展示次序确定各所述目标交互组件在所述第二区域中的位置。
根据本公开的一个或多个实施例,所述直播信息中包括所述直播视频的开播时长;所述根据所述直播信息,确定各所述目标交互组件的优先级,包括:根据所述开播时长,确定各所述目标交互组件的优先级。
根据本公开的一个或多个实施例,在根据所述直播信息,确定至少一个所述目标交互组件之后,还包括:根据所述直播信息,确定所述目标交互组件的显示方式,其中,所述显示方式为嵌入式显示或悬浮式显示;在所述第二区域显示至少一个所述目标交互组件,包括:基于各所述目标交互组件的显示方式,在所述第二区域显示各所述目标交互组件。
根据本公开的一个或多个实施例,在根据所述直播信息,确定至少一个所述目标交互组件之后,还包括:根据所述直播信息,确定所述目标交互组件的固定方式,其中,所述固定方式为常驻状态或可调整状态;响应于组件移动操作,调整处于所述可调整状态的目标交互组件在所述第二区域内的位置。
根据本公开的一个或多个实施例,根据所述直播视频的内容,在所述第二区域显示目标交互组件,包括:据所述直播视频的内容,在所述第二区域显示对应的预览组件,所述预览组件的组件外观用于表征至少一种交互功能的内容;响应于针对该预览组件的触发操作,在第二区域显示至少一个所述目标交互组件。
第二方面,根据本公开的一个或多个实施例,提供了一种直播间页面显示装置,包括:
显示模块,用于显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;
处理模块,用于根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。
根据本公开的一个或多个实施例,所述处理模块,具体用于:获取直播 信息,所述直播信息表征所述直播间页面当前显示的直播视频的内容;根据所述直播信息,确定至少一个所述目标交互组件;在所述第二区域显示至少一个所述目标交互组件。
根据本公开的一个或多个实施例,所述直播信息中包括直播事件信息,所述直播事件信息表征所述直播视频中发生的直播事件;所述处理模块在根据所述直播信息,确定至少一个所述目标交互组件时,具体用于:根据所述直播信息中的直播事件信息对应的目标直播事件,确定对应的目标交互组件。
根据本公开的一个或多个实施例,所述直播信息中包括直播间信息,所述直播间信息表征直播间内触发目标互动事件的次数;所述处理模块在根据所述直播信息,确定至少一个所述目标交互组件时,具体用于:根据所述直播信息中的直播间信息,确定所述直播间页面对应的直播间内触发所述目标互动事件的目标次数;根据触发所述目标互动事件的目标次数和所述直播事件信息对应的目标直播事件,确定所述目标交互组件。
根据本公开的一个或多个实施例,所述处理模块,还用于:根据所述直播信息,确定各所述目标交互组件的优先级;基于各所述目标交互组件的优先级,确定各所述目标交互组件的展示次序;基于所述展示次序确定各所述目标交互组件在所述第二区域中的位置。
根据本公开的一个或多个实施例,所述直播信息中包括所述直播视频的开播时长;所述处理模块在根据所述直播信息,确定各所述目标交互组件的优先级时,具体用于:根据所述开播时长,确定各所述目标交互组件的优先级。
根据本公开的一个或多个实施例,所述处理模块在根据所述直播信息,确定至少一个所述目标交互组件之后,还用于:根据所述直播信息,确定所述目标交互组件的显示方式,其中,所述显示方式为嵌入式显示或悬浮式显示;所述处理模块在所述第二区域显示至少一个所述目标交互组件时,包具体用于:基于各所述目标交互组件的显示方式,在所述第二区域显示各所述目标交互组件。
根据本公开的一个或多个实施例,所述处理模块在根据所述直播信息,确定至少一个所述目标交互组件之后,还用于:根据所述直播信息,确定所述目标交互组件的固定方式,其中,所述固定方式为常驻状态或可调整状态; 响应于组件移动操作,调整处于所述可调整状态的目标交互组件在所述第二区域内的位置。
根据本公开的一个或多个实施例,所述处理模块,具体用于:据所述直播视频的内容,在所述第二区域显示对应的预览组件,所述预览组件的组件外观用于表征至少一种交互功能的内容;响应于针对该预览组件的触发操作,在第二区域显示至少一个所述目标交互组件。
第三方面,根据本公开的一个或多个实施例,提供了一种电子设备,包括:处理器,以及与所述处理器通信连接的存储器;
所述存储器存储计算机执行指令;
所述处理器执行所述存储器存储的计算机执行指令,以实现如上第一方面以及第一方面各种可能的设计所述的直播间页面显示方法。
第四方面,根据本公开的一个或多个实施例,提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的直播间页面显示方法。
第五方面,本公开实施例提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现如上第一方面以及第一方面各种可能的设计所述的直播间页面显示方法。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。
此外,虽然采用特定次序描绘了各操作,但是这不应当理解为要求这些操作以所示出的特定次序或以顺序次序执行来执行。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实施例中。相反地,在单个实施例的上下文中描述的各种特征也可以单独地或以任何合适的子组合的 方式实现在多个实施例中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (13)

  1. 一种直播间页面显示方法,其特征在于,包括:
    显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;
    根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。
  2. 根据权利要求1所述的方法,其特征在于,根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,包括:
    获取直播信息,所述直播信息表征所述直播间页面当前显示的直播视频的内容;
    根据所述直播信息,确定至少一个所述目标交互组件;
    在所述第二区域显示至少一个所述目标交互组件。
  3. 根据权利要求2所述的方法,其特征在于,所述直播信息中包括直播事件信息,所述直播事件信息表征所述直播视频中发生的直播事件;根据所述直播信息,确定至少一个所述目标交互组件,包括:
    根据所述直播信息中的直播事件信息对应的目标直播事件,确定所述目标交互组件。
  4. 根据权利要求3所述的方法,其特征在于,所述直播信息中还包括直播间信息,所述直播间信息表征直播间内触发目标互动事件的次数;根据所述直播信息中的直播事件信息对应的目标直播事件,确定所述目标交互组件,包括:
    根据所述直播信息中的直播间信息,确定所述直播间页面对应的直播间内触发所述目标互动事件的目标次数;
    根据触发所述目标互动事件的目标次数和所述直播事件信息对应的目标直播事件,确定所述目标交互组件。
  5. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    根据所述直播信息,确定各所述目标交互组件的优先级;
    基于各所述目标交互组件的优先级,确定各所述目标交互组件的展示信 息,所述展示信息表征所述目标交互组件在所述第二区域中的显示位置,和/或,所述目标交互组件在目标时间段内显示的时间次序。
  6. 根据权利要求5所述的方法,其特征在于,所述直播信息中包括所述直播视频的开播时长;所述根据所述直播信息,确定各所述目标交互组件的优先级,包括:
    根据所述开播时长,确定各所述目标交互组件的优先级。
  7. 根据权利要求2所述的方法,其特征在于,在根据所述直播信息,确定至少一个所述目标交互组件之后,还包括:
    根据所述直播信息,确定所述目标交互组件的显示方式,其中,所述显示方式为嵌入式显示或悬浮式显示;
    在所述第二区域显示至少一个所述目标交互组件,包括:
    基于各所述目标交互组件的显示方式,在所述第二区域显示各所述目标交互组件。
  8. 根据权利要求2所述的方法,其特征在于,在根据所述直播信息,确定至少一个所述目标交互组件之后,还包括:
    根据所述直播信息,确定所述目标交互组件的固定方式,其中,所述固定方式为常驻状态或可调整状态;
    响应于组件移动操作,调整处于所述可调整状态的目标交互组件在所述第二区域内的位置。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,根据所述直播视频的内容,在所述第二区域显示目标交互组件,包括:
    据所述直播视频的内容,在所述第二区域显示预览组件;
    响应于针对该预览组件的触发操作,在第二区域显示至少一个所述目标交互组件。
  10. 一种直播间页面显示装置,其特征在于,包括:
    显示模块,用于显示直播间页面,所述直播间页面内包括第一区域和第二区域,其中,所述第一区域用于显示直播视频;
    处理模块,用于根据所述第一区域内显示的直播视频的内容,在所述第二区域显示目标交互组件,所述目标交互组件用于在被触发后,执行与所述直播视频对应的组件功能。
  11. 一种电子设备,其特征在于,包括:处理器,以及与所述处理器通信连接的存储器;
    所述存储器存储计算机执行指令;
    所述处理器执行所述存储器存储的计算机执行指令,以实现如权利要求1至9中任一项所述的直播间页面显示方法。
  12. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1至9任一项所述的直播间页面显示方法。
  13. 一种计算机程序产品,其特征在于,包括计算机程序,该计算机程序被处理器执行时实现权利要求1至9中任一项所述的直播间页面显示方法。
PCT/CN2023/131113 2022-11-10 2023-11-10 直播间页面显示方法、装置、电子设备及存储介质 WO2024099450A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211407240.4A CN115604500A (zh) 2022-11-10 2022-11-10 直播间页面显示方法、装置、电子设备及存储介质
CN202211407240.4 2022-11-10

Publications (1)

Publication Number Publication Date
WO2024099450A1 true WO2024099450A1 (zh) 2024-05-16

Family

ID=84853776

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/131113 WO2024099450A1 (zh) 2022-11-10 2023-11-10 直播间页面显示方法、装置、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN115604500A (zh)
WO (1) WO2024099450A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115604500A (zh) * 2022-11-10 2023-01-13 北京字跳网络技术有限公司(Cn) 直播间页面显示方法、装置、电子设备及存储介质
CN117596418A (zh) * 2023-10-11 2024-02-23 书行科技(北京)有限公司 直播间ui展示控制方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140156752A1 (en) * 2012-05-11 2014-06-05 iConnectUS LLC Software applications for interacting with live events and methods of use thereof and creating custom content streams
CN111263175A (zh) * 2020-01-16 2020-06-09 网易(杭州)网络有限公司 直播平台的交互控制方法及装置、存储介质及电子设备
CN114385297A (zh) * 2022-01-11 2022-04-22 北京字跳网络技术有限公司 页面的显示方法、装置、电子设备、存储介质和程序产品
CN115022653A (zh) * 2022-04-27 2022-09-06 北京达佳互联信息技术有限公司 信息展示方法、装置、电子设备及存储介质
CN115065838A (zh) * 2022-05-31 2022-09-16 广州方硅信息技术有限公司 直播间封面交互方法、系统、装置及电子设备
CN115604500A (zh) * 2022-11-10 2023-01-13 北京字跳网络技术有限公司(Cn) 直播间页面显示方法、装置、电子设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140156752A1 (en) * 2012-05-11 2014-06-05 iConnectUS LLC Software applications for interacting with live events and methods of use thereof and creating custom content streams
CN111263175A (zh) * 2020-01-16 2020-06-09 网易(杭州)网络有限公司 直播平台的交互控制方法及装置、存储介质及电子设备
CN114385297A (zh) * 2022-01-11 2022-04-22 北京字跳网络技术有限公司 页面的显示方法、装置、电子设备、存储介质和程序产品
CN115022653A (zh) * 2022-04-27 2022-09-06 北京达佳互联信息技术有限公司 信息展示方法、装置、电子设备及存储介质
CN115065838A (zh) * 2022-05-31 2022-09-16 广州方硅信息技术有限公司 直播间封面交互方法、系统、装置及电子设备
CN115604500A (zh) * 2022-11-10 2023-01-13 北京字跳网络技术有限公司(Cn) 直播间页面显示方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN115604500A (zh) 2023-01-13

Similar Documents

Publication Publication Date Title
US11178448B2 (en) Method, apparatus for processing video, electronic device and computer-readable storage medium
WO2024099450A1 (zh) 直播间页面显示方法、装置、电子设备及存储介质
WO2021254186A1 (zh) 一种活动入口的展示方法、装置、电子设备及存储介质
US9285945B2 (en) Method and apparatus for displaying multi-task interface
JP7407340B2 (ja) ホットスポットリストの表示方法、装置、電子機器および記憶媒体
US20240094875A1 (en) Video interaction method and apparatus, electronic device, and storage medium
WO2023104102A1 (zh) 一种直播评论展示方法、装置、设备、程序产品及介质
US11930252B2 (en) Video recommendation method and apparatus, electronic device, and storage medium
WO2022199426A1 (zh) 视频的展示方法、装置、电子设备和存储介质
US11997356B2 (en) Video page display method and apparatus, electronic device and computer-readable medium
US11652763B2 (en) Information display method and apparatus, and electronic device
US20220253492A1 (en) Method, an apparatus, an electronic device and a storage medium for multimedia information processing
US20230328330A1 (en) Live streaming interface display method, and device
US20230133163A1 (en) Video processing method and apparatus, device, storage medium and computer program product
WO2024094130A1 (zh) 内容分享方法、装置、设备、计算机可读存储介质及产品
WO2023134559A1 (zh) 评论提示方法、装置、电子设备、存储介质和程序产品
WO2024078486A1 (zh) 内容展示方法、装置、设备及存储介质
US11886484B2 (en) Music playing method and apparatus based on user interaction, and device and storage medium
WO2023131104A1 (zh) 直播过程中的界面显示方法、装置、设备、介质及产品
WO2021135684A1 (zh) 直播间互动方法、装置、可读介质及电子设备
WO2023143299A1 (zh) 消息展示方法、装置、设备及存储介质
WO2023134610A1 (zh) 一种视频展示与交互方法、装置、电子设备及存储介质
WO2024099455A1 (zh) 直播互动方法、装置、电子设备及存储介质
WO2024165022A1 (en) Method, device and medium for enabling moving comments to a video
WO2024169923A1 (zh) 基于虚拟对象的互动方法、装置、设备、存储介质及产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23888135

Country of ref document: EP

Kind code of ref document: A1