CN113542782A - Method and device for guiding live broadcast interaction, electronic equipment and storage medium - Google Patents

Method and device for guiding live broadcast interaction, electronic equipment and storage medium Download PDF

Info

Publication number
CN113542782A
CN113542782A CN202110760392.1A CN202110760392A CN113542782A CN 113542782 A CN113542782 A CN 113542782A CN 202110760392 A CN202110760392 A CN 202110760392A CN 113542782 A CN113542782 A CN 113542782A
Authority
CN
China
Prior art keywords
timing
live
interactive
space
foreground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110760392.1A
Other languages
Chinese (zh)
Other versions
CN113542782B (en
Inventor
许圣霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Music Entertainment Technology Shenzhen Co Ltd
Original Assignee
Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Music Entertainment Technology Shenzhen Co Ltd filed Critical Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority to CN202110760392.1A priority Critical patent/CN113542782B/en
Publication of CN113542782A publication Critical patent/CN113542782A/en
Application granted granted Critical
Publication of CN113542782B publication Critical patent/CN113542782B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method for guiding live broadcast interaction, which comprises the following steps: triggering an interactive guide timing in response to a user trigger operation associated with a live space of an application program, wherein the interactive guide timing is performed independently of the live space; when the guide interaction timing reaches a preset duration threshold, detecting whether the live broadcast space is located in the foreground of the application program; and if the live broadcast space is detected to be positioned in the foreground of the application program, triggering the prompt of the interactive guidance message. The application also discloses a device, electronic equipment and storage medium for guiding live broadcast interaction.

Description

Method and device for guiding live broadcast interaction, electronic equipment and storage medium
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for guiding live broadcast interaction. The application also relates to a related electronic device and a storage medium.
Background
With the development of internet technology, the network live broadcast service is widely popularized, and the network anchor user can play the live broadcast audio and video stream to a plurality of audience users entering a live broadcast space.
For example, in a karaoke live space (also referred to as a song room), an anchor user can collect a live audio stream of the anchor user through a microphone of an anchor user side terminal and send the live audio stream to a live server through an anchor client; the server sends the live audio stream to the audience client end of the same song room through the broadcasting or multicasting technology, so that the live audio stream is played in each audience user side terminal.
The requirements of users of internet live broadcast, including the anchor and audiences, on the social property of the internet live broadcast are continuously increasing, and an important point is that the audiences and the anchor can well interact. In this regard, it is desirable to improve viewer interaction with the anchor by various means while avoiding the impact of interactive alerts on the user experience, particularly for the viewer user.
This background description is for the purpose of facilitating understanding of relevant art in the field and is not to be construed as an admission of the prior art.
Disclosure of Invention
Therefore, embodiments of the present invention are intended to provide a method and an apparatus for guiding live broadcast interaction, and a related electronic device and a storage medium, which can improve the interaction between a viewer in a live broadcast space, such as a studio, and a main broadcast, and at the same time avoid the interaction from affecting the user experience of the user, especially the viewer user.
In a first aspect, there is provided a method of guiding live interaction, comprising:
triggering an interactive guide timing in response to a user trigger operation associated with a live space of an application program, wherein the interactive guide timing is performed independently of the live space;
when the guide interaction timing reaches a preset duration threshold, detecting whether the live broadcast space is located in the foreground of the application program;
and if the live broadcast space is detected to be positioned in the foreground of the application program, triggering the prompt of the interactive guidance message.
In a second aspect, there is provided a device for guiding live interaction, comprising:
a timer unit configured to trigger interactive guided timing in response to a user trigger operation of an application live space, wherein the timer unit is independent of the live space;
the detection unit is configured to detect whether the live broadcast space is located in the foreground of the application program when the guiding interaction timing reaches a preset duration threshold;
and the triggering unit is configured to trigger the prompt of the interactive guidance message if the live broadcast space is detected to be positioned in the foreground of the application program.
In a third aspect, an electronic device is provided, comprising: a processor and a memory storing a computer program, the processor being configured to perform the method of any of the embodiments of the invention when the computer program is run.
In a fourth aspect, there is provided a storage medium storing a computer program configured when executed to perform the method of any of the embodiments of the present invention.
In embodiments of the invention, good viewer interaction with the anchor may be facilitated by providing periodic guidance to the user, particularly the viewer user, for example to facilitate viewer enjoyment of the anchor, and also to avoid undue annoyance to the user affecting the user experience of the user in the live space. In addition, considering that a user often switches among different modules of an application program and possibly switches among different application programs when using the electronic device, in the embodiment of the present invention, the user can still be effectively guided to perform live broadcast interaction under the condition that the user switches within the application or between the applications.
Additional optional features and technical effects of embodiments of the invention are set forth, in part, in the description which follows and, in part, will be apparent from the description.
Drawings
Embodiments of the invention will hereinafter be described in detail with reference to the accompanying drawings, wherein the elements shown are not to scale as shown in the figures, and wherein like or similar reference numerals denote like or similar elements, and wherein:
FIG. 1 illustrates an exemplary architecture diagram of an application according to an embodiment of the present invention, showing device modules according to an embodiment of the present invention;
FIG. 2 shows a first schematic flow chart of a method according to an embodiment of the invention;
FIG. 3 shows a second schematic flow chart of a method according to an embodiment of the invention;
FIG. 4 shows a third schematic flow chart of a method according to an embodiment of the invention;
FIG. 5 shows a fourth schematic flow chart of a method according to an embodiment of the invention;
FIG. 6 shows an exemplary diagram implementing method steps according to an embodiment of the invention;
FIGS. 7A and 7B illustrate exemplary diagrams implementing method steps according to embodiments of the invention;
FIG. 8 shows an exemplary diagram implementing method steps according to an embodiment of the invention;
FIG. 9 illustrates an exemplary process diagram of a method in accordance with one particular embodiment of the invention;
FIG. 10 shows a schematic structural diagram of an apparatus according to an embodiment of the invention;
fig. 11 shows a hardware configuration diagram of an electronic device according to an embodiment of the invention;
FIG. 12A shows an operating system diagram of an electronic device, in accordance with embodiments of the present invention;
FIG. 12B shows an operating system diagram of an electronic device, according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following detailed description and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
In the embodiment of the present invention, a live Application (APP) can be broadly interpreted, and covers an APP with a live function as a main function and other types of APPs with live functions or modules. For example, in some embodiments of the present invention, the Applications (APPs) may be different types of APPs with live functionality or modules, including but not limited to: music APP, social APP, shopping APP, video APP, and the like.
In embodiments of the present invention, a "live space" may be broadly interpreted as an independent space provided in a live application in which live audio and video streams are provided by a anchor user to audience users. In the embodiment of the invention, the "live broadcasting space" covers a "live broadcasting room" and a "song room".
In the embodiments of the present invention, "foreground" or "background" has a meaning known in the art, and refers to whether to display on each display level interface of the electronic device, and covers the foreground or the background of an application (interface), and the foreground or the background of an operating system (interface).
The embodiment of the invention provides a method and a device for guiding live broadcast interaction. Furthermore, the embodiment of the invention also relates to an electronic device implementing the method and a storage medium storing a program capable of realizing the method when the program is executed. In some embodiments, an apparatus, component, unit or model may be implemented by software, hardware or a combination of software and hardware.
And in particular to fig. 1, an exemplary architecture diagram of an application according to an embodiment of the present invention is shown, illustrating an apparatus/module according to an embodiment of the present invention.
As previously mentioned, the live Applications (APPs) of various embodiments of the present invention may encompass both direct-broadcast-enabled APPs as well as other types of APPs having direct-broadcast functionality or modules. In fig. 1, an architecture diagram of a music type APP with live function, such as a karaoke APP, according to an embodiment of the present invention is shown.
With continued reference to fig. 1, in the architecture of the live Application (APP), such as the karaoke APP, one or more live space modules may be provided. In the particular embodiment shown in fig. 1, the live space module may include a live (room) module 101 and a song room module 102, respectively. With the live room module 101, a live APP may provide one or more live rooms in which a host user may provide, for example, a performance of a song singing or a video live to play to audience users in the live room. With the help of the song room module 102, the live APP may provide one or more song rooms in which one or more singers may sing a song in solo or karaoke fashion, and the audience user may enter the song room to listen to the song solo or karaoke by the singer. In other embodiments of the invention, only one of the live room and the song room may be provided, or other types of live spaces may be provided.
In the embodiment of the invention, interaction can be carried out between users in a live broadcast space such as a live broadcast room or a song room, such as between audience users and a main broadcast user, so that the social contact of network live broadcast is improved. In the embodiment shown in fig. 1, the interaction may include enjoying the virtual item in a live space, such as an audience user enjoying the virtual item to a main user. Those skilled in the art will appreciate that in alternative embodiments, the interaction may also include sending a message, sending a barrage, or sending a special effect, etc.
In some embodiments, playback (e.g., live) and presentation (e.g., reward) may be provided as separate system architectures. For example, in the bus architecture of a live APP such as a karaoke APP exemplarily shown in fig. 1, a reward view (RewardView) module 103 and a virtual article reward unit (or referred to as a gift sending processing unit (giftsendacessor)) 104 may be further provided, and the reward view module 103 and the virtual article reward unit 104 may be independent of the live (room) module 101 and the song room module 102 mainly used for constructing a live room or a song room.
With continued reference to fig. 1, a bus architecture of live APP such as karaoke APP is shown in which a timing-based direct live interaction device/module independent of a live space such as a live room or a song room is also provided. In the exemplary embodiment shown in fig. 1, the direct live interaction device/module may also be referred to as an interactive viscosity promotion device (e.g., StickyTimerProcessor) 105.
Implementations or implementations of various embodiments of the direct live interaction device/module and related methods will be described further below with reference to the accompanying drawings.
Referring to fig. 2, a method of guiding live interaction according to an embodiment of the present invention is shown, which may be implemented by the live interaction device/module, for example.
With continued reference to fig. 2, the method may include:
s201: and triggering the interactive guide timing in response to the user trigger operation of the application program live broadcast space.
In an embodiment of the present invention, as described above with reference to fig. 1, the interactive guide timing is performed independently of the live space. The independent interactive guide timing is performed independently of whether a user is in the live broadcast space or whether the live broadcast space is in the foreground of an application program.
In an embodiment of the present invention, the user-triggered operation includes at least one of a user entering a live space or a user (e.g., a viewer user) performing an interactive operation in the live space. As in the embodiment shown in fig. 1, the user interaction includes a user (e.g., audience user) enjoying a virtual item in the live space.
In some embodiments of the present invention, the step S201 further includes:
a1: in response to the user trigger operation, recording an identification number (ID) of the live space associated with the user trigger operation, and associating the identification number with the interactive guidance timing.
In order to effectively confirm that the same live space for which the timing is directed is located or returned to the foreground of the application when the interaction is conducted, the ID of the live space may be recorded in response to a trigger operation record of the user, which may be cached in a memory of a process of the application, for example. For example, an identification number buffer may be provided, alone or in combination with other elements, to record the ID of the live space in the buffer, as schematically shown in fig. 9.
S202: and when the guide interaction timing reaches a preset duration threshold, detecting whether the live broadcast space is positioned in the foreground of the application program.
In some embodiments of the invention, the detection may be triggered by setting an event mechanism. For example, when the guiding interaction timing reaches a preset time threshold, an event that the guiding interaction timing reaches the preset time threshold may be generated; and, triggering the detection by the generated event. In some embodiments, the events may be generated, cached, consumed, or destroyed as the case may be, as described further below.
In some embodiments of the present invention, a corresponding Event Pool (Event Pool) may be set to record the Event. For example, in some embodiments of the present invention, the events may be recorded in a boot interaction events pool, or referred to as an interactive sticky events pool. In some embodiments, an integral buffer may be provided in the memory of the application process, so as to integrate the event pool, the aforementioned identification number buffer and/or the timing buffer therein (as described below).
In some embodiments of the present invention, the preset duration threshold may be determined empirically. In other embodiments, the preset duration threshold may be determined based on historical statistics. In some embodiments, the preset duration threshold may be determined based on historical statistics of the user, or based on historical statistical averages of a plurality of users of the application.
In some embodiments, the detecting whether the live space is in the foreground of the application may include:
a2: reading the identification number of the recorded live broadcast space;
a3: determining whether the live space is in a foreground based on the read identification number.
The step a3 may include:
a4: and checking whether the read identification number is consistent with the identification number of the live broadcast space positioned on the application program foreground or not so as to confirm whether the live broadcast space positioned on the application program foreground is the live broadcast space associated with the user trigger operation or not.
S203: and if the live broadcast space is positioned in the foreground of the application program, triggering the prompt of an interactive guidance message.
In some embodiments of the invention, the aforementioned generated events are consumed and destroyed when the prompt of the interactive guidance message is triggered.
In some embodiments of the invention, the method may further comprise:
b1: if the live broadcast space is detected not to be located in the foreground of the application program, caching an event that the guiding interaction timing reaches a preset duration threshold value and not triggering the prompt of the interaction guiding message.
B2: and when the live broadcast space is switched back to the foreground of the application program, reading the cached event, and if the cached event is read, triggering the prompt of an interactive guidance message.
In this case, for example, events recorded (cached) in the event pool are read when the live space is switched back from the background of the application or when the application is switched back to the foreground directly from the system background. If the event exists, the event is consumed, a prompt is triggered, and the event can be destroyed accordingly.
Here, the aforementioned ID checking method may also be used to detect whether the live broadcast space is switched back to the foreground of the application, which is not described herein.
In these embodiments of the invention, the detection and the event logging may be interlocked.
Thus, on the one hand, in response to the recording of an event, it will be detected, e.g., whether the monitoring of the live broadcast space, e.g., the live broadcast room/song room, is located in the foreground of the application (e.g., whether the user is located in the live broadcast space), and when it is detected that the live broadcast room/song room is located in the foreground of the application, the prompt for the interactive guidance message is triggered, optionally removing the recorded event; on the other hand, when it is detected that the live space, such as a live room/song room, switches back to the foreground of the application (e.g., restored by the system background or the application background), it is detected whether there is a recorded but unconsumed event in the event pool, and when there is a recorded event, the event is consumed, i.e., the prompt to trigger the interactive guidance message, and the recorded event is optionally removed.
In the embodiment of the invention, by setting the interactive guide timing independent of the live broadcast space such as a live broadcast room/song room, the timing can still be carried out according to a reasonable time length threshold value under the condition that the user already exits the current live broadcast space or the live broadcast space is not in the foreground of an application program; moreover, by guiding the recording of the event that the interactive timing reaches the preset duration threshold and detecting whether the live broadcast space (live broadcast room/song room) is located in the foreground of the application program, the interactive prompt message can be provided only when the live broadcast space (live broadcast room/song room) is located in the foreground of the application program and is presented to the user. Therefore, when the live broadcast space exits from the foreground of the application program (for example, the user exits from the live broadcast space or minimizes the live broadcast space in an application program interface), the interaction (such as appreciation) of the user can be still promoted at a reasonable time interval, so that the interactivity of the live broadcast of the network is improved, and the use experience of the user using the live broadcast application program is not influenced by too many messages.
In a further embodiment, after the timing is started and in the case that the timing does not reach the preset duration threshold, if the live space is no longer in the foreground of the application (e.g. the user exits the live space or minimizes the live space on the application interface), but the application is still in the foreground of the system interface, the timing is still continued without depending on the live space or the live/song room module as described in fig. 1.
For example, referring to fig. 3, a second schematic flow chart of a method according to an embodiment of the present invention is shown, in which a process is embodied when the live space is not in the foreground of the application during the timing process. In the embodiment shown in fig. 3, the method may further include:
s301: after triggering the interactive guide timing and before the guide interactive timing reaches a preset time threshold, if the live broadcast space leaves the foreground of the application program and the application program is located in the system foreground, the live broadcast interactive guide timing is continued.
S302: and when the guiding interactive timing reaches a preset time threshold, if the live broadcast space is not located in the foreground of the application program and the application program is located in the system foreground, caching an event that the guiding interactive timing reaches the preset time threshold and not triggering the prompt of an interactive guiding message.
In some embodiments, as previously described, in response to a generated event, it will be detected whether the live space is in the foreground of the application; when the live broadcast space leaves the foreground of the application program and the application program is positioned in the foreground of the system, the prompt of the interactive guidance message is not triggered, and meanwhile, the event is not consumed and destroyed but is cached.
S303: and when the live broadcast space (from the background of the application program) is switched back to the foreground of the application program, reading the recorded event, and if the recorded event is read, triggering the prompt of the interactive guide message.
An implementation of the relevant method steps according to an embodiment of the invention is described in conjunction with fig. 2, 3 and 6 and fig. 9, where fig. 6 shows an exemplary diagram of the method steps according to an embodiment of the invention. In the embodiment shown in fig. 6, the electronic device 600 is, for example, a mobile terminal, in which a live APP can be installed, and in the embodiment shown, for example, a karaoke APP with a live function. The karaoke APP in the illustrated embodiment may have, for example, a bus architecture as shown in FIG. 1. Fig. 6 shows that the live APP is in the foreground of the electronic device.
Although not shown in the figure, when the user enters a live space such as the song room 610, an interactive guidance timer (e.g., a reward viscosity boost timer) is triggered. In some embodiments, an identification number (ID) of a live space, such as a song room, is recorded when a user enters the live space, such as the song room ID shown in fig. 6: 000800. as previously mentioned, the interactive guidance timing may be triggered in response to other operations by the user in the live space, for example, in response to user interactions such as a viewing operation in the live space. The timing is implemented, for example, by a timer 630 independent of the live space.
With reference to fig. 6 and fig. 3, after the interactive guidance timing is triggered and before the guidance interactive timing reaches the preset duration threshold, if the live broadcast space is not in the foreground of the application and the application is in the foreground of the system, in other words, when the user switches within the application, the live broadcast interactive guidance timing can be continued. As in the example of fig. 6, the timing in the timer continues as the user switches within the application. In the embodiment shown in fig. 6, where the live space is not in the foreground of the application, but the application is in the foreground of the system, a display float window or "bubble" 620 of the live space, such as song room 610, may be presented, which may serve as a quick link icon back to the live space, such as song room 610. Here, the user may be "not exiting" the live space, e.g. the user may still listen to a live audio stream of a live space, such as a studio, by means of a speaker or headphones paired with the electronic device. In other embodiments, the absence of the live space in the foreground of the application may also cover a situation where the user has exited the live space. Here, in some embodiments of the present invention, it may be set that the timer is re-triggered (reset) when the user exits the current live space and enters a new live space.
Referring to fig. 2, 3, 6 and 8 in combination, when the timing of the timer reaches the preset duration threshold, an event that guides the interaction timing to reach the preset duration threshold may be generated in the event pool 860, for example. At this time, it can be detected whether the live space, such as the song room 810 (ID: 000800), is in the foreground, and if so, a prompt of interactive guidance information, such as the prompt of guidance appreciation information 840 shown in fig. 8, can be triggered. Upon triggering the prompt, the event is consumed and may be destroyed.
With reference to fig. 2, fig. 3, and fig. 6, if the timing of the timer reaches the preset duration threshold, and it is detected that the live broadcast space is not in the application foreground (for example, in the application background or in the system background), an event that guides the interactive timing to reach the preset duration threshold may be recorded (cached) in the event pool, and at this time, the prompt of the interactive guidance information is not triggered; also, a live space module, such as a live/song room module, can detect in real-time, e.g., listen, for the live space, such as the song room 820 (ID: 000800), to switch back to the foreground of the application. As shown in fig. 8, when the song room is switched back, a prompt of the interactive guidance information 840 is triggered.
According to a further embodiment, a solution when an application switches to the system background may also be provided.
Fig. 4 shows a solution for switching to the background of the system according to an embodiment of the present invention.
Referring to fig. 4, a third schematic flow chart of a method according to an embodiment of the invention is shown. On the basis of the method embodiments as shown in fig. 2 and/or fig. 3, the method in the embodiment shown in fig. 4 may further include:
s401: after the interactive guide timing is triggered and before the guide interactive timing reaches a preset time threshold, when the application program is switched to a system background, the interactive guide timing is suspended, and the timing time of the interactive guide timing is recorded.
S402: and when the application program is switched back to the system foreground from the system background, reading the recorded timing duration and continuing timing by using the timing duration.
Reference is made in combination to fig. 2, 3, 4, 6, 7A and 7B and 9. As previously mentioned and as exemplarily shown in fig. 9, the timing is separate from the live room/song room.
As exemplarily shown in fig. 7A and with reference to fig. 9, when the live APP of the electronic device 700 is switched to the system background, for example, because the application is suspended, the timer 730 will suspend the interactive booting timing, and record the timing duration when suspended, for example, in the timing buffer 750.
As exemplarily shown in fig. 7B and with reference to fig. 9, when the live APP of the electronic device 700 is switched back to the system foreground (at this time, the live space may or may not be located in the foreground of the application, and the live space shown in fig. 7B, such as song room (ID: 000800), is not located in the foreground of the application, but is presented in the form of a display window), the timer 730 may read the recorded duration of the pause from the timing buffer 750 and continue to count the time with the recorded duration.
In the embodiment shown in FIG. 4, the time that the application is in the background of the system does not count into the duration of the boot.
Fig. 5 shows another solution for switching to the system background according to an embodiment of the present invention.
Referring to fig. 5, a fourth schematic flow chart of a method according to an embodiment of the invention is shown. On the basis of the method embodiments as shown in fig. 2 and/or fig. 3, the method in the embodiment shown in fig. 5 may further include:
s501: after triggering the interactive guide timing and before the guide interactive timing reaches a preset time threshold, when the application program is switched to a system background, suspending the interactive guide timing and recording a trigger time point of the interactive guide timing.
S502: and when the application program is switched back to the system foreground from the system background, reading the trigger time point and determining the time length elapsed from the trigger time point.
S503: and if the elapsed time is greater than or equal to the preset time threshold, defining that the guiding interaction timing reaches the preset time threshold.
S504: and if the elapsed time length is less than the preset time length threshold value, continuing timing by using the elapsed time length.
The embodiment shown in fig. 5 differs from the embodiment shown in fig. 4 in that, for example, the time point of triggering the timing may be recorded in the timing buffer, and when the application returns to the system foreground, it is determined whether the preset duration threshold has been reached with the absolute time of the time point. For example, when t1-t0 ≧ Δ t, the preset duration threshold is defined to be reached, so as to, for example, perform the aforementioned detection, and trigger or not trigger the prompt of the interactive guidance message according to the detection result; wherein t1 is an event point at which the application switches from the system background back to the system foreground, t0 is a trigger event point, and Δ t is a preset duration threshold. In some embodiments, the defining may include generating the aforementioned event. Thus, in the embodiment shown in FIG. 5, the time that the application is in the background of the system is counted into the boot time period, and the event is recorded when the preset time period threshold is reached.
In some embodiments of the present invention, when the application program is located in the system background for a long time (far beyond the preset time threshold), the timer may be automatically cleared and/or the cache area or the event pool may be recovered or released based on a memory recovery mechanism configured by the system (e.g., Android or iOS), or a specially configured memory recovery mechanism may be used, at this time, the guidance interaction timing (and events thereof) mechanism is reset, and then entering the same live broadcast space triggers new guidance interaction timing, which is not described herein again.
In some embodiments of the present invention, as shown in fig. 10, there is further provided a device 1000 for guiding live interaction, which may be, for example, an interactive viscosity increasing device as shown in fig. 1. The device 1000 may include a timer unit 1001, a detection unit 1002, and a trigger unit 1003.
In some embodiments, the timer unit 1001 may be configured to trigger the interactive guidance timing in response to a user trigger operation of the application live space. In the embodiment of the present invention, the timer unit 1001 is independent from the live space (not shown).
In some embodiments, the detecting unit 1002 is configured to detect whether the live space is located in the foreground of the application program when the guiding interaction timing reaches a preset duration threshold.
In some embodiments, the triggering unit 1003 may be configured to trigger the prompt of the interactive guidance message if it is detected that the live broadcast space is located in the foreground of the application program.
In some optional embodiments, the device may further direct an interaction event pool, which may be configured to detect that the live space leaves a foreground of the application and the application is located in a system foreground, and cache an event that directs an interaction timing to reach a preset duration threshold. In some embodiments, the triggering unit 1003 may be further configured to read a cached event when the live space is switched back to the foreground of the application, and trigger a prompt of an interactive guidance message if the cached event is read.
In some optional embodiments, the apparatus 1000 for guiding live broadcast interaction may further include a time buffer unit configured to pause the interactive guidance timing when the application is switched to the system background after the interactive guidance timing is triggered and before the guidance interactive timing reaches a preset time threshold, and record a timing duration of the interactive guidance timing in a timing buffer. In this embodiment, the timer unit 1001 may be further configured to read the recorded timing duration from a timing buffer and continue to count with the timing duration when the application switches from the system background to the system foreground.
It will be clear to a person skilled in the art that without causing a contradiction, the device of the present embodiment may incorporate method features described in other embodiments, and vice versa.
The embodiment of the invention also provides the electronic equipment. In a preferred embodiment of the present invention, the electronic device may be a mobile terminal, and preferably may be a mobile phone or a tablet. By way of exemplary implementation only, fig. 11 illustrates a hardware architecture diagram of a particular embodiment, such as mobile terminal 1100; and fig. 12A and 12B are schematic views illustrating an operating system configuration of one embodiment of the mobile terminal.
In the illustrated embodiment, mobile terminal 1100 may include processor 1101, external memory interface 1112, internal memory 1110, Universal Serial Bus (USB) interface 1113, charge management module 1114, power management module 1115, battery 1116, mobile communication module 1140, wireless communication module 1142, antennas 1139 and 1141, audio module 1134, speaker 1135, microphone 1136, microphone 1137, headset interface 1138, keys 1109, motor 11011, indicator 1107, user identification module (SIM) card interface 1111, display 1105, camera 1106, sensor module 1120, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the mobile terminal 1100. In other embodiments of the present application, mobile terminal 1100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In some embodiments, processor 1101 may include one or more processing units. In some embodiments, the processor 1101 may include one or a combination of at least two of the following: an Application Processor (AP), a modem processor, a baseband processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), and so forth. The different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural center and a command center of the mobile terminal 1100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 1101, thereby increasing the efficiency of the system.
The NPU is a Neural Network (NN) computational processor that processes input information quickly by referencing a biological neural network structure, such as by referencing transfer patterns between human brain neurons, and may also be continuously self-learning.
The GPU is a microprocessor for image processing and is connected with a display screen and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor may include one or more GPUs that execute program instructions to generate or alter display information.
The digital signal processor (ISP) is used to process digital signals and may process other digital signals in addition to digital image signals.
In some embodiments, processor 1101 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and so forth.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not constitute a limitation to the structure of the mobile terminal. In other embodiments of the present application, the mobile terminal may also adopt different interface connection manners or a combination of multiple interface connection manners in the foregoing embodiments.
The wireless communication function of the mobile terminal 1100 may be implemented by the antennas 1139 and 1141, the mobile communication module 1140, the wireless communication module 1142, a modem processor, a baseband processor, or the like.
The mobile terminal 1100 may implement audio functions, such as music playing, sound recording, etc., through the audio module 1134, the speaker 1135, the headphones 1136, the microphone 1137, the earphone interface 1138, and the application processor, etc.
Audio module 1134 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The microphone 1137 is used to convert acoustic signals into electrical signals. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user.
The sensor module 1120 may include one or more of the following sensors:
the pressure sensor 1123 is configured to sense a pressure signal and convert the pressure signal into an electrical signal.
The air pressure sensor 1124 is used to measure air pressure.
The magnetic sensor 1125 includes a hall sensor.
Gyroscopic sensor 1127 may be used to determine the motion pose of mobile terminal 1100.
Acceleration sensor 1128 may detect the magnitude of acceleration of mobile terminal 1100 in various directions.
The distance sensor 1129 may be configured to measure distance.
The proximity light sensor 1121 may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 1122 is used to sense ambient light level.
The fingerprint sensor 1131 may be configured to capture a fingerprint.
The touch sensor 1132 may be disposed on the display screen, and the touch sensor and the display screen form a touch screen, which is also called a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby.
The bone conduction sensor 1133 may acquire a vibration signal.
A software operating system of an electronic device (computer) such as a mobile terminal may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
The embodiments illustrated herein exemplify the software structure of a mobile terminal, taking the iOS and android operating system platforms, respectively, as a layered architecture. It is contemplated that embodiments herein may be implemented in different software operating systems.
In the embodiment shown in fig. 12A, the solution of the embodiment of the present invention may employ an iOS operating system. The iOS operating system adopts a four-layer architecture, which comprises a touchable layer (coco Touch layer)1210, a Media layer (Media layer)1220, a Core Services layer (Core Services layer)1230 and a Core operating system layer (Core OS layer)1240 from top to bottom. The touch layer 1210 provides various commonly used frameworks for application development and most of the frameworks are related to interfaces, which are responsible for touch interaction operations of users on the iOS device. The media layer provides the technology of audio-visual aspects in the application, such as graphic images, sound technology, video and audio-video transmission related frameworks and the like. The core service layer provides the underlying system services required by the application. The core operating system layer contains most of the low level hardware-like functionality.
In an embodiment of the present invention, UIKit is the user interface framework of the touchable layer 1210.
Fig. 12B is a schematic structural diagram of an Android (Android) operating system, which may be adopted in the scheme of the embodiment of the present invention. The layered architecture divides the software into several layers, which communicate via software interfaces. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer 1210 ', an application framework layer 1220', an android Runtime (Runtime) and system libraries 1230 ', and a kernel layer 1240', respectively.
The application layer 1210' may include a series of application packages.
The application framework layer 1220' provides an Application Programming Interface (API) and a programming framework for applications of the application layer. The application framework layer includes a number of predefined functions.
The window manager is used for managing window programs.
The content provider is used to store and retrieve data so that it can be accessed by the application.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide a communication function of the mobile terminal.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction.
The android Runtime comprises a core library and a virtual machine, and is responsible for scheduling and managing an android system. The core library comprises two parts: one part is a function to be called by java language, and the other part is a core library of android. The application layer and the framework layer run in a virtual machine.
The system library may include a plurality of functional modules. The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others.
The kernel layer 1240' is a layer between hardware and software. The kernel layer may include a display driver, a camera driver, an audio interface, a sensor driver, power management, and a GPS interface. In some embodiments of the present invention, the display of the frame animation may invoke a display driver.
The systems, devices, modules or units described in the above or below embodiments of the present invention may be implemented by a computer or its associated components. The computer may be, for example, a mobile terminal, a smart phone, a Personal Computer (PC), a laptop, a vehicle-mounted human interaction device, a personal digital assistant, a media player, a navigation device, a game console, a tablet, a wearable device, a smart television, an internet of things system, a smart home, an industrial computer, a server, or a combination thereof, as the case may be.
In some embodiments of the present invention, a storage medium may also be provided. In some embodiments, the storage medium stores a computer program configured to perform the method of any of the embodiments of the present invention when executed.
Storage media in embodiments of the invention include permanent and non-permanent, removable and non-removable articles of manufacture in which information storage may be accomplished by any method or technology. Examples of storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The methods, programs, systems, apparatuses, etc., in embodiments of the present invention may be performed or implemented in a single or multiple networked computers, or may be practiced in distributed computing environments. In the described embodiments, tasks may be performed by remote processing devices that are linked through a communications network in such distributed computing environments.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Thus, it will be apparent to one skilled in the art that the implementation of the functional modules/units or controllers and the associated method steps set forth in the above embodiments may be implemented in software, hardware, and a combination of software and hardware.
Unless specifically stated otherwise, the actions or steps of a method, program or process described in accordance with an embodiment of the present invention need not be performed in a particular order and still achieve desirable results. In some embodiments, multitasking and parallel/combined processing of the steps may also be possible or may be advantageous.
In this document, "first" and "second" are used to distinguish different elements in the same embodiment, and do not denote any order or relative importance.
While various embodiments of the invention have been described herein, the description of the various embodiments is not intended to be exhaustive or to limit the invention to the precise forms disclosed, and features and components that are the same or similar to one another may be omitted for clarity and conciseness. As used herein, "one embodiment," "some embodiments," "examples," "specific examples," or "some examples" are intended to apply to at least one embodiment or example, but not to all embodiments, in accordance with the present invention. The above terms are not necessarily meant to refer to the same embodiment or example. Various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Exemplary systems and methods of the present invention have been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes for carrying out the systems and methods. It will be appreciated by those skilled in the art that various changes in the embodiments of the systems and methods described herein may be made in practicing the systems and/or methods without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. A method for directing live interaction, comprising:
triggering an interactive guide timing in response to a user trigger operation associated with a live space of an application program, wherein the interactive guide timing is performed independently of the live space;
when the guide interaction timing reaches a preset duration threshold, detecting whether the live broadcast space is located in the foreground of the application program;
and if the live broadcast space is detected to be positioned in the foreground of the application program, triggering the prompt of the interactive guidance message.
2. The method of claim 1, further comprising:
if the live broadcast space is detected not to be located in the foreground of the application program, caching an event which guides the interactive timing to reach a preset time length threshold value and not triggering the prompt of an interactive guide message;
and when the live broadcast space is switched back to the foreground of the application program, reading the cached event, and if the cached event is read, triggering the prompt of an interactive guidance message.
3. The method of claim 1, further comprising:
after triggering the interactive guide timing and before the guide interactive timing reaches a preset time threshold, if the live broadcast space leaves the foreground of the application program and the application program is located in the system foreground, the live broadcast interactive guide timing is continued.
4. The method of any of claims 1 to 3, further comprising:
after triggering the interactive guide timing and before the guide interactive timing reaches a preset time threshold, when the application program is switched to a system background, suspending the interactive guide timing and recording the timing time of the interactive guide timing;
and when the application program is switched back to the system foreground from the system background, reading the recorded timing duration and continuing timing by using the timing duration.
5. The method of any of claims 1 to 3, further comprising:
after triggering the interactive guide timing and before the guide interactive timing reaches a preset time threshold, when the application program is switched to a system background, suspending the interactive guide timing and recording a trigger time point of the interactive guide timing;
when the application program is switched back to a system foreground from a system background, reading the trigger time point, and determining the duration elapsed from the trigger time point;
if the elapsed time is greater than or equal to the preset time threshold, defining that the guiding interaction timing reaches the preset time threshold;
and if the elapsed time length is less than the preset time length threshold value, continuing timing by using the elapsed time length.
6. The method of any one of claims 1 to 3, wherein triggering an interactive guided timing in response to a user-triggered operation of an application live space comprises:
responding to the user trigger operation, recording an identification number of the live broadcast space associated with the user trigger operation, and associating the identification number with the interactive guide timing;
the detecting whether the live broadcast space is located in a foreground of the application program includes:
reading the identification number of the recorded live broadcast space;
determining whether the live space is in a foreground of the application program based on the read identification number.
7. The method of claim 6, wherein the determining whether the live space is in the foreground based on the identification number read comprises:
and checking whether the read identification number is consistent with the identification number of the live broadcast space positioned on the application program foreground or not so as to confirm whether the live broadcast space positioned on the application program foreground is the live broadcast space associated with the user trigger operation or not.
8. The method of any of claims 1-3, wherein the user-triggered operation comprises at least one of the user entering the live space and the user interacting with a host in the live space.
9. An apparatus for directing live interaction, comprising:
a timer unit configured to trigger interactive guided timing in response to a user trigger operation of an application live space, wherein the timer unit is independent of the live space;
the detection unit is configured to detect whether the live broadcast space is located in the foreground of the application program when the guiding interaction timing reaches a preset duration threshold;
and the triggering unit is configured to trigger the prompt of the interactive guidance message if the live broadcast space is detected to be positioned in the foreground of the application program.
10. The apparatus for directing live interaction of claim 9, further comprising:
the time cache unit is configured to pause the interactive guidance timing when the application program is switched to a system background after the interactive guidance timing is triggered and before the guidance interactive timing reaches a preset time threshold, and record the timing time of the interactive guidance timing in a timing cache region;
wherein the timer unit is further configured to read the recorded timing duration from a timing buffer and continue timing with the interaction-guided timing duration when the application is switched from the system background to the system foreground.
11. An electronic device, comprising: a processor and a memory storing a computer program, the processor being configured to implement the method of any of claims 1 to 8 when running the computer program.
12. A storage medium, characterized in that the storage medium stores a computer program configured to implement the method of any one of claims 1 to 8 when executed.
CN202110760392.1A 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium Active CN113542782B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110760392.1A CN113542782B (en) 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110760392.1A CN113542782B (en) 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113542782A true CN113542782A (en) 2021-10-22
CN113542782B CN113542782B (en) 2023-11-03

Family

ID=78097792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110760392.1A Active CN113542782B (en) 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113542782B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114257833A (en) * 2021-12-29 2022-03-29 广州方硅信息技术有限公司 Live broadcast room recommending and entering method, system, device, equipment and storage medium
CN114615515A (en) * 2022-03-15 2022-06-10 广州歌神信息科技有限公司 On-line singing hall space scheduling method and device, equipment, medium and product

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110252423A1 (en) * 2010-04-07 2011-10-13 Apple Inc. Opportunistic Multitasking
CN107734390A (en) * 2017-10-27 2018-02-23 广州酷狗计算机科技有限公司 Live broadcasting method, device and storage medium
CN108521607A (en) * 2018-04-04 2018-09-11 Oppo广东移动通信有限公司 The processing method of advertisement, device, storage medium and intelligent terminal in video
CN109121012A (en) * 2018-07-24 2019-01-01 北京潘达互娱科技有限公司 A kind of response method, device, electronic equipment and storage medium
CN110362266A (en) * 2019-07-19 2019-10-22 北京字节跳动网络技术有限公司 Prompt information display methods, system, electronic equipment and computer-readable medium
CN110830844A (en) * 2019-11-20 2020-02-21 四川长虹电器股份有限公司 Intelligent pushing method for television terminal
CN110858926A (en) * 2018-08-24 2020-03-03 武汉斗鱼网络科技有限公司 Sharing method and device for live broadcast room, terminal and storage medium
CN111683263A (en) * 2020-06-08 2020-09-18 腾讯科技(深圳)有限公司 Live broadcast guiding method, device, equipment and computer readable storage medium
CN112243157A (en) * 2020-10-14 2021-01-19 北京字节跳动网络技术有限公司 Live broadcast control method and device, electronic equipment and computer readable medium
CN112770128A (en) * 2020-12-31 2021-05-07 百果园技术(新加坡)有限公司 Playing system, method and device of live gift and server
CN112995695A (en) * 2021-04-20 2021-06-18 北京映客芝士网络科技有限公司 Live broadcast interaction method, device, equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110252423A1 (en) * 2010-04-07 2011-10-13 Apple Inc. Opportunistic Multitasking
CN107734390A (en) * 2017-10-27 2018-02-23 广州酷狗计算机科技有限公司 Live broadcasting method, device and storage medium
CN108521607A (en) * 2018-04-04 2018-09-11 Oppo广东移动通信有限公司 The processing method of advertisement, device, storage medium and intelligent terminal in video
CN109121012A (en) * 2018-07-24 2019-01-01 北京潘达互娱科技有限公司 A kind of response method, device, electronic equipment and storage medium
CN110858926A (en) * 2018-08-24 2020-03-03 武汉斗鱼网络科技有限公司 Sharing method and device for live broadcast room, terminal and storage medium
CN110362266A (en) * 2019-07-19 2019-10-22 北京字节跳动网络技术有限公司 Prompt information display methods, system, electronic equipment and computer-readable medium
CN110830844A (en) * 2019-11-20 2020-02-21 四川长虹电器股份有限公司 Intelligent pushing method for television terminal
CN111683263A (en) * 2020-06-08 2020-09-18 腾讯科技(深圳)有限公司 Live broadcast guiding method, device, equipment and computer readable storage medium
CN112243157A (en) * 2020-10-14 2021-01-19 北京字节跳动网络技术有限公司 Live broadcast control method and device, electronic equipment and computer readable medium
CN112770128A (en) * 2020-12-31 2021-05-07 百果园技术(新加坡)有限公司 Playing system, method and device of live gift and server
CN112995695A (en) * 2021-04-20 2021-06-18 北京映客芝士网络科技有限公司 Live broadcast interaction method, device, equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114257833A (en) * 2021-12-29 2022-03-29 广州方硅信息技术有限公司 Live broadcast room recommending and entering method, system, device, equipment and storage medium
CN114615515A (en) * 2022-03-15 2022-06-10 广州歌神信息科技有限公司 On-line singing hall space scheduling method and device, equipment, medium and product
CN114615515B (en) * 2022-03-15 2024-04-16 广州歌神信息科技有限公司 Online singing hall space scheduling method and device, equipment, medium and product

Also Published As

Publication number Publication date
CN113542782B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
EP3902278B1 (en) Music playing method, device, terminal and storage medium
CN109600678B (en) Information display method, device and system, server, terminal and storage medium
CN113542782B (en) Method and device for guiding live interaction, electronic equipment and storage medium
CN112584224B (en) Information display and processing method, device, equipment and medium
CN110267067A (en) Method, apparatus, equipment and the storage medium that direct broadcasting room is recommended
CN111966275B (en) Program trial method, system, device, equipment and medium
CN112511850B (en) Wheat connecting method, live broadcast display device, equipment and storage medium
CN110290392B (en) Live broadcast information display method, device, equipment and storage medium
CN108419113A (en) Caption presentation method and device
CN109275013B (en) Method, device and equipment for displaying virtual article and storage medium
CN113082721B (en) Resource management method and device for application program of integrated game module, electronic equipment and storage medium
EP4184412A1 (en) Method and apparatus for presenting resources
CN113230655B (en) Virtual object control method, device, equipment, system and readable storage medium
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
CN115002359B (en) Video processing method, device, electronic equipment and storage medium
CN112165628A (en) Live broadcast interaction method, device, equipment and storage medium
CN112004134B (en) Multimedia data display method, device, equipment and storage medium
CN112188268B (en) Virtual scene display method, virtual scene introduction video generation method and device
CN114339308A (en) Video stream loading method, electronic equipment and storage medium
CN112464019B (en) Audio playing method, device, terminal and storage medium
US20220254082A1 (en) Method of character animation based on extraction of triggers from an av stream
CN112433696A (en) Wallpaper display method, device, equipment and medium
KR20230022588A (en) Method and apparatus for assisting watching video contents
CN113448532A (en) Multimedia data playing method and device, electronic equipment and storage medium
CN110808985A (en) Song on-demand method, device, terminal, server and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant