CN113542782B - Method and device for guiding live interaction, electronic equipment and storage medium - Google Patents

Method and device for guiding live interaction, electronic equipment and storage medium Download PDF

Info

Publication number
CN113542782B
CN113542782B CN202110760392.1A CN202110760392A CN113542782B CN 113542782 B CN113542782 B CN 113542782B CN 202110760392 A CN202110760392 A CN 202110760392A CN 113542782 B CN113542782 B CN 113542782B
Authority
CN
China
Prior art keywords
live
timing
space
interactive
foreground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110760392.1A
Other languages
Chinese (zh)
Other versions
CN113542782A (en
Inventor
许圣霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Music Entertainment Technology Shenzhen Co Ltd
Original Assignee
Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Music Entertainment Technology Shenzhen Co Ltd filed Critical Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority to CN202110760392.1A priority Critical patent/CN113542782B/en
Publication of CN113542782A publication Critical patent/CN113542782A/en
Application granted granted Critical
Publication of CN113542782B publication Critical patent/CN113542782B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a method for guiding live interaction, which comprises the following steps: triggering an interactive guiding timing in response to user triggering operation related to a live space of an application program, wherein the interactive guiding timing is performed independently of the live space; detecting whether the live broadcast space is positioned at the foreground of the application program when the guide interaction timing reaches a preset duration threshold; and triggering the prompt of the interactive guide message if the live space is detected to be positioned at the foreground of the application program. The application also discloses a device for guiding live interaction, electronic equipment and a storage medium.

Description

Method and device for guiding live interaction, electronic equipment and storage medium
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for guiding live interaction. The application also relates to a related electronic device and a storage medium.
Background
With the development of internet technology, network live broadcast service is widely popularized, and the network anchor users can play live broadcast audio and video streams to a plurality of audience users entering a live broadcast space.
For example, in a K song living space (also referred to as a song studio), a host user may collect a live audio stream of the host user through a microphone of the host user side terminal, and send the live audio stream to a live server through a host client; the server transmits the live audio stream to the viewer client of the same song room by broadcasting or multicasting technology, thereby playing the live audio stream in each viewer user side terminal.
Users of live internet broadcasts, including the anchor and audience, are increasingly demanding of their social attributes, where an important point is to achieve good interaction of the audience with the anchor. In this regard, it is desirable to be able to employ various means to enhance the audience's interaction with the anchor while avoiding interactive alerts from affecting the user's use experience, particularly for the audience user.
The description of the background art is only for the purpose of facilitating an understanding of the relevant art and is not to be taken as an admission of prior art.
Disclosure of Invention
Therefore, the embodiments of the present invention are intended to provide a method and apparatus for guiding live interaction, and related electronic devices and storage media, which can improve interaction between a viewer and a host in a live space, such as a song room, while avoiding the interaction from affecting the user experience of a user, particularly a viewer user.
In a first aspect, a method of directing live interaction is provided, comprising:
triggering an interactive guiding timing in response to user triggering operation related to a live space of an application program, wherein the interactive guiding timing is performed independently of the live space;
detecting whether the live broadcast space is positioned at the foreground of the application program when the guide interaction timing reaches a preset duration threshold;
and triggering the prompt of the interactive guide message if the live space is detected to be positioned at the foreground of the application program.
In a second aspect, there is provided an apparatus for directing live interaction, comprising:
a timer unit configured to trigger an interactive boot timing in response to a user-triggered operation of an application live space, wherein the timer unit is independent of the live space;
the detection unit is configured to detect whether the live broadcast space is positioned at the foreground of the application program when the guide interaction timing reaches a preset duration threshold;
and the triggering unit is configured to trigger the prompt of the interactive guide message if the live space is detected to be positioned at the foreground of the application program.
In a third aspect, there is provided an electronic device comprising: a processor and a memory storing a computer program, the processor being configured to perform the method of any one of the embodiments of the invention when the computer program is run.
In a fourth aspect, there is provided a storage medium storing a computer program configured to, when executed, perform a method of any one of the embodiments of the invention.
In embodiments of the present invention, good interaction of the audience with the anchor, such as facilitating the audience's appreciation to the anchor, may be facilitated by providing periodic guidance to the user, particularly the audience user, and also avoiding unduly annoying the user, affecting the user experience of the user in the live space. In addition, considering that when the user uses the electronic device, the user often switches among different modules of the application program, and possibly switches among different application programs, in the embodiment of the invention, the user can still be effectively guided to perform live interaction under the condition that the user performs intra-application or inter-application switching.
Additional optional features and technical effects of embodiments of the invention are described in part below and in part will be apparent from reading the disclosure herein.
Drawings
Embodiments of the present invention will hereinafter be described in conjunction with the appended drawings, wherein like or similar reference numerals denote like or similar elements, and wherein:
FIG. 1 illustrates an exemplary architecture diagram of an application program according to an embodiment of the invention, showing device modules according to an embodiment of the invention;
FIG. 2 shows a first schematic flow chart of a method according to an embodiment of the invention;
FIG. 3 shows a second schematic flow chart of a method according to an embodiment of the invention;
FIG. 4 shows a third schematic flow chart of a method according to an embodiment of the invention;
FIG. 5 shows a fourth schematic flow chart diagram of a method according to an embodiment of the invention;
FIG. 6 shows an exemplary diagram of implementation of method steps according to an embodiment of the invention;
FIGS. 7A and 7B show exemplary diagrams of method steps implemented according to an embodiment of the invention;
FIG. 8 shows an exemplary diagram of implementing method steps according to an embodiment of the invention;
FIG. 9 illustrates an exemplary process diagram of a method according to one particular embodiment of the invention;
FIG. 10 shows a schematic structural view of an apparatus according to an embodiment of the present invention;
fig. 11 shows a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention;
FIG. 12A illustrates an operating system diagram of an electronic device in accordance with an embodiment of the invention;
fig. 12B shows a schematic diagram of an operating system of an electronic device according to an embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and the accompanying drawings, in order to make the objects, technical solutions and advantages of the present invention more apparent. The exemplary embodiments of the present invention and the descriptions thereof are used herein to explain the present invention, but are not intended to limit the invention.
In the embodiment of the invention, the live application program (APP) can be interpreted broadly, and covers the APP with the live broadcast as the main function and other types of APP with the live broadcast function or module. For example, in some embodiments of the present invention, the Application (APP) may be a different type of APP with live functionality or modules, including but not limited to: music APP, social APP, shopping APP, video APP, etc.
In the embodiment of the present invention, the "live space" may be interpreted broadly as an independent space set in a live application, in which a live audio and video stream is provided by a host user to a viewer user. In the embodiment of the invention, the live space covers a live room and a song room.
In the embodiments of the present invention, "foreground" or "background" has a meaning known in the art, and refers to whether or not the display is on each display level interface of the electronic device, and covers the foreground or the background of an application program (interface), and the foreground or the background of an operating system (interface).
The embodiment of the invention provides a method and a device for guiding live interaction. In addition, the embodiments of the present invention also relate to an electronic device implementing the above method and a storage medium storing a program that can implement the above method when executed. In some embodiments, an apparatus, component, unit, or model may be implemented in software, hardware, or a combination of software and hardware.
Referring specifically to FIG. 1, an exemplary architecture diagram of an application according to an embodiment of the present invention is shown, illustrating an apparatus/module according to an embodiment of the present invention.
As previously mentioned, the live Applications (APPs) of different embodiments of the present invention may encompass live-based APPs as well as other types of APPs having live functionality or modules. In fig. 1, a frame diagram of a music type APP with live functionality, e.g. K-song APP, according to one embodiment of the invention is shown.
With continued reference to fig. 1, in the architecture of the live Application (APP), such as K-song APP, one or more live space modules may be provided. In the specific embodiment shown in fig. 1, the live space modules may include a live (room) module 101 and a song room module 102, respectively. By means of the live house module 101, the live APP may provide one or more live houses in which the hosting user may provide, for example, a song singing or video live, etc. performance for playing to the audience user in the live house. By means of the studio module 102, the live APP may provide one or more studios in which one or more singers may sing songs in a solo or K song manner, and a viewer user may enter the studio to listen to the solo or K song by the singer. In other embodiments of the invention, only one of the living room and the song room may be provided, or other types of living space may be provided.
In the embodiment of the invention, the live space such as live broadcast or users in a song room such as audience users and anchor users can interact with each other so as to promote the social property of network live broadcast. In the embodiment shown in FIG. 1, the interaction may include viewing the virtual item in a live space, such as viewing the virtual item by a spectator user to a host user. Those skilled in the art will appreciate that in other embodiments, the interaction may also include sending a message, sending a bullet screen, sending a special effect, etc.
In some embodiments, the play (e.g., live) and presentation (e.g., viewing) may be provided as separate system architectures. For example, in the bus architecture of the live APP such as K song APP exemplarily shown in fig. 1, a view module 103 and a virtual article view unit (or referred to as gift sending processing unit (GiftSendProcessor)) 104 may be further provided, where the view module 103 and the virtual article view processing unit 104 may be independent from the live view (room) module 101 and the song room module 102, which are mainly used to construct a live broadcast or a song room.
With continued reference to fig. 1, a timing-based guided live interaction device/module independent of live space such as live or song rooms is also provided in the bus architecture of a live APP such as K song APP as shown. In the exemplary embodiment shown in fig. 1, the guided live interaction device/module may also be referred to as an interaction viscosity enhancing device (e.g., stickyTimerProcessor) 105.
Hereinafter, implementations or implementations of various embodiments of the direct broadcast interaction device/module and associated methods will be described with further reference to the accompanying drawings.
Referring to fig. 2, a method of directing live interaction, which may be implemented by the live interaction device/module, for example, is shown in accordance with an embodiment of the present invention.
With continued reference to fig. 2, the method may include:
s201: and responding to the user triggering operation of the live space of the application program, and triggering the interactive guiding timing.
In an embodiment of the present invention, as described above with reference to fig. 1, the interactive guidance timing is performed independently of the live space. The independently performing interactive guidance timing refers to performing the timing independently of whether a user is located in the live space or whether the live space is located in a foreground of an application program.
In an embodiment of the present invention, the user-triggered operation includes at least one of a user entering a live space or a user (e.g., a viewer user) performing an interactive operation in the live space. As in the embodiment shown in FIG. 1, the user interaction comprises the user (e.g., an audience user) viewing a virtual item in the live space.
In some embodiments of the present invention, the step S201 further includes:
A1: in response to the user-triggered operation, an identification number (ID) of the live space associated with the user-triggered operation is recorded, and the identification number is associated with the interactive guidance timing.
In order to be able to effectively confirm that the same live space for which the timing is aimed is located in or returned to the foreground of the application program when the interaction is directed, the ID of the live space may be recorded in response to a trigger operation record of the user, which may be cached in the memory of the process of the application program, for example. For example, an identification number buffer alone or in combination with other units may be provided to record the ID of the live space in the buffer, as schematically shown in fig. 9.
S202: and detecting whether the live broadcast space is positioned at the foreground of the application program when the guide interaction timing reaches a preset duration threshold.
In some embodiments of the invention, the detection may be triggered by setting an event mechanism. For example, an event that the pilot interaction timing reaches a preset duration threshold may be generated when the pilot interaction timing reaches the preset duration threshold; and triggering the detection by the generated event. In some embodiments, the events may be generated, cached, consumed, or destroyed as the case may be, as described further below.
In some embodiments of the present invention, a corresponding Event Pool (Event Pool) may be set to record the Event. For example, in some embodiments of the invention, the events may be recorded in a boot interaction event pool or an interaction viscosity event pool. In some embodiments, an integral cache may be provided in the memory of the application process to integrate the event pool, the aforementioned identification number cache, and/or the timing cache (described below) therein.
In some embodiments of the present invention, the preset time period threshold may be empirically determined. In other embodiments, the preset duration threshold may be determined based on historical statistics. In some embodiments, the predetermined duration threshold may be determined based on historical statistics of the user or based on historical statistical averages of numerous users of the application.
In some embodiments, the detecting whether the live space is located in a foreground of the application may include:
a2: reading the recorded identification number of the live broadcast space;
a3: based on the read identification number, it is determined whether the live space is located in the foreground.
The step A3 may include:
a4: and checking whether the read identification number is consistent with the identification number of the live space positioned at the application program foreground so as to confirm whether the live space positioned at the application program foreground is the live space associated with the user triggering operation.
S203: and if the live broadcast space is positioned at the foreground of the application program, triggering the prompt of the interactive guide message.
In some embodiments of the invention, the aforementioned generated events are consumed and destroyed when the prompt for the interactive guidance message is triggered.
In some embodiments of the present invention, the method may further comprise:
b1: if the live broadcast space is detected not to be located in the foreground of the application program, an event that the guide interaction timing reaches a preset duration threshold is cached, and the prompt of the interaction guide message is not triggered.
B2: and when the live broadcast space is switched back to the foreground of the application program, reading the cached event, and triggering the prompt of the interactive guide message if the cached event is read.
Here, for example, when the live space is switched from the background of the application program or the application program is directly switched from the background to the foreground of the system, the event recorded (cached) in the event pool is read. If the event exists, the event is consumed, a prompt is triggered, and the event can be correspondingly destroyed.
The method for detecting whether the live space is switched back to the foreground of the application program may also be used in the aforementioned ID verification method, which is not described herein.
In these embodiments of the invention, the detection and the event record may be interlocked.
Thus, on the one hand, in response to the recording of an event, it will be detected, e.g. to monitor, whether the live space, e.g. the live room/song room, is located in the foreground of the application (e.g. whether the user is located in the live space), and trigger a prompt for an interactive guide message when it is detected that the live room/song room is located in the foreground of the application, optionally removing the recorded event; on the other hand, when it is detected that the live space, such as a live room/song room, is switched back to the foreground of the application (e.g. restored by the system background or the application background), it is detected whether there is a recorded but not consumed event in the event pool, and when there is a recorded event, the event is consumed, that is, a prompt of an interactive guiding message is triggered, and optionally the recorded event is removed.
In the embodiment of the invention, by setting the interactive guiding timing independent of the live broadcast space such as a live broadcast room/song room, the timing can be still carried out according to a reasonable time length threshold under the condition that the user exits the current live broadcast space or the live broadcast space is not in the foreground of the application program; moreover, by guiding the recording of events when the interaction timing reaches the preset duration threshold and detecting whether the live space (live broadcasting room/song room) is located in the foreground of the application program, the interaction prompt message can be provided only when the live space (live broadcasting room/song room) is located in the foreground of the application program and presented to the user. Therefore, when the live space exits the foreground of the application program (for example, the user exits the live space or the live space is minimized in the application program interface), the interaction (such as appreciation) of the user can be promoted at reasonable time intervals, so that the interaction of the network live is promoted, and the using experience of the user using the live application program is not influenced by excessive messages.
In a further embodiment, after starting the timing and in case the timing does not reach the preset duration threshold, if the live space is no longer in the foreground of the application (e.g. the user exits the live space or minimizes the live space at the application interface), but the application is still in the foreground of the system interface, the timing is continued, independent of the live space or the live/song-room module as described in fig. 1.
Referring to FIG. 3, for example, a second schematic flow chart of a method according to an embodiment of the invention is shown in which processing is embodied when the live space is not in the foreground of the application during the timekeeping process. In the embodiment shown in fig. 3, the method may further comprise:
s301: after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, if the live space leaves the foreground of the application program and the application program is positioned at the system foreground, continuing the live interaction guidance timing.
S302: and when the guide interaction timing reaches a preset duration threshold, if the live broadcast space is not located in the foreground of the application program and the application program is located in the system foreground, caching an event that the guide interaction timing reaches the preset duration threshold and not triggering the prompt of the interaction guide message.
In some embodiments, as previously described, in response to the generated event, it will be detected whether the live space is located in the foreground of the application; when the live space leaves the foreground of the application and the application is located in the system foreground, the prompt of the interactive guide message will not be triggered, and at the same time, the event will not be consumed and destroyed, but buffered.
S303: when the live space is switched back to the foreground of the application (from the background of the application), the recorded event is read, and if the recorded event is read, the prompt of the interactive guide message is triggered.
Implementation of relevant method steps according to an embodiment of the invention is described in connection with fig. 2, 3 and 6 and fig. 9, wherein fig. 6 shows an exemplary diagram of method steps according to an embodiment of the invention. In the embodiment shown in fig. 6, the electronic device 600 is for example a mobile terminal, in which a live APP may be installed, in the embodiment shown for example a K-song APP with live functionality. The K song APP in the illustrated embodiment may have, for example, a bus architecture as shown in fig. 1. Fig. 6 shows that the live APP is located in the foreground of the electronic device.
Although not shown, when a user enters a live space such as a song room 610, an interactive boot timing (e.g., a viewing viscosity boost timing) is triggered. In some embodiments, the identification number (ID) of the live space is recorded when the user enters the live space, such as a song room, as shown in fig. 6, song room ID:000800. as previously described, the interaction guidance timing may be triggered in response to other operations of the user in the live space, for example in response to interactions of the user in the live space such as a reward operation. The timing is implemented, for example, by a timer 630 that is independent of the live space.
Referring to fig. 6 and fig. 3 in combination, after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, if the live space is not in the foreground of the application program and the application program is in the foreground of the system, in other words, when the user switches in the application program, the live interaction guidance timing can be continued. As in the example of fig. 6, the timing in the timer is still on when the user switches within the application. In the embodiment shown in fig. 6, where the live space is not in the foreground of the application, but the application is in the system foreground, the live space, such as a display bay window or "bubble" 620 of the song room 610, may be presented as a quick link icon returning to the live space, such as the song room 610. Here, the user may be "not exiting" the live space, e.g. the user may still listen to a live audio stream played in a live space, e.g. a song room, by means of a speaker or headphones paired with the electronic device. In other embodiments, the live space not being in the foreground of the application may also cover situations where the user has exited the live space. Here, in some embodiments of the present invention, it may be set that the timer will be re-triggered (reset) when the user exits the current live space and enters a new live space.
Referring to fig. 2, 3, 6, and 8 in combination, when the count of the timer reaches the preset duration threshold, for example, an event may be generated in the event pool 860 that directs the count of the interactions to reach the preset duration threshold. At this time, it may be detected whether the live space such as the song room 810 (ID: 000800) is located in the foreground, and if so, a prompt for interactive guidance information may be triggered, for example, a prompt for guidance viewing information 840 as shown in fig. 8. After the prompt is triggered, the event is consumed and can be destroyed.
Referring to fig. 2, 3 and 6 in combination, if the timer reaches the preset duration threshold, the detected live space is not in the application foreground (for example, in the application background or in the system background), an event that the guided interaction timing reaches the preset duration threshold may be recorded (cached) in the event pool, and the prompt of the interactive guiding information is not triggered at this time; also, a live space module, such as a live/song room module, may detect, e.g., listen to, that the live space, such as song room 820 (ID: 000800), switches back to the foreground of the application. As shown in fig. 8, when a switch to the song studio is monitored, a prompt for interactive guidance information 840 is triggered.
According to a further embodiment, a solution may also be provided when an application is switched to the system background.
Fig. 4 shows a solution for switching to the system background according to an embodiment of the invention.
Referring to fig. 4, a third schematic flow chart of a method according to an embodiment of the invention is shown. On the basis of the method embodiments as shown in fig. 2 and/or fig. 3, the method in the embodiment shown in fig. 4 may further comprise:
s401: after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, suspending the interactive guidance timing and recording the timing duration of the interactive guidance timing when the application program is switched to a system background.
S402: and when the application program is switched back to the system foreground from the system background, reading the recorded timing duration, and continuing to time according to the timing duration.
Reference is made in combination to fig. 2, 3, 4, 6, 7A and 7B and 9. As previously described and as exemplarily shown in fig. 9, the timing is separate from the live room/song room.
As exemplarily shown in fig. 7A and referring to fig. 9 in combination, when the live APP of the electronic device 700 is switched to the system background, for example, because the application is suspended, the timer 730 will suspend the interactive boot timing and record the timing duration at the time of suspension, for example, in the timing buffer 750.
As exemplarily shown in fig. 7B and referring to fig. 9 in combination, when the live APP of the electronic device 700 is switched back to the system foreground (at this time, the live space may or may not be located in the foreground of the application program, and the live space shown in fig. 7B, such as a song room (ID: 000800), is not located in the foreground of the application program but is presented in the form of a display bay window), the timer 730 may read the duration of the recording when paused from the timing buffer 750, and continue to count for the duration of the recording.
In the embodiment shown in fig. 4, the time that the application is in the background of the system does not account for the boot duration.
Another solution for switching to the system background according to an embodiment of the invention is shown in fig. 5.
A fourth schematic flow chart of a method according to an embodiment of the invention is shown with reference to fig. 5. On the basis of the method embodiments as shown in fig. 2 and/or fig. 3, the method in the embodiment shown in fig. 5 may further comprise:
s501: after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, suspending the interactive guidance timing when the application program is switched to a system background, and recording the triggering time point of the interactive guidance timing.
S502: and when the application program is switched back to the system foreground from the system background, the triggering time point is read, and the time length which is passed from the triggering time point is determined.
S503: and if the duration of the experience is greater than or equal to the preset duration threshold, defining that the guide interaction timing reaches the preset duration threshold.
S504: and if the duration of the experience is smaller than the preset duration threshold, continuing to time the duration of the experience.
The embodiment shown in fig. 5 differs from the embodiment shown in fig. 4 in that, for example, the point in time at which the timer is triggered can be recorded in a timer buffer and, when the application program is restored to the system foreground, it is determined whether the preset duration threshold has been reached with the absolute time of said point in time. For example, when t1-t0 is equal to or greater than Δt, defining that the preset duration threshold is reached, for example, performing the foregoing detection, and triggering or not triggering the prompt of the interactive guiding message according to the detection result; wherein t1 is the event point of the application program switching from the system background to the system foreground, t0 is the trigger event point, and Δt is the preset duration threshold. In some embodiments, the defining may include generating the aforementioned event. Thus, in the embodiment shown in FIG. 5, the time that the application is in the background of the system counts the boot duration and records events when a preset duration threshold is reached.
In some embodiments of the present invention, when an application program is located in the background of the system for a long time (far exceeding the preset duration threshold), the foregoing timer may be automatically cleared and/or the foregoing buffer or event pool may be reclaimed or released based on a memory reclamation mechanism configured by the system (such as Android or iOS), or a specially configured memory reclamation mechanism may be also utilized, where the boot interaction timing (and event) mechanism is reset at this time, and then entering the same live space triggers a new boot interaction timing, which is not described herein again.
In some embodiments of the present invention, as shown in fig. 10, a device 1000 for guiding live interaction is also provided, which may be, for example, an interactive viscosity enhancing device as shown in fig. 1. The device 1000 may comprise a timer unit 1001, a detection unit 1002, a trigger unit 1003.
In some embodiments, the timer unit 1001 may be configured to trigger the interactive guidance timer in response to a user-triggered operation of the application live space. In an embodiment of the present invention, the timer unit 1001 is independent of the live space (not shown).
In some embodiments, the detecting unit 1002 is configured to detect whether the live space is located in the foreground of the application program when the pilot interaction timing reaches a preset duration threshold.
In some embodiments, the triggering unit 1003 may be configured to trigger prompting of an interactive guidance message if the live space is detected to be located in the foreground of the application.
In some alternative embodiments, the apparatus may further direct an interaction event pool, which may be configured to detect that the live space leaves the foreground of the application and the application is located in a system foreground, and cache an event that directs interaction timing to reach a preset duration threshold. In some embodiments, the triggering unit 1003 may be further configured to read a cached event when the live space switches back to the foreground of the application, and trigger a prompt of an interactive guidance message if the cached event is read.
In some optional embodiments, the apparatus 1000 for guiding live interaction may further include a time buffer unit configured to suspend the interactive guidance timing when the application is switched to the system background after triggering the interactive guidance timing and before the guidance interaction timing reaches a preset duration threshold, and record a timing duration of the interactive guidance timing in a timing buffer. In this embodiment, the timer unit 1001 may be further configured to read the recorded time duration from the time duration buffer when the application program is switched from the system background to the system foreground, and continue to time with the time duration.
Those skilled in the art will appreciate that the apparatus of this embodiment may incorporate the method features described in other embodiments and vice versa without causing contradiction.
An electronic device is also provided in an embodiment of the invention. In a preferred embodiment of the present invention, the electronic device may be a mobile terminal, preferably a mobile phone or a tablet. Fig. 11 shows, by way of example only, a schematic diagram of the hardware architecture of a particular embodiment, such as mobile terminal 1100; and FIGS. 12A and 12B illustrate operating system architecture diagrams of one particular embodiment of a mobile terminal.
In the illustrated embodiment, mobile terminal 1100 may include a processor 1101, an external memory interface 1112, an internal memory 1110, a Universal Serial Bus (USB) interface 1113, a charge management module 1114, a power management module 1115, a battery 1116, a mobile communication module 1140, a wireless communication module 1142, antennas 1139 and 1141, an audio module 1134, a speaker 1135, a receiver 1136, a microphone 1137, an ear-headphone interface 1138, keys 1109, a motor 11011, an indicator 1107, a Subscriber Identity Module (SIM) card interface 1111, a display screen 1105, an imaging device 1106, and a sensor module 1120, among others.
It should be understood that the illustrated construction of an embodiment of the present application does not constitute a particular limitation of the mobile terminal 1100. In other embodiments of the application, mobile terminal 1100 may include more or less components than those shown, or may combine certain components, or may split certain components, or may have a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In some embodiments, the processor 1101 may include one or more processing units. In some embodiments, the processor 1101 may include one or a combination of at least two of the following: an Application Processor (AP), a modem processor, a baseband processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), etc. The different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural and command center of the mobile terminal 1100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. The memory may hold instructions or data that the processor has just used or recycled. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 1101 is reduced, thus improving the efficiency of the system.
The NPU is a Neural Network (NN) computing processor, and can also be continuously self-learned by rapidly processing input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons.
The GPU is a microprocessor for image processing and is connected with the display screen and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor may include one or more GPUs that execute program instructions to generate or change display information.
A digital signal processor (ISP) is used to process digital signals, and may process other digital signals in addition to digital image signals.
In some embodiments, the processor 1101 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only illustrative, and does not limit the structure of the mobile terminal. In other embodiments of the present application, the mobile terminal may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The wireless communication functions of mobile terminal 1100 can be implemented by antennas 1139 and 1141, mobile communication module 1140, wireless communication module 1142, a modem processor, a baseband processor, or the like.
The mobile terminal 1100 may perform audio functions, such as music playing, recording, etc., through an audio module 1134, a speaker 1135, a receiver 1136, a microphone 1137, an earphone interface 1138, an application processor, etc.
The audio module 1134 is used to convert digital audio information to an analog audio signal output and also to convert an analog audio input to a digital audio signal.
The microphone 1137 is used to convert sound signals into electrical signals. When making a call or transmitting voice information, a user can sound near the microphone through the mouth, inputting a sound signal to the microphone.
The sensor module 1120 may include one or more of the following sensors:
The pressure sensor 1123 is configured to sense a pressure signal, and convert the pressure signal into an electrical signal.
The air pressure sensor 1124 is configured to measure air pressure.
The magnetic sensor 1125 includes a hall sensor.
The gyro sensor 1127 may be used to determine a motion gesture of the mobile terminal 1100.
The acceleration sensor 1128 may detect the magnitude of acceleration of the mobile terminal 1100 in various directions.
The distance sensor 1129 may be configured to measure a distance.
The proximity light sensor 1121 may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 1122 is used to sense ambient light levels.
The fingerprint sensor 1131 may be configured to collect a fingerprint.
The touch sensor 1132 may be disposed on a display screen, and the touch sensor and the display screen form a touch screen, which is also called a "touch screen". The touch sensor is used to detect a touch operation acting on or near it.
The bone conduction sensor 1133 may acquire a vibration signal.
The software operating system of an electronic device (computer) such as a mobile terminal may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
The embodiments shown herein exemplify the software architecture of a mobile terminal with a hierarchical architecture, taking iOS and android operating system platforms, respectively. It is contemplated that embodiments herein may be implemented in different software operating systems.
In the embodiment shown in fig. 12A, the iOS operating system may be adopted in the scheme of the embodiment of the present invention. The iOS operating system adopts a four-layer architecture, from top to bottom, a touchable layer (Cocoa Touch layer) 1210, a Media layer (Media layer) 1220, a Core service layer (Core Services layer) 1230, and a Core operating system layer (Core OS layer) 1240. Touch layer 1210 provides various commonly used frameworks for application development and most of the frameworks are related to interfaces that are responsible for touch interactions by users on iOS devices. The media layer provides audiovisual technology in applications such as graphics images, sound technology, video and audio-video transmission related frameworks, etc. The core service layer provides the basic system services required by the application. The core operating system layer contains most of the low-level near hardware functionality.
In an embodiment of the present invention, UIKit is the user interface framework of touchable layer 1210.
Fig. 12B is a schematic diagram of an Android (Android) operating system, and the solution of the embodiment of the present invention may use an Android operating system. The layered architecture divides the software into several layers, with the layers communicating through software interfaces. In some embodiments, the android system is divided into four layers, from top to bottom, application layer 1210', application framework layer 1220', android Runtime (run time) and system library 1230', and kernel layer 1240', respectively.
The application layer 1210' may include a series of application packages.
The application framework layer 1220' provides an Application Programming Interface (API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
The window manager is used for managing window programs.
The content provider is used to store and retrieve data that is made accessible to the application.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide communication functions of the mobile terminal.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction.
The android run time comprises a core library and a virtual machine, and is responsible for scheduling and management of an android system. The core library consists of two parts: one part is a function to be called by java language, and the other part is a core library of android. The application layer and the framework layer run in virtual machines.
The system library may include a plurality of functional modules. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like.
Kernel layer 1240' is a layer between hardware and software. The kernel layer may include display drivers, camera drivers, audio interfaces, sensor drivers, power management, and GPS interfaces. In some embodiments of the invention, the display of the frame animation may invoke a display driver.
The system, apparatus, module or unit described in the above or below embodiments of the present invention may be implemented by a computer or its associated components. According to specific circumstances, the computer may be, for example, a mobile terminal, a smart phone, a Personal Computer (PC), a laptop computer, a vehicle-mounted human-computer interaction device, a personal digital assistant, a media player, a navigation device, a game console, a tablet computer, a wearable device, a smart television, an internet of things system, a smart home, an industrial computer, a server, or a combination thereof.
In some embodiments of the present invention, a storage medium may also be provided. In some embodiments, the storage medium stores a computer program configured to perform the method of any of the embodiments of the present invention when executed.
Storage media in embodiments of the invention include both permanent and non-permanent, removable and non-removable items that may be used to implement information storage by any method or technology. Examples of storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, read only compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device.
Methods, programs, systems, apparatus, etc. in accordance with embodiments of the invention may be implemented or realized in single or multiple networked computers, or in distributed computing environments. In the present description embodiments, tasks may be performed by remote processing devices that are linked through a communications network in such a distributed computing environment.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Thus, it will be apparent to those skilled in the art that the functional modules/units or controllers and associated method steps set forth in the above embodiments may be implemented in software, hardware, and a combination of software/hardware.
The acts of the methods, procedures, or steps described in accordance with the embodiments of the present invention do not have to be performed in a specific order and still achieve desirable results unless explicitly stated. In some embodiments, multitasking and parallel/merge processing of the steps is also possible or may be advantageous.
In this document, the terms "first," "second," and the like are used to distinguish between different elements in the same embodiment and do not denote sequential or relative importance.
Various embodiments of the invention are described herein, but for brevity, description of each embodiment is not exhaustive and features or parts of the same or similar between each embodiment may be omitted. Herein, "one embodiment," "some embodiments," "example," "specific example," or "some examples" means that it is applicable to at least one embodiment or example, but not all embodiments, according to the present invention. The above terms are not necessarily meant to refer to the same embodiment or example. Those skilled in the art may combine and combine the features of the different embodiments or examples described in this specification and of the different embodiments or examples without contradiction.
The exemplary systems and methods of the present invention have been particularly shown and described with reference to the foregoing embodiments, which are merely examples of the best modes for carrying out the systems and methods. It will be appreciated by those skilled in the art that various changes may be made to the embodiments of the systems and methods described herein in practicing the systems and/or methods without departing from the spirit and scope of the invention as defined in the following claims.

Claims (12)

1. A method of directing live interaction, comprising:
triggering an interactive guidance timing in response to a user-triggered operation associated with a live space of an application, wherein the interactive guidance timing is executed independently of whether the user is located in the live space or whether the live space is located in a foreground of the application;
detecting whether the live broadcast space is positioned at the foreground of the application program when the guide interaction timing reaches a preset duration threshold;
and triggering the prompt of the interactive guide message if the live space is detected to be positioned at the foreground of the application program.
2. The method as recited in claim 1, further comprising:
if the live broadcast space is detected not to be located at the foreground of the application program, caching an event that the guide interaction timing reaches a preset duration threshold value and not triggering the prompt of the interaction guide message;
And when the live broadcast space is switched back to the foreground of the application program, reading the cached event, and triggering the prompt of the interactive guide message if the cached event is read.
3. The method as recited in claim 1, further comprising:
after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, if the live space leaves the foreground of the application program and the application program is positioned at the system foreground, continuing the live interaction guidance timing.
4. A method according to any one of claims 1 to 3, further comprising:
after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, suspending the interactive guidance timing and recording the timing duration of the interactive guidance timing when the application program is switched to a system background;
and when the application program is switched back to the system foreground from the system background, reading the recorded timing duration, and continuing to time according to the timing duration.
5. A method according to any one of claims 1 to 3, further comprising:
after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, suspending the interactive guidance timing when the application program is switched to a system background, and recording a triggering time point of the interactive guidance timing;
When the application program is switched back to the system foreground from the system background, the triggering time point is read, and the duration which is experienced from the triggering time point is determined;
if the duration of the experience is greater than or equal to the preset duration threshold, defining that the pilot interaction timing reaches the preset duration threshold;
and if the duration of the experience is smaller than the preset duration threshold, continuing to time the duration of the experience.
6. A method according to any one of claims 1 to 3, wherein said triggering an interactive boot timing in response to a user-triggered operation of an application live space comprises:
responsive to the user-triggered operation, recording an identification number of the live space associated with the user-triggered operation, and associating the identification number with the interactive guide schedule;
the detecting whether the live space is located in the foreground of the application program includes:
reading the recorded identification number of the live broadcast space;
based on the read identification number, it is determined whether the live space is located in the foreground of the application.
7. The method of claim 6, wherein the determining whether the live space is in the foreground based on the read identification number comprises:
And checking whether the read identification number is consistent with the identification number of the live space positioned at the front stage of the application program so as to confirm whether the live space positioned at the front stage of the application program is the live space associated with the user triggering operation.
8. A method according to any of claims 1 to 3, wherein the user-triggered operation comprises at least one of the user entering the live space and the user interacting with a host in the live space.
9. An apparatus for directing live interaction, comprising:
a timer unit configured to trigger an interactive guidance timing in response to a user-triggered operation of a live space of an application, wherein the timer unit is configured to perform the interactive guidance timing irrespective of whether the user is located in the live space or whether the live space is located in a foreground of the application;
the detection unit is configured to detect whether the live broadcast space is positioned at the foreground of the application program when the guide interaction timing reaches a preset duration threshold;
and the triggering unit is configured to trigger the prompt of the interactive guide message if the live space is detected to be positioned at the foreground of the application program.
10. The apparatus for directing live interaction of claim 9, further comprising:
the time buffer unit is configured to pause the interactive guidance timing after triggering the interactive guidance timing and before the guidance interactive timing reaches a preset duration threshold, and record the timing duration of the interactive guidance timing in the timing buffer area when the application program is switched to a system background;
the timer unit is further configured to read the recorded time duration from the time duration buffer when the application program is switched from the system background to the system foreground, and continue to time with the time duration of the interactive guidance time.
11. An electronic device, comprising: a processor and a memory storing a computer program, the processor being configured to implement the method of any one of claims 1 to 8 when the computer program is run.
12. A storage medium storing a computer program configured to implement the method of any one of claims 1 to 8 when executed.
CN202110760392.1A 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium Active CN113542782B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110760392.1A CN113542782B (en) 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110760392.1A CN113542782B (en) 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113542782A CN113542782A (en) 2021-10-22
CN113542782B true CN113542782B (en) 2023-11-03

Family

ID=78097792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110760392.1A Active CN113542782B (en) 2021-07-06 2021-07-06 Method and device for guiding live interaction, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113542782B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114257833A (en) * 2021-12-29 2022-03-29 广州方硅信息技术有限公司 Live broadcast room recommending and entering method, system, device, equipment and storage medium
CN114615515B (en) * 2022-03-15 2024-04-16 广州歌神信息科技有限公司 Online singing hall space scheduling method and device, equipment, medium and product

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107734390A (en) * 2017-10-27 2018-02-23 广州酷狗计算机科技有限公司 Live broadcasting method, device and storage medium
CN108521607A (en) * 2018-04-04 2018-09-11 Oppo广东移动通信有限公司 The processing method of advertisement, device, storage medium and intelligent terminal in video
CN109121012A (en) * 2018-07-24 2019-01-01 北京潘达互娱科技有限公司 A kind of response method, device, electronic equipment and storage medium
CN110362266A (en) * 2019-07-19 2019-10-22 北京字节跳动网络技术有限公司 Prompt information display methods, system, electronic equipment and computer-readable medium
CN110830844A (en) * 2019-11-20 2020-02-21 四川长虹电器股份有限公司 Intelligent pushing method for television terminal
CN110858926A (en) * 2018-08-24 2020-03-03 武汉斗鱼网络科技有限公司 Sharing method and device for live broadcast room, terminal and storage medium
CN111683263A (en) * 2020-06-08 2020-09-18 腾讯科技(深圳)有限公司 Live broadcast guiding method, device, equipment and computer readable storage medium
CN112243157A (en) * 2020-10-14 2021-01-19 北京字节跳动网络技术有限公司 Live broadcast control method and device, electronic equipment and computer readable medium
CN112770128A (en) * 2020-12-31 2021-05-07 百果园技术(新加坡)有限公司 Playing system, method and device of live gift and server
CN112995695A (en) * 2021-04-20 2021-06-18 北京映客芝士网络科技有限公司 Live broadcast interaction method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104476B2 (en) * 2010-04-07 2015-08-11 Apple Inc. Opportunistic multitasking of VOIP applications

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107734390A (en) * 2017-10-27 2018-02-23 广州酷狗计算机科技有限公司 Live broadcasting method, device and storage medium
CN108521607A (en) * 2018-04-04 2018-09-11 Oppo广东移动通信有限公司 The processing method of advertisement, device, storage medium and intelligent terminal in video
CN109121012A (en) * 2018-07-24 2019-01-01 北京潘达互娱科技有限公司 A kind of response method, device, electronic equipment and storage medium
CN110858926A (en) * 2018-08-24 2020-03-03 武汉斗鱼网络科技有限公司 Sharing method and device for live broadcast room, terminal and storage medium
CN110362266A (en) * 2019-07-19 2019-10-22 北京字节跳动网络技术有限公司 Prompt information display methods, system, electronic equipment and computer-readable medium
CN110830844A (en) * 2019-11-20 2020-02-21 四川长虹电器股份有限公司 Intelligent pushing method for television terminal
CN111683263A (en) * 2020-06-08 2020-09-18 腾讯科技(深圳)有限公司 Live broadcast guiding method, device, equipment and computer readable storage medium
CN112243157A (en) * 2020-10-14 2021-01-19 北京字节跳动网络技术有限公司 Live broadcast control method and device, electronic equipment and computer readable medium
CN112770128A (en) * 2020-12-31 2021-05-07 百果园技术(新加坡)有限公司 Playing system, method and device of live gift and server
CN112995695A (en) * 2021-04-20 2021-06-18 北京映客芝士网络科技有限公司 Live broadcast interaction method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113542782A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN109640188B (en) Video preview method and device, electronic equipment and computer readable storage medium
CN109600678B (en) Information display method, device and system, server, terminal and storage medium
CN100454239C (en) System and method for accessing system software in a gaming console system via an input device
CN111147878B (en) Stream pushing method and device in live broadcast and computer storage medium
EP3902278B1 (en) Music playing method, device, terminal and storage medium
CN113542782B (en) Method and device for guiding live interaction, electronic equipment and storage medium
CN107079186B (en) Enhanced interactive television experience
WO2020134560A1 (en) Live broadcast room switching method and apparatus, and terminal, server and storage medium
CN111263181A (en) Live broadcast interaction method and device, electronic equipment, server and storage medium
CN111966275B (en) Program trial method, system, device, equipment and medium
CN108694073B (en) Control method, device and equipment of virtual scene and storage medium
CN112511850B (en) Wheat connecting method, live broadcast display device, equipment and storage medium
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
CN109275013B (en) Method, device and equipment for displaying virtual article and storage medium
JP2016502706A (en) Hybrid advertising support and user-owned content presentation
EP4184412A1 (en) Method and apparatus for presenting resources
CN113230655B (en) Virtual object control method, device, equipment, system and readable storage medium
US8845429B2 (en) Interaction hint for interactive video presentations
CN114302160B (en) Information display method, device, computer equipment and medium
CN112004134B (en) Multimedia data display method, device, equipment and storage medium
CN112188268B (en) Virtual scene display method, virtual scene introduction video generation method and device
US20220254082A1 (en) Method of character animation based on extraction of triggers from an av stream
CN110808985B (en) Song on-demand method, device, terminal, server and storage medium
CN113867977A (en) Equipment control method and equipment
CN112464019A (en) Audio playing method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant