CN113284523A - Dynamic effect display method and device, computer equipment and storage medium - Google Patents

Dynamic effect display method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113284523A
CN113284523A CN202010104441.1A CN202010104441A CN113284523A CN 113284523 A CN113284523 A CN 113284523A CN 202010104441 A CN202010104441 A CN 202010104441A CN 113284523 A CN113284523 A CN 113284523A
Authority
CN
China
Prior art keywords
dynamic effect
time point
playing
audio
playing time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010104441.1A
Other languages
Chinese (zh)
Inventor
刘小萱
庞晟立
陈家盛
林毅雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Cyber Tianjin Co Ltd
Original Assignee
Tencent Cyber Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Cyber Tianjin Co Ltd filed Critical Tencent Cyber Tianjin Co Ltd
Priority to CN202010104441.1A priority Critical patent/CN113284523A/en
Publication of CN113284523A publication Critical patent/CN113284523A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • G11B2020/10537Audio or video recording
    • G11B2020/10546Audio or video recording specifically adapted for audio data

Abstract

The embodiment of the application discloses a dynamic effect display method, a dynamic effect display device, computer equipment and a storage medium; the method includes the steps that an audio playing page is displayed, wherein the audio playing page comprises audio playing information; when the fact that the playing time point of the audio reaches a dynamic effect playing time point is detected, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object aiming at the audio; the scheme can improve the interactivity of audio playing.

Description

Dynamic effect display method and device, computer equipment and storage medium
Technical Field
The application relates to the technical field of internet, in particular to a dynamic effect display method, a dynamic effect display device, computer equipment and a storage medium.
Background
With the development of internet technology, the life style of people has changed greatly, for example, in daily life, the song listening style of people has changed, and the song can be listened through an application program with an audio playing function on an intelligent device (such as a smart phone, a tablet computer, or a smart television).
With the development of technology and the change of demand, people hope to have more communication and interaction related to audio content while listening to audio, however, the prior art has obvious shortcomings in interactivity related to audio content when playing audio.
Disclosure of Invention
The embodiment of the application provides a dynamic effect display method and device, computer equipment and a storage medium, which can improve interactivity of audio playing.
The embodiment of the application provides a dynamic effect display method, which comprises the following steps:
displaying an audio playing page, wherein the audio playing page comprises playing information of audio;
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object aiming at the audio.
Correspondingly, the embodiment of the application provides a dynamic effect display device, including:
the page display module is used for displaying an audio playing page, wherein the audio playing page comprises audio playing information;
and the dynamic effect display module is used for displaying a target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the audio playing page when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, wherein the dynamic effect playing time point is the interactive operation time point of the interactive object aiming at the audio.
In some embodiments of the present application, the audio playing page includes a dynamic effect display area and a playing progress display control, and the dynamic effect display module is specifically configured to:
and when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying object information of the interactive object on a playing progress display control, and displaying a target dynamic effect corresponding to the dynamic effect playing time point in a dynamic effect display area.
In some embodiments of the present application, the audio playing page includes a playing progress display control, and the dynamic effect display device further includes an object information display module, wherein,
and the object information display module is specifically used for displaying the object information of the interactive object corresponding to the dynamic effect playing time point on the target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
In some embodiments of the present application, the audio playing page includes a playing progress display control, and the dynamic effect display device further includes a reminding dynamic effect display module, wherein,
and the reminding dynamic effect display module is specifically used for displaying the reminding dynamic effect corresponding to the dynamic effect playing time point on the target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
In some embodiments of the present application, the object information display module includes:
the display position determining submodule is used for determining the display position of each playing time point on the playing progress display control based on the playing time length of the audio and the display time length range of the playing progress display control;
the target position determining submodule is used for determining a target position corresponding to the dynamic effect playing time point from the display position of each playing time point;
and the display sub-module is used for displaying on the target position of the playing progress display control.
In some embodiments of the present application, the audio playing page includes an object information display area and a dynamic effect display area corresponding to the object information display area, and the dynamic effect display module is specifically configured to:
and when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying the object information of the interactive object in an object information display area, and displaying the target dynamic effect corresponding to the dynamic effect playing time point in a dynamic effect display area.
In some embodiments of the present application, the audio playing page includes progress display controls in multiple dimensions of audio playing, and the dynamic effect display module is specifically configured to:
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying object information of an interactive object on a first playing progress display control, and displaying a target dynamic effect corresponding to the dynamic effect playing time point on a second playing progress display control, wherein the first playing progress display control and the second playing progress display control are any two of progress display controls in multiple dimensions.
In some embodiments of the present application, the audio playing page further includes a dynamic effect hiding control, and the dynamic effect displaying apparatus further includes:
and the hiding module is used for hiding the object information and the target dynamic effect on the audio playing page when the triggering operation aiming at the dynamic effect hiding control is detected.
In some embodiments of the present application, the audio playback page further includes an interactive control, and the dynamic display device further includes:
the acquisition module is used for displaying the interactive effect corresponding to the interactive control when the interactive operation aiming at the interactive control is detected, and acquiring the object information of the interactive object and the playing time point of the audio, wherein the playing time point is the interactive operation time point;
and the uploading module is used for uploading the interactive operation time point and the object information to a server.
In some embodiments of the present application, the dynamic effect display module comprises:
the determining submodule is used for determining a target dynamic effect corresponding to the dynamic effect playing time point based on a first mapping relation when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, wherein the first mapping relation is the first mapping relation between the reference dynamic effect playing time point of the audio and a preset dynamic effect;
and the display sub-module is used for displaying the target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the audio playing page.
In some embodiments of the present application, the dynamic effect display device further comprises:
the first determining module is configured to determine a target dynamic effect identifier corresponding to a dynamic effect playing time point based on a second mapping relationship, where the second mapping relationship is a second mapping relationship between a reference dynamic effect playing time point of the audio and the target dynamic effect identifier;
the second determining module is used for determining a target dynamic effect storage address corresponding to the target dynamic effect identifier;
and the target dynamic effect obtaining module is used for obtaining the target dynamic effect corresponding to the dynamic effect playing time point from the target dynamic effect storage address.
In some embodiments of the present application, the second determining module comprises:
the obtaining submodule is used for obtaining a third mapping relation between the dynamic effect identification and the dynamic effect storage address;
and the determining submodule is used for determining a target dynamic effect storage address corresponding to the target dynamic effect identification based on the third mapping relation.
Correspondingly, the embodiment of the present application further provides a storage medium, where a computer program is stored, and the computer program is suitable for being loaded by a processor to execute any of the dynamic effect display methods provided by the embodiment of the present application.
Correspondingly, the embodiment of the present application further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the computer program to implement any of the motion effect display methods provided by the embodiment of the present application.
The method includes the steps of firstly displaying an audio playing page, wherein the audio playing page comprises audio playing information, and then displaying a target dynamic effect corresponding to a dynamic effect playing time point and object information of an interactive object on the audio playing page when the audio playing time point is detected to reach the dynamic effect playing time point, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object for the audio. The scheme can improve the interactivity of audio playing in the social scene.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a scene schematic diagram of a dynamic effect display method provided by an embodiment of the present invention;
FIG. 2 is a flow chart of a dynamic effect display method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a partial page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 4 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 5 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 6 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 7 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 8 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 9 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 10 is another flow chart of a dynamic effect display method provided by an embodiment of the present application;
FIG. 11 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 12 is a schematic view of another part of a page of a dynamic effect display method provided in an embodiment of the present application;
FIG. 13 is a flowchart of song action determination provided by an embodiment of the present application;
FIG. 14 is a flowchart of a song animation display provided by an embodiment of the present application;
FIG. 15 is a schematic structural diagram of a dynamic effect display device according to an embodiment of the present application;
FIG. 16 is a schematic structural diagram of a dynamic effect display device according to an embodiment of the present application;
FIG. 17 is a schematic structural diagram of a dynamic effect display device according to an embodiment of the present application;
FIG. 18 is a schematic structural diagram of a computer device provided in an embodiment of the present application;
fig. 19 is an alternative structure diagram of the distributed system 110 applied to the blockchain system according to the embodiment of the present application;
fig. 20 is an alternative schematic diagram of a block structure provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a dynamic effect display method and device, computer equipment and a storage medium. Specifically, the embodiment of the present application may be integrated in a first dynamic effect display device and a second dynamic effect display device, the first dynamic effect display device may be integrated in a first computer device, the first computer device may be an electronic device such as a terminal or a server, the terminal may be an electronic device such as a smart phone, a tablet computer, a notebook computer, or a personal computer, the server may be a single server or a server cluster, where the server may be a web server, an application server, a data server, and the like.
The second dynamic display device may be integrated in a second computer device, where the second computer device may be an electronic device such as a terminal or a server, the terminal may be an electronic device such as a smart phone, a tablet computer, a notebook computer, and a personal computer, the server may be a single server or a server cluster, and the server may be a web server, an application server, a data server, and the like.
In the embodiment of the application, a dynamic effect display method is described by taking a first computer device as a terminal and a second computer device as a server as an example.
As shown in fig. 1, in the solution of the embodiment of the present application, the terminal and the server may be connected through a network, for example, through a wired or wireless network connection, and the embodiment describes a dynamic effect display system, which may be developed by taking a first dynamic effect display device integrated on the terminal 10 and a second dynamic effect display device integrated on the server 20 as an example.
Specifically, the terminal 10 may display an audio playing page, where the audio playing page includes audio playing information, and then, when it is detected that the audio playing time point reaches a dynamic effect playing time point, the terminal 10 may display a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page, where the dynamic effect playing time point is an interactive operation time point of the interactive object with respect to the audio.
For example, a song clip is played on the terminal, singing information of the song is displayed, and when it is detected that the playing time point of the song clip reaches the dynamic effect playing time point, a target dynamic effect corresponding to the dynamic effect playing time point (the target dynamic effect may be "falling snowflake") and object information of an interactive object corresponding to the dynamic effect playing time point (the object information may be an object avatar) are displayed on the terminal.
Specifically, the server 20 may store the interactive operation time point and the object information of the audio uploaded by the terminal 10; the server 20 may determine a target action corresponding to the action playing time point based on a first mapping relationship, where the first mapping relationship is a first mapping relationship between a reference action playing time point of the audio and a preset action; the server 20 may determine a target dynamic effect identifier corresponding to the dynamic effect playing time point based on a second mapping relationship, where the second mapping relationship is a second mapping relationship between a reference dynamic effect playing time point of the audio and the target dynamic effect identifier, determine a target dynamic effect storage address corresponding to the target dynamic effect identifier, and obtain the target dynamic effect corresponding to the dynamic effect playing time point from the target dynamic effect storage address.
The following are detailed below. It should be noted that the order of description of the following embodiments is not intended to limit the order of the embodiments.
Embodiments of the present invention will be described in terms of a first dynamic display device, which may be particularly integrated in a terminal.
The method for displaying dynamic effect provided by the embodiment of the present invention may be executed by a processor of a terminal, as shown in fig. 2, the flow of the method for displaying dynamic effect may be as follows:
201. and displaying an audio playing page, wherein the audio playing page comprises playing information of the audio.
The Audio in the present application may be an Audio file that is made and played by a computer device, the Audio coding modes used by common Audio files include motion Picture Experts compression standard Audio Layer 3(MP3, Moving Picture Experts Group Audio Layer III), and the like, and different Audio coding modes are suitable for different application scenarios, for example, the Audio file stored locally at the terminal may adopt an MP3 coding mode. Specifically, the content of the audio file may be a song, an audio book, a podcast, a recording, a radio play, a radio program, an alert tone, or the like. The audio file may be generated by recording or manufacturing, or may be generated by converting the format of the video file, and so on.
The audio playing may be a process of processing and playing an audio file by an application program on the computer device, and the playing may be performed by a playing device on the computer device or by an external playing device connected to the computer in a wired or wireless manner. For example, an audio playing application on the terminal processes a song file a.mp3 and plays song a through a sound connected in a wireless connection.
The audio playing page is a page displayed on the computer device during audio playing, the audio playing page can be a page displayed on software for playing the audio, the page displays the playing state of the audio and contains the playing information of the audio, and the audio playing page can also not be a page displayed on the software for playing the audio, for example, the audio playing page can be a main page of the computer device, the audio playing page at the moment is more flexible, and a user can obtain more information through the computer device while playing the audio.
The audio playing page may contain playing information of the audio, where the playing information is information related to the audio to be played or being played, and specifically, the playing information may include production information of the audio (such as a recorder, a word writer, a singer, an author, or a producer of the audio, etc.), attribute information (such as a style, a type, or a genre, etc. to which the audio belongs), promotional information (such as a poster, or an audio-video promotion, etc.), content information (such as a song, a manuscript, or a melody, etc.), and so on.
For example, the audio playing page is an audio playing page of the song M displayed on the audio playing software P when the audio playing software P plays the song M, and the playing information contained in the audio playing page includes a word author, a poster, and a producer of the song M.
As shown in fig. 3, the song "song 1" being played and the song information of the song are displayed on the audio playing page: singer 1.
202. When the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object aiming at the audio.
The dynamic effect in the application can be an image or a video playing a role in prompting, the content expressed by the image or the video can be combined with an actual application scene, and in the application, the dynamic effect can also be any multimedia form playing a role in prompting, such as an audio special effect or a text special effect, or the combination of at least two multimedia forms, so that the display level can be enriched to achieve a better prompting effect, for example, the dynamic effect of the image or the video can be combined with a sound effect, and the like. The target dynamic effect is the dynamic effect played by the audio playing page at the dynamic effect playing time point, and the number of the target dynamic effects can be one or more.
In the application, the dynamic effect may be an interactive operation time point at which the application program only collects the interactive object, the application program analyzes the interactive object, the audio style and the interactive operation time point, and determines the dynamic effect corresponding to the playing time point of the audio, or a preset dynamic effect library provided by the application program selects and determines the interactive object during the interactive operation, and the like.
The interactive object is an object for determining an interactive operation time point of the audio, and the interactive object can perform a related operation. The interactive object can be a user who experiences and operates on the application program, or a tester during manual testing, a test program or intelligent test equipment during automatic testing, and the like in a development stage before the application program is not published.
The dynamic playing time point may be an interactive operation time point determined based on an interactive operation of the interactive object with respect to the audio, the terminal may determine the interactive operation time point according to a time point at which the interactive object performs a related operation, and if the interactive operation of the interactive object is a time period, the terminal may determine a determination rule of the interactive operation time point based on a preset rule, for example, the preset rule may be an interactive operation time point which is an end time point of the time period at which the interactive object is selected to perform the interactive operation, and the like.
The object information may be a nickname, a head portrait, or a label of the interactive object, and when the object information is displayed, one or more kinds of object information may be displayed, for example, the head portrait of the interactive object and a label of the interactive user may be displayed at the same time, the label may be determined based on the operation of the interactive object itself, or may be determined by the application program based on the historical behavior of the interactive object on the application program, for example, the label may be "love violin", "love book" or the like set by the interactive object, or may be "pure music singer", "love song", or "pure music sprout" or the like determined by the application program according to the historical behavior of the interactive object.
The related operation may be a playing time point for the audio, may be content for the audio, may be a response operation for an interactive operation that has been performed by another object, and the like. For example, during the test, the tester marks the start time point and the end time point of the audio to detect whether the audio playing application can normally record the marking operation, or for example, during listening to the audio, the object a sees the "objection" operation of the friend B on the application at a certain time, the object a responds to the operation of the friend B, such as commenting the "objection" operation of the friend B, and the comment content may be "very good to hear! ", and the like.
Whether the playing time point of the audio reaches the dynamic playing time point or not is detected, the playing time point can be detected at a set time interval, the precision of the set time interval is not greater than that of the dynamic playing time point, if the collection precision of the dynamic playing time point is second, the precision of the set time interval can be second or millisecond, and the like, and in an actual application scene, the set time interval can be flexibly set based on actual requirements. For example, the time interval may be set to 1ms, and the computer device performs the operation of acquiring the playing time point every 1 ms.
When the audio playing time point is detected to reach the dynamic playing time point, the target dynamic effect corresponding to the dynamic playing time point and the object information of the interactive object can be displayed on the audio playing page, if the audio playing time point corresponds to a plurality of dynamic playing time points, the target dynamic effect corresponding to each dynamic playing time point and the object information of the interactive object can be displayed simultaneously on the hidden audio playing page, and the target dynamic effect corresponding to each dynamic playing time point and the object information of the interactive object can also be sequentially displayed according to certain sequence information. In addition, if the display of the target dynamic effect requires a time period, for two or more dynamic effect playing time points with small differences, the target dynamic effect may be displayed according to the actual dynamic effect playing time points, or the actual display time of each target dynamic effect may be flexibly adjusted according to the display time of the target dynamic effect, for example, if the target dynamic effect 1 is to be displayed for 1 minute 20 seconds, the target dynamic effect 2 is to be displayed for 1 minute 21 seconds, and the display time of the target dynamic effect 1 is 5 seconds, one of the display modes may be that the target dynamic effect 1 is displayed for 1 minute 20 seconds, and the target dynamic effect 1 is displayed for 1 minute 25 seconds.
In this embodiment, the audio in the playing process can display the target special effect corresponding to the interactive operation from other interactive objects in real time, and usually, in the audio playing process, the object may not view the audio playing page at any moment.
For example, song M has 3 target animation effects, which are respectively a short video animation effect MA corresponding to interactive object a in 20 seconds of the song, a short video animation effect MB corresponding to interactive object B in 50 seconds of the song, and a short video animation effect MC corresponding to interactive object C in 1 minute and 12 seconds of the song. When the song M is played for 20 seconds on the application program, the animation MA and the object information of the interactive object a are displayed on the audio playing page.
As shown in fig. 3, when the play time point of the song reaches the animation play time point, a target animation (three hexagons) and a head portrait (smiling face) of the interactive subject are displayed on the audio play page.
In some embodiments, the audio playing page includes a dynamic effect display area and a playing progress display control, and the step "when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page" may include:
and when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying object information of the interactive object on the playing progress display control, and displaying a target dynamic effect corresponding to the dynamic effect playing time point in a dynamic effect display area.
The playing progress display control may display the playing progress of the audio, for example, a lyric progress display control that may display lyrics in the audio along with the playing of the audio, a progress bar control that may display the duration of the audio and display the playing time point along with the playing of the audio, a waveform progress display control that may display the melody information of the audio along with the tone, rhythm, or the like of the audio, and the like. The dynamic effect display area is an area for displaying a target dynamic effect, and the actual position of the dynamic effect display area on the audio playing page can be flexibly set according to the layout and the actual requirement of the audio playing page in the application scene, which is not limited herein.
The scheme demarcates the display position of the object information and the display position of the target dynamic effect, so that the display process is more standard, and the object is easier to know the display rule.
For example, when the song M displays the animation MA, object information of the interactive object a, such as an avatar of the interactive object a, may be displayed on the progress bar control of the song, and the animation MA may be displayed in the animation display area.
As shown in fig. 4, the playing progress display control may be a progress bar control, and when the playing time point of the audio reaches the dynamic effect playing time point, a head portrait (smiling face) of the interactive object is displayed on the progress bar, and a target dynamic effect (three hexagons) is displayed in the dynamic effect display area.
In some embodiments, the audio playback page includes a playback progress display control,
the dynamic effect display method can further comprise the following steps: and displaying the object information of the interactive object corresponding to the dynamic effect playing time point on a target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
The play progress display control may display the whole duration of the audio, or may display part of the duration of the audio, for example, if the duration of one audio is 2 minutes, the play progress display control may display the play duration of 2 minutes, or may display the play duration of 20 seconds all the time along with the audio playing process.
In order to improve the interaction interest of the dynamic effect display, object information can be displayed at corresponding positions of the dynamic effect playing time points on the progress bar so as to prompt the interactive operation of the interactive object at the playing time points.
Specifically, the object information of the interactive object may be displayed, and the object may determine the interactive object performing the interactive operation at the point according to the object information displayed on the play progress display control.
For example, the playing progress display control of the song M may be a progress bar displaying all the durations of the audio, and then on this progress bar, object information of the interactive object corresponding to all the live-action playing time points is displayed, such as displaying a nickname of the interactive object, that is, displaying the object a at a target position corresponding to 20 seconds of audio, displaying the object B at a target position corresponding to 50 seconds, and displaying the object C at a target position corresponding to 1 minute 12 seconds on the progress bar.
As shown in fig. 5, the audio playing interface includes a playing progress display control, where the playing progress display control may be a progress bar control, a playing time of a song is displayed on the progress bar, and the avatars of the interactive objects (the avatars are letters A, C and D, respectively) corresponding to the dynamic playing time point in the playing time, and as shown in the figure, each avatar corresponds to one playing time point on the progress bar.
In some embodiments, the audio playback page includes a playback progress display control.
The dynamic effect display method can further comprise the following steps:
and displaying a reminding dynamic effect corresponding to the dynamic effect playing time point on a target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
Wherein, the reminding dynamic effect can be the dynamic effect which can be displayed on the audio display page such as images, videos, character special effects and the like, and the reminding dynamic effect is different from the target dynamic effect in that the reminding dynamic effect is usually simple in form, and a plurality of reminding dynamic effects are displayed on the audio display page to prompt the object to have the interactive operation of the interactive object,
the interactive operation can be adding element sound to original audio, the element sound can be sound of percussion of musical instruments, special effect of music, sound from nature (such as rain sound and the like), the audio can be optionally added into the sound added by the interactive operation during playing, and the reminding action can be used for reminding the playing time point of the element sound.
Specifically, a reminding dynamic effect may be displayed, and the reminding dynamic effect may prompt the object that the target dynamic effect appears based on the interactive operation of the interactive object at the playing time point.
For example, the playing progress display control of the song M may be a progress bar displaying all the duration of the audio, and then on this progress bar, the reminding action of the interactive object corresponding to all the action playing time points is displayed, and the reminding action may be a rotating polygon of different colors, so that a green rotating quadrangle may be displayed at a target position corresponding to 20 seconds of the audio of the progress bar, a yellow rotating quadrangle may be displayed at a target position corresponding to 50 seconds, and a green rotating hexagon may be displayed at a target position corresponding to 1 minute 12 seconds.
As shown in fig. 6, a playing progress display control is included on the audio playing interface, the playing progress display control may be a progress bar control, a section of playing time of a song is displayed on the progress bar, and a reminding action corresponding to the action playing time point on the section of playing time, as shown in the figure, each polygon is a reminding action corresponding to the playing time point on the progress bar.
In some embodiments, the step of "displaying on the target position of the play progress display control" may include:
and determining the display position of each playing time point on the playing progress display control based on the playing time of the audio and the display time range of the playing progress display control, determining a target position corresponding to the dynamic playing time point from the display position of each playing time point, and displaying the target position on the playing progress display control.
The display duration range, namely the length of the display control of the playing progress on the audio playing interface, is different on different computer devices and different versions of the application program due to the difference between the versions of the application program and the size of the display screen of the computer device,
and determining the display time range, and determining the display position of each playing time point on the playing progress display control in proportion according to the playing time of the audio, wherein the dynamic playing time point is a specific playing time point, so that the display position of the dynamic playing time point on the playing progress display control, namely the target position, can be determined.
For example, if the display duration range of the playing progress display control is 15 centimeters and the playing duration of the audio is 15 seconds, the display position of each playing time point on the playing progress display control can be determined, and then the display position of the target position on the audio playing page can be determined according to the display position of the playing progress display control on the audio playing page.
In some embodiments, the audio playback page includes an object information display region and a dynamic effect display region corresponding to the object information display region,
the step "when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page", may include:
and when the audio playing time point is detected to reach the dynamic effect playing time point, displaying the object information of the interactive object in the object information display area, and displaying the target dynamic effect corresponding to the dynamic effect playing time point in the dynamic effect display area.
In this embodiment, the display area of the target animation is determined based on the display position of the object information, and the object information is displayed in the object information display area, so that the relationship between the displayed elements (the object information and the target animation) can be enhanced, and the display effect can be improved.
For example, the object information is displayed at the bottom of the audio playing page, the target animation is displayed in a specific area above the object information, and if the object information changes in position along with the audio playing process, the position of the target animation or other display factors (such as the display duration and the like) may also change along with the change of the position of the object information.
As shown in fig. 7, when the playing time point reaches the animation playing time point, the avatar (smiling face) of the interactive subject is displayed at the center of the page, and the target animation (four hexagons) is displayed in the area directly above the avatar of the interactive subject.
In some embodiments, the audio playback page includes progress display controls in multiple dimensions of audio playback,
the step "when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page", may include:
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, object information of the interactive object is displayed on the first playing progress display control, and a target dynamic effect corresponding to the dynamic effect playing time point is displayed on the second playing progress display control, wherein the first playing progress display control and the second playing progress display control are any two of the progress display controls in multiple dimensions.
The progress display controls in multiple dimensions may include a lyric progress display control for displaying lyrics in audio along with the playing of the audio, a progress bar control for displaying the duration of the audio and the playing time point along with the playing of the audio, a waveform progress display control for displaying the melody information of the audio along with the tone, rhythm, or the like of the audio, and the like.
For example, the first playing progress display control may be designated as a lyric progress display control, the second playing progress display control may be designated as a waveform progress display control, and when it is detected that the playing time point of the audio reaches the animation effect playing time point, the object information of the interactive object may be displayed on the lyric progress display control, and the target animation effect may be displayed on the waveform progress display control.
As shown in fig. 8, when the playing time point reaches the animation playing time point, the avatar (smiling face) of the interactive object is displayed on the lyric progress display control, and the target animation (seven hexagons) is displayed on the progress bar control.
In some embodiments, the audio playing page further includes a dynamic effect hiding control, and the dynamic effect displaying method further includes:
and when the triggering operation aiming at the dynamic effect hiding control is detected, hiding the object information and the target dynamic effect on the audio playing page.
The dynamic effect hiding control can be in the form of a button, a sliding frame and the like.
The dynamic effect hiding control is used for hiding, when the object does not want to receive contents such as target dynamic effect, reminding dynamic effect and object information of the interactive object related to interactive operation of the interactive object during audio playing, the contents can be hidden through the dynamic effect hiding control, the flexibility of the scheme is enhanced through the dynamic effect hiding control, and the object has more selectable audio playing modes during audio playing.
As shown in fig. 9, the dynamic effect hiding control may be a "hidden dynamic effect" button in the drawing, and when a trigger operation for the "hidden dynamic effect" button is detected, the special effect (three hexagons) on the audio playing page and the avatar (smiling face) of the interactive object are hidden.
In some embodiments, the audio playing page further includes an interactive control, and the dynamic effect display method further includes:
when the interactive operation aiming at the interactive control is detected, the interactive effect corresponding to the interactive control is displayed, the object information of the interactive object and the playing time point of the audio are collected, the playing time point is the interactive operation time point, and the interactive operation time point and the object information are uploaded to the server.
The interactive control can be in the form of a button or an edit box.
The interactive control is used for performing interactive operation, specifically, the interactive operation can be in various forms, different interactive operations can be realized through different interactive controls, for example, audio content can be evaluated through a 'approval' button or a 'disapproval' button, and for example, comments can be made in the forms of characters, pictures, dynamic effects and the like through an editing control.
And if the interactive operation comprises operation content (such as characters, pictures and the like), the operation content, the interactive object and the interactive operation time point can be uploaded to the server together.
For example, the interactive control is an emoticon control, when an interactive operation directed at the emoticon control is detected, the playing time point of the audio can be determined as the interactive operation time point, and the operation content determined by the interactive operation (such as the emoticon 1) can be determined. And uploading the interactive operation time point and the interactive object to a server together.
In some embodiments, the step "when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page" may include:
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, the target dynamic effect corresponding to the dynamic effect playing time point is determined based on the first mapping relation, and the target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object are displayed on the audio playing page.
The first mapping relation is a mapping relation between a reference dynamic effect playing time point of the audio and a preset dynamic effect, the first mapping relation stores the preset dynamic effect corresponding to the reference dynamic effect playing time point of the audio, the reference dynamic effect playing time point can be a certain playing time point of the audio, a target dynamic effect corresponding to the dynamic effect playing time point can be determined through the first mapping relation, the reference dynamic effect playing time point can be a certain playing time period of the audio, the playing time period to which the dynamic effect playing time point belongs needs to be determined first, then the target dynamic effect corresponding to the playing time period needs to be determined, and then the target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object are displayed on an audio playing page.
For example, if the playing time point of the audio is 20 seconds, and it is detected that 20 seconds is a dynamic effect playing time point, the target dynamic effect a2 corresponding to the audio 20 seconds can be determined through the first mapping relationship, and the target dynamic effect a2 and the object information of the interactive object are displayed on the audio playing page.
In some embodiments, the dynamic effect display method may further include:
and determining a target dynamic effect identifier corresponding to the dynamic effect playing time point based on the second mapping relation, determining a target dynamic effect storage address corresponding to the target dynamic effect identifier, and acquiring the target dynamic effect corresponding to the dynamic effect playing time point from the target dynamic effect storage address.
At this time, in some embodiments, the step of "determining the target animation effect identifies the corresponding target animation effect storage address" may include:
and acquiring a third mapping relation between the dynamic effect identification and the dynamic effect storage address, and determining a target dynamic effect storage address corresponding to the target dynamic effect identification based on the third mapping relation.
The second mapping relation is the mapping relation between the reference dynamic effect playing time point of the audio and the target dynamic effect identifier, and the third mapping relation is the mapping relation between the dynamic effect identifier and the dynamic effect storage address.
In some cases, the target dynamic effect may occupy more memory resources, and therefore, in order to save memory resources and ensure better operation of the application program, the target resources may be stored in the server. Before audio playing is carried out, a target dynamic effect identification of the audio is determined through a second mapping relation, a storage address of the target dynamic effect is determined through a third mapping relation, the target dynamic effect is obtained from a server according to the storage address, and then in the audio playing process, when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, the target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object are displayed on an audio playing page.
For example, before audio playing, the server may determine, according to a request message of the terminal, a target dynamic effect identifier corresponding to the dynamic effect playing time point based on the second mapping relationship, then determine, based on the third mapping relationship, a target dynamic effect storage address corresponding to the target dynamic effect identifier, obtain, from the target dynamic effect storage address, a target dynamic effect corresponding to the dynamic effect playing time point, and send the target dynamic effect to the server.
The method includes the steps of firstly displaying an audio playing page, wherein the audio playing page comprises audio playing information, and then displaying a target dynamic effect corresponding to a dynamic effect playing time point and object information of an interactive object on the audio playing page when the audio playing time point is detected to reach the dynamic effect playing time point, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object for the audio. The scheme can improve the interactivity of audio playing in the social scene.
The method described in the above embodiments is further illustrated in detail by way of example.
The embodiment will describe the dynamic effect display method in detail by taking social software capable of playing songs as an example.
As shown in fig. 10, the specific flow of the dynamic effect display method may be as follows:
301. and the terminal displays a song playing page, wherein the song playing page comprises the playing information of the song.
As shown in fig. 11, there is a function "music box" on the social software, and when the user plays a song through the music box, the social software displays a song playing page, and the song playing interface can display the playing information of the song: the song title "song 1" and the artist "artist 1".
302. And the server determines a target dynamic effect identifier of each dynamic effect playing time point of the song from the song special effect table according to a target dynamic effect obtaining request based on the song sent by the terminal.
There is a song special effect table (which may be stored in the database) on the server, and one record in the song special effect table stores the relevant information of the praise operation of an interactive object for a song, including a song id, an object id, a praise time t, and a corresponding target dynamic effect id, for example, there are 3 records in the song special effect table for song 1, which are: song 1, object 1, 20s, action 32; song 1, object 5, 27s, action 12; song 1, object 2, 35s, action 01.
303. And the server acquires the target dynamic effect from the special effect table according to the target dynamic effect identification and sends the target dynamic effect to the terminal.
For example, the server has a special effect table, and may obtain a target dynamic effect from the special effect table according to the target dynamic effect id, and send the target dynamic effect to the terminal, where the target dynamic effect is dynamic effect 32, dynamic effect 12, and dynamic effect 01, respectively.
304. The terminal stores the target animation of the song sent by the server.
305. And when the fact that the playing time point of the song reaches the dynamic effect playing time point is detected, the terminal displays a target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the song playing page.
As shown in fig. 12, when it is detected that the playing time point of the song is 20s, the object 1 and the avatar of the object 1 are displayed 32 on the song playing page, the avatar of the object 1 is the letter D, and the target animation 32 is a star on the song playing page.
In the application, the interactive operation may be praise, the client of the application collects the praise operation of the user, and the process of determining the dynamic effect corresponding to the interactive operation time point may be as shown in fig. 13, where the client monitors whether the user approves during the song playing process (for example, it may be that monitoring is enough to trigger an approve control), and if not, the monitoring is continued, if yes, the song ID being played, the user ID for the approval operation, and the approval time T (i.e., the interactive operation time point) of the user are acquired, and then, according to the song ID and the praise time T, determining the corresponding special effect ID (namely the target dynamic effect identification) from the song special effect database, and the song ID, the user ID, the praise time T and the special effect ID are stored in a praise database, so that the processes of collecting the praise of the user and determining the corresponding special effect are completed.
In this application, the interactive operation may be like a praise, and the process of displaying the target action by the client of the application may be as shown in fig. 14, where the client is going to play or playing a song, the client first obtains at least one set of praise time T and its corresponding special effect ID (i.e., target action identifier) included in the song ID of the song from the user praise database in a summary manner, then obtains a special effect (i.e., target action) corresponding to the special effect ID from the special effect database, performs preloading, and then monitors the playing process of the song, for example, determines whether the song is played or not, if not, determines whether the song is played to the praise time T, and if so, displays the special effect corresponding to the praise time T, and then monitors the playing process of the song until the song is played.
In the embodiment of the application, a terminal displays a song playing page, the song playing page comprises playing information of songs, a server determines a target dynamic effect identifier of each dynamic effect playing time point of the songs from a song special effect table according to a target dynamic effect obtaining request based on the songs sent by the terminal, the server obtains a target dynamic effect from the special effect table according to the target dynamic effect identifier and sends the target dynamic effect to the terminal, the terminal stores the target dynamic effect of the songs sent by the server, and finally when the fact that the playing time point of the songs reaches the dynamic effect playing time point is detected, the terminal displays the target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the song playing page. The scheme can improve the interactivity of audio playing in the social scene.
In order to better implement the shooting method provided by the embodiment of the present application, the embodiment of the present application further provides a device based on the dynamic effect display method. Wherein the meanings of the nouns are the same as those in the above dynamic effect display method, and the specific implementation details can refer to the description in the method embodiment.
As shown in fig. 15, fig. 15 is a schematic structural diagram of a dynamic effect display device according to an embodiment of the present application, where the dynamic effect display device may include a page display module 401 and a dynamic effect display module 402, where,
a page display module 401, configured to display an audio playing page, where the audio playing page includes audio playing information;
the dynamic effect display module 402 is configured to display, on the audio playing page, a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object when it is detected that the audio playing time point reaches the dynamic effect playing time point, where the dynamic effect playing time point is an interactive operation time point of the interactive object for the audio.
In some embodiments of the present application, the audio playing page includes a dynamic effect display area and a playing progress display control, and the dynamic effect display module 402 is specifically configured to:
and when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying object information of the interactive object on the playing progress display control, and displaying a target dynamic effect corresponding to the dynamic effect playing time point in a dynamic effect display area.
In some embodiments of the present application, the audio playing page includes a playing progress display control, and the dynamic effect display device further includes an object information display module, wherein,
and the display module is specifically used for displaying the object information of the interactive object corresponding to the dynamic effect playing time point on the target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
In some embodiments of the present application, the audio playing page includes a playing progress display control, and the dynamic effect display device further includes a reminding dynamic effect display module, wherein,
and the display module is specifically used for displaying the reminding dynamic effect corresponding to the dynamic effect playing time point on the target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
In some embodiments of the present application, the object information display module includes:
the display position determining submodule is used for determining the display position of each playing time point on the playing progress display control based on the playing time length of the audio and the display time length range of the playing progress display control;
the target position determining submodule is used for determining a target position corresponding to the dynamic effect playing time point from the display position of each playing time point;
and the display sub-module is used for displaying on the target position of the playing progress display control.
In some embodiments of the present application, the audio playing page includes an object information display area and a dynamic effect display area corresponding to the object information display area, and the dynamic effect display module 402 is specifically configured to:
and when the audio playing time point is detected to reach the dynamic effect playing time point, displaying the object information of the interactive object in the object information display area, and displaying the target dynamic effect corresponding to the dynamic effect playing time point in the dynamic effect display area.
In some embodiments of the present application, the audio playing page includes progress display controls in multiple dimensions of audio playing, and the dynamic effect display module 402 is specifically configured to:
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, object information of the interactive object is displayed on the first playing progress display control, and a target dynamic effect corresponding to the dynamic effect playing time point is displayed on the second playing progress display control, wherein the first playing progress display control and the second playing progress display control are any two of the progress display controls in multiple dimensions.
In some embodiments of the present application, the audio playing page further includes a dynamic effect hiding control, and the dynamic effect displaying device further includes:
and the hiding module is used for hiding the object information and the target dynamic effect on the audio playing page when the triggering operation aiming at the dynamic effect hiding control is detected.
In some embodiments of the present application, the audio playing page further includes an interactive control, as shown in fig. 16, and the dynamic effect display device further includes:
the acquisition module 403 is configured to, when an interactive operation for the interactive control is detected, display an interactive effect corresponding to the interactive control, and acquire object information of an interactive object and a playing time point of an audio, where the playing time point is an interactive operation time point;
and an uploading module 404, configured to upload the interactive operation time point and the object information to the server.
In some embodiments of the present application, as shown in fig. 17, the dynamic display module 402 includes:
the determining submodule 4021 is configured to determine, when it is detected that a playing time point of an audio reaches a dynamic effect playing time point, a target dynamic effect corresponding to the dynamic effect playing time point based on a first mapping relationship, where the first mapping relationship is a first mapping relationship between a reference dynamic effect playing time point of the audio and a preset dynamic effect;
the display sub-module 4022 is configured to display a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page.
In some embodiments of the present application, the dynamic effect display device further comprises:
the first determining module is used for determining a target dynamic effect identifier corresponding to the dynamic effect playing time point based on a second mapping relation, wherein the second mapping relation is the second mapping relation between the reference dynamic effect playing time point of the audio and the target dynamic effect identifier;
the second determining module is used for determining a target dynamic effect storage address corresponding to the target dynamic effect identifier;
and the target dynamic effect obtaining module is used for obtaining the target dynamic effect corresponding to the dynamic effect playing time point from the target dynamic effect storage address.
In some embodiments of the present application, the second determining module comprises:
the obtaining submodule is used for obtaining a third mapping relation between the dynamic effect identification and the dynamic effect storage address;
and the determining submodule is used for determining a target dynamic effect storage address corresponding to the target dynamic effect identifier based on the third mapping relation.
In this embodiment of the application, the page display module 401 firstly displays an audio playing page, where the audio playing page includes audio playing information, and then, when it is detected that the audio playing time point reaches a dynamic effect playing time point, the dynamic effect display module 402 displays a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, where the dynamic effect playing time point is an interactive operation time point of the interactive object for the audio. The scheme can improve the interactivity of audio playing in the social scene.
In addition, an embodiment of the present application further provides a computer device, where the computer device may be a terminal or a server, as shown in fig. 18, which shows a schematic structural diagram of the computer device according to the embodiment of the present application, and specifically:
the computer device may include components such as a processor 501 of one or more processing cores, memory 502 of one or more computer-readable storage media, a power supply 503, and an input unit 504. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 18 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components in combination, or a different arrangement of components. Wherein:
the processor 501 is a control center of the computer device, connects various parts of the entire computer device by using various interfaces and lines, and performs various functions of the computer device and processes data by running or executing software programs and/or modules stored in the memory 502 and calling data stored in the memory 502, thereby monitoring the computer device as a whole. Optionally, processor 501 may include one or more processing cores; preferably, the processor 501 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user pages, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 501.
The memory 502 may be used to store software programs and modules, and the processor 501 executes various functional applications and data processing by operating the software programs and modules stored in the memory 502. The memory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the computer device, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 502 may also include a memory controller to provide the processor 501 with access to the memory 502.
The computer device further comprises a power supply 503 for supplying power to the various components, and preferably, the power supply 503 may be logically connected to the processor 501 through a power management system, so that functions of managing charging, discharging, power consumption, and the like are realized through the power management system. The power supply 503 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The computer device may also include an input unit 504, and the input unit 504 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the computer device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 501 in the computer device loads the executable file corresponding to the process of one or more application programs into the memory 502 according to the following instructions, and the processor 501 runs the application programs stored in the memory 502, so as to implement various functions as follows:
and displaying an audio playing page, wherein the audio playing page comprises audio playing information, and when the fact that the audio playing time point reaches the dynamic effect playing time point is detected, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object aiming at the audio.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application further provides a storage medium, in which a computer program is stored, where the computer program can be loaded by a processor to execute the steps in any one of the dynamic effect display methods provided in the present application. For example, the computer program may perform the steps of:
and displaying an audio playing page, wherein the audio playing page comprises audio playing information, and when the fact that the audio playing time point reaches the dynamic effect playing time point is detected, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of the interactive object on the audio playing page, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object aiming at the audio.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any of the dynamic effect display methods provided in the embodiments of the present application, the beneficial effects that can be achieved by any of the dynamic effect display methods provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The system related to the embodiment of the application can be a distributed system formed by connecting a client and a plurality of nodes (computer devices in any form in an access network, such as servers and terminals) in a network communication mode.
Taking a distributed system as a blockchain system as an example, referring To fig. 19, fig. 19 is an optional structural schematic diagram of the distributed system 110 applied To the blockchain system provided in this embodiment of the present application, and is formed by a plurality of nodes 1101 (computing devices in any form in an access network, such as servers and user terminals) and a client 1102, a Peer-To-Peer (P2P, Peer To Peer) network is formed between the nodes, and the P2P Protocol is an application layer Protocol operating on a Transmission Control Protocol (TCP). In a distributed system, any machine, such as a server or a terminal, can join to become a node, and the node comprises a hardware layer, a middle layer, an operating system layer and an application layer.
Referring to the functions of each node in the blockchain system shown in fig. 19, the functions involved include:
1) routing, the basic function that a node has, is used to support communication between nodes.
Besides the routing function, the node may also have the following functions:
2) the application is used for being deployed in a block chain, realizing specific services according to actual service requirements, recording data related to the realization functions to form recording data, carrying a digital signature in the recording data to represent a source of task data, and sending the recording data to other nodes in the block chain system, so that the other nodes add the recording data to a temporary block when the source and integrity of the recording data are verified successfully.
For example, the services implemented by the application include:
2.1) wallet, for providing the function of transaction of electronic money, including initiating transaction (i.e. sending the transaction record of current transaction to other nodes in the blockchain system, after the other nodes are successfully verified, storing the record data of transaction in the temporary blocks of the blockchain as the response of confirming the transaction is valid; of course, the wallet also supports the querying of the remaining electronic money in the electronic money address;
and 2.2) sharing the account book, wherein the shared account book is used for providing functions of operations such as storage, query and modification of account data, record data of the operations on the account data are sent to other nodes in the block chain system, and after the other nodes verify the validity, the record data are stored in a temporary block as a response for acknowledging that the account data are valid, and confirmation can be sent to the node initiating the operations.
2.3) Intelligent contracts, computerized agreements, which can enforce the terms of a contract, implemented by codes deployed on a shared ledger for execution when certain conditions are met, for completing automated transactions according to actual business requirement codes, such as querying the logistics status of goods purchased by a buyer, transferring the buyer's electronic money to the merchant's address after the buyer signs for the goods; of course, smart contracts are not limited to executing contracts for trading, but may also execute contracts that process received information.
3) And the Block chain comprises a series of blocks (blocks) which are mutually connected according to the generated chronological order, new blocks cannot be removed once being added into the Block chain, and recorded data submitted by nodes in the Block chain system are recorded in the blocks.
In this embodiment, the dynamic effect playing time point, the target dynamic effect, the reminding dynamic effect, the object information of the interactive object, and the like of the audio may be stored in the shared ledger of the area chain through the node, and the computer device (for example, a terminal or a server) may obtain the dynamic effect playing time point and the object information of the interactive object based on the data stored in the shared ledger.
Referring to fig. 20, fig. 20 is an optional schematic diagram of a Block Structure (Block Structure) provided in this embodiment, where each Block includes a hash value of a transaction record stored in the Block (hash value of the Block) and a hash value of a previous Block, and the blocks are connected by the hash value to form a Block chain. The block may include information such as a time stamp at the time of block generation. A block chain (Blockchain), which is essentially a decentralized database, is a string of data blocks associated by using cryptography, and each data block contains related information for verifying the validity (anti-counterfeiting) of the information and generating a next block.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
The method, the apparatus, the computer device and the storage medium for displaying dynamic effect provided by the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (15)

1. A dynamic effect display method is characterized by comprising the following steps:
displaying an audio playing page, wherein the audio playing page comprises playing information of audio;
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, wherein the dynamic effect playing time point is an interactive operation time point of the interactive object aiming at the audio.
2. The method of claim 1, wherein the audio playback page comprises a dynamic display area and a playback progress display control,
when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, including:
and when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying object information of the interactive object on a playing progress display control, and displaying a target dynamic effect corresponding to the dynamic effect playing time point in a dynamic effect display area.
3. The method of claim 1, wherein the audio playback page includes a playback progress display control,
when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, before displaying the target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the audio playing page, the method further includes:
and displaying the object information of the interactive object corresponding to the dynamic effect playing time point on a target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
4. The method of claim 1, wherein the audio playback page includes a playback progress display control,
when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, before displaying the target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the audio playing page, the method further includes:
and displaying a reminding dynamic effect corresponding to the dynamic effect playing time point on a target position of the playing progress display control, wherein the target position is the position of the dynamic effect playing time point in the playing progress display control.
5. The method according to claim 3 or 4, wherein the displaying at the target position of the playing progress display control comprises:
determining the display position of each playing time point on the playing progress display control based on the playing time length of the audio and the display time length range of the playing progress display control;
determining a target position corresponding to the dynamic effect playing time point from the display position of each playing time point;
and displaying the target position of the playing progress display control.
6. The method of claim 1, wherein the audio playback page comprises an object information display area and a dynamic effect display area corresponding to the object information display area;
when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, including:
and when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying the object information of the interactive object in an object information display area, and displaying the target dynamic effect corresponding to the dynamic effect playing time point in a dynamic effect display area.
7. The method of claim 1, wherein the audio playback page includes progress display controls in multiple dimensions of audio playback,
when it is detected that the playing time point of the audio reaches the dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, including:
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, displaying object information of an interactive object on a first playing progress display control, and displaying a target dynamic effect corresponding to the dynamic effect playing time point on a second playing progress display control, wherein the first playing progress display control and the second playing progress display control are any two of progress display controls in multiple dimensions.
8. The method of claim 1, 2, 3, 4, 6, or 7, wherein the audio playback page further comprises a dynamic effect hidden control, the method further comprising:
and when the triggering operation aiming at the dynamic effect hiding control is detected, hiding the object information and the target dynamic effect on the audio playing page.
9. The method of claim 1, wherein the audio playback page further comprises an interactive control, the method further comprising:
when the interactive operation aiming at the interactive control is detected, the interactive effect corresponding to the interactive control is displayed, and
acquiring object information of an interactive object and a playing time point of the audio, wherein the playing time point is an interactive operation time point;
and uploading the interactive operation time point and the object information to a server.
10. The method according to claim 1, wherein when it is detected that the playing time point of the audio reaches a dynamic effect playing time point, displaying a target dynamic effect corresponding to the dynamic effect playing time point and object information of an interactive object on the audio playing page, comprises:
when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, determining a target dynamic effect corresponding to the dynamic effect playing time point based on a first mapping relation, wherein the first mapping relation is the first mapping relation between a reference dynamic effect playing time point and a preset dynamic effect of the audio;
and displaying the target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the audio playing page.
11. The method according to claim 1, wherein before displaying the target animation corresponding to the animation playing time point and the object information of the interactive object on the audio playing page when it is detected that the playing time point of the audio reaches the animation playing time point, further comprising:
determining a target dynamic effect identifier corresponding to a dynamic effect playing time point based on a second mapping relation, wherein the second mapping relation is a second mapping relation between a reference dynamic effect playing time point of the audio and the target dynamic effect identifier;
determining a target dynamic effect storage address corresponding to the target dynamic effect identifier;
and acquiring the target dynamic effect corresponding to the dynamic effect playing time point from the target dynamic effect storage address.
12. The method of claim 11, wherein determining the target live action identifies a corresponding target live action storage address comprises:
acquiring a third mapping relation between the dynamic effect identification and the dynamic effect storage address;
and determining a target dynamic effect storage address corresponding to the target dynamic effect identifier based on the third mapping relation.
13. A dynamic effect display device, comprising:
the page display module is used for displaying an audio playing page, wherein the audio playing page comprises audio playing information;
and the dynamic effect display module is used for displaying a target dynamic effect corresponding to the dynamic effect playing time point and the object information of the interactive object on the audio playing page when the fact that the playing time point of the audio reaches the dynamic effect playing time point is detected, wherein the dynamic effect playing time point is the interactive operation time point of the interactive object aiming at the audio.
14. A storage medium, characterized in that it stores a plurality of computer programs adapted to be loaded by a processor for performing the steps of the method according to any one of claims 1 to 12.
15. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method according to any of claims 1 to 12 are implemented when the computer program is executed by the processor.
CN202010104441.1A 2020-02-20 2020-02-20 Dynamic effect display method and device, computer equipment and storage medium Pending CN113284523A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010104441.1A CN113284523A (en) 2020-02-20 2020-02-20 Dynamic effect display method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010104441.1A CN113284523A (en) 2020-02-20 2020-02-20 Dynamic effect display method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113284523A true CN113284523A (en) 2021-08-20

Family

ID=77275029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010104441.1A Pending CN113284523A (en) 2020-02-20 2020-02-20 Dynamic effect display method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113284523A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885830A (en) * 2021-10-25 2022-01-04 北京字跳网络技术有限公司 Sound effect display method and terminal equipment
CN113885829A (en) * 2021-10-25 2022-01-04 北京字跳网络技术有限公司 Sound effect display method and terminal equipment
CN115220625A (en) * 2022-07-19 2022-10-21 广州酷狗计算机科技有限公司 Audio playing method and device, electronic equipment and computer readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885830A (en) * 2021-10-25 2022-01-04 北京字跳网络技术有限公司 Sound effect display method and terminal equipment
CN113885829A (en) * 2021-10-25 2022-01-04 北京字跳网络技术有限公司 Sound effect display method and terminal equipment
CN113885829B (en) * 2021-10-25 2023-10-31 北京字跳网络技术有限公司 Sound effect display method and terminal equipment
CN115220625A (en) * 2022-07-19 2022-10-21 广州酷狗计算机科技有限公司 Audio playing method and device, electronic equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN110719524B (en) Video playing method and device, intelligent playing equipment and storage medium
US8285776B2 (en) System and method for processing a received media item recommendation message comprising recommender presence information
US9164993B2 (en) System and method for propagating a media item recommendation message comprising recommender presence information
CN110809175B (en) Video recommendation method and device
CN113284523A (en) Dynamic effect display method and device, computer equipment and storage medium
US20090259623A1 (en) Systems and Methods for Associating Metadata with Media
US20080301241A1 (en) System and method of generating a media item recommendation message with recommender presence information
JP5113796B2 (en) Emotion matching device, emotion matching method, and program
CN111325004B (en) File commenting and viewing method
CN110995569B (en) Intelligent interaction method and device, computer equipment and storage medium
CN113785283A (en) Managing access to digital assets
US20110081967A1 (en) Interactive media social game
US10990625B2 (en) Playlist preview
Prior Beyond Napster: Popular music and the normal Internet
CN108292411A (en) Video content item is generated using subject property
US10216824B2 (en) Explanatory animation generation
Sandler et al. Semantic web technology for new experiences throughout the music production-consumption chain
US9685190B1 (en) Content sharing
JP6144477B2 (en) Collaboration singing video display system
Allik et al. Join my party! How can we enhance social interactions in music streaming?
US20240050849A1 (en) Intelligent song writer system
WO2023174073A1 (en) Video generation method and apparatus, and device, storage medium and program product
GÜVEN Thinking on the Changing Representation of Music on Cyberspace
JP2003108157A (en) Method, video server and karaoke playing terminal for receiving and using contribution of karaoke video work on karaoke playing terminal
Heuguet How Music Changed YouTube

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40050659

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination