CN110688046B - Song playing method and device and storage medium - Google Patents

Song playing method and device and storage medium Download PDF

Info

Publication number
CN110688046B
CN110688046B CN201910906717.5A CN201910906717A CN110688046B CN 110688046 B CN110688046 B CN 110688046B CN 201910906717 A CN201910906717 A CN 201910906717A CN 110688046 B CN110688046 B CN 110688046B
Authority
CN
China
Prior art keywords
playing
time
song
lyric
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910906717.5A
Other languages
Chinese (zh)
Other versions
CN110688046A (en
Inventor
林若曦
符跃峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Music Entertainment Technology Shenzhen Co Ltd
Original Assignee
Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Music Entertainment Technology Shenzhen Co Ltd filed Critical Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority to CN201910906717.5A priority Critical patent/CN110688046B/en
Publication of CN110688046A publication Critical patent/CN110688046A/en
Application granted granted Critical
Publication of CN110688046B publication Critical patent/CN110688046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application discloses a song playing method, a song playing device and a song playing storage medium, and belongs to the technical field of multimedia. In the application, when the gesture operation of a user on the playing progress bar is detected, the operation position corresponding to the gesture operation is obtained, the target song word corresponding to the operation position is highlighted and displayed, then the starting playing time of the target song word is determined to be the first playing time, and the song is played from the first playing time. Since the first playing time is the starting playing time of the target phrase, i.e. the song is played from the first word of the target phrase, the user can learn that the playing time corresponds to the first word in the target phrase which is highlighted, and thus the user can expect the position of the song to be played at the playing time.

Description

Song playing method and device and storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to a song playing method, an apparatus, and a storage medium.
Background
Currently, a variety of multimedia applications can be installed on a touch screen device, and a user can watch or listen to audio and video through the multimedia applications. For example, when the multimedia application is a music application, the touch screen device may play a song through the music application, and in the process of playing the song, a play progress bar of the song may be displayed on the touch screen device, and the user may operate the play progress bar to adjust the play progress of the song.
In the related art, when detecting a gesture operation of a user on a playing progress bar, a touch screen device may obtain an operation position corresponding to the user when the gesture operation is finished, determine a playing time corresponding to the operation position, display a sentence of lyrics corresponding to the playing time, and play a song from the playing time.
However, when a song is played at the playing time, the user cannot know which word of the displayed lyric corresponds to the playing time, and therefore, the user has no expectation on the position of the song to be played at the playing time.
Disclosure of Invention
The embodiment of the application provides a song playing method, a song playing device and a song playing storage medium. The technical scheme is as follows:
in one aspect, a song playing method is provided, and the method includes:
when detecting the gesture operation of a user on a playing progress bar, acquiring an operation position corresponding to the gesture operation, and emphatically displaying a target song sentence corresponding to the operation position;
determining the starting playing time of the target song words and sentences as a first playing time;
and starting to play the song from the second playing time.
Optionally, when the gesture operation is a sliding operation, the method further comprises:
determining a sliding direction of the gesture operation;
correspondingly, the emphatically displaying the target lyric corresponding to the operation position comprises:
and displaying the target lyrics on a background bar according to the sliding direction, wherein the weakening direction of the transparency of the background bar is the same as the sliding direction.
Optionally, after the obtaining of the operation position corresponding to the gesture operation, the method further includes:
displaying the lyrics of the previous sentence and the lyrics of the next sentence of the target lyric sentence;
when the user is detected to select any one of the lyrics of the previous sentence and the next sentence, the starting playing time of the selected lyrics is taken as a second playing time;
and starting to play the song from the second playing time.
Optionally, when the gesture operation is a long-press operation, the determining that the starting playing time of the target phrase is before a first playing time further includes:
and if the sliding operation of the user for the playing progress bar from the operation position is not detected, executing the step of determining the starting playing time of the target song sentence as the first playing time.
Optionally, if a sliding operation of the user for the playing progress bar from the operation position is detected, highlighting the words in the target song sentence in sequence according to a sliding track of the sliding operation on the playing progress bar;
determining the end position of the sliding operation, and determining the corresponding moment of the end position of the sliding operation on the playing progress bar as a third playing moment;
and starting to play the song from the third playing time.
In another aspect, there is provided a song playback apparatus including:
the acquisition module is used for acquiring an operation position corresponding to gesture operation when the gesture operation of a user on the playing progress bar is detected;
the first display module is used for emphatically displaying the target song words and sentences corresponding to the operation positions;
the first determining module is used for determining the starting playing time of the target song words and sentences as a first playing time;
and the first playing module is used for playing the song from the second playing moment.
Optionally, when the gesture operation is a sliding operation, the apparatus further includes:
the second determination module is used for determining the sliding direction of the gesture operation;
correspondingly, the first display module is specifically configured to display the target phrase on a background bar according to the sliding direction, and a decreasing direction of the transparency of the background bar is the same as the sliding direction.
Optionally, the apparatus further comprises:
the second display module is used for displaying the lyrics of the previous sentence and the lyrics of the next sentence of the target lyric sentence;
a third determining module, configured to, when a user selection operation of any one of the previous lyrics and the next lyric is detected, use a starting playing time of the selected lyric as a second playing time;
and the second playing module is used for playing the song from the second playing moment.
Optionally, when the gesture operation is a long press operation, the apparatus further includes:
and the triggering module is used for triggering the first determining module to determine that the starting playing time of the target song phrase is the first playing time if the sliding operation of the user for the playing progress bar from the operation position is not detected.
Optionally, the apparatus further comprises:
a third display module, configured to highlight, in sequence, the words in the target song sentence according to a sliding track of the sliding operation on the play progress bar if the sliding operation of the user on the play progress bar from the operation position is detected;
a fourth determining module, configured to determine an end position of the sliding operation, and determine a time corresponding to the end position of the sliding operation on the play progress bar as a third play time;
and the third playing module is used for playing the song from the third playing moment.
In another aspect, a song playback apparatus is provided, the song playback apparatus including a processor, a communication interface, a memory, and a communication bus;
the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing computer programs;
the processor is used for executing the program stored in the memory so as to realize the song playing method.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the song playback method provided in the preceding description.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
in the embodiment of the application, when the gesture operation of the user on the playing progress bar is detected, the operation position corresponding to the gesture operation is obtained, the target song word corresponding to the operation position is highlighted and displayed, then the starting playing time of the target song word is determined to be the first playing time, and the song is played from the first playing time. Since the first playing time is the starting playing time of the target phrase, i.e. the song is played from the first word of the target phrase, the user can learn that the playing time corresponds to the first word in the target phrase which is highlighted, and thus the user can expect the position of the song to be played at the playing time.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a song playing method provided in an embodiment of the present application;
fig. 2 is a schematic diagram illustrating a display of target phrases according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating another example of displaying target phrases according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating another example of displaying a target phrase provided in an embodiment of the present application;
fig. 5 is a schematic diagram illustrating another example of displaying a target phrase provided in an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a display of a target phrase according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a song playback apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the embodiments of the present application in detail, an application scenario related to the embodiments of the present application will be described.
Currently, a variety of multimedia applications can be installed on a touch screen device, and a user can watch or listen to audio and video through the multimedia applications. For example, when the multimedia application is a music application, the touch screen device may play a song through the music application. In the process of playing a song, a playing progress bar and lyrics of the song are displayed on the touch screen device, and in this case, the touch screen device can adjust the playing progress bar or the displayed lyrics by using the method provided by the embodiment of the application, so that the playing progress of the song is adjusted. For another example, when the touch screen device plays a music short or other musical works, the method provided by the embodiment of the present application may also be used to adjust the playing progress.
Next, a song playback method provided in an embodiment of the present application will be described.
Fig. 1 is a flowchart of a song playing method according to an embodiment of the present application. As shown in fig. 1, the method comprises the steps of:
step 101: when the gesture operation of the user on the playing progress bar is detected, the operation position corresponding to the gesture operation is obtained, and the target song sentence corresponding to the operation position is highlighted and displayed.
In the embodiment of the application, the touch screen device can play the song through the installed music application or other applications, in the process of playing the song, a playing progress bar for playing the song is displayed on the touch screen device, and a user can perform gesture operation on the playing progress bar. When the touch screen device detects gesture operation of a user on the playing progress bar, an operation position corresponding to the gesture operation can be detected, a target song word and phrase corresponding to the operation position is highlighted and displayed, and then the playing progress of the song is adjusted according to the operation position.
It should be noted that the gesture operation performed on the play progress bar by the user may be a click operation, a slide operation, a long-press operation, or the like. When the gesture operation is a click operation, the operation position corresponding to the gesture operation actually refers to a position where the user clicks the screen. When the gesture operation is a sliding operation, the touch screen device may detect an initial sliding position where a user finger starts sliding, and when it is detected that the user finishes sliding, determine a position where the finger is located when the sliding is finished as an operation position corresponding to the end of the sliding operation. When the gesture operation is a long-press operation, the touch screen device may detect an operation duration of a certain operation position of the playing progress bar by a user, and when the operation duration is greater than a first threshold, it is determined that the gesture operation is the long-press operation, and at this time, the touch screen device may use a position where the user presses for a long time as an operation position corresponding to the gesture operation.
In the embodiment of the application, after the operation position corresponding to the gesture operation is obtained, the target song sentence corresponding to the operation position can be determined according to the corresponding moment of the operation position on the playing progress bar.
Generally, when a song is played, the song is played according to an audio file of the song, and lyrics are displayed according to a lyric file of the song. The lyric file comprises a lyric text and a time tag, wherein the lyric text comprises all lyrics corresponding to the song, each lyric corresponds to one time tag, and the time tag can be used for indicating the playing time of the lyric in the playing process of the song. The lyric file can be downloaded from a server when the song is played, and can also be acquired from a file stored locally.
In one implementation, the time tag for each lyric may include a start play time, which may indicate a corresponding play time for the first word of the lyric during the playing of the song. Correspondingly, after the operation position corresponding to the gesture operation is obtained, the corresponding time of the operation position on the playing progress bar is compared with the playing starting time included in the time label of each lyric in the lyric file, and if the corresponding time of the operation position on the playing progress bar is positioned between two playing starting times corresponding to two adjacent lyrics, the former lyric in the two adjacent lyrics is used as the target lyric.
In another implementation, the time tag for each sentence of lyrics may include a start play time and an end play time. The starting playing time may indicate a playing time corresponding to a first word of the lyrics of the sentence in the playing process of the song, and the ending playing time is used to indicate a playing time corresponding to a last word of the lyrics of the sentence in the playing process of the song. Correspondingly, after the corresponding time of the operation position on the playing progress bar is determined, the corresponding time of the operation position on the playing progress bar is compared with the playing starting time and the playing ending time included in the time labels of each lyric in the lyric file, and if the corresponding time of the operation position on the playing progress bar is between the playing starting time and the playing ending time of a certain lyric, the lyric is taken as a target lyric sentence.
It should be noted that when the time corresponding to the operation position on the play progress bar is the same as any one of the plurality of start play times included in the lyric file, the lyric sentence corresponding to the start play time is determined to be the target lyric sentence.
In addition, since a song usually has an interlude section without corresponding lyrics, in this embodiment of the application, if the corresponding time of the operation position on the playing progress bar is between the playing start time and the playing end time of a certain interlude section, the touch screen device may acquire a lyric adjacent to the playing end time and the playing start time of the interlude as the target lyric, or the touch screen device may acquire a lyric adjacent to the playing start time and the playing end time of the interlude as the target lyric.
As can be seen from the foregoing description, the gesture operation of the user on the playing progress bar of the song may be a click operation, a slide operation, a long-press operation, or the like. Based on this, after determining the target lyric sentence, the touch screen device may emphasize and display the target lyric sentence in different ways according to different gesture operations.
For example, when the gesture operation is a click operation, the touch screen device may highlight the target lyric, for example, the touch screen device may display the target lyric in a font, a font size, a color, and the like different from other lyrics. In addition, the touch screen device may further highlight the target phrase on a background bar, where the background bar may be a specified shape or a specified color, for example, the background bar may be a rectangle, a rounded rectangle, or the like, and the color of the background bar may be gray, red, or the like, which is not limited in this embodiment of the present application.
Fig. 2 is a schematic diagram illustrating a display of a target phrase provided in an embodiment of the present application. Referring to fig. 2, when it is detected that the user clicks the play progress bar, the target phrase determined at the end of the clicking operation is "where you will be", and "where you will be" is highlighted on the background bar, which is in the shape of a rounded rectangle and is highlighted in black as shown in fig. 2.
Optionally, when the gesture operation is a sliding operation, when the sliding operation is detected, the touch screen device may further determine a sliding direction of the gesture operation, where the sliding direction may include sliding left and sliding right, and typically, sliding left is used to indicate a playing progress of a fast-backward song and sliding right is used to indicate a playing progress of a fast-forward song. Specifically, the touch screen device may detect an operation position corresponding to a start time and an operation position corresponding to an end time of a sliding operation, and determine a sliding direction according to the two operation positions. Accordingly, after determining the sliding direction, the touch screen device may highlight the target phrase on the background bar according to the sliding direction. Wherein the decreasing direction of the transparency of the background bar may be the same as the sliding direction. For example, when the sliding direction is leftward sliding, the transparency of the background bar may gradually decrease from right to left, and when the sliding direction is rightward sliding, the transparency of the background bar may gradually decrease from left to right. In addition, the background bar may be in a designated shape or a designated color, and the like, which is not limited in the embodiments of the present application.
Optionally, in a possible implementation, the touch screen device may further display text information for indicating fast forward or fast reverse on the background bar according to the sliding direction. For example, when the sliding direction is sliding to the right, text information of "fast forward" may be displayed on the background bar, and when the sliding direction is sliding to the left, text information of "fast backward" may be displayed on the background bar. In addition, the touch screen device may display the target phrase according to the sliding direction in other manners to indicate fast backward or fast forward, which is not limited in the embodiment of the present application.
Fig. 3 is a schematic diagram illustrating another example of displaying target phrases according to an embodiment of the present disclosure. Referring to fig. 3, the user slides the play progress bar to the left, and the determined target lyric sentence is "where you will be" at the end of the sliding operation, the lyric of the sentence is highlighted, and since the sliding direction is to the left, the lyric of the sentence can be highlighted on the background bar, and the transparency of the background bar can be weakened from left to right.
Fig. 4 is a schematic diagram illustrating another example of displaying a target phrase provided in an embodiment of the present application. Referring to fig. 4, the user slides the play progress bar to the right, and the determined target lyric sentence is "where you will be" at the end of the sliding operation, the lyric of the sentence is highlighted, and since the sliding direction is sliding to the right, the lyric of the sentence can be highlighted on the background bar, and the transparency of the background bar can be weakened from right to left.
Step 102: and determining the starting playing time of the target song words and sentences as a first playing time.
In the embodiment of the application, besides determining the target song words corresponding to the operation positions, the touch screen device can also obtain the play starting time of the target song words, and the play starting time of the target song words is determined to be the first play time.
In the embodiment of the application, when the time tag of each lyric in the lyric file comprises a starting playing time, the starting playing time corresponding to the target lyric can be used as the first playing time while the target lyric is determined according to the method or after the target lyric is determined.
Illustratively, the lyrics text in a song includes "suddenly want you, where you will be, happy or committed". Wherein, the time label of "suddenly and well want you" is [00:01:13], that is, the starting playing time of the first word "suddenly" in the lyric is 1 minute 13 seconds. The time stamp of "where you will be" is 00:01:18, i.e., the starting playing time of the first word "you" in the lyric is 1 minute 18 seconds. The time label of 'happy or committed' is 00:01:24, that is, the playing start time of the first word 'past' in the lyric is 1 minute and 24 seconds. If the corresponding time of the operation position when the gesture operation is finished is [00:01:19], the [00:01:19] is positioned between [00:01:18] and [00:01:24], so that the target song phrase of where you will be is taken as the 'where you will be', and the first playing time is taken as [00:01:18 ].
In the embodiment of the application, when the time tag of each lyric in the lyric file comprises a starting playing time and an ending playing time, the starting playing time corresponding to the lyric can be used as the first playing time while the target lyric is determined according to the method or after the target lyric is determined.
Illustratively, the lyrics text in a song includes "how you want to, where you will be, how happy or consignment" where you want to "suddenly" has a start play time of [00:01:13], an end play time of [00:01:17], a start play time of [00:01:18], an end play time of [00:01:23], a start play time of [00:01:24], and an end play time of [00:01:30 ]. If the corresponding time of the operation position on the playing progress bar is [00:01:19], and [00:01:19] is between [00:01:18] and [00:01:24], the target phrase of where you will be is the "where you will be", and the first playing time is [00:01:18 ].
Step 103: the song is played from the first play time.
In an embodiment of the application, after the first play time is determined, the touch screen device may start to play the song from the first play time.
Optionally, after the operation position corresponding to the gesture operation of the user is obtained, a previous lyric and a next lyric of the target lyric sentence may be further displayed on the interface of the touch screen device, and when a selection operation of the user on any lyric of the previous lyric and the next lyric is detected, the starting playing time of the selected lyric may be used as a second playing time, and the song is played from the second playing time.
In the embodiment of the application, the touch screen interface can display the target lyric, and can also display the lyric of the previous sentence and the lyric of the next sentence of the target lyric, and a user can select any displayed lyric to adjust the playing progress of the song.
For example, after determining the target lyric, the touch screen device may further determine a last lyric and a next lyric of the target lyric. Specifically, in the lyric text, each sentence of lyrics may be arranged according to the order in which the lyrics are played. Based on this, after the target song word is determined, the touch screen device can also obtain the lyrics of the previous sentence and the lyrics of the next sentence of the target song word from the lyrics text. And then, displaying the lyrics of the previous sentence above the target lyric sentence, and displaying the lyrics of the next sentence below the target lyric sentence. The user can click any one of the displayed previous lyrics and the next lyrics, when the user is detected to select any one of the previous lyrics and the next lyrics, the touch screen device can obtain the starting playing time of the lyrics as a second playing time according to the time tag corresponding to the selected lyrics, and then the song is played from the second playing time.
Referring to fig. 2, when it is detected that the user clicks the play progress bar, the determined target lyric sentence at the end of the clicking operation is "where you will be", after the determination of the target lyric sentence, the last lyric of the determined target lyric sentence is "who will suddenly want to you", and the next lyric is "happy or torquered", then "where you will be", is highlighted, and "who will suddenly want to you" is displayed above "where you will be", and "happy or torquered" is displayed below "where you will be". The user can execute a click operation on any one of the lyrics of the previous sentence and the lyrics of the next sentence, and when the touch screen device detects the click operation, the lyrics can be played from the playing starting time of the lyrics acted by the click operation.
As can be seen from the foregoing, the first playing time is the starting playing time of the target song word, that is, when the touch screen device adjusts the playing progress of the song according to the operation of the user on the playing progress bar, the touch screen device directly starts playing the song from the first word of the target song word corresponding to the time corresponding to the operation position, and meanwhile, the target song word is emphatically displayed, so that the user can learn that the first word of the target song word corresponds to the playing time of the song after the playing progress is adjusted, and similarly, the song is played from the second playing time, and the song is also played from the first word of the song word selected by the user, so that the user can expect the position of the song to be played at the playing time.
In addition, in addition to the foregoing song playing from the first word of the target phrase selected by the user, the song can also be played from any word of the target phrase selected by the user. For example, when the gesture operation in step 101 is a long-press operation, before determining that the play start time of the target phrase is the first play time, the touch screen device may further detect whether the user performs a sliding operation on the play progress bar from an operation position corresponding to when the long-press operation is finished. And if the sliding operation of the user for the playing progress bar from the corresponding operation position when the long-time pressing operation is finished is not detected, executing a step of determining the starting playing time of the target song sentence as the first playing time. If the sliding operation of the user for the playing progress bar from the operation position corresponding to the end of the long-time pressing operation is detected, the characters in the target song words and phrases are highlighted in sequence according to the sliding track of the sliding operation on the playing progress bar, the end position of the sliding operation is determined, the moment corresponding to the end position of the sliding operation on the playing progress bar is determined as a third playing moment, and the song is played from the third playing moment.
When the gesture operation is detected to be a long-time pressing operation, the touch screen device can emphatically display the target song words according to the method, and meanwhile, the touch screen device can also display a prompt text or a prompt icon for reminding a user that the user can perform sliding operation on the playing progress bar after the long-time pressing operation so as to select any character in the target song words. The user may decide whether to perform a slide operation based on the prompt text or the prompt icon. Correspondingly, if the touch screen device detects that the user starts the sliding operation on the playing progress bar from the corresponding operation position when the long-time pressing operation is finished, the characters in the target song sentence are highlighted in sequence according to the sliding track of the sliding operation on the playing progress bar.
In the embodiment of the application, after the touch screen device determines the target phrase according to the operation position corresponding to the end of the long-press operation, when the target phrase is displayed, the touch screen device may highlight the word corresponding to the playing time corresponding to the operation position in the target phrase, and normally display other words in the target phrase. When the touch screen device detects that a user performs sliding operation on the playing progress bar from an operation position corresponding to the end of long-time pressing operation, the operation position of the sliding operation on the playing progress bar can be detected in real time, a sliding track of the sliding operation is determined, the sliding track can slide from left to right or from right to left from the operation position when the long-time pressing operation is ended, and characters in a target song sentence are sequentially highlighted backwards or forwards in sequence from characters corresponding to the long-time pressing operation according to the playing time of each point on the sliding track.
It should be noted that the time tag of each lyric in the lyric text may further include a playing time corresponding to each word in the lyric, where the playing time is used to indicate a corresponding playing time of the word in the playing process. In the process that the user slides the playing progress bar, the operation position of the sliding operation of the user can be detected in real time, the time corresponding to the operation position of the sliding operation on the playing progress bar is compared with the playing time of each character in the target song sentence in real time, and the character corresponding to the time which is the same as the time corresponding to the operation position is highlighted.
Illustratively, the target phrase displayed on the touch screen device is "where you will be", the word corresponding to the operation position corresponding to the long-press operation of the user on the play progress bar is "you" in the target phrase, when the user slides the play progress bar to the right, the words in the target phrase are highlighted starting from the "you" word and are sequentially highlighted to the right, and when the sliding operation is finished, the last highlighted word is "on", all the words sequentially highlighted are "you", "meeting" and "on". The manner of highlighting each character may refer to the manner of highlighting the target song sentence introduced above, and the embodiment of the present application is not described herein again. And when the sliding operation is finished, determining the end position of the sliding operation, determining the corresponding moment of the end position on the playing progress bar as a third playing moment, and playing the song from the third playing moment.
Optionally, in this embodiment of the present application, the time tag of each lyric in the lyric text may further include a start playing time and an end playing time of each word, where the start playing time is used to indicate a corresponding start playing time of the word in the playing process, and the end playing time is used to indicate a corresponding end playing time of the word in the playing process. When the current corresponding operation position of the sliding operation is detected in real time and the corresponding time of the operation position on the playing progress bar is determined, the time can be compared with the playing starting time and the playing ending time corresponding to each word in the target song sentence, and if the time is between the playing starting time and the playing ending time corresponding to a certain word, the word is highlighted as the target word.
Illustratively, the target song phrase is "where you will be", where "you" has a start play time of [00:01:18], an end play time of [00:01:19], a start play time of [00:01:19], an end play time of [00:01:20], a start play time of [00:01:20], an end play time of [00:01:22], a start play time of [00:01:22], an end play time of [00:01:23], a start play time of [00:01:23], and an end play time of [00:01:24 ]. Assuming that the time corresponding to the operation position of the sliding operation detected by the touch screen device in real time is [00:01:21], the [00:01:21] is positioned between the play starting time [00:01:20] and the play ending time [00:01:22], at this time, the "on" can be highlighted.
It should be noted that the long-press operation and the sliding operation after the long-press operation may be continuous, that is, the user may continue to perform the sliding operation without lifting the finger after performing the long-press operation.
In addition, after the user slides the play progress bar to the right, the play progress bar may be further slid to the left before the sliding operation is finished, for example, the word in the target song sentence highlighted when the sliding operation to the right is finished is "you will be", and the word highlighted when the sliding operation to the left is finished may be "you will", that is, the highlight is canceled "in the process of sliding to the left.
Fig. 5 is a schematic diagram illustrating another example of displaying a target phrase according to the present embodiment. Referring to fig. 5, the user presses the playing progress bar for a long time to emphatically display the target phrase, where the touch screen device may highlight the target phrase on the background bar and display a prompt text message of "slide word by word after pressing for a long time to refine" on the interface of the touch screen device, or may display a prompt icon at an operation position corresponding to the long-time pressing operation on the playing progress bar to indicate that the user is performing the long-time pressing operation, and if the word corresponding to the operation position corresponding to the end of the long-time pressing operation is "you", as shown in fig. 5, the "you" may be highlighted.
Fig. 6 is a schematic diagram illustrating another example of displaying a target phrase according to the present embodiment. Referring to fig. 6, the user continues to perform the sliding operation after performing the long press operation on the play progress bar, where a word corresponding to the operation position corresponding to the end of the long press operation is "you", the play progress bar is slid rightward after the long press operation is ended, the words in the target song sentence are highlighted in sequence according to the sliding track of the sliding operation on the play progress bar, it is determined that the target word is the "in" word when the sliding operation is ended, and the words highlighted in a color-changing manner in the sliding operation process are "you", "will", and "in".
In the embodiment of the application, when the gesture operation of the user on the playing progress bar is detected, the operation position corresponding to the gesture operation is obtained, the target song word corresponding to the operation position is highlighted and displayed, then the starting playing time of the target song word is determined to be the first playing time, and the song is played from the first playing time. Since the first playing time is the starting playing time of the target phrase, i.e. the song is played from the first word of the target phrase, the user can learn that the playing time corresponds to the first word in the target phrase which is highlighted, and thus the user can expect the position of the song to be played at the playing time.
Referring to fig. 7, an embodiment of the present application provides a song playback apparatus 700, where the song playback apparatus may be applied to a client, where the apparatus 700 includes:
the obtaining module 701 is configured to obtain an operation position corresponding to a gesture operation when the gesture operation of the user on the play progress bar is detected;
a first display module 702, configured to highlight and display the target phrase corresponding to the operation position;
a first determining module 703, configured to determine that a play start time of a target song phrase is a first play time;
a first playing module 704, configured to play the song from a first playing time.
Optionally, when the gesture operation is a sliding operation, the apparatus further comprises:
the second determination module is used for determining the sliding direction of the gesture operation;
correspondingly, the first display module is specifically configured to display the target song sentence on the background bar according to the sliding direction, and the weakening direction of the transparency of the background bar is opposite to the sliding direction.
Optionally, the apparatus further comprises:
the second display module is used for displaying the lyrics of the previous sentence and the lyrics of the next sentence of the target lyrics;
a third determining module, configured to, when a selection operation of a user on any one of the lyrics of the previous sentence and the lyrics of the next sentence is detected, take a starting playing time of the selected lyrics as a second playing time;
and the second playing module is used for playing the song from the second playing moment.
Optionally, when the gesture operation is a long press operation, the apparatus further includes:
and the triggering module is used for triggering the first determining module to determine that the starting playing time of the target song sentence is the first playing time if the sliding operation of the user for the playing progress bar from the operation position is not detected.
Optionally, the apparatus further comprises:
the third display module is used for sequentially highlighting the characters in the target song words and sentences according to the sliding track of the sliding operation on the playing progress bar if the sliding operation of the user for the playing progress bar from the operation position is detected;
the fourth determining module is used for determining the end position of the sliding operation and determining the corresponding moment of the end position of the sliding operation on the playing progress bar as a third playing moment;
and the third playing module is used for playing the song from the third playing moment.
In summary, in the embodiment of the present application, when a gesture operation of a user on a playing progress bar is detected, an operation position corresponding to the gesture operation is obtained, a target phrase corresponding to the operation position is highlighted and displayed, then, a play start time of the target phrase is determined as a first play time, and a song is played from the first play time. Since the first playing time is the starting playing time of the target phrase, i.e. the song is played from the first word of the target phrase, the user can learn that the playing time corresponds to the first word in the target phrase which is highlighted, and thus the user can expect the position of the song to be played at the playing time.
It should be noted that: in the song playing apparatus provided in the above embodiment, only the division of the above functional modules is used for illustration when a song is played, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the song playing apparatus and the song playing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiments and are not described herein again.
Fig. 8 is a block diagram of a terminal 800 according to an embodiment of the present disclosure. The terminal 800 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The terminal 800 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 800 includes: a processor 801 and a memory 802.
The processor 801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 801 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 801 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 801 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 801 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 802 may include one or more computer-readable storage media, which may be non-transitory. Memory 802 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 802 is used to store at least one instruction for execution by processor 801 to implement a song playback method provided by method embodiments herein.
In some embodiments, the terminal 800 may further include: a peripheral interface 803 and at least one peripheral. The processor 801, memory 802 and peripheral interface 803 may be connected by bus or signal lines. Various peripheral devices may be connected to peripheral interface 803 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 804, a touch screen display 805, a camera 806, an audio circuit 807, a positioning component 808, and a power supply 809.
The peripheral interface 803 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 801 and the memory 802. In some embodiments, the processor 801, memory 802, and peripheral interface 803 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 801, the memory 802, and the peripheral interface 803 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 804 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 804 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 804 converts an electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 804 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 805 is a touch display, the display 805 also has the ability to capture touch signals on or above the surface of the display 805. The touch signal may be input to the processor 801 as a control signal for processing. At this point, the display 805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 805 may be a front panel disposed on the terminal 800; in other embodiments, the display 805 may be at least two, respectively disposed on different surfaces of the terminal 800 or in a folded design; in other embodiments, the display 805 may be a flexible display disposed on a curved surface or a folded surface of the terminal 800. Even further, the display 805 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 805 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 806 is used to capture images or video. Optionally, camera assembly 806 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 806 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 807 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 801 for processing or inputting the electric signals to the radio frequency circuit 804 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 800. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 801 or the radio frequency circuit 804 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 807 may also include a headphone jack.
The positioning component 808 is used to locate the current geographic position of the terminal 800 for navigation or LBS (Location Based Service). The Positioning component 808 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 809 is used to provide power to various components in terminal 800. The power supply 809 can be ac, dc, disposable or rechargeable. When the power supply 809 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 800 also includes one or more sensors 810. The one or more sensors 810 include, but are not limited to: acceleration sensor 811, gyro sensor 812, pressure sensor 813, fingerprint sensor 814, optical sensor 815 and proximity sensor 816.
The acceleration sensor 811 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 800. For example, the acceleration sensor 811 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 801 may control the touch screen 805 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 811. The acceleration sensor 811 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 812 may detect a body direction and a rotation angle of the terminal 800, and the gyro sensor 812 may cooperate with the acceleration sensor 811 to acquire a 3D motion of the user with respect to the terminal 800. From the data collected by the gyro sensor 812, the processor 801 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 813 may be disposed on the side bezel of terminal 800 and/or underneath touch display 805. When the pressure sensor 813 is disposed on the side frame of the terminal 800, the holding signal of the user to the terminal 800 can be detected, and the processor 801 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 813. When the pressure sensor 813 is disposed at a lower layer of the touch display screen 805, the processor 801 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 805. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 814 is used for collecting a fingerprint of the user, and the processor 801 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 814, or the fingerprint sensor 814 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 801 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 814 may be disposed on the front, back, or side of terminal 800. When a physical button or a vendor Logo is provided on the terminal 800, the fingerprint sensor 814 may be integrated with the physical button or the vendor Logo.
The optical sensor 815 is used to collect the ambient light intensity. In one embodiment, the processor 801 may control the display brightness of the touch screen 805 based on the ambient light intensity collected by the optical sensor 815. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 805 is increased; when the ambient light intensity is low, the display brightness of the touch display 805 is turned down. In another embodiment, the processor 801 may also dynamically adjust the shooting parameters of the camera assembly 806 based on the ambient light intensity collected by the optical sensor 815.
A proximity sensor 816, also known as a distance sensor, is typically provided on the front panel of the terminal 800. The proximity sensor 816 is used to collect the distance between the user and the front surface of the terminal 800. In one embodiment, when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually decreases, the processor 801 controls the touch display 805 to switch from the bright screen state to the dark screen state; when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 becomes gradually larger, the processor 801 controls the touch display 805 to switch from the screen-on state to the screen-on state.
Those skilled in the art will appreciate that the configuration shown in fig. 8 is not intended to be limiting of terminal 800 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Embodiments of the present application further provide a non-transitory computer-readable storage medium, where instructions in the storage medium, when executed by a processor of a terminal, enable the terminal to perform the song playing method provided in the embodiments shown in fig. 1 to 6.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a computer, cause the computer to execute the song playing method provided in the embodiments shown in fig. 1 to 6.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (7)

1. A method for playing a song, the method comprising:
when detecting gesture operation of a user on a playing progress bar, acquiring an operation position corresponding to the gesture operation, emphatically displaying a target song word corresponding to the operation position, and corresponding to different gesture operations in different ways of emphatically displaying the target song word;
determining the starting playing time of the target song words and sentences as a first playing time;
starting to play the song from the first playing time;
the lyric file of the song comprises a time tag, each lyric corresponds to one time tag, the time tag is used for indicating the playing time of each lyric in the playing process of the song, the time tag of each lyric comprises a starting playing time and an ending playing time, and before the target lyric corresponding to the operation position is highlighted, the method further comprises the following steps:
if the corresponding time of the operation position on the playing progress bar is between the starting playing time and the ending playing time of a certain interlude part, acquiring a lyric adjacent to the ending playing time and the starting playing time of the interlude as the target lyric sentence; alternatively, the first and second electrodes may be,
if the corresponding time of the operation position on the playing progress bar is between the playing starting time and the playing ending time of a certain interlude part, acquiring a lyric adjacent to the playing starting time and the playing ending time of the interlude as the target lyric sentence;
when the gesture operation is a long-press operation, the determining that the starting playing time of the target song sentence is before a first playing time further includes:
if the sliding operation of the user for the playing progress bar from the operation position is not detected, the step of determining the starting playing time of the target song sentence as a first playing time is executed;
if the sliding operation of the user for the playing progress bar from the operation position is detected, highlighting the words in the target song words and sentences in sequence according to the sliding track of the sliding operation on the playing progress bar; determining the end position of the sliding operation, and determining the corresponding moment of the end position of the sliding operation on the playing progress bar as a third playing moment; and starting to play the song from the third playing time.
2. The method of claim 1, wherein when the gesture operation is a swipe operation, the method further comprises:
determining a sliding direction of the gesture operation;
correspondingly, the emphatically displaying the target lyric corresponding to the operation position comprises:
and displaying the target lyric sentence on a background bar according to the sliding direction, wherein the weakening direction of the transparency of the background bar is the same as the sliding direction.
3. The method according to claim 1, wherein after obtaining the operation position corresponding to the gesture operation, the method further comprises:
displaying the lyrics of the previous sentence and the lyrics of the next sentence of the target lyric sentence;
when the user is detected to select any one of the lyrics of the previous sentence and the next sentence, the starting playing time of the selected lyrics is taken as a second playing time;
and starting to play the song from the second playing time.
4. A song playback apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring an operation position corresponding to gesture operation when the gesture operation of a user on the playing progress bar is detected;
the first display module is used for emphatically displaying the target song words and sentences corresponding to the operation positions, and different gesture operations correspond to different modes for emphatically displaying the target song words and sentences;
the first determining module is used for determining the starting playing time of the target song words and sentences as a first playing time;
the first playing module is used for playing songs from the first playing moment;
the lyric file of the song comprises a time tag, each sentence of lyric corresponds to one time tag, the time tag is used for indicating the playing time of each sentence of lyric in the playing process of the song, the time tag of each sentence of lyric comprises a starting playing time and an ending playing time, and the device further comprises a module used for executing the following operations:
if the corresponding time of the operation position on the playing progress bar is between the starting playing time and the ending playing time of a certain interlude part, acquiring a lyric adjacent to the ending playing time and the starting playing time of the interlude as the target lyric sentence; alternatively, the first and second electrodes may be,
if the corresponding time of the operation position on the playing progress bar is between the playing starting time and the playing ending time of a certain interlude part, acquiring a lyric adjacent to the playing starting time and the playing ending time of the interlude as the target lyric sentence;
when the gesture operation is a long press operation, the apparatus further comprises:
a triggering module, configured to trigger the first determining module to determine that the play start time of the target song phrase is a first play time if a sliding operation of the user with respect to the play progress bar from the operation position is not detected;
a third display module, configured to highlight, in sequence, the words in the target song sentence according to a sliding track of the sliding operation on the play progress bar if the sliding operation of the user on the play progress bar from the operation position is detected;
a fourth determining module, configured to determine an end position of the sliding operation, and determine a time corresponding to the end position of the sliding operation on the play progress bar as a third play time;
and the third playing module is used for playing the song from the third playing moment.
5. The apparatus of claim 4, wherein when the gesture operation is a slide operation, the apparatus further comprises:
the second determination module is used for determining the sliding direction of the gesture operation;
correspondingly, the first display module is specifically configured to display the target phrase on a background bar according to the sliding direction, and a decreasing direction of the transparency of the background bar is the same as the sliding direction.
6. The apparatus of claim 4, further comprising:
the second display module is used for displaying the lyrics of the previous sentence and the lyrics of the next sentence of the target lyric sentence;
a third determining module, configured to, when a user selection operation of any one of the previous lyrics and the next lyric is detected, use a starting playing time of the selected lyric as a second playing time;
and the second playing module is used for playing the song from the second playing moment.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 3.
CN201910906717.5A 2019-09-24 2019-09-24 Song playing method and device and storage medium Active CN110688046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910906717.5A CN110688046B (en) 2019-09-24 2019-09-24 Song playing method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910906717.5A CN110688046B (en) 2019-09-24 2019-09-24 Song playing method and device and storage medium

Publications (2)

Publication Number Publication Date
CN110688046A CN110688046A (en) 2020-01-14
CN110688046B true CN110688046B (en) 2022-02-25

Family

ID=69110172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910906717.5A Active CN110688046B (en) 2019-09-24 2019-09-24 Song playing method and device and storage medium

Country Status (1)

Country Link
CN (1) CN110688046B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114495993A (en) * 2021-12-24 2022-05-13 北京梧桐车联科技有限责任公司 Progress adjusting method, device and equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103839565A (en) * 2012-11-23 2014-06-04 宏达国际电子股份有限公司 Method of displaying music lyrics and portable handheld electronic device using the same
CN104091609A (en) * 2013-05-22 2014-10-08 腾讯科技(深圳)有限公司 Lyric preview method and device of music player
CN105611371A (en) * 2015-12-22 2016-05-25 深圳市金立通信设备有限公司 Multimedia playing method and terminal
CN107765987A (en) * 2017-11-03 2018-03-06 北京密境和风科技有限公司 A kind of user interaction approach and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7725828B1 (en) * 2003-10-15 2010-05-25 Apple Inc. Application of speed effects to a video presentation
US9176658B1 (en) * 2013-12-10 2015-11-03 Amazon Technologies, Inc. Navigating media playback using scrollable text
CN104967910B (en) * 2014-10-29 2018-11-23 广州酷狗计算机科技有限公司 Multimedia progress monitoring method and device
CN105609121B (en) * 2014-11-20 2019-03-12 广州酷狗计算机科技有限公司 Multimedia progress monitoring method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103839565A (en) * 2012-11-23 2014-06-04 宏达国际电子股份有限公司 Method of displaying music lyrics and portable handheld electronic device using the same
CN104091609A (en) * 2013-05-22 2014-10-08 腾讯科技(深圳)有限公司 Lyric preview method and device of music player
CN105611371A (en) * 2015-12-22 2016-05-25 深圳市金立通信设备有限公司 Multimedia playing method and terminal
CN107765987A (en) * 2017-11-03 2018-03-06 北京密境和风科技有限公司 A kind of user interaction approach and device

Also Published As

Publication number Publication date
CN110688046A (en) 2020-01-14

Similar Documents

Publication Publication Date Title
CN107908929B (en) Method and device for playing audio data
CN109144346B (en) Song sharing method and device and storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN108965922B (en) Video cover generation method and device and storage medium
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN110572716B (en) Multimedia data playing method, device and storage medium
CN109346111B (en) Data processing method, device, terminal and storage medium
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
CN111061405B (en) Method, device and equipment for recording song audio and storage medium
CN110288689B (en) Method and device for rendering electronic map
CN109982129B (en) Short video playing control method and device and storage medium
CN111081277B (en) Audio evaluation method, device, equipment and storage medium
CN111092991B (en) Lyric display method and device and computer storage medium
CN110798327B (en) Message processing method, device and storage medium
CN109743461B (en) Audio data processing method, device, terminal and storage medium
CN111083526A (en) Video transition method and device, computer equipment and storage medium
CN108509127B (en) Method and device for starting screen recording task and computer equipment
CN112069350A (en) Song recommendation method, device, equipment and computer storage medium
CN110688046B (en) Song playing method and device and storage medium
CN112118482A (en) Audio file playing method and device, terminal and storage medium
CN108763521B (en) Method and device for storing lyric phonetic notation
CN109005359B (en) Video recording method, apparatus and storage medium
CN111370096A (en) Interactive interface display method, device, equipment and storage medium
CN111063372B (en) Method, device and equipment for determining pitch characteristics and storage medium
CN109101166B (en) Audio control method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant