CN111741366A - Audio playing method, device, terminal and storage medium - Google Patents

Audio playing method, device, terminal and storage medium Download PDF

Info

Publication number
CN111741366A
CN111741366A CN202010687831.6A CN202010687831A CN111741366A CN 111741366 A CN111741366 A CN 111741366A CN 202010687831 A CN202010687831 A CN 202010687831A CN 111741366 A CN111741366 A CN 111741366A
Authority
CN
China
Prior art keywords
video
playing
audio
interface
playing interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010687831.6A
Other languages
Chinese (zh)
Inventor
吴晗
李文涛
唐贝希
陈恒全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN202010687831.6A priority Critical patent/CN111741366A/en
Publication of CN111741366A publication Critical patent/CN111741366A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally

Abstract

The application provides an audio playing method, an audio playing device, a terminal and a storage medium, and belongs to the technical field of internet. The method comprises the following steps: the method comprises the steps of displaying a playing interface of a first audio, and simultaneously playing the first audio; detecting a first sliding operation on a playing interface; if the sliding direction of the first sliding operation is the target direction, the content in the playing interface is switched to the video picture in the first background video while the first audio is continuously played, and the first background video is the background video corresponding to the first audio. The method enables the user to control the terminal to play the video picture corresponding to the audio through sliding operation, so that the operation efficiency can be effectively improved. Moreover, the playing of the audio is not affected by the playing of the video picture, so that the user can watch the video picture corresponding to the audio under the condition that the listening of the user to the audio is not affected, and the playing effect is improved.

Description

Audio playing method, device, terminal and storage medium
Technical Field
The present application relates to the field of internet technologies, and in particular, to an audio playing method, an audio playing device, a terminal, and a storage medium.
Background
In daily life, users often use terminals to play Music for leisure and entertainment, and the users also watch Music MVs (Music videos) while playing Music.
In the related art, when music is played, an audio playing interface is displayed, the audio playing interface includes a button for acquiring more music information, after a user clicks the button, a button corresponding to various information of the music is popped up, such as a music source, an author, an album to which the music belongs, and the like, and after the user finds a button corresponding to an MV of the music, the user clicks the button to enter the video playing interface, and the MV is played in the video playing interface.
In the method, the user needs to perform multiple clicking operations to control the terminal to play the music MV, so that the operation efficiency is low.
Disclosure of Invention
The embodiment of the application provides an audio playing method, an audio playing device, a terminal and a storage medium, and can improve the operation efficiency of the terminal. The technical scheme is as follows:
in one aspect, an audio playing method is provided, and the method includes:
the method comprises the steps of playing a first audio while displaying a playing interface of the first audio;
detecting a first sliding operation on the playing interface;
if the sliding direction of the first sliding operation is the target direction, the content in the playing interface is switched to a video picture in a first background video while the first audio is continuously played, wherein the first background video is a background video corresponding to the first audio.
In one possible implementation manner, the switching the content in the playing interface to a video frame in a first background video while continuing to play the first audio includes:
displaying a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while continuing to play the first audio;
and if the sliding distance of the video cover in the playing interface reaches a reference distance, switching the content in the playing interface into the video picture.
In another possible implementation manner, the playing interface includes text information, and the text information is displayed on an upper layer of the playing interface in the process of displaying the dynamic effect.
In another possible implementation manner, the switching the content in the play interface to the video frame if the sliding distance of the video cover in the play interface reaches the reference distance includes:
and if the sliding distance of the video cover in the playing interface reaches the reference distance, the video picture is played in the playing interface in a full screen mode, or the video picture is played in a picture playing area in the playing interface.
In another possible implementation manner, the switching the content in the playing interface to a video frame in a first background video while continuing to play the first audio includes:
acquiring the playing progress of the first audio when the first sliding operation is detected;
determining an initial playing point of the video picture according to the playing progress;
and switching the content in the playing interface to the video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and when the playing duration of the video picture reaches the reference duration, adjusting the text information in the playing interface from the current position to the target position.
In another possible implementation manner, after the text information in the playing interface is adjusted from the current position to the target position, the method further includes:
and when the display restoring operation on the playing interface is detected, restoring the text information to the current position.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and hiding the target interface control in the playing interface when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, after hiding the target interface control in the play interface, the method further includes:
and when the display resuming operation on the playing interface is detected, resuming to display the target interface control.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and when the playing duration of the video picture reaches the reference duration, performing fuzzy processing on other areas except the picture playing area of the first background video in the playing interface.
In another possible implementation manner, after performing the blurring processing on the other areas except the picture playing area of the first background video in the playing interface, the method further includes:
and when the display restoring operation on the playing interface is detected, restoring to display the other areas before the fuzzy processing.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
when the audio switching operation on the playing interface is detected, playing a second audio according to the audio switching operation, and playing a video picture in a second background video in the playing interface, wherein the second background video is a background video corresponding to the second audio.
In another possible implementation manner, before the switching the content in the playing interface to the video frame in the first background video, the method further includes:
determining issuing user identifications of a plurality of background videos corresponding to the first audio;
and if the plurality of issued user identifications comprise the currently logged-in user identification, determining the background video corresponding to the currently logged-in user identification as the first background video.
In another possible implementation manner, the method further includes:
and if the plurality of issuing user identifications do not comprise the currently logged-in user identification, selecting the first background video from the plurality of background videos according to the heat values of the plurality of background videos.
In another possible implementation manner, if the sliding direction of the first sliding operation is a target direction, switching the content in the playing interface to a video picture in a first background video while continuing to play the first audio includes:
if the sliding direction of the first sliding operation is a target direction, sending a background video acquisition request for the first audio to a server, wherein the server is used for inquiring the first background video corresponding to the first audio;
receiving the first background video sent by the server;
and switching the content in the playing interface to the video picture while continuing to play the first audio.
In another possible implementation manner, the server is further configured to receive a publishing request sent by any terminal, where the publishing request carries the identifier of the first audio and the first background video; and correspondingly storing the identifier of the first audio and the first background video, wherein the first background video is obtained by editing any terminal in a background video setting interface of the first audio.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
detecting a second sliding operation on the playing interface;
and if the sliding direction of the second sliding operation is opposite to the target direction, switching the video picture into the original content of the playing interface.
In another possible implementation manner, the switching the content in the play interface to a video frame in a first background video while continuing to play the first audio includes:
and switching the background picture into the video picture while continuing to play the first audio.
In another aspect, an audio playing apparatus is provided, the apparatus including:
the first playing module is configured to play a first audio while displaying a playing interface of the first audio;
an operation detection module configured to detect a first sliding operation on the play interface;
and the second playing module is configured to switch the content in the playing interface to a video picture in a first background video while continuing to play the first audio if the sliding direction of the first sliding operation is the target direction, wherein the first background video is a background video corresponding to the first audio.
In a possible implementation manner, the second playing module includes:
the effect display sub-module is configured to display a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while the first audio is continuously played;
and the picture playing submodule is configured to switch the content in the playing interface into the video picture if the sliding distance of the video cover in the playing interface reaches a reference distance.
In another possible implementation manner, the playing interface includes text information, and the effect display sub-module is configured to display the text information on an upper layer of the playing interface in the process of displaying the dynamic effect.
In another possible implementation manner, the picture playing sub-module is configured to play the video picture in a full screen manner in the playing interface or play the video picture in a picture playing area in the playing interface if the sliding distance of the video cover in the playing interface reaches the reference distance.
In another possible implementation manner, the second playing module is configured to obtain a playing progress of the first audio when the first sliding operation is detected; determining an initial playing point of the video picture according to the playing progress; and switching the content in the playing interface to the video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point.
In another possible implementation manner, the apparatus further includes:
and the position adjusting module is configured to adjust the text information in the playing interface from the current position to the target position when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, the position adjusting module is further configured to restore the text information to the current position when a display restoring operation on the play interface is detected.
In another possible implementation manner, the control hiding module is configured to hide a target interface control in the playing interface when the playing duration of the video picture reaches a reference duration.
In another possible implementation manner, the control hiding module is further configured to resume displaying the target interface control when a resume display operation on the play interface is detected.
In another possible implementation manner, the apparatus further includes:
and the fuzzy processing module is configured to perform fuzzy processing on other areas except the picture playing area of the first background video in the playing interface when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, the blurring processing module is further configured to, when a display resuming operation on the playing interface is detected, resume displaying the other area before blurring processing.
In another possible implementation manner, the apparatus further includes:
the third playing module is configured to play a second audio according to the audio switching operation when the audio switching operation on the playing interface is detected, and play a video picture in a second background video in the playing interface, wherein the second background video is a background video corresponding to the second audio.
In another possible implementation manner, the apparatus further includes:
an identification determination module configured to determine a publishing user identification of a plurality of background videos corresponding to the first audio;
the video determining module is configured to determine a background video corresponding to the currently logged-in user identifier as the first background video if the plurality of published user identifiers include the currently logged-in user identifier.
In another possible implementation manner, the video determining module is further configured to select the first background video from the plurality of background videos according to the heat values of the plurality of background videos if the plurality of publishing user identifiers do not include the currently logged-in user identifier.
In another possible implementation manner, the second playing module is configured to send a background video obtaining request for the first audio to a server if the sliding direction of the first sliding operation is a target direction, where the server is configured to query the first background video corresponding to the first audio; receiving the first background video sent by the server; and switching the content in the playing interface to the video picture while continuing to play the first audio.
In another possible implementation manner, the server is further configured to receive a publishing request sent by any terminal, where the publishing request carries the identifier of the first audio and the first background video; and correspondingly storing the first audio and the first background video according to the identifier of the first audio, wherein the first background video is obtained by editing any terminal in a background video setting interface of the first audio.
In another possible implementation manner, the apparatus further includes:
an interface display module configured to detect a second sliding operation on the play interface; and if the sliding direction of the second sliding operation is opposite to the target direction, switching the video picture into the original content of the playing interface.
In another possible implementation manner, the content in the playing interface includes a background picture of the first audio, and the second playing module is configured to switch the background picture to the video picture while continuing to play the first audio.
In another aspect, a terminal is provided, where the terminal includes a processor and a memory, where the memory stores at least one program code, and the program code is loaded by the processor and executed to implement the operations executed in the audio playing method in any one of the above possible implementations.
In another aspect, a computer-readable storage medium is provided, in which at least one program code is stored, and the program code is loaded and executed by a processor to implement the operations performed in the audio playing method in any one of the above possible implementation manners.
In yet another aspect, a computer program product is provided, where the computer program product includes at least one program code, and the program code is loaded and executed by a processor to implement the operations performed in the audio playing method in any one of the above possible implementation manners.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
in the embodiment of the application, if the sliding operation in the target direction is detected, the content in the playing interface is switched to the video picture in the first background video corresponding to the first audio while the first audio continues to be played, so that the user can control the terminal to play the video picture corresponding to the audio through the sliding operation, and thus, the operation efficiency can be effectively improved. Moreover, the playing of the audio is not affected by the playing of the video picture, so that the user can watch the video picture corresponding to the audio under the condition that the listening of the user to the audio is not affected, and the playing effect is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
fig. 2 is a flowchart of an audio playing method provided in an embodiment of the present application;
fig. 3 is a flowchart of an audio playing method provided in an embodiment of the present application;
fig. 4 is a schematic view of a playing interface provided in an embodiment of the present application;
fig. 5 is a schematic view of a playing interface provided in an embodiment of the present application;
fig. 6 is a block diagram of an audio playing apparatus provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It will be understood that, as used herein, the terms "each," "plurality," and "either," and the like, include two or more, each referring to each of the corresponding plurality and any one referring to any one of the corresponding plurality. For example, the plurality of background videos includes 10 background videos, each background video refers to each background video of the 10 background videos, and any background video refers to any background video of the 10 background videos.
Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application. Referring to fig. 1, the implementation environment includes a terminal 101 and a server 102. The terminal 101 and the server 102 are connected through a wireless or wired network, and the terminal 101 is installed with a target application served by the server 102, through which the terminal 101 can implement functions such as data transmission, message interaction, and the like.
Optionally, the terminal 101 is a computer, a mobile phone, a tablet computer, or other terminal. The target application is any application installed on the terminal 101, and optionally, the target application is an application in an operating system of the terminal 101 or an application provided by a third party. For example, the target application is a music application having an audio playing function and a video playing function, and optionally, the music application further has other functions, such as a social function, a game function, a function of recording music or video, and the like. Optionally, the server 102 is a background server corresponding to the target application, or the server 102 is a cloud server providing services such as cloud computing or cloud storage, which is not limited in this application.
In the embodiment of the present application, the terminal 101 is configured to play audio and play background video of the audio, and the server 102 is configured to store and send the background video of the audio to the terminal. The audio playing method in the present application is applicable to any audio playing scene, and the present application is not limited thereto.
Fig. 2 is a flowchart of an audio playing method according to an embodiment of the present application. Referring to fig. 2, the main execution body of the embodiment is a terminal, and the embodiment includes:
step 201: and playing the first audio while displaying the playing interface of the first audio.
Step 202: a first sliding operation on a playing interface is detected.
Step 203: if the sliding direction of the first sliding operation is the target direction, the content in the playing interface is switched to the video picture in the first background video while the first audio is continuously played, and the first background video is the background video corresponding to the first audio.
In the embodiment of the application, if the sliding operation in the target direction is detected, the content in the playing interface is switched to the video picture in the first background video corresponding to the first audio while the first audio continues to be played, so that the user can control the terminal to play the video picture corresponding to the audio through the sliding operation, and thus, the operation efficiency can be effectively improved. Moreover, the playing of the audio is not affected by the playing of the video picture, so that the user can watch the video picture corresponding to the audio under the condition that the listening of the user to the audio is not affected, and the playing effect is improved.
In one possible implementation manner, switching the content in the playing interface to a video frame in the first background video while continuing to play the first audio includes:
displaying a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while continuing to play the first audio;
and if the sliding distance of the video cover in the playing interface reaches the reference distance, switching the content in the playing interface into a video picture.
In another possible implementation manner, the playing interface includes text information, and the text information is displayed on an upper layer of the playing interface in the process of displaying the dynamic effect.
In another possible implementation manner, if the sliding distance of the video cover in the playing interface reaches the reference distance, switching the content in the playing interface to a video frame includes:
and if the sliding distance of the video cover in the playing interface reaches the reference distance, playing the video picture in the full screen mode in the playing interface or playing the video picture in a picture playing area in the playing interface.
In another possible implementation manner, switching the content in the playing interface to a video frame in the first background video while continuing to play the first audio includes:
acquiring the playing progress of a first audio when the first sliding operation is detected;
determining an initial playing point of a video picture according to the playing progress;
and switching the content in the playing interface to a video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and when the playing duration of the video picture reaches the reference duration, adjusting the text information in the playing interface from the current position to the target position.
In another possible implementation manner, after the text information in the playing interface is adjusted from the current position to the target position, the method further includes:
and when the display restoring operation on the playing interface is detected, restoring the text information to the current position.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and hiding the target interface control in the playing interface when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, after hiding the target interface control in the play interface, the method further includes:
and when the display resuming operation on the playing interface is detected, resuming to display the target interface control.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and when the playing duration of the video picture reaches the reference duration, performing fuzzy processing on other areas except the picture playing area of the first background video in the playing interface.
In another possible implementation manner, after blurring processing is performed on other areas in the playing interface except for the picture playing area of the first background video, the method further includes:
and when the display restoring operation on the playing interface is detected, restoring to display other areas before the fuzzy processing.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
and when the audio switching operation on the playing interface is detected, playing a second audio according to the audio switching operation, and playing a video picture in a second background video in the playing interface, wherein the second background video is a background video corresponding to the second audio.
In another possible implementation manner, before switching the content in the playing interface to the video frame in the first background video, the method further includes:
determining issuing user identifications of a plurality of background videos corresponding to a first audio;
and if the plurality of issued user identifications comprise the currently logged-in user identification, determining the background video corresponding to the currently logged-in user identification as the first background video.
In another possible implementation manner, the method further includes:
and if the plurality of issuing user identifications do not comprise the currently logged-in user identification, selecting a first background video from the plurality of background videos according to the heat values of the plurality of background videos.
In another possible implementation manner, if the sliding direction of the first sliding operation is the target direction, switching the content in the playing interface to the video frame in the first background video while continuing to play the first audio includes:
if the sliding direction of the first sliding operation is the target direction, sending a background video acquisition request for the first audio to a server, wherein the server is used for inquiring a first background video corresponding to the first audio;
receiving a first background video sent by a server;
and switching the content in the playing interface into a video picture while continuing to play the first audio.
In another possible implementation manner, the server is further configured to receive a publishing request sent by any terminal, where the publishing request carries the identifier of the first audio and the first background video; and correspondingly storing the identifier of the first audio and the first background video, wherein the first background video is obtained by editing any terminal in a background video setting interface of the first audio.
In another possible implementation manner, after the content in the playing interface is switched to the video frame in the first background video, the method further includes:
detecting a second sliding operation on the playing interface;
and if the sliding direction of the second sliding operation is opposite to the target direction, switching the video picture into the original content of the playing interface.
In another possible implementation manner, the content in the playing interface includes a background picture of the first audio, and switching the content in the playing interface to a video picture in the first background video while continuing to play the first audio includes:
and switching the background picture into a video picture while continuing to play the first audio.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 3 is a flowchart of an audio playing method according to an embodiment of the present application. Referring to fig. 3, the embodiment includes:
step 301: the terminal plays the first audio while displaying a playing interface of the first audio.
In the embodiment of the application, the terminal has an audio playing function, a user can trigger the playing operation of any audio on the terminal, and the terminal plays the audio when detecting the playing operation of any audio.
Optionally, each audio corresponds to a playing interface, and the content in the playing interface includes text information related to the audio, such as the name, author, lyrics, and the like of the audio. Optionally, the content in the playing interface further includes picture information related to the audio, such as album cover, photo of the author, and the like. Optionally, the content in the playing interface further includes various interface controls, for example, a pause control, a download control, a collection control, a sharing control, and the like, which is not limited in the present application.
When the terminal plays the audio, if the user wants to check the information of the audio, the display operation of the playing interface is performed on the terminal, and correspondingly, when the terminal detects the display operation of the playing interface of the audio, the playing interface of the audio is displayed.
The first audio may include any type of audio, such as songs, pure music, drama, vocals, commentary, etc., and the present application is not limited thereto.
Step 302: the terminal detects a first sliding operation on the playing interface, and if the sliding direction of the first sliding operation is the target direction, sends a background video obtaining request for the first audio to the server, and the server is used for inquiring a first background video corresponding to the first audio.
The background video includes a horizontal screen video and a vertical screen video, optionally, the background video is a video formed by splicing short videos, the short videos used for splicing are original videos recorded by a user or movie and television clips intercepted by the user, or the background video is other videos, which is not limited in this application. The first background video is any one of background videos corresponding to the first audio, wherein the background video corresponding to the first audio is a background video edited for the first audio by the user.
In the embodiment of the application, the audio playing interface has a function of synchronously playing the background video corresponding to the audio, and when the audio is played in the audio playing interface, if a user wants to watch the background video corresponding to the audio, the user directly performs a first sliding operation in a target direction on the playing interface, so that the terminal can be controlled to play the background video on the playing interface. Correspondingly, when the terminal detects a first sliding operation on the playing interface and the sliding direction of the first sliding operation is the target direction, a background video acquisition request is sent to the server to acquire a background video corresponding to the audio.
In one possible implementation, the sliding direction of the first sliding operation being the target direction includes an angle between the sliding direction of the first sliding operation and a reference direction being smaller than a reference angle. Accordingly, step 302 includes: the terminal detects a first sliding operation on a playing interface, obtains an included angle between a sliding direction of the first sliding operation and a reference direction, and sends a background video obtaining request for a first audio to the server when the included angle is smaller than the reference angle. The target direction is a direction in which an included angle between the target direction and the reference direction is smaller than the reference angle. Optionally, the reference direction is an arbitrary direction, for example, the reference direction is a left direction when the terminal is vertical to the screen, or a right direction when the terminal is vertical to the screen, and the reference angle is an arbitrary angle, for example, the reference angle is 10 degrees, which is not limited in this application.
In a possible implementation manner, an implementation manner in which the terminal sends the background video acquisition request for the first audio to the server is as follows: and the terminal sends a background video acquisition request carrying the identifier of the first audio to the server. Correspondingly, the server receives the background video acquisition request, and inquires a first background video corresponding to the first audio according to the identifier of the first audio.
In this embodiment of the present application, a plurality of audio identifiers are stored in the server, each audio identifier corresponds to at least one background video, and optionally, the at least one background video is obtained by editing by each terminal and is published in the server.
In a possible implementation manner, the step of querying, by the server, the first background video corresponding to the first audio according to the identifier of the first audio includes: the server acquires a plurality of background videos corresponding to the first audio according to the identifier of the first audio, determines the publishing user identifiers of the plurality of background videos, and determines the background video corresponding to the currently logged user identifier as the first background video if the plurality of publishing user identifiers comprise the currently logged user identifier; and if the plurality of issuing user identifications do not comprise the currently logged-in user identification, selecting a first background video from the plurality of background videos according to the heat values of the plurality of background videos. The issuing user identifier is used for representing the user identity, and optionally, the issuing user identifier is an account number or a mobile phone number of the user, and the like, which is not limited in the present application.
Before selecting a first background video from a plurality of background videos according to the heat values of the plurality of background videos, determining the heat value of the background video, wherein the implementation mode is as follows: for any background video in the plurality of background videos, the server obtains at least one of the complete play rate, click rate, collection times or sharing times of the background video, and determines the heat value of the background video according to the at least one of the complete play rate, click rate, collection times or sharing times of the background video. Or, the heat value of the background video is determined according to other parameters of the background video, which is not limited in this application.
Optionally, the implementation manner of determining the heat value of the background video according to at least one of the complete play rate, the click rate, the collection times or the sharing times of the background video is as follows: and the server performs weighting processing on at least one of the complete playing rate, the click rate, the collection times or the sharing times of the background video to obtain the heat value of the background video.
Optionally, the selecting the first background video from the plurality of background videos according to the heat value of the plurality of background videos is implemented by: the server selects any background video with the heat value larger than the reference threshold value from the plurality of background videos as a first background video. Alternatively, the server selects a background video with the highest heat value from the plurality of background videos as the first background video. This is not limited by the present application.
In the embodiment of the application, the background video corresponding to the currently logged-in user identifier is preferentially acquired, so that the user can preferentially watch the background video edited for the audio, and the background video edited by the user is generally preferred by the user, so that the user viscosity is improved. In addition, when the background video corresponding to the currently logged-in user identifier does not exist, the background video is selected according to the heat value, and the heat value of the background video can indicate the popularity of the background video, so that the background video selected according to the heat value is most likely to be liked by the user, and the user stickiness is improved.
In the embodiment of the present application, the server is further configured to receive a publishing request of any terminal. In a possible implementation manner, before querying a first background video corresponding to a first audio, a server receives a publishing request sent by any terminal, where the publishing request carries an identifier of the first audio and the first background video, and the server correspondingly stores the identifier of the first audio and the first background video, where the first background video is obtained by editing any terminal in a background video setting interface of the first audio. Optionally, the editing includes various processing operations such as splicing, intercepting, and adding a special effect, which is not limited in the present application.
In the embodiment of the application, a user can edit the background video corresponding to the audio through the background video setting interface of the audio. Optionally, the background video setting interface of the audio includes a plurality of short videos, and the user can select a plurality of favorite short videos from the short videos, and the short videos are spliced through the background video setting interface to obtain a background video corresponding to the audio. Optionally, the background video setting interface includes a video recording button, and a user can enter the video recording interface through the video recording button and record the background video through the video recording interface. Optionally, the background video setting interface includes a local video selection button, and the user can select a plurality of favorite short videos from the locally stored short videos through the local video selection button, and splice the plurality of short videos through the background video setting interface to obtain the background video. After that, the user can distribute the edited background video in the server.
In a possible implementation manner, the terminal directly obtains the first background video corresponding to the first audio from the local. Optionally, the first background video is obtained from the server and stored in the terminal when the terminal obtains the first audio from the server.
Step 303: the terminal receives the first background video sent by the server, and switches the content in the playing interface into a video picture in the first background video while continuing to play the first audio.
The terminal continues to play the first audio, which means that the terminal continues to play the first audio on the premise of keeping the playing progress of the first audio.
Optionally, any content in the playing interface is switched to the video screen, for example, album covers in the playing interface are switched to the video screen, or photos of an author in the playing interface are switched to the video screen, which is not limited in this application.
In a possible implementation manner, the step of switching the content in the playing interface to the video picture in the first background video while the terminal continues to play the first audio includes: the terminal displays a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while continuously playing the first audio; and if the sliding distance of the video cover in the playing interface reaches the reference distance, the terminal switches the content in the playing interface into a video picture.
Optionally, the video cover is any frame of picture in the first background video, for example, the video cover is a second frame of picture in the first background video, which is not limited in this application. Alternatively, the reference distance of the sliding distance is any value, such as 1/4 of the bottom length of the playing interface, which is not limited in this application.
When a user performs a first sliding operation in a target direction on a playing interface, the terminal displays a dynamic effect that a video cover gradually slides from the edge of the playing interface to the playing interface according to the first sliding operation, in the process, the terminal detects the sliding distance of the video cover on the playing interface, and if the sliding distance reaches a reference distance, the terminal switches the content in the playing interface into a video picture.
Optionally, the implementation manner in which the terminal detects the sliding distance of the video cover on the playing interface is as follows: the terminal detects a distance between a first cover edge of the video cover and a first interface edge of the playing interface as a sliding distance, wherein the first cover edge is an edge of the video cover which preferentially enters the playing interface, and the first interface edge is an edge of the playing interface which slides out of the video cover.
In a possible implementation manner, the content in the play interface includes a background picture of the first audio, and correspondingly, the terminal switches the content in the play interface to a video picture in the first background video while continuing to play the first audio, including: and the terminal switches the background picture into a video picture in the first background video while continuing to play the first audio. Optionally, the implementation manner is: the terminal continuously plays the first audio, and simultaneously, a video cover displaying the first background video slides out from the edge of the playing interface and gradually covers the dynamic effect of the background picture; and if the sliding distance of the video cover in the playing interface reaches the reference distance, the terminal switches the background picture into the video picture. Therefore, the display effect of the playing interface is improved when the background picture is switched to the video picture, and the user viscosity can be improved. Optionally, the background picture includes album art of audio, a photo of an author, and the like, which is not limited in this application.
In a possible implementation manner, if a sliding distance of the video cover in the playing interface reaches a reference distance, the terminal switches the content in the playing interface to a video frame, including: and if the sliding distance of the video cover in the playing interface reaches the reference distance, the terminal plays the video picture in a full screen mode in the playing interface or in a picture playing area in the playing interface.
Optionally, the picture playing area is any area on the playing interface, for example, the picture playing area is a middle area of the playing interface, which is not limited in the present application. When the sliding distance of the video cover in the playing interface reaches the reference distance, the terminal automatically plays the video picture in the full screen mode in the playing interface or plays the video picture in the picture playing area in the playing interface, on one hand, misoperation of a user can be prevented, and on the other hand, the operation efficiency of the terminal is improved.
The terminal plays the video picture in the first background video in the playing interface, which includes two situations: the first background video only has the picture without the audio, or the first background video has the audio, but when the first background video is played, the audio of the first background video is not played, so that the playing of the video picture of the first background video does not influence the playing of the first audio, and the playing effect of the audio is ensured.
In another possible implementation manner, the implementation manner of step 303 is: the terminal displays a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while continuously playing the first audio; if the first cover edge of the video cover slides to the reference area in the playing interface, the terminal switches the content in the playing interface to the video picture in the first background video, wherein the first cover edge is an edge of the video cover which preferentially enters the playing interface.
Optionally, the reference area is any area in the playing interface, for example, the reference area is an 4/3 area near a second interface edge of the playing interface, where the second interface edge refers to an edge of the playing interface opposite to the first interface edge.
When a user performs a first sliding operation in a target direction on a playing interface, the terminal displays a dynamic effect that a video cover gradually slides from the edge of the playing interface to the playing interface according to the first sliding operation, in the process, the terminal detects an area where the edge of a first cover preferentially entering the playing area in the video cover is located, and if the area is a reference area, the terminal switches the content in the playing interface into the video picture of a first background video.
In a possible implementation manner, the step of the terminal displaying a dynamic effect that the video cover of the first background video slides out from the edge of the playing interface includes: and the terminal displays a dynamic effect that the video cover of the first background video slides out from the edge of the playing interface according to at least one of the operation speed, the operation distance, the operation strength, the operation duration or the operation direction of the first sliding operation. For example, when the terminal displays a dynamic effect that a video cover slips out of the edge of the play interface according to the operation speed, the terminal determines the sliding speed of the video cover corresponding to the operation speed according to the operation speed. For another example, when the terminal displays a dynamic effect that the video cover slips out from the edge of the playing interface according to the operation distance, the terminal determines the sliding distance of the video cover corresponding to the operation distance according to the operation distance.
In one possible implementation, the playing interface includes text information, for example, the text information is a name, an author, lyrics, or other text information of the first audio. And in the process of displaying the dynamic effect, the text information is displayed on the upper layer of the playing interface. Optionally, if the text information in the original playing interface is not on the upper layer of the playing interface, the terminal displays the text information on the upper layer of the playing interface when starting to display the dynamic effect, so that the text information can be normally displayed on the playing interface in the process of sliding the video cover to the playing interface, browsing of the text information by a user is not influenced, and the user viscosity is improved.
Optionally, the display position of the lyrics in the original playing interface is different from the display position of the lyrics in the playing interface for playing the video picture, for example, the lyrics in the original playing interface are displayed to the left, and the lyrics in the playing interface for playing the video picture are displayed in the middle. Correspondingly, in the process that the terminal displays the dynamic effect that the video cover of the first background video slides out from the edge of the playing interface, the lyrics are displayed on the upper layer of the playing interface, and the terminal responds to the situation that the content in the playing interface is switched to the video picture of the first background video and adjusts the lyrics in the playing interface from the initial position to the target position. Optionally, the implementation manner of adjusting the lyrics in the playing interface from the initial position to the target position is as follows: the terminal displays the dynamic effect of the lyrics fading in at the original position and the dynamic effect of the lyrics fading in at the target position.
Optionally, the playing interface includes interface controls, such as a sharing control, a downloading control, a progress bar control, a comment control, a pause control, an audio switching control, a collection control, and a sound effect control, and optionally, the interface controls are displayed on an upper layer of the playing interface in the process of displaying the dynamic effect by the terminal. The implementation manner is the same as the manner of displaying the text information, and is not described herein again.
Referring to fig. 4, fig. 4 is an interface diagram of the terminal switching a background picture (album cover) in a playing interface to a background video, it can be seen that the video cover of the background video has slid to 1/3 of the playing interface, and interface controls and text information in the playing interface are displayed on an upper layer of the playing interface, such as a collection control, a download control and a sharing control on the upper right below the playing interface, and an audio name, an author and lyrics below the playing interface are all displayed on an upper layer of the interface, and are not affected by a sliding operation.
In this embodiment of the application, because the terminal continues to play the first audio on the premise of maintaining the playing progress of the first audio, correspondingly, the terminal switches the content in the playing interface to the video picture in the first background video while continuing to play the first audio, which includes: the method comprises the steps that when a terminal detects a first sliding operation, the playing progress of a first audio is obtained; determining an initial playing point of a video picture according to the playing progress; and switching the content in the playing interface to a video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point.
In the embodiment of the application, since the total duration of the first background video is the same as the total duration of the first audio, the starting playing point of the video picture of the first background video is determined according to the playing progress of the first audio, so that the problem that the audio playing is finished but the video picture of the first background video is not played is solved. In addition, because the video frame of the first background video has temporal correspondence with the first audio, for example, a climax part of the first audio corresponds to a climax part of the video frame of the first background video, the initial playing point of the video frame is determined according to the playing progress of the first audio, so that the playing progress of the first audio and the video frame can be ensured to be consistent, and the playing effect of the video frame is ensured.
Optionally, the played time length of the first audio represents the playing progress of the first audio, and correspondingly, the terminal obtains the played time length of the first audio when the first sliding operation is detected; determining an initial playing point of the video picture according to the played time length; and switching the content in the playing interface to a video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point. For example, if the played time of the first audio is 10 seconds, the starting playing point of the video frame is 10 seconds.
It should be noted that, after the content in the playing interface is switched to the video frame, if the playing interface includes the progress bar control, the progress bar control can not only control the playing progress of the first audio, but also control the playing progress of the video frame. In the process of playing the first audio and video pictures, if a user wants to browse a played video picture again, or listen to a played segment of the first audio again, or the user wants to skip a currently played video segment or audio segment, the user directly performs a sliding operation on the progress bar control. Therefore, the video picture and the audio of the background video can keep the progress consistent all the time, and the playing effect of the audio and the background video is ensured.
In a possible implementation manner, after the terminal switches the content in the playing interface to a video picture, a publishing user identifier corresponding to a first background video is also displayed on the playing interface, and when a triggering operation on the publishing user identifier is detected, a detail page of the publishing user identifier is displayed, where the detail page includes at least one background video corresponding to the publishing user identifier. Therefore, if the user likes the video picture of the currently played background video, the user can browse more background videos edited by the author based on the detail page of the author who issues the user identification to enter the background video, so that the user can find the favorite background video conveniently, and the user viscosity is improved.
Step 304: and when the playing duration of the video picture reaches the reference duration, the terminal adjusts the text information in the playing interface from the current position to the target position.
In the embodiment of the application, after the terminal switches the content in the playing interface into the video picture of the first background video, the terminal waits for a period of time and then automatically enters the pure mode, and the playing interface in the pure mode is simplified and adjusted, for example, the number of interface controls displayed in the playing interface is reduced, the display position of text information in the playing interface is adjusted, and the like, so that the shielding or interference on the video picture is reduced, the immersive browsing of a user is facilitated, and the playing effect of the video picture can be improved.
Step 304 is a possible implementation manner of the terminal entering the pure mode, in which the shielding effect on the video frame when the text information is located at the target position is smaller than the shielding effect on the video frame when the text information is located at the current position. For example, the current position of the text message is located at the upper middle of the playing interface, and the target position is located at the lower left side of the playing interface. Optionally, the occlusion area of the text information on the video picture when located at the target position is smaller than the occlusion area of the text information on the video picture when located at the current position, for example, when the video picture is a horizontal screen video, and when the video picture is played on a vertical screen, the video picture is not fully covered with a playing interface, and if the current position of the text information is a position where the text information can occlude the video picture, the target position is a position where the text information cannot occlude the video picture. Alternatively, the reference time period is set to any time period as needed, for example, 10 seconds, which is not limited in the present application.
Optionally, the implementation manner of the terminal entering the pure mode includes: and when the playing duration of the video picture reaches the reference duration, hiding a target interface control in the playing interface by the terminal. Optionally, the target interface control in the playing interface is hidden in an implementation manner that: and the terminal adjusts the transparency of the target interface control in the playing interface into a reference value. For example, the transparency of the target interface control is adjusted to 100%, or other values, which is not limited in this application. Or, the terminal removes the target interface control in the playing interface. That is, the play interface of the terminal no longer includes the target interface control.
Optionally, the target interface controls are set to be any controls as needed, and the number of the target interface controls is any number, for example, the target interface controls include a collection control, a download control, a comment control, a sound effect control, and the like, which is not limited in the present application.
Optionally, the implementation manner of the terminal entering the pure mode includes: and when the playing duration of the video picture reaches the reference duration, the terminal performs fuzzy processing on other areas except the picture playing area of the first background video in the playing interface.
Optionally, the picture playing area of the first background video is smaller than the playing interface for two reasons, and in the first reason, the picture playing area is smaller than the playing interface due to the format of the background video, for example, the playing interface is a vertical screen interface, and the first background video is a horizontal screen video, so the picture playing area is smaller than the playing interface. Secondly, the picture playing area set by the terminal is smaller than the playing interface.
When the picture playing area of the first background video is smaller than the playing interface, the blurring processing is carried out on other areas except the picture playing area in the playing interface, the video picture of the currently played background video can be highlighted, the interference of the other areas on the video picture is reduced, and therefore the playing effect of the video picture is improved.
Optionally, an implementation manner of the terminal performing the blurring processing on the other areas in the playing interface except for the picture playing area of the first background video includes: the terminal adjusts the size of the video picture of the first background video on the premise of keeping the length-width ratio of the video picture, so that the video picture is fully supported by a playing interface, the video picture fully supported by the playing interface is subjected to fuzzy processing, the video picture of the first background video is played on the upper layer of the video picture subjected to the fuzzy processing, and the picture playing area of the first background video on the upper layer is smaller than that of the playing interface. Therefore, the pictures in other areas subjected to the fuzzy processing and the video pictures in the picture playing area are synchronous, and the interface display effect is improved.
Optionally, the condition for entering the pure mode comprises: and when the playing time length of the video picture reaches the reference time length and is within the time range corresponding to the playing time length, the terminal does not detect any trigger operation on the playing interface. Therefore, the terminal can be prevented from automatically entering the pure mode to influence the user to operate the playing interface, and the user viscosity is improved.
Another point to be described is that, the method is described only by taking an example that the terminal automatically triggers to enter the pure mode, optionally, the terminal supports a user to manually enter the pure mode, and only needs to replace the triggering condition of entering the pure mode by "when the playing duration of the video picture reaches the reference duration" with "if the terminal detects a change display operation on the playing interface", optionally, the change display operation is any operation, such as a click operation, a long-time press operation, and the like, which is not limited by the present application. Therefore, the user can conveniently and manually control to quickly enter the pure mode.
Referring to fig. 5, fig. 5 is a schematic view of a playing interface when the terminal enters the pure mode, wherein a playing area of the background video is located in the middle of the playing interface, other areas except for the picture playing area in the playing interface are subjected to fuzzy processing, and an audio name, an author, lyrics, and a user identifier "watermelon", corresponding to the background video, are displayed on the lower left of the playing interface.
It should be noted that in another embodiment, step 304 is not performed, or step 304 is performed in another manner.
Step 305: and when the display restoring operation on the playing interface is detected, the terminal restores the text information to the current position.
Optionally, after entering the clean mode, if the user wants to exit the clean mode, the user directly performs a resume display operation on the play interface. If the pure mode comprises the step of adjusting the position of the text information, when the display restoring operation on the playing interface is detected, restoring the text information to the current position; if the pure mode comprises a hidden target interface control, restoring to display the target interface control when the terminal detects display restoring operation on the playing interface; and if the pure mode comprises the step of carrying out fuzzy processing on other areas outside the playing area of the picture, when the terminal detects the display restoring operation on the playing interface, restoring to display other areas before the fuzzy processing. The resume display operation is a click operation or other operations, for example, a long press operation, and the like, which is not limited in the present application.
It should be noted that, in another embodiment, step 305 is not performed, or step 305 is performed in another manner.
Step 306: and the terminal detects a second sliding operation on the playing interface, and if the sliding direction of the second sliding operation is opposite to the target direction, the video picture is switched to the original content of the playing interface.
In the application, if the user does not want to continue playing the video picture in the playing interface, a second sliding operation in which the sliding direction is opposite to the target direction is directly performed on the playing interface, so as to control the terminal to switch the video picture to the original content of the playing interface, that is, the user can recover the previous playing interface only by sliding operation, thereby effectively improving the operation efficiency of the terminal.
It should be noted that, this step is a process opposite to the above-mentioned process of switching the content in the play interface to the video picture in the first background video when the sliding operation in the target direction is detected, but the implementation manner is the same, and details are not described here.
In a possible implementation manner, after the terminal plays the video picture of the first background video in the playing interface, if the user does not like the video picture of the currently played first background video, a third sliding operation in a preset direction is directly performed on the playing interface to switch the background video in the currently played interface. Correspondingly, after the terminal plays the video picture of the first background video on the playing interface, detecting a third sliding operation on the playing interface, and when the sliding direction of the third sliding operation is a preset direction, acquiring a third background video corresponding to the first audio, and switching the video picture of the first background video into the video picture of the third background video.
The manner of acquiring the third background video by the terminal is the same as the manner of acquiring the first background video, and is not described herein again. Optionally, the implementation manner of switching the video picture of the first background video to the video picture of the third background video by the terminal is as follows: and the terminal displays the dynamic effect that the video picture of the first background video disappears from the playing interface in a sliding manner and the dynamic effect that the video cover of the third background video appears in a sliding manner in the playing interface, and if the sliding distance of the video cover of the third background video in the playing interface reaches a preset distance, the video picture in the third background video is played in the playing interface.
It should be noted that the third sliding operation in the preset direction is used to switch the video frame of the background video corresponding to the audio in the playing interface, and the first sliding operation in the target direction is used to play the video frame of the background video on the playing interface without the background picture, or switch the background picture in the playing interface to the video frame of the background video. The preset direction is different from the target direction, taking the target direction as the left direction when the screen is vertical as an example, and the preset direction is the upper or lower direction when the screen is vertical or the direction different from the target direction in other directions. Alternatively, the predetermined distance may be any value, such as 1/3, which is not limited in this application.
In a possible implementation manner, when a terminal plays a video picture in a first background video in a playing interface, a catalog popup window display button is displayed on the playing interface, when a trigger operation on the catalog popup window display button is detected, a catalog popup window is displayed in a preset area of the playing interface, the catalog popup window includes at least one background video corresponding to a first audio, and the terminal responds to the trigger operation on any background video and switches the video picture of the currently played background video in the playing interface to the video picture of the background video. Therefore, the user can freely select the played video picture, and the user viscosity is improved.
In a possible implementation manner, after the terminal switches the content in the playing interface to the video picture in the first background video, if the user switches the currently played audio, the video picture of the played background video is also automatically switched. Correspondingly, when the terminal detects the audio switching operation on the playing interface, the second audio is played according to the audio switching operation, and the video picture in the second background video is played on the playing interface, wherein the second background video is the background video corresponding to the second audio. Therefore, the audio played currently and the video image played currently can be ensured to be matched all the time, and the playing effect of the audio and the video image is improved.
Before playing the video picture in the second background video, the terminal needs to acquire the second background video first, and the implementation manner of acquiring the second background video by the terminal is the same as the manner of acquiring the first background video, which is not described herein again.
It should be noted that in another embodiment, step 306 is not performed, or step 306 is otherwise performed.
Another point to be explained is that the video frame of the background video is played by adding the video playing function on the audio playing interface, that is, the background video is directly played on the audio playing interface without performing interface skip.
In the embodiment of the application, if the sliding operation in the target direction is detected, the content in the playing interface is switched to the video picture in the first background video corresponding to the first audio while the first audio continues to be played, so that the user can control the terminal to play the video picture corresponding to the audio through the sliding operation, and thus, the operation efficiency can be effectively improved. Moreover, the playing of the audio is not affected by the playing of the video picture, so that the user can watch the video picture corresponding to the audio under the condition that the listening of the user to the audio is not affected, and the playing effect is improved.
Fig. 6 is a block diagram of an audio playing apparatus according to an embodiment of the present application. Referring to fig. 6, the apparatus includes:
a first playing module 601 configured to play the first audio while displaying a playing interface of the first audio;
an operation detection module 602 configured to detect a first sliding operation on the play interface;
the second playing module 603 is configured to, if the sliding direction of the first sliding operation is the target direction, switch the content in the playing interface to a video frame in the first background video while continuing to play the first audio, where the first background video is a background video corresponding to the first audio.
In a possible implementation manner, the second playing module 603 includes:
the effect display submodule is configured to display a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while the first audio is continuously played;
and the picture playing submodule is configured to switch the content in the playing interface into a video picture if the sliding distance of the video cover in the playing interface reaches the reference distance.
In another possible implementation manner, the playing interface includes text information, and the effect display sub-module is configured to display the text information on an upper layer of the playing interface in the process of displaying the dynamic effect.
In another possible implementation manner, the picture playing sub-module is configured to play the video picture in a full screen manner in the playing interface or in a picture playing area in the playing interface if the sliding distance of the video cover in the playing interface reaches the reference distance.
In another possible implementation manner, the second playing module 603 is configured to obtain a playing progress of the first audio when the first sliding operation is detected; determining an initial playing point of the first background video according to the playing progress; and switching the content in the playing interface to a video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point.
In another possible implementation manner, the apparatus further includes:
and the position adjusting module is configured to adjust the text information in the playing interface from the current position to the target position when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, the position adjusting module is further configured to restore the text information to the current position when a display restoring operation on the play interface is detected.
In another possible implementation manner, the control hiding module is configured to hide a target interface control in the playing interface when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, the control hiding module is further configured to resume displaying the target interface control when a resume displaying operation on the playing interface is detected.
In another possible implementation manner, the apparatus further includes:
and the fuzzy processing module is configured to perform fuzzy processing on other areas except the picture playing area of the first background video in the playing interface when the playing duration of the video picture reaches the reference duration.
In another possible implementation manner, the blurring processing module is further configured to, when a display resuming operation on the playing interface is detected, resume displaying other areas before the blurring processing.
In another possible implementation manner, the apparatus further includes:
and the third playing module is configured to play a second audio according to the audio switching operation when the audio switching operation on the playing interface is detected, and play a video picture in a second background video in the playing interface, wherein the second background video is a background video corresponding to the second audio.
In another possible implementation manner, the apparatus further includes:
the identification determining module is configured to determine the publishing user identification of a plurality of background videos corresponding to the first audio;
the video determining module is configured to determine a background video corresponding to the currently logged-in user identifier as a first background video if the plurality of published user identifiers include the currently logged-in user identifier.
In another possible implementation manner, the video determination module is further configured to select the first background video from the plurality of background videos according to a heat value of the plurality of background videos if the plurality of publishing user identifiers do not include the currently logged-in user identifier.
In another possible implementation manner, the second playing module 603 is configured to send a background video obtaining request for the first audio to the server if the sliding direction of the first sliding operation is the target direction, where the server is used to query the first background video corresponding to the first audio; receiving a first background video sent by a server; and switching the content in the playing interface into a video picture while continuing to play the first audio.
In another possible implementation manner, the server is further configured to receive a publishing request sent by any terminal, where the publishing request carries the identifier of the first audio and the first background video; and correspondingly storing the first audio and the first background video according to the identifier of the first audio, wherein the first background video is obtained by editing any terminal in a background video setting interface of the first audio.
In another possible implementation manner, the apparatus further includes:
the interface display module is configured to detect a second sliding operation on the playing interface; and if the sliding direction of the second sliding operation is opposite to the target direction, switching the video picture into the original content of the playing interface.
In another possible implementation manner, the content in the playing interface includes a background picture of the first audio, and the second playing module is configured to switch the background picture to a video picture while continuing to play the first audio.
In the embodiment of the application, if the sliding operation in the target direction is detected, the content in the playing interface is switched to the video picture in the first background video corresponding to the first audio while the first audio continues to be played, so that the user can control the terminal to play the video picture corresponding to the audio through the sliding operation, and thus, the operation efficiency can be effectively improved. Moreover, the playing of the audio is not affected by the playing of the video picture, so that the user can watch the video picture corresponding to the audio under the condition that the listening of the user to the audio is not affected, and the playing effect is improved.
It should be noted that: in the audio playing apparatus provided in the foregoing embodiment, when playing audio, only the division of the functional modules is exemplified, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the terminal is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the audio playing device and the audio playing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 7 shows a block diagram of a terminal 700 according to an exemplary embodiment of the present application. The terminal 700 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer iv, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 700 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so on.
In general, terminal 700 includes: a processor 701 and a memory 702.
The processor 701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 701 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 701 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 702 may include one or more computer-readable storage media, which may be non-transitory. Memory 702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 702 is used to store at least one instruction for execution by processor 701 to implement an audio playback method provided by method embodiments herein.
In some embodiments, the terminal 700 may further optionally include: a peripheral interface 703 and at least one peripheral. The processor 701, the memory 702, and the peripheral interface 703 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 703 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 704, a display screen 705, a camera assembly 706, an audio circuit 707, a positioning component 708, and a power source 709.
The peripheral interface 703 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 701 and the memory 702. In some embodiments, processor 701, memory 702, and peripheral interface 703 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 701, the memory 702, and the peripheral interface 703 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 704 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 704 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 704 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 704 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 704 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 705 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 705 is a touch display screen, the display screen 705 also has the ability to capture touch signals on or over the surface of the display screen 705. The touch signal may be input to the processor 701 as a control signal for processing. At this point, the display 705 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 705 may be one, providing the front panel of the terminal 700; in other embodiments, the display 705 can be at least two, respectively disposed on different surfaces of the terminal 700 or in a folded design; in other embodiments, the display 705 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 700. Even more, the display 705 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display 705 may be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 706 is used to capture images or video. Optionally, camera assembly 706 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 706 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 707 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 701 for processing or inputting the electric signals to the radio frequency circuit 704 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 700. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 701 or the radio frequency circuit 704 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 707 may also include a headphone jack.
The positioning component 708 is used to locate the current geographic position of the terminal 700 to implement navigation or LBS (location based Service). The positioning component 708 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 709 is provided to supply power to various components of terminal 700. The power source 709 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When power source 709 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 700 also includes one or more sensors 710. The one or more sensors 710 include, but are not limited to: acceleration sensor 711, gyro sensor 712, pressure sensor 713, fingerprint sensor 714, optical sensor 715, and proximity sensor 716.
The acceleration sensor 711 can detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the terminal 700. For example, the acceleration sensor 711 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 701 may control the display screen 705 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 711. The acceleration sensor 711 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 712 may detect a body direction and a rotation angle of the terminal 700, and the gyro sensor 712 may cooperate with the acceleration sensor 711 to acquire a 3D motion of the terminal 700 by the user. From the data collected by the gyro sensor 712, the processor 701 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 713 may be disposed on a side frame of terminal 700 and/or underneath display 705. When the pressure sensor 713 is disposed on a side frame of the terminal 700, a user's grip signal on the terminal 700 may be detected, and the processor 701 performs right-left hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 713. When the pressure sensor 713 is disposed at a lower layer of the display screen 705, the processor 701 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 705. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 714 is used for collecting a fingerprint of a user, and the processor 701 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 714, or the fingerprint sensor 714 identifies the identity of the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 701 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 714 may be disposed on the front, back, or side of the terminal 700. When a physical button or a vendor Logo is provided on the terminal 700, the fingerprint sensor 714 may be integrated with the physical button or the vendor Logo.
The optical sensor 715 is used to collect the ambient light intensity. In one embodiment, the processor 701 may control the display brightness of the display screen 705 based on the ambient light intensity collected by the optical sensor 715. Specifically, when the ambient light intensity is high, the display brightness of the display screen 705 is increased; when the ambient light intensity is low, the display brightness of the display screen 705 is adjusted down. In another embodiment, processor 701 may also dynamically adjust the shooting parameters of camera assembly 706 based on the ambient light intensity collected by optical sensor 715.
A proximity sensor 716, also referred to as a distance sensor, is typically disposed on a front panel of the terminal 700. The proximity sensor 716 is used to collect the distance between the user and the front surface of the terminal 700. In one embodiment, when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 gradually decreases, the processor 701 controls the display 705 to switch from the bright screen state to the dark screen state; when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 is gradually increased, the processor 701 controls the display 705 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 7 is not intended to be limiting of terminal 700 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 8 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 800 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 801 and one or more memories 802, where the memory 802 stores at least one program code, and the at least one program code is loaded and executed by the processors 801 to implement the audio playing method provided by each method embodiment. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
The embodiment of the present application further provides a terminal, where the terminal includes a processor and a memory, where at least one program code is stored in the memory, and the at least one program code is loaded and executed by the processor, so as to implement the operations executed in the audio playing method of the foregoing embodiment.
The embodiment of the present application further provides a computer-readable storage medium, where at least one program code is stored in the computer-readable storage medium, and the at least one program code is loaded and executed by a processor to implement the operations executed in the audio playing method of the foregoing embodiment.
The embodiment of the present application further provides a computer program product, where the computer program product includes at least one program code, and the at least one program code is loaded and executed by a processor to implement the operations executed in the audio playing method of the foregoing embodiment.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by hardware associated with program code, and the program may be stored in a computer readable storage medium, and the above mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (21)

1. An audio playing method, the method comprising:
the method comprises the steps of playing a first audio while displaying a playing interface of the first audio;
detecting a first sliding operation on the playing interface;
if the sliding direction of the first sliding operation is the target direction, the content in the playing interface is switched to a video picture in a first background video while the first audio is continuously played, wherein the first background video is a background video corresponding to the first audio.
2. The method of claim 1, wherein switching the content in the playing interface to a video frame in a first background video while continuing to play the first audio comprises:
displaying a dynamic effect that a video cover of the first background video slides out from the edge of the playing interface while continuing to play the first audio;
and if the sliding distance of the video cover in the playing interface reaches a reference distance, switching the content in the playing interface into the video picture.
3. The method according to claim 2, wherein the playing interface includes text information, and the text information is displayed on an upper layer of the playing interface during the displaying of the dynamic effect.
4. The method of claim 2, wherein switching the content in the playing interface to the video frame if the sliding distance of the video cover in the playing interface reaches a reference distance comprises:
and if the sliding distance of the video cover in the playing interface reaches the reference distance, the video picture is played in the playing interface in a full screen mode, or the video picture is played in a picture playing area in the playing interface.
5. The method of claim 1, wherein switching the content in the playing interface to a video frame in a first background video while continuing to play the first audio comprises:
acquiring the playing progress of the first audio when the first sliding operation is detected;
determining an initial playing point of the video picture according to the playing progress;
and switching the content in the playing interface to the video picture while continuing to play the first audio, wherein the video picture is played from the initial playing point.
6. The method according to claim 1, wherein after the switching the content in the playing interface to the video frame in the first background video, the method further comprises:
and when the playing duration of the video picture reaches the reference duration, adjusting the text information in the playing interface from the current position to the target position.
7. The method according to claim 6, wherein after the text information in the playing interface is adjusted from the current position to the target position, the method further comprises:
and when the display restoring operation on the playing interface is detected, restoring the text information to the current position.
8. The method according to claim 1, wherein after the switching the content in the playing interface to the video frame in the first background video, the method further comprises:
and hiding the target interface control in the playing interface when the playing duration of the video picture reaches the reference duration.
9. The method of claim 8, wherein after hiding the target interface control in the playback interface, the method further comprises:
and when the display resuming operation on the playing interface is detected, resuming to display the target interface control.
10. The method according to claim 1, wherein after the switching the content in the playing interface to the video frame in the first background video, the method further comprises:
and when the playing duration of the video picture reaches the reference duration, performing fuzzy processing on other areas except the picture playing area of the first background video in the playing interface.
11. The method according to claim 10, wherein after blurring the other regions of the playback interface except the frame playback region of the first background video, the method further comprises:
and when the display restoring operation on the playing interface is detected, restoring to display the other areas before the fuzzy processing.
12. The method according to claim 1, wherein after the switching the content in the playing interface to the video frame in the first background video, the method further comprises:
when the audio switching operation on the playing interface is detected, playing a second audio according to the audio switching operation, and playing a video picture in a second background video in the playing interface, wherein the second background video is a background video corresponding to the second audio.
13. The method of claim 1, wherein before switching the content in the playing interface to the video frame in the first background video, the method further comprises:
determining issuing user identifications of a plurality of background videos corresponding to the first audio;
and if the plurality of issued user identifications comprise the currently logged-in user identification, determining the background video corresponding to the currently logged-in user identification as the first background video.
14. The method of claim 13, further comprising:
and if the plurality of issuing user identifications do not comprise the currently logged-in user identification, selecting the first background video from the plurality of background videos according to the heat values of the plurality of background videos.
15. The method according to claim 1, wherein if the sliding direction of the first sliding operation is a target direction, switching the content in the playing interface to a video frame in a first background video while continuing to play the first audio, comprises:
if the sliding direction of the first sliding operation is a target direction, sending a background video acquisition request for the first audio to a server, wherein the server is used for inquiring the first background video corresponding to the first audio;
receiving the first background video sent by the server;
and switching the content in the playing interface to the video picture while continuing to play the first audio.
16. The method according to claim 15, wherein the server is further configured to receive a publishing request sent by any terminal, where the publishing request carries the identifier of the first audio and the first background video; and correspondingly storing the identifier of the first audio and the first background video, wherein the first background video is obtained by editing any terminal in a background video setting interface of the first audio.
17. The method according to claim 1, wherein after the switching the content in the playing interface to the video frame in the first background video, the method further comprises:
detecting a second sliding operation on the playing interface;
and if the sliding direction of the second sliding operation is opposite to the target direction, switching the video picture into the original content of the playing interface.
18. The method of claim 1, wherein the content in the playing interface comprises a background picture of the first audio, and the switching the content in the playing interface to a video picture in a first background video while the playing of the first audio is continued comprises:
and switching the background picture into the video picture while continuing to play the first audio.
19. An audio playback apparatus, comprising:
the first playing module is configured to play a first audio while displaying a playing interface of the first audio;
an operation detection module configured to detect a first sliding operation on the play interface;
and the second playing module is configured to switch the content in the playing interface to a video picture in a first background video while continuing to play the first audio if the sliding direction of the first sliding operation is the target direction, wherein the first background video is a background video corresponding to the first audio.
20. A terminal, characterized in that the terminal comprises a processor and a memory, wherein at least one program code is stored in the memory, and the program code is loaded and executed by the processor to implement the operations performed by the audio playback method according to any one of claims 1 to 18.
21. A computer-readable storage medium, in which at least one program code is stored, the program code being loaded and executed by a processor to implement the operations performed by the audio playback method according to any one of claims 1 to 18.
CN202010687831.6A 2020-07-16 2020-07-16 Audio playing method, device, terminal and storage medium Pending CN111741366A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010687831.6A CN111741366A (en) 2020-07-16 2020-07-16 Audio playing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010687831.6A CN111741366A (en) 2020-07-16 2020-07-16 Audio playing method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN111741366A true CN111741366A (en) 2020-10-02

Family

ID=72654792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010687831.6A Pending CN111741366A (en) 2020-07-16 2020-07-16 Audio playing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111741366A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113141538A (en) * 2021-04-25 2021-07-20 广州酷狗计算机科技有限公司 Media resource playing method, device, terminal, server and storage medium
CN113207022A (en) * 2021-05-08 2021-08-03 广州酷狗计算机科技有限公司 Video playing method and device, computer equipment and storage medium
CN113377995A (en) * 2021-06-18 2021-09-10 广州酷狗计算机科技有限公司 Media resource playing method and device, storage medium and electronic equipment
CN113411684A (en) * 2021-06-24 2021-09-17 广州酷狗计算机科技有限公司 Video playing method and device, storage medium and electronic equipment
WO2023071466A1 (en) * 2021-10-25 2023-05-04 北京字节跳动网络技术有限公司 Method and device for playing sound effect of music
WO2024077498A1 (en) * 2022-10-11 2024-04-18 广州酷狗计算机科技有限公司 Playback interface display method and apparatus, device, and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003157089A (en) * 1996-08-30 2003-05-30 Daiichikosho Co Ltd Operating method of karaoke system which utilizes two media, i.e., communication and broadcasting
CN101640057A (en) * 2009-05-31 2010-02-03 北京中星微电子有限公司 Audio and video matching method and device therefor
CN105404438A (en) * 2014-08-13 2016-03-16 小米科技有限责任公司 Background fuzzy method and apparatus and terminal device
CN105872647A (en) * 2015-12-14 2016-08-17 乐视网信息技术(北京)股份有限公司 Video switching method and system
CN107562349A (en) * 2017-09-11 2018-01-09 广州酷狗计算机科技有限公司 A kind of method and apparatus for performing processing
CN109302538A (en) * 2018-12-21 2019-02-01 广州酷狗计算机科技有限公司 Method for playing music, device, terminal and storage medium
CN111338537A (en) * 2020-02-11 2020-06-26 北京字节跳动网络技术有限公司 Method, apparatus, electronic device, and medium for displaying video

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003157089A (en) * 1996-08-30 2003-05-30 Daiichikosho Co Ltd Operating method of karaoke system which utilizes two media, i.e., communication and broadcasting
CN101640057A (en) * 2009-05-31 2010-02-03 北京中星微电子有限公司 Audio and video matching method and device therefor
CN105404438A (en) * 2014-08-13 2016-03-16 小米科技有限责任公司 Background fuzzy method and apparatus and terminal device
CN105872647A (en) * 2015-12-14 2016-08-17 乐视网信息技术(北京)股份有限公司 Video switching method and system
CN107562349A (en) * 2017-09-11 2018-01-09 广州酷狗计算机科技有限公司 A kind of method and apparatus for performing processing
CN109302538A (en) * 2018-12-21 2019-02-01 广州酷狗计算机科技有限公司 Method for playing music, device, terminal and storage medium
CN111338537A (en) * 2020-02-11 2020-06-26 北京字节跳动网络技术有限公司 Method, apparatus, electronic device, and medium for displaying video

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113141538A (en) * 2021-04-25 2021-07-20 广州酷狗计算机科技有限公司 Media resource playing method, device, terminal, server and storage medium
CN113141538B (en) * 2021-04-25 2023-08-11 广州酷狗计算机科技有限公司 Media resource playing method, device, terminal, server and storage medium
CN113207022A (en) * 2021-05-08 2021-08-03 广州酷狗计算机科技有限公司 Video playing method and device, computer equipment and storage medium
CN113377995A (en) * 2021-06-18 2021-09-10 广州酷狗计算机科技有限公司 Media resource playing method and device, storage medium and electronic equipment
CN113411684A (en) * 2021-06-24 2021-09-17 广州酷狗计算机科技有限公司 Video playing method and device, storage medium and electronic equipment
WO2023071466A1 (en) * 2021-10-25 2023-05-04 北京字节跳动网络技术有限公司 Method and device for playing sound effect of music
WO2024077498A1 (en) * 2022-10-11 2024-04-18 广州酷狗计算机科技有限公司 Playback interface display method and apparatus, device, and readable storage medium

Similar Documents

Publication Publication Date Title
CN108391171B (en) Video playing control method and device, and terminal
CN107908929B (en) Method and device for playing audio data
CN109348247B (en) Method and device for determining audio and video playing time stamp and storage medium
CN111741366A (en) Audio playing method, device, terminal and storage medium
CN108449641B (en) Method, device, computer equipment and storage medium for playing media stream
CN108737897B (en) Video playing method, device, equipment and storage medium
CN111065001B (en) Video production method, device, equipment and storage medium
CN110248236B (en) Video playing method, device, terminal and storage medium
CN111464830B (en) Method, device, system, equipment and storage medium for image display
CN108881286B (en) Multimedia playing control method, terminal, sound box equipment and system
CN111147878A (en) Stream pushing method and device in live broadcast and computer storage medium
CN109144346B (en) Song sharing method and device and storage medium
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN109982129B (en) Short video playing control method and device and storage medium
CN113411680B (en) Multimedia resource playing method, device, terminal and storage medium
CN111880888B (en) Preview cover generation method and device, electronic equipment and storage medium
CN111711838A (en) Video switching method, device, terminal, server and storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN111818358A (en) Audio file playing method and device, terminal and storage medium
CN109547847B (en) Method and device for adding video information and computer readable storage medium
CN113204672B (en) Resource display method, device, computer equipment and medium
CN114245218A (en) Audio and video playing method and device, computer equipment and storage medium
CN111399796B (en) Voice message aggregation method and device, electronic equipment and storage medium
CN111294551B (en) Method, device and equipment for audio and video transmission and storage medium
CN112616082A (en) Video preview method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination