CN111628925B - Song interaction method, device, terminal and storage medium - Google Patents

Song interaction method, device, terminal and storage medium Download PDF

Info

Publication number
CN111628925B
CN111628925B CN202010450193.6A CN202010450193A CN111628925B CN 111628925 B CN111628925 B CN 111628925B CN 202010450193 A CN202010450193 A CN 202010450193A CN 111628925 B CN111628925 B CN 111628925B
Authority
CN
China
Prior art keywords
interaction
interactive
target
song
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010450193.6A
Other languages
Chinese (zh)
Other versions
CN111628925A (en
Inventor
苏裕贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN202010450193.6A priority Critical patent/CN111628925B/en
Publication of CN111628925A publication Critical patent/CN111628925A/en
Application granted granted Critical
Publication of CN111628925B publication Critical patent/CN111628925B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/185Arrangements for providing special services to substations for broadcast or conference, e.g. multicast with management of multicast group membership
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services

Abstract

The application discloses a song interaction method, a song interaction device, a song interaction terminal and a song interaction storage medium, and belongs to the technical field of networks. The method comprises the following steps: displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, and different candidate interaction inlets correspond to different interaction groups; receiving triggering operation of a target interaction portal in at least one candidate interaction portal, wherein the target interaction portal corresponds to a target interaction group; and displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group. By the song interaction method provided by the embodiment of the application, the terminal user receives the real-time interaction information sent by the interaction user through the interaction interface and carries out the interaction of the synchronous information, so that the problem that the terminal user cannot carry out the synchronous information interaction when playing songs in the related technology is solved, and the interactivity between the terminal user and an interaction object when listening to songs is improved.

Description

Song interaction method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of networks, in particular to a song interaction method, a song interaction device, a song interaction terminal and a song interaction storage medium.
Background
With the rapid development of internet technology, online interaction modes are more and more. For example, when watching video, a barrage can be sent for interaction; when a song is played, the evaluation area can be accessed for leaving a message, and the like.
For song comments, when a target song is played on the terminal, a comment entry for posting comments on the target song is displayed. In the related art, comments of target songs are asynchronous interaction modes, and real-time communication of multiple users cannot be achieved. Particularly for users who are playing songs, real-time interaction is more important, and good real-time interaction can create ideal social environment for the users and promote interaction among the users.
Disclosure of Invention
The embodiment of the application provides a song interaction method, a song interaction device, a song interaction terminal and a song interaction storage medium, wherein the technical scheme is as follows:
in one aspect, a song interaction method is provided, the method including:
displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, and different candidate interaction inlets correspond to different interaction groups;
receiving triggering operation of a target interaction portal in at least one candidate interaction portal, wherein the target interaction portal corresponds to a target interaction group;
And displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
In another aspect, a song interaction apparatus is provided, the apparatus comprising:
the song playing interface display module is used for displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, and different candidate interaction inlets correspond to different interaction groups;
the triggering operation receiving module is used for receiving triggering operation of a target interaction portal in at least one candidate interaction portal, and the target interaction portal corresponds to a target interaction group;
the interactive interface display module is used for displaying an interactive interface corresponding to the target interactive group, and the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
In another aspect, a terminal is provided that includes a processor and a memory; the memory stores at least one instruction for execution by the processor to implement the song interaction method as described in the above aspects.
In another aspect, a computer-readable storage medium is provided, the storage medium storing at least one instruction for execution by a processor to implement a song interaction method as described in the above aspect.
In another aspect, there is provided a computer program product storing at least one instruction that is loaded and executed by the processor to implement the song interaction method of the above aspect.
In the embodiment of the application, a song interaction method is provided, which is different from a song playing interface in the related technology, wherein the song playing interface in the embodiment of the application comprises at least one candidate interaction entrance, when the triggering operation of a target interaction entrance in the at least one candidate interaction entrance is received, the terminal interface displays an interaction interface corresponding to the target interaction group, then a terminal user receives real-time interaction information sent by an interaction user through the interaction interface and carries out interaction of synchronous information, thereby solving the problem that the terminal user cannot carry out information synchronous interaction when playing songs in the related technology, and improving the interactivity of the terminal user with a corresponding interaction object when listening songs.
Drawings
FIG. 1 illustrates a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 illustrates a flow chart of a song interaction method provided by an exemplary embodiment of the present application;
FIG. 3 illustrates an interface diagram of an incoming interactive interface provided by an exemplary embodiment of the present application;
FIG. 4 illustrates a flow chart of a song interaction method provided by another exemplary embodiment of the present application;
FIG. 5 is a diagram illustrating an interface diagram of an interactive group creation process provided by an exemplary embodiment of the present application;
FIG. 6 illustrates a flow chart of a song interaction method provided by another exemplary embodiment of the present application;
FIG. 7 illustrates a schematic diagram of an interactive interface provided by an exemplary embodiment of the present application;
FIG. 8 illustrates a block diagram of a song interaction apparatus according to an exemplary embodiment of the present application;
fig. 9 is a block diagram showing a structure of a terminal according to an exemplary embodiment of the present application;
fig. 10 is a block diagram showing the structure of a server according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
References herein to "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application. Referring to fig. 1, the implementation environment may include: a first terminal 100, a server 200 and a second terminal 300.
It should be noted that, the first terminal 100 corresponds to an end user in the embodiment of the present application, the second terminal 300 corresponds to a terminal device of an interaction object of the end user, and at least one second terminal 300 is provided. Wherein clients having the same music playing software installed and running in the first terminal 100 and the second terminal 300.
The first terminal 100 is connected to the server 200 through a wireless network or a wired network.
The server 200 is a transfer station for transmitting interactive information between the terminal user and the interactive object thereof, and is configured to receive real-time interactive information from the first terminal 100 and forward the real-time interactive information to the second terminal 300, so that the user of the second terminal 300 can see the real-time interactive information sent by the corresponding user of the first terminal 100 at the interactive interface.
Alternatively, the server 200 may be an independent server, or may be any one of a server cluster, a virtual cloud storage, or a cloud computing center, taking the server 200 as an example of the server cluster, and in the embodiment of the present application, the server 200 may include an interaction server 210 and a comment server 220, where the interaction server 210 and the comment server 220 are connected through a wireless network or a wired network. On the first terminal 100 side, the first terminal 100 is connected to the interaction server 210 and the comment server 220 through a wireless network or a wired network, respectively; on the side of the second terminal 300, the second terminal 300 is connected to the interaction server 210 and the comment server 220 through a wireless network or a wired network, respectively.
In combination with the above-described composition of the server 200, in an embodiment of the present application, the interaction server 210 may include the following actions: when the first terminal 100 receives the interaction group creation operation, an interaction group creation request is sent to the interaction server 210, and the interaction server 210 is used for creating a corresponding interaction entry for the interaction group; when the first terminal 100 receives the interaction information sending instruction, sending the input target real-time interaction information to the interaction server 210, where the interaction server 210 is configured to forward the target real-time interaction information to other interaction objects in the target interaction group (i.e. the terminal users corresponding to the second terminals 300); when the first terminal 100 receives a closing operation of the interaction interface, an interaction exit instruction is sent to the interaction server 210, and the interaction server 210 is configured to remove the current interaction object from the target interaction group according to the interaction exit instruction.
Furthermore, in an embodiment of the present application, comment server 220 may include the following functions: when the first terminal 100 receives comment posting operation on the target real-time interaction information, the target real-time interaction information is sent to the comment server 220, and the comment server 220 is used for posting the target real-time interaction information to a comment area corresponding to the target song.
Optionally, the server 200 may further include a user identifier management server, where the user identifier management server is configured to manage events such as user identifier creation, user identifier supervision, and the like in the information interaction process; optionally, the server 200 may further include an interactive interface management server, where the interactive interface management server is configured to manage events such as creation of interactive interface identifiers, supervision of interactive interface identifiers, and so on. The servers that may be included in the server 200 are merely exemplary, and may be supplemented according to the embodiments of the present application, which are not limited thereto.
When the server 200 is a separate server, the server 200 has the background service function of each server.
In the related art, when each terminal user listens to a song, a song playing interface is displayed on an application program interface of the terminal, and a comment entry for displaying a comment can be made on the song playing interface. In one example, user a is playing song 1, and user a enters the comment area by a trigger operation on the comment entry, where comments previously posted by other end users are displayed; further, the user a may issue an independent comment in the comment area, or reply to comments issued by other end users.
In the above example, that is, in the comment interaction method provided in the related art, the terminal users who listen to the same song can only perform asynchronous interaction, if the terminal users are comment publishers, the terminal users need to wait for other terminal users to enter the comment area again and can browse the comment and reply, so that comment interaction between the terminal users can be realized, and the time is long; if the end user is a comment replier, the reply of the user to be commented still needs to be received for a long time, and the like, and repeated interaction is performed.
In the embodiment of the application, a song interaction method is provided, and the problem that different terminal users cannot realize instant interaction when song comments in the related technology can be solved.
Referring to fig. 2, a flowchart of a song interaction method according to an exemplary embodiment of the present application is shown. The method is applicable to the implementation environment shown in fig. 1, and comprises the following steps:
step 201, displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction entrance.
Wherein the song playing interface comprises at least one candidate interaction entrance. In the above example, the song playing interface in the related art includes a comment portal, and it should be noted that the comment portal is not the same as the candidate interaction portal in the various embodiments of the present application.
In the embodiment of the application, different candidate interaction entrances correspond to different interaction groups. Optionally, if the song playing interface includes a candidate interaction entry, the interaction group corresponding to the candidate interaction entry may be an interaction group uniquely corresponding to the target song, and in the interaction group, each terminal user is a user who listens to the target song at the same time; if the song playing interface includes at least two candidate interaction inlets, the interaction group corresponding to the candidate interaction inlet is an interaction group with a quantity related to the target song, for example, each interaction group is respectively associated with a plurality of songs, and the plurality of songs contained in each interaction group commonly contain the target song displayed by the song playing interface.
Step 202, receiving a triggering operation for a target interaction portal in at least one candidate interaction portal, wherein the target interaction portal corresponds to a target interaction group.
In response to the song playing interface including at least two candidate interaction portals, the user can select a plurality of current candidate interaction portals, and the client receives trigger operation of the terminal user on a target interaction portal in the at least one candidate interaction portal, so that selection of the target interaction portal by the user is completed. Wherein the target interaction portal corresponds to a target interaction group.
In one example, as shown in FIG. 3, terminal interface 300 displays a song play interface along with entry control 301. When the end user triggers portal control 301, there are two possible interface display scenarios. If the entry control 301 is triggered and then the user can directly enter the interaction interface uniquely corresponding to the target song, the current candidate interaction entry is one, and an interaction interface as displayed by the terminal interface 320 is displayed, and the interaction interface displays a chat area named as a "Zz exclusive chat room 1"; if the interface displays the interactive interface displayed by the terminal interface 310 after the entry control 301 is triggered, it indicates that more than one candidate interactive entry is currently needed for further selection by the terminal user, as shown in fig. 3, the terminal interface 310 displays three candidate interactive entries, i.e., icons 311 to 313, and when the terminal user selects the icon 311, the terminal user can enter the interactive interface displayed by the terminal interface 320. That is, the terminal user can select a candidate interaction portal as a target interaction portal according to own preference and interaction requirement, and then the client receives the triggering operation of the terminal user on the target interaction portal in the three candidate interaction portals.
Step 203, displaying an interactive interface corresponding to the target interactive group, where the interactive interface is used to display real-time interactive information sent by the interactive objects in the target interactive group.
Correspondingly, in response to the triggering operation of the terminal user on the target interaction entrance in the at least one candidate interaction entrance, the terminal interface corresponding to the client displays the interaction interface corresponding to the target interaction group.
The interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group. The interactive object is other terminal users which are different from the terminal users, and the terminal users are synchronous on-line with the interactive object, so that in the interactive interface, the terminal users can synchronously receive real-time interactive information sent by the interactive object; furthermore, the terminal user may send real-time interaction information on the interaction interface, such as sending real-time interaction information of sharing links, text, pictures or voice, and the embodiment of the present application does not limit the specific form of the real-time interaction information.
Schematically, as shown in fig. 3, the interactive interface displayed by the terminal interface 320 is an interactive interface corresponding to a target interactive group, where the target interactive group is an interactive object of the terminal user, and in fig. 3, the terminal user may also perform viewing of all the interactive objects by triggering the interactive object viewing control 321, and perform other functions such as friend adding, private letter and the like of the interactive object.
In summary, in the embodiment of the present application, a song interaction method is provided, which is different from a song playing interface in the related art in that the song playing interface in the embodiment of the present application includes at least one candidate interaction portal, when a trigger operation for a target interaction portal in the at least one candidate interaction portal is received, the terminal interface displays an interaction interface corresponding to the target interaction group, and then the terminal user receives real-time interaction information sent by the interaction user through the interaction interface, and performs interaction of synchronization information, thereby solving the problem that the terminal user cannot perform information synchronization interaction when playing songs in the related art, and improving interactivity between the terminal user and a corresponding interaction object when listening to songs.
The above embodiments have provided that the candidate interactive portal may be an interactive portal existing in the song playing process, or may be an interactive portal created by the end user, where the following embodiments are described with respect to the case where the end user creates the interactive portal.
Referring to fig. 4, a flowchart of a song interaction method according to another exemplary embodiment of the present application is shown. The method is applicable to the implementation environment shown in fig. 1, and comprises the following steps:
In response to receiving the interactive group creation operation, an interactive group creation request is sent to an interactive server, step 401.
In one possible implementation, when the end user wants to create the interaction group corresponding to the target song by himself, this can be achieved by triggering the relevant control of the client interface. If the user clicks a control for indicating to create the interactive group to generate an interactive group creation operation, the client receives the interactive group creation operation and then sends an interactive group creation request to the interactive server.
The interactive group creation request comprises a group name and an interactive song corresponding to the interactive group, and the interactive server is used for creating a corresponding interactive entry for the interactive group. Alternatively, the interactive song may be a collection of songs or songs having similar characteristics, such as having the same word author, the same song style, and the like.
As shown in fig. 5, an interactive interface such as that displayed by the terminal interface 500 is displayed, the interactive interface displays a chat area named "Zz-exclusive chat room 1", and a hall jump control 501 is also displayed, when the terminal user triggers the hall jump control 501, the current interactive interface jumps to the hall interface displayed by the terminal interface 510, and optionally, the hall interface displays a "recommended room" function, and a function of selecting chat rooms by type according to the "recommended classification" is implemented; further, the terminal interface 510 further displays a room creation control 511, when the terminal user triggers the room creation control 511, an interactive group creation operation is generated, the terminal sends an interactive group creation request to the interactive server in response to receiving the interactive group creation operation, and at this time, the current interactive interface jumps to the creation room interface displayed by the terminal interface 520, and optionally, the creation room interface displays a "room name" filling box, a "room classification" filling box and an "associated song" filling box; after the end user completes the input according to the prompt of the fill box, a confirmation creation control 521 may be triggered to confirm.
Step 402, in response to a song playing request of a target song, an interactive portal acquisition request is sent to an interactive server.
In one possible implementation, in response to a song play request for a target song, the client sends an interactive portal retrieval request to the interactive server. Wherein the song play request is triggered by the end user.
In one example, the song interaction method provided by the application is implemented as a new function of song playing software, when a song playing control is triggered by an end user on a song playing interface of a target song, a song playing request is generated, a client sends an interaction entrance acquisition request to an interaction server, and when the interaction server agrees, the end user can see that at least one candidate interaction entrance is displayed on the song playing interface when the target song is played.
Optionally, step 402 includes the following content one and content two.
And obtaining song information of the target song in response to the song playing request, wherein the song information comprises at least one of song names, singers, song styles, albums and ages.
And sending an interactive entry acquisition request to the interactive server according to the song information.
The interactive songs corresponding to the candidate interactive entrance are target songs, or the interactive songs corresponding to the candidate interactive entrance and the target songs correspond to the same singer, or the interactive songs corresponding to the candidate interactive entrance and the target songs belong to the same song style, or the interactive songs corresponding to the candidate interactive entrance and the target songs belong to the same album, or the interactive songs corresponding to the candidate interactive entrance and the target songs belong to the same year.
Step 403, displaying at least one candidate interaction portal on the song playing interface according to the candidate interaction portal information sent by the interaction server.
Step 404, receiving a triggering operation on a target interaction portal in the at least one candidate interaction portal, where the target interaction portal corresponds to a target interaction group.
In this step, please refer to step 202, and the description of the embodiment of the present application is omitted here.
Step 405, displaying an interactive interface corresponding to the target interactive group, where the interactive interface is used to display real-time interactive information sent by the interactive objects in the target interactive group.
In this step, please refer to step 203, and the description of the embodiment of the present application is omitted here.
On the basis of the embodiment, the embodiment of the application also discloses the content that the client can send the interactive group creation request to the interactive server so as to improve the participation of the terminal user on the interactive group and enrich the creation mode of the interactive group; in addition, the function of automatically displaying the candidate interaction entrance on the song playing interface when the target song is played is achieved by acquiring the candidate interaction entrance according to the song playing request of the target song, and the user using portability of the function is provided.
Referring to fig. 6, a flowchart of a song interaction method according to another exemplary embodiment of the present application is shown. The method is applicable to the implementation environment shown in fig. 1, and comprises the following steps:
in response to receiving the interactive group creation operation, an interactive group creation request is sent to an interactive server, step 601.
In this step, please refer to step 401, and the description of the embodiment of the present application is omitted here.
In step 602, an interactive portal acquisition request is sent to an interactive server in response to a song play request of a target song.
In this step, please refer to step 402, and the description of the embodiment of the present application is omitted here.
And step 603, displaying at least one candidate interaction entrance on the song playing interface according to the candidate interaction entrance information sent by the interaction server.
In this step, please refer to step 403, and the description of the embodiment of the present application is omitted here.
Step 604, receiving a triggering operation on a target interaction portal in at least one candidate interaction portal, where the target interaction portal corresponds to a target interaction group.
In this step, please refer to step 404, and the description of the embodiment of the present application is omitted here.
Step 605, in response to the triggering operation of the target interaction portal, reporting the first location information to the interaction server.
In the embodiment of the application, the geographic position interaction between the terminal user and the interaction user can be realized. The first position information is the position information of the current geographic position.
Step 606, displaying an interactive interface corresponding to the target interactive group, where the interactive interface is used to display real-time interactive information sent by the interactive objects in the target interactive group.
In this step, please refer to step 405, and the description of the embodiment of the present application is omitted here.
In step 607, the real-time interaction information sent by the target interaction object and the second location information of the target interaction object are received.
In order to realize the geographic position interaction function between the terminal user and the interaction user, the client receives the second position information of the target interaction object besides the real-time interaction information sent by the target interaction object on the basis of receiving the first position information.
The second position information is reported by the target interaction object, and the real-time interaction information and the second position information are forwarded by the interaction server. The target interactive object is any one end user in the interactive object.
At step 608, real-time interaction information, geographic location and distance information are displayed at the interaction interface.
Alternatively, the distance information may be determined by the interaction server or by the client providing the first location information. The geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information.
In one example, as shown in fig. 7, an interactive interface is displayed, such as terminal interface 700, with chat areas named "Zz-specific chat room 1" displayed, which may be displayed in a display format, such as shown in example block 701, when the target interactive object sends real-time interactive information. At the interactive interface, the avatar icon, nickname, gender, geographic location and distance information of the target interactive object, and the speaking content of the user (i.e. the real-time interactive information sent by the target interactive object) are displayed. And displaying a text prompt such as 'first school, distance 2.3 km' on the peripheral side of the head portrait icon of the target interactive object, wherein 'first school' is the geographic position of the target interactive object, and '2.3 km' is distance information.
It should be noted that, the steps 601 to 608 may be implemented as one embodiment.
Optionally, step 609 is further included after step 606.
In step 609, in response to receiving the interaction information sending instruction, the input target real-time interaction information is sent to the interaction server.
The target real-time interaction information comprises at least one of text information, picture information and audio/video information, and the interaction server is used for forwarding the target real-time interaction information to other interaction objects in the target interaction group.
In one example, as shown in fig. 5, in the interactive interface corresponding to the terminal interface 500, the terminal user inputs text information through the dialogue input box 502 and triggers the send key 503, and then the client receives the interactive information sending instruction and sends the input target real-time interactive information to the interactive server, and then forwards the text information to other interactive objects in the target interactive group through the interactive server.
In another example, as shown in fig. 5, in the interactive interface corresponding to the terminal interface 500, the terminal user displays the terminal interface 530 by triggering the interactive selection control 504, where the terminal interface 530 includes an interactive option selection box 531, if the terminal user selects a voice option, an interactive information sending instruction is triggered to be generated, the client receives the interactive information sending instruction, the interactive server generates a phone request according to the interactive information sending instruction, and sends the phone request to other interactive objects in the target interactive group, and when an interactive object for confirming the phone request exists in the other interactive objects in the target interactive group, the interactive server establishes an instant communication channel with the interactive object for the terminal user, and realizes an online phone function.
Optionally, if the target real-time interaction information is displayed in the interaction interface, step 606 is followed by step 610.
In step 610, in response to receiving the comment posting operation on the target real-time interaction information, the target real-time interaction information is sent to a comment server, where the comment server is configured to post the target real-time interaction information to a comment area corresponding to the target song.
In one example, the terminal user sends the speech (i.e. the target real-time interaction information) of the terminal user as comments to a comment area corresponding to the target song, the realization process determines the target real-time interaction information for the terminal user, and sends the target real-time interaction information to a comment server by triggering a release control, and then the comment server releases the target real-time interaction information to the comment area corresponding to the target song.
Optionally, step 611 is further included after step 606.
In step 611, in response to receiving the close operation to the interactive interface, an interaction exit instruction is sent to the interaction server.
At this time, the interaction server is configured to remove the current interaction object from the target interaction group according to the interaction exit instruction.
Optionally, step 612 is further included after step 606.
In step 612, a group reservation operation is received for the target interaction group.
In one possible implementation, after the end user listens to the target song, the end user can not keep on the interactive interface due to something else, and at this time, the end user triggers a group reservation operation on the target interactive group so as to be convenient for continuing to enter the target interactive group when listening to the song next time.
Optionally, step 613 is further included after step 606.
In step 613, in response to receiving the closing operation of the interactive interface, a target interaction entry corresponding to the target interaction group is displayed in a predetermined area of the user interface.
In one possible implementation manner, after the end user listens to the target song, the end user does not need to perform group reservation operation on the target interaction group, when the client receives closing operation on the interaction interface, a target interaction entrance corresponding to the target interaction group is displayed in a preset area of the user interface, so that the end user can conveniently and continuously enter the interaction interface again through the target interaction entrance when the end user exits carelessly.
On the basis of the embodiment, the song interaction method is also provided, and is different from a song playing interface in the related technology in that the song playing interface in the embodiment of the application comprises at least one candidate interaction entrance, when the triggering operation of a target interaction entrance in the at least one candidate interaction entrance is received, the terminal interface displays an interaction interface corresponding to the target interaction group, then the terminal user receives real-time interaction information sent by the interaction user through the interaction interface and carries out interaction of synchronous information, thereby solving the problem that the terminal user cannot carry out information synchronous interaction when playing songs in the related technology and improving the interactivity of the terminal user with corresponding interaction objects when listening songs; the client can send the content of the interaction group creation request to the interaction server so as to improve the participation of the terminal user on the interaction group and enrich the creation mode of the interaction group; in addition, the function of automatically displaying the candidate interaction entrance on the song playing interface when the target song is played is realized by acquiring the candidate interaction entrance according to the song playing request of the target song, and the user using portability of the function is provided; in the interactive interface, the terminal user can share the geographic position with the interactive object, so that the interactive experience of the user when listening to songs is further improved.
Referring to fig. 8, a block diagram of a song interaction apparatus according to an exemplary embodiment of the present application is shown, where the apparatus includes:
a playing interface display module 801, configured to display a song playing interface, where the song playing interface includes at least one candidate interaction entry, and different candidate interaction entries correspond to different interaction groups;
a trigger operation receiving module 802, configured to receive a trigger operation for a target interaction portal in at least one candidate interaction portal, where the target interaction portal corresponds to a target interaction group;
the interactive interface display module 803 is configured to display an interactive interface corresponding to the target interactive group, where the interactive interface is configured to display real-time interactive information sent by an interactive object in the target interactive group.
Optionally, the playing interface display module 801 includes:
the first display unit is used for responding to a song playing request of a target song and sending an interaction entrance acquisition request to the interaction server;
and the second display unit is used for displaying at least one candidate interaction inlet on the song playing interface according to the candidate interaction inlet information sent by the interaction server, wherein the interaction song corresponding to the candidate interaction inlet has a preset relationship with the target song.
Optionally, the first display unit is further configured to:
obtaining song information of the target song in response to the song playing request, wherein the song information comprises at least one of song names, singers, song styles, albums and ages;
sending the interactive entrance acquisition request to the interactive server according to the song information;
the interactive songs corresponding to the candidate interactive entries are the target songs, or the interactive songs corresponding to the candidate interactive entries and the target songs correspond to the same singer, or the interactive songs corresponding to the candidate interactive entries and the target songs belong to the same song style, or the interactive songs corresponding to the candidate interactive entries and the target songs belong to the same album, or the interactive songs corresponding to the candidate interactive entries and the target songs belong to the same year.
Optionally, the apparatus further includes:
the system comprises a creation request sending module, an interaction server and an interaction group creation module, wherein the creation request sending module is used for responding to the received interaction group creation operation and sending an interaction group creation request to the interaction server, the interaction group creation request comprises a group name and an interaction song corresponding to the interaction group, and the interaction server is used for creating a corresponding interaction inlet for the interaction group.
Optionally, the apparatus further includes:
the information reporting module is used for responding to the triggering operation of the target interaction entrance and reporting first position information to an interaction server, wherein the first position information is the position information of the current geographic position;
optionally, the apparatus further includes:
the information receiving module is used for receiving the real-time interaction information sent by the target interaction object and second position information of the target interaction object, wherein the second position information is reported by the target interaction object, and the real-time interaction information and the second position information are forwarded by the interaction server;
the information display module is used for displaying the real-time interaction information, the geographic position and the distance information on the interaction interface, wherein the geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information.
Optionally, the apparatus further includes:
the first information sending module is used for sending input target real-time interaction information to the interaction server in response to receiving the interaction information sending instruction, wherein the target real-time interaction information comprises at least one of text information, picture information and audio/video information, and the interaction server is used for forwarding the target real-time interaction information to other interaction objects in the target interaction group.
Optionally, the apparatus further includes:
the second information sending module is used for responding to the comment issuing operation of the target real-time interaction information, sending the target real-time interaction information to a comment server, and the comment server is used for issuing the target real-time interaction information to a comment area corresponding to a target song.
Optionally, the apparatus further includes:
the instruction sending module is used for responding to the receiving of the closing operation of the interactive interface and sending an interactive exit instruction to the interactive server, and the interactive server is used for removing the current interactive object from the target interactive group according to the interactive exit instruction.
Optionally, the apparatus further includes:
the operation reservation module is used for receiving group reservation operation on the target interaction group;
and the entrance display module is used for responding to the receiving of the closing operation of the interactive interface and displaying the target interactive entrance corresponding to the target interactive group in a preset area of a user interface.
Referring to fig. 9, a block diagram of a terminal 900 according to an exemplary embodiment of the present application is shown. The terminal 900 may be a portable mobile terminal such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg 3), MP4 (Moving Picture Experts Group Audio Layer IV, mpeg 4) players. Terminal 900 may also be referred to by other names of user devices, portable terminals, etc. Alternatively, the terminal 900 may be a terminal device corresponding to the push end 100 shown in fig. 1.
In general, the terminal 900 includes: a processor 901 and a memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 901 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 901 may also include a main processor and a coprocessor, the main processor being a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 901 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 901 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
The memory 902 may include one or more computer-readable storage media, which may be tangible and non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one instruction for execution by processor 901 to implement a method provided by an embodiment of the present application.
In some embodiments, the terminal 900 may further optionally include: a peripheral interface 903, and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 904, a touch display 905, a camera 906, audio circuitry 907, positioning components 908, and a power source 909.
The peripheral interface 903 may be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 901, the memory 902, and the peripheral interface 903 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 904 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 904 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication ) related circuits, which the present application is not limited to.
The touch display 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display 905 also has the ability to capture touch signals at or above the surface of the touch display 905. The touch signal may be input as a control signal to the processor 901 for processing. The touch display 905 is used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the touch display 905 may be one, setting the front panel of the terminal 900; in other embodiments, the touch display 905 may be at least two, respectively disposed on different surfaces of the terminal 900 or in a folded design; in still other embodiments, the touch display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the touch display 905 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The touch display 905 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 906 is used to capture images or video. Optionally, the camera assembly 906 includes a front camera and a rear camera. In general, a front camera is used for realizing video call or self-photographing, and a rear camera is used for realizing photographing of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and the rear cameras are any one of a main camera, a depth camera and a wide-angle camera, so as to realize fusion of the main camera and the depth camera to realize a background blurring function, and fusion of the main camera and the wide-angle camera to realize a panoramic shooting function and a Virtual Reality (VR) shooting function. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
Audio circuitry 907 is used to provide an audio interface between the user and terminal 900. The audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be plural and disposed at different portions of the terminal 900. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 907 may also include a headphone jack.
The location component 908 is used to locate the current geographic location of the terminal 900 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 908 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, or the Galileo system of Russia.
The power supply 909 is used to supply power to the various components in the terminal 900. The power supply 909 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power source 909 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can further include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyroscope sensor 912, pressure sensor 913, fingerprint sensor 914, optical sensor 915, and proximity sensor 916.
The acceleration sensor 911 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 901 may control the touch display 905 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 911. The acceleration sensor 911 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may collect a 3D motion of the user on the terminal 900 in cooperation with the acceleration sensor 911. The processor 901 may implement the following functions according to the data collected by the gyro sensor 912: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 913 may be provided at a side frame of the terminal 900 and/or a lower layer of the touch display 905. When the pressure sensor 913 is provided at a side frame of the terminal 900, a grip signal of the terminal 900 by a user may be detected, and left-right hand recognition or shortcut operation may be performed according to the grip signal. When the pressure sensor 913 is disposed at the lower layer of the touch display 905, control of the operability control on the UI interface can be achieved according to the pressure operation of the user on the touch display 905. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 914 is used for capturing a fingerprint of a user to identify the identity of the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 901 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 914 may be provided on the front, back or side of the terminal 900. When a physical key or a vendor Logo is provided on the terminal 900, the fingerprint sensor 914 may be integrated with the physical key or the vendor Logo.
The optical sensor 915 is used to collect the intensity of ambient light. In one embodiment, the processor 901 may control the display brightness of the touch display 905 based on the intensity of ambient light collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display brightness of the touch display 905 is turned up; when the ambient light intensity is low, the display brightness of the touch display panel 905 is turned down. In another embodiment, the processor 901 may also dynamically adjust the shooting parameters of the camera assembly 906 based on the ambient light intensity collected by the optical sensor 915.
A proximity sensor 916, also referred to as a distance sensor, is typically disposed on the front side of the terminal 900. Proximity sensor 916 is used to collect the distance between the user and the front of terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the touch display 905 to switch from the bright screen state to the off screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually increases, the processor 901 controls the touch display 905 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 9 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Referring to fig. 10, a schematic structural diagram of a server 1000 according to an embodiment of the application is shown. The server 1000 may be used to implement the song interaction method provided in the above-described embodiments. The server 1000 may be the push server 200 described in the embodiment of fig. 1. Specifically, the present application relates to a method for manufacturing a semiconductor device.
The server 1000 includes a Central Processing Unit (CPU) 1001, a system memory 1004 including a Random Access Memory (RAM) 1002 and a Read Only Memory (ROM) 1003, and a system bus 1005 connecting the system memory 1004 and the central processing unit 1001. The server 1000 also includes a basic input/output system (I/O system) 1006 for aiding in the transfer of information between the various devices within the computer, and a mass storage device 1007 for storing an operating system 1013, application programs 1014, and other program modules 1015.
The basic input/output system 1006 includes a display 1008 for displaying information and an input device 1009, such as a mouse, keyboard, etc., for a user to input information. Wherein the display 1008 and the input device 1009 are connected to the central processing unit 1001 via an input output controller 1010 connected to a system bus 1005. The basic input/output system 1006 may also include an input/output controller 1010 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input output controller 1010 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1007 is connected to the central processing unit 1001 through a mass storage controller (not shown) connected to the system bus 1005. The mass storage device 1007 and its associated computer-readable media provide non-volatile storage for the server 1000. That is, the mass storage device 1007 may include a computer readable medium (not shown) such as a hard disk or CD-ROM drive.
The computer readable medium may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that the computer storage medium is not limited to the one described above. The system memory 1004 and mass storage devices 1007 described above may be collectively referred to as memory.
The server 1000 may also operate in accordance with various embodiments of the present application, through a network, such as the internet, to remote computers connected to the network. I.e. the server 1000 may be connected to the network 1012 via a network interface unit 1011 connected to said system bus 1005, or alternatively, the network interface unit 1011 may be used to connect to other types of networks or remote computer systems (not shown).
The memory also includes one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for implementing the server-side song interaction method described above.
Embodiments of the present application also provide a computer-readable storage medium having at least one instruction stored therein, the at least one instruction being loaded and executed by a processor to implement the song interaction method provided in the above embodiments.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (7)

1. A method of song interaction, the method comprising:
displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, the song playing interface is used for playing target songs, interaction songs corresponding to the candidate interaction inlets have a preset relationship with the target songs, and different candidate interaction inlets correspond to different interaction groups;
in response to receiving the interaction group creation operation, sending an interaction group creation request to an interaction server; the interactive group creation request comprises a group name and interactive songs corresponding to the interactive group, wherein the interactive server is used for creating a corresponding interactive entry for the interactive group, and the interactive songs are a set of one song or multiple similar characteristic songs; responding to the received triggering operation of a target interaction portal in at least one candidate interaction portal, and reporting first position information to the interaction server, wherein the first position information is the position information of the current geographic position, and the target interaction portal corresponds to a target interaction group;
Displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by an interactive object in the target interactive group;
in response to receiving an interaction information sending instruction, sending input target real-time interaction information to an interaction server, wherein the interaction server is used for forwarding the target real-time interaction information to other interaction objects in the target interaction group;
receiving real-time interaction information sent by a target interaction object and second position information of the target interaction object;
displaying the real-time interaction information, the geographic position and the distance information on the interaction interface, wherein the geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information;
and in response to receiving comment issuing operation on the target real-time interaction information, sending the target real-time interaction information to a comment server, wherein the comment server is used for issuing the target real-time interaction information to a comment area of the target song.
2. The method of claim 1, wherein the sending an interactive portal retrieval request to the interactive server in response to the song play request of the target song comprises:
Obtaining song information of the target song in response to the song playing request, wherein the song information comprises at least one of song names, singers, song styles, albums and ages;
and sending the interactive entrance acquisition request to the interactive server according to the song information.
3. The method according to any one of claims 1 to 2, wherein after the displaying the interactive interface corresponding to the target interactive group, the method further comprises:
and in response to receiving the closing operation of the interactive interface, sending an interaction exit instruction to an interaction server, wherein the interaction server is used for removing the current interaction object from the target interaction group according to the interaction exit instruction.
4. The method according to any one of claims 1 to 2, wherein after the displaying the interactive interface corresponding to the target interactive group, the method further comprises:
receiving a group reservation operation for the target interaction group;
and responding to the receiving of the closing operation of the interactive interface, and displaying the target interactive entrance corresponding to the target interactive group in a preset area of a user interface.
5. A song interaction apparatus, the apparatus comprising:
the song playing interface display module is used for displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, the song playing interface is used for playing a target song, the interaction songs corresponding to the candidate interaction inlets have a preset relationship with the target song, and different candidate interaction inlets correspond to different interaction groups;
the creation request sending module is used for sending an interaction group creation request to the interaction server in response to receiving the interaction group creation operation; the interactive group creation request comprises a group name and interactive songs corresponding to the interactive group, wherein the interactive server is used for creating a corresponding interactive entry for the interactive group, and the interactive songs are a set of one song or multiple similar characteristic songs;
the trigger operation receiving module is used for responding to the trigger operation of the target interaction entrance in at least one candidate interaction entrance, reporting first position information to the interaction server, wherein the first position information is the position information of the current geographic position, and the target interaction entrance corresponds to a target interaction group;
The interactive interface display module is used for displaying an interactive interface corresponding to the target interactive group, and the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group;
the first information sending module is used for sending input target real-time interaction information to the interaction server in response to receiving an interaction information sending instruction, and the interaction server is used for forwarding the target real-time interaction information to other interaction objects in the target interaction group;
the information receiving module is used for receiving real-time interaction information sent by the target interaction object and second position information of the target interaction object;
the information display module is used for displaying the real-time interaction information, the geographic position and the distance information on the interaction interface, wherein the geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information;
and the second information sending module is used for responding to the comment issuing operation of the target real-time interaction information, sending the target real-time interaction information to a comment server, and the comment server is used for issuing the target real-time interaction information to a comment area of the target song.
6. A terminal, the terminal comprising a processor and a memory; the memory stores at least one instruction for execution by the processor to implement the song interaction method of any one of claims 1 to 4.
7. A computer readable storage medium storing at least one instruction for execution by a processor to implement the song interaction method of any one of claims 1 to 4.
CN202010450193.6A 2020-05-25 2020-05-25 Song interaction method, device, terminal and storage medium Active CN111628925B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010450193.6A CN111628925B (en) 2020-05-25 2020-05-25 Song interaction method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010450193.6A CN111628925B (en) 2020-05-25 2020-05-25 Song interaction method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111628925A CN111628925A (en) 2020-09-04
CN111628925B true CN111628925B (en) 2023-11-14

Family

ID=72260695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010450193.6A Active CN111628925B (en) 2020-05-25 2020-05-25 Song interaction method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111628925B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518253A (en) * 2021-04-29 2021-10-19 广州酷狗计算机科技有限公司 Song playing method and device, terminal equipment and storage medium
CN114625466B (en) * 2022-03-15 2023-12-08 广州歌神信息科技有限公司 Interactive execution and control method and device for online singing hall, equipment, medium and product
CN114885200B (en) * 2022-04-26 2024-01-02 北京达佳互联信息技术有限公司 Message processing method, device, electronic equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104092596A (en) * 2014-01-20 2014-10-08 腾讯科技(深圳)有限公司 Music user group management method, device and system
CN105897867A (en) * 2016-03-29 2016-08-24 乐视控股(北京)有限公司 Share processing method of interaction information, vehicle terminal, server and system
CN106231436A (en) * 2016-08-30 2016-12-14 乐视控股(北京)有限公司 Message treatment method and processing means
CN106341695A (en) * 2016-08-31 2017-01-18 腾讯数码(天津)有限公司 Interaction method, device and system of live streaming room
CN110209871A (en) * 2019-06-17 2019-09-06 广州酷狗计算机科技有限公司 Song comments on dissemination method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109391850B (en) * 2017-08-02 2021-06-18 腾讯科技(深圳)有限公司 Method, device and storage medium for interacting messages in video page

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104092596A (en) * 2014-01-20 2014-10-08 腾讯科技(深圳)有限公司 Music user group management method, device and system
CN105897867A (en) * 2016-03-29 2016-08-24 乐视控股(北京)有限公司 Share processing method of interaction information, vehicle terminal, server and system
CN106231436A (en) * 2016-08-30 2016-12-14 乐视控股(北京)有限公司 Message treatment method and processing means
CN106341695A (en) * 2016-08-31 2017-01-18 腾讯数码(天津)有限公司 Interaction method, device and system of live streaming room
CN110209871A (en) * 2019-06-17 2019-09-06 广州酷狗计算机科技有限公司 Song comments on dissemination method and device

Also Published As

Publication number Publication date
CN111628925A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
CN110267067B (en) Live broadcast room recommendation method, device, equipment and storage medium
CN110061900B (en) Message display method, device, terminal and computer readable storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN112118477B (en) Virtual gift display method, device, equipment and storage medium
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN111628925B (en) Song interaction method, device, terminal and storage medium
CN111327953B (en) Live broadcast voting method and device and storage medium
CN112764608B (en) Message processing method, device, equipment and storage medium
US20220191557A1 (en) Method for displaying interaction data and electronic device
CN113411680B (en) Multimedia resource playing method, device, terminal and storage medium
CN112583806B (en) Resource sharing method, device, terminal, server and storage medium
CN110147503B (en) Information issuing method and device, computer equipment and storage medium
CN110750734A (en) Weather display method and device, computer equipment and computer-readable storage medium
CN112163406A (en) Interactive message display method and device, computer equipment and storage medium
CN111402844B (en) Song chorus method, device and system
CN113204671A (en) Resource display method, device, terminal, server, medium and product
CN113204672B (en) Resource display method, device, computer equipment and medium
CN114245218A (en) Audio and video playing method and device, computer equipment and storage medium
CN111399796B (en) Voice message aggregation method and device, electronic equipment and storage medium
CN110337042B (en) Song on-demand method, on-demand order processing method, device, terminal and medium
CN111131867A (en) Song singing method, device, terminal and storage medium
CN113485596B (en) Virtual model processing method and device, electronic equipment and storage medium
CN113141538B (en) Media resource playing method, device, terminal, server and storage medium
CN114245148B (en) Live interaction method, device, terminal, server and storage medium
CN110944214B (en) Method, device, equipment, system and storage medium for intercepting high-tide video segments of songs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant