CN108574878B - Data interaction method and device - Google Patents

Data interaction method and device Download PDF

Info

Publication number
CN108574878B
CN108574878B CN201710135742.9A CN201710135742A CN108574878B CN 108574878 B CN108574878 B CN 108574878B CN 201710135742 A CN201710135742 A CN 201710135742A CN 108574878 B CN108574878 B CN 108574878B
Authority
CN
China
Prior art keywords
information
client
account
interaction
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710135742.9A
Other languages
Chinese (zh)
Other versions
CN108574878A (en
Inventor
俄万有
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710135742.9A priority Critical patent/CN108574878B/en
Priority to PCT/CN2018/078115 priority patent/WO2018161887A1/en
Publication of CN108574878A publication Critical patent/CN108574878A/en
Application granted granted Critical
Publication of CN108574878B publication Critical patent/CN108574878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4758End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for providing answers, e.g. voting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4888Data services, e.g. news ticker for displaying teletext characters

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention discloses a data interaction method and device. Wherein, the method comprises the following steps: when a first client of a first application plays a target frame of a first media file, detecting an interactive instruction generated by executing interactive operation on the target frame, wherein the interactive instruction carries object indication information of an object to be interacted selected by the interactive operation, and the first client logs in by using a first account; acquiring a second account of the object to be interacted indicated by the object indication information in a second application, wherein the first account and the second account have an association relationship; acquiring first interaction information to be interacted; and sending the first interaction information to a second account of the second application through the first client. The invention solves the technical problem of limited data interaction range when the existing data interaction mode is adopted.

Description

Data interaction method and device
Technical Field
The invention relates to the field of computers, in particular to a data interaction method and device.
Background
With the development of mobile internet and social media, more and more user terminals can realize the process of data interaction with other user accounts in an application platform after logging in the application platform by using the user accounts. Among them, in the media playing platform, the currently common data interaction mode includes: commenting and barrage, the concrete process is as follows:
1. the comment is usually configured with an independent comment section separate from the media playing window in the display interface, as shown in fig. 1, the comment section can issue the comment content and can also view popular comments of other people.
2. The barrage generally displays the comment content published by different user accounts in a large amount of subtitles in a media playing window, as shown in fig. 2.
However, in any of the above manners of data interaction, the current user account (i.e., the viewer account) is limited to data interaction with other user accounts (i.e., the viewer accounts) logged into the platform. That is, in the media playing platform, in the related art, a channel for performing data interaction, such as a comment area and a barrage, is only provided for a user account (i.e., an audience account) logged in the media playing platform, however, a data interaction process between the user account (i.e., the audience account) of the concerned media playing platform and a character played in the media playing window at the current playing progress cannot be implemented, so that a data interaction range is limited.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a data interaction method and device, which at least solve the technical problem that the data interaction range is limited when the existing data interaction mode is adopted.
According to an aspect of an embodiment of the present invention, there is provided a data interaction method, including: when a first client of a first application plays a target frame of a first media file, detecting an interaction instruction generated by performing interaction operation on the target frame, wherein the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and the first client logs in by using a first account; acquiring a second account of the object to be interacted in a second application, wherein the second account is indicated by the object indication information, and the first account and the second account have an association relationship; acquiring first interaction information to be interacted; and sending the first interaction information to the second account of the second application through the first client.
According to another aspect of the embodiments of the present invention, there is also provided a data interaction method, including: acquiring a data interaction request sent by a first client of a first application, wherein the data interaction request at least carries object indication information for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in by using a first account; acquiring a second account of the object to be interacted in a second application according to the object indication information, wherein the first account and the second account have an association relationship; acquiring first interaction information to be interacted; and sending the first interactive information to the second account of the second application.
According to another aspect of the embodiments of the present invention, there is also provided a data interaction apparatus, including: the system comprises a detection unit, a first client side and a second client side, wherein the detection unit is used for detecting an interaction instruction generated by executing interaction operation on a target frame when the target frame of a first media file is played by the first client side of a first application, the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and the first client side logs in by using a first account; a first obtaining unit, configured to obtain a second account of the object to be interacted in a second application, where the second account is indicated by the object indication information, and the first account and the second account have an association relationship; the second acquisition unit is used for acquiring first interaction information to be interacted; a sending unit, configured to send the first interaction information to the second account of the second application through the first client.
According to another aspect of the embodiments of the present invention, there is also provided a data interaction apparatus, including: a first obtaining unit, configured to obtain a data interaction request sent by a first client of a first application, where the data interaction request carries at least object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in using a first account; a second obtaining unit, configured to obtain, according to the object indication information, a second account of the object to be interacted in a second application, where the first account and the second account have an association relationship; the third acquisition unit is used for acquiring first interaction information to be interacted; and a sending unit, configured to send the first interaction information to the second account of the second application.
In the embodiment of the invention, when a first client of a first application plays a target frame of a first media file, after an interaction instruction generated by performing an interaction operation on the target frame is detected, an object to be interacted selected by the interaction operation in a frame picture of the target frame indicated by object indication information carried in the interaction instruction is acquired, and a second account of the object to be interacted in a second application is acquired, wherein the second account has an association relationship with the first account, so that after the first interaction information to be interacted is acquired, the first interaction information is sent to the second account of the second application through the first client, so that the real-time data interaction is performed on the object to be interacted in the target frame played by the first client and the first client, and the data interaction is not limited to be performed only between the accounts used for playing the media file, and further, the effect of expanding the range of data interaction is realized. Furthermore, the data interaction is directly carried out in real time with the object to be interacted in the target frame played by the first client, and the login operation does not need to be repeatedly executed to log in different application platforms, so that the effect of improving the data interaction efficiency is further realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a data interaction method according to the prior art;
FIG. 2 is a schematic diagram of another data interaction method according to the prior art;
FIG. 3 is a schematic diagram of an application environment of a data interaction method according to an embodiment of the present invention;
FIG. 4 is a flow chart of a method of data interaction according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a data interaction method according to an embodiment of the invention;
FIG. 6 is a schematic diagram of another data interaction method according to an embodiment of the invention;
FIG. 7 is a schematic diagram of yet another data interaction method according to an embodiment of the invention;
FIG. 8 is a flow chart of another method of data interaction according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a data interaction device according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of another data interaction device, according to an embodiment of the present invention;
FIG. 11 is a flow chart of yet another method of data interaction according to an embodiment of the present invention;
FIG. 12 is a flow chart of yet another method of data interaction according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of a data interaction terminal according to an embodiment of the present invention;
fig. 14 is a schematic diagram of a data interaction server according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In an embodiment of the present invention, an embodiment of the above data interaction method is provided. As an optional implementation manner, the data interaction method may be but is not limited to be applied to an application environment shown in fig. 3, where a first client of a first application that logs in by using a first account is run on a terminal 302, and when the first client of the first application plays to a target frame of a first media file, an interaction instruction generated by performing an interaction operation on the target frame is detected, where the interaction instruction carries object indication information of an object to be interacted, which is selected by the interaction operation, and a second account, in a second application, of the object to be interacted, which is indicated by the object indication information is obtained from a server 306 through a network 304; then, first interaction information to be interacted is obtained, and the first interaction information is sent to a second account of a second application through the server 306 by the first client.
In this embodiment, when a first client of a first application, which uses a first account to log in, plays a target frame of a first media file, an interaction instruction generated by performing an interaction operation on the target frame is detected, where the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and a second account of the object to be interacted, which is indicated by the object indication information, in a second application is obtained, where the second account has an association relationship with the first account, so that after the first interaction information to be interacted is obtained, the first interaction information is sent to the second account of the second application through the first client, so as to achieve an effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client and no longer limit to performing data interaction only between accounts used for playing the media file, and further, the effect of expanding the range of data interaction is realized. Furthermore, the data interaction is directly carried out in real time with the object to be interacted in the target frame played by the first client, and the login operation does not need to be repeatedly executed to log in different application platforms, so that the effect of improving the data interaction efficiency is further realized.
Optionally, in this embodiment, the terminal may include, but is not limited to, at least one of the following: mobile phones, tablet computers, notebook computers, desktop PCs, digital televisions and other hardware devices for playing media files. The network may include, but is not limited to, at least one of: wide area networks, metropolitan area networks, and local area networks. The above is only an example, and the present embodiment is not limited to this.
In the embodiment of the present invention, the data interaction method may be applied to, but is not limited to, a data interaction system, where the data interaction system includes: the data interaction terminal is used for acquiring first interaction information to be interacted, and the data interaction server is used for processing the information to be interacted and sending the processed first interaction information to a second account through a server of a second application.
According to an embodiment of the present invention, there is provided a data interaction method, as shown in fig. 4, the method including:
s402, when a first client of a first application plays a target frame of a first media file, detecting an interactive instruction generated by executing interactive operation on the target frame, wherein the interactive instruction carries object indication information of an object to be interacted selected by the interactive operation, and the first client logs in by using a first account;
s404, acquiring a second account of the object to be interacted in the second application, wherein the object indication information indicates that the object to be interacted is in the second application, and the first account and the second account have an association relationship;
s406, acquiring first interaction information to be interacted;
s408, the first interaction information is sent to a second account of the second application through the first client.
Optionally, in this embodiment, the data interaction method may be applied to, but not limited to, a media playing platform, such as a television platform, a digital television platform, or a network media playing platform. Optionally, in this embodiment, the first application may be, but is not limited to, an application for playing a media file, and the second application may be, but is not limited to, an application for instant messaging. That is to say, at the media playing platform, the object to be interacted in the played media file target frame is acquired, so that the interaction information is sent to the account of the object to be interacted in the communication application, so that the data interaction with the object to be interacted displayed in the media file is achieved in real time, and the purpose of expanding the data interaction range is further achieved. For example, as shown in fig. 5 (a), when a first client (playing application App1) of a first application plays to a target frame of a first media file, a click selection operation (i.e., an interaction operation) is performed on an object a to be interacted (e.g., a character on the right side of the figure) in a frame screen of the target frame to generate an interaction instruction, where the interaction instruction carries object indication information of the object to be interacted, for example, the object indication information is the object a to be interacted, and then, as shown in fig. 5 (b), prompt information for determining an interaction platform and an interaction account number for interacting with the object a to be interacted is displayed in the first client of the first application, and it is assumed that the account ID _02 of the object a to be interacted on an S platform (platform where the instant messaging application App2 is located) is selected to interact with. Further, as shown in (c) of fig. 5, acquiring the interaction information, for example, acquiring the interaction information in a voice input manner, and sending the interaction information to a second account ID _02 of the object a to be interacted in a second application (an instant messaging application App2), a display interface of an interaction space of a second client, which is logged in by the second application using the second account ID _02, may be as shown in (d) of fig. 5, where the first account ID _01 of the first application is published before 5 minutes, and for example, the graphic information includes text comments published by the first client and screenshots of corresponding target frames. Further, in this embodiment, the second account may, but is not limited to, send reply information back to the first account ID _01 of the first application through the second account. The above is only an example, and this is not limited in this embodiment.
It should be noted that, in this embodiment, when a first client of a first application, which logs in by using a first account, plays a target frame of a first media file, after an interaction instruction generated by performing an interaction operation on the target frame is detected, an object to be interacted selected by the interaction operation in a frame picture of the target frame indicated by object indication information carried in the interaction instruction is acquired, and a second account of the object to be interacted in a second application is acquired, where the second account has an association relationship with the first account, so that after the first interaction information to be interacted is acquired, the first interaction information is sent to the second account of the second application by the first client, so as to achieve real-time data interaction between the first client and the object to be interacted in the target frame played by the first client, and is not limited to perform data interaction only between accounts used for playing the media file, and further, the effect of expanding the range of data interaction is realized. Furthermore, the data interaction is directly carried out in real time with the object to be interacted in the target frame played by the first client, and the login operation does not need to be repeatedly executed to log in different application platforms, so that the effect of improving the data interaction efficiency is further realized.
Optionally, in this embodiment, the first interaction information may include, but is not limited to: the input information and frame picture indication information, the frame picture indication information is used for indicating the frame picture corresponding to the target frame in the first media file.
Optionally, in this embodiment, the input information may include, but is not limited to, at least one of the following: character information, voice information. The character information may include, but is not limited to: text, icons, where the icons may be, but are not limited to, emoticons. In addition, the voice message may be, but is not limited to, recognized and converted into text message in the server of the first application, so as to facilitate forwarding of the text message to the second account of the second application.
Optionally, in this embodiment, the first interaction information may be, but is not limited to, to be sent to a server of the first application, and the server of the first application sends the first interaction information to a second client of the second application through a server of the second application. The sending method of the first interactive information may include, but is not limited to: synchronous transmission and asynchronous transmission. That is, the input information and the frame indication information may be simultaneously transmitted to the server of the first application, so that the server of the first application directly synthesizes the content into the interactive information for forwarding to the server of the second application according to the predetermined format. In addition, the input information and the frame indication information can also be asynchronously sent to the server of the first application, so as to improve the transmission efficiency of the information and reduce the transmission load.
In addition, in this embodiment, the interactive information in the predetermined format sent by the server of the first application to the server of the second application may be, but is not limited to, text information, image information, link information, and combination information of any two or three of the above information. The image information may include, but is not limited to: still pictures (e.g., screenshot of frame of target frame), dynamic pictures (a set of consecutive frame pictures comprising target frame). The link information may be, but not limited to, a frame picture indicating a target frame in the first media file. The above is only an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the object to be interacted may be determined according to, but not limited to, an operation position of the interaction operation. The determining method may include, but is not limited to:
1) when the first media file is played in a full screen mode, the object identifier of the object to be interacted corresponding to the operation coordinate in the target frame can be searched from a preset mapping relation according to the frame number of the target frame and the operation coordinate of the interactive operation on a terminal playing interface, but not limited to, so as to determine the object to be interacted;
it should be noted that the preset mapping relationship may be, but is not limited to, that before the interactive operation is detected, the server of the first application identifies the object to be interacted displayed in each frame through a face recognition technology, and establishes a one-to-one mapping relationship between the object to be interacted and the frame number and the display coordinate of the corresponding display, so that after the frame number and the operation coordinate of the interactive operation of the target frame are obtained according to the detected interactive operation, the object identifier of the object to be interacted in the frame picture of the target frame is obtained by using the mapping relationship, and then the data interaction with the object to be interacted is achieved.
2) When the first media file is played in the media playing window, the above mode 1) may be adopted, which is not described again in this example; the method may also, but is not limited to, obtain a predetermined identifier displayed in an operation position where the first client performs the interaction operation, and thus obtain an object identifier of the object to be interacted according to the predetermined identifier to determine the object to be interacted.
It should be noted that the above-mentioned operation position may be, but is not limited to, displaying an image identifier, such as an avatar identifier of an object to be interacted, as shown in fig. 6. Specifically, when the interaction operation is triggered by clicking the avatar identifier "object a" shown in fig. 6, the object identifier of the corresponding object to be interacted is obtained according to the avatar identifier, so as to determine the object to be interacted.
Optionally, in this embodiment, after the first interaction information is sent to the second account of the second application by the first client, the method may further include, but is not limited to: receiving second interaction information sent by a second account of a second application through a first client; and displaying the second interactive information at the first client.
It should be noted that, the second interaction information may be, but is not limited to, interaction information associated with the first interaction information, for example, the second interaction information is reply information for the first interaction information. As shown in fig. 5 (d), after the first interactive message issued by the first account in the second client is clicked, the second account ID-02 may send the second interactive information to the first account ID-01 of the first application through the second client, so as to implement real-time interaction with the first account. Here, the second interaction information may be, but is not limited to, according to a message reply policy, after the server of the second application acquires the second interaction information, the server of the second application may identify a sending target path of the second interaction information by, but is not limited to, using the source identifier of the first interaction information, thereby implementing sending the second interaction information to the first client through the server of the first application. When the first account is offline, the second interaction information may be, but is not limited to, cached in the server of the first application, and pushed from the server of the first application to the first client after the first account logs in.
Optionally, in this embodiment, when the first account logs in, displaying the second interaction information at the first client may be, but is not limited to:
s1, judging whether the first client continues playing the first media file;
s2, when the first client continues playing the first media file, displaying the second interactive information on the playing picture of the first media file;
s3, when the first client does not play the first media file continuously, the first client prompts to receive the second interactive information; and after the control instruction for displaying the second interactive information is acquired, displaying the second interactive information in a preset window of the first client.
The predetermined window may be, but is not limited to, a message center window of the first client, which is only an example, and this is not limited in this embodiment.
For example, as shown in fig. 7 (a), after the second client clicks the "reply", the second interaction information is input and transmitted. If the first account is online and the first client is still playing the first media file, as shown in fig. 7 (b), the replied second interactive information may be displayed on the playing screen of the first client, such as "reply: …'.
Through the embodiment provided by the application, when a first client logged in by using a first account of a first application plays a target frame of a first media file, after an interaction instruction generated by performing interaction operation on the target frame is detected, an object to be interacted selected by the interaction operation in a frame picture of the target frame indicated by object indication information carried in the interaction instruction is acquired, and a second account of the object to be interacted in a second application is acquired, wherein the second account has an association relation with the first account, so that after the first interaction information to be interacted is acquired, the first interaction information is sent to the second account of the second application through the first client, an effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client is achieved, and data interaction is not limited to be performed only among accounts used for playing the media file, and further, the effect of expanding the range of data interaction is realized.
As an optional scheme, the obtaining first interaction information to be interacted includes:
s2, acquiring the input information and frame indication information, where the frame indication information is used to indicate a frame corresponding to the target frame in the first media file, and the first interaction information includes: input information and frame picture indication information.
Optionally, in this embodiment, the acquiring the input information includes:
1) acquiring input character information, wherein the character information comprises: text and/or icons; or
2) Acquiring input voice information; and recognizing character information to be interacted from the voice information.
It should be noted that the icons may be, but are not limited to, expressions for interaction, and the above is only an example, and this is not limited in this embodiment.
In addition, in the embodiment, the server performs format conversion on the voice information sent by the first client to identify the corresponding character information, so as to be conveniently provided to the second client for display directly. Furthermore, through voice input information, interaction operation is greatly simplified, and interaction efficiency is improved.
Optionally, in this embodiment, the acquiring the frame picture indication information includes one of:
1) acquiring a frame picture on a target frame in a first media file;
2) acquiring a group of continuous frame pictures including a target frame in a first media file;
3) acquiring first link information, wherein the first link information is used for indicating a frame picture on a target frame in a first media file;
4) second link information is acquired, wherein the second link information is used for indicating a group of continuous frame pictures including the target frame in the first media file.
It should be noted that, in this embodiment, the server of the first application may perform, but is not limited to, a composition operation on the frame picture indicated by the input information and the indication information of the frame picture sent by the first client, so as to obtain the predetermined format information to be sent. The frame picture indication information may be, but is not limited to, synthesizing the at least one frame picture information on the target frame into the predetermined format information.
In addition, in this embodiment, the first link information and the second link information may be displayed on the second client, but not limited thereto, so that the second account can jump to the position indicated by the link information through the second client, thereby starting to play the first media file played by the first client, and the playing progress of the first client can also be shared, for example, the second account can also start to play from the target frame, thereby ensuring real-time performance of data interaction, and further ensuring intuitiveness of the interactive content through content association.
For example, as shown in fig. 5 (d), the frame picture on the target frame in the first media file and the input information "show-while-stick" are synthesized to obtain the predetermined format information, and the predetermined format information is transmitted to the second account and displayed in the second client.
According to the embodiment provided by the application, the input information and the frame indication information are acquired, and the information obtained by synthesizing the input information and the frame indication information is sent to the second account of the second application, so that the data interaction is realized, and meanwhile, the second account is enabled to synchronize the playing progress of the first account, so that the real-time data interaction related to the content is realized.
As an optional scheme, after the first interaction information is sent to the second account of the second application by the first client, the method further includes:
s1, receiving second interaction information sent by a second account of a second application through a first client;
and S2, displaying the second interactive information on the first client.
Optionally, in this embodiment, the receiving, by the first client, the second interaction information sent by the second account of the second application includes: and receiving second interaction information through the first client, wherein the second interaction information is sent by a second client of a second application and is sent to the first client through a server of the second application and a server of the first application, and the second client logs in by using a second account.
According to the embodiment provided by the application, the second interactive information sent by the second client is received and displayed in the first client, so that one-to-one real-time interaction between the first account and the second account is achieved, and privacy and safety of data interaction are guaranteed.
As an alternative, the displaying the second interaction information at the first client includes:
s1, judging whether the first client continues playing the first media file;
s2, when the first client continues playing the first media file, displaying the second interactive information on the playing picture of the first media file;
s3, when the first client does not play the first media file continuously, the first client prompts to receive the second interactive information; and after the control instruction for displaying the second interactive information is acquired, displaying the second interactive information in a preset window of the first client.
Optionally, in this embodiment, after the server of the first application receives the second interaction information, it is determined whether the first account is online. When the first account is offline, the server of the first application may, but is not limited to, cache the second interaction information in the server of the first application, and push the second interaction information from the server of the first application to the first client after the first account logs in. When the first account is online, it may be, but is not limited to, determining whether the first client continues to play the first media file; and then determining the display position of the second interactive information according to the judgment result.
For example, as shown in fig. 7 (a), after the second client clicks the "reply", the second interaction information is input and transmitted. If the first account is online and the first client is still playing the first media file, as shown in fig. 7 (b), the replied second interactive information may be displayed on the playing screen of the first client, such as "reply: …'. If the first account is online, but the first client does not continue playing the first media file, or the first account is offline, it may be prompted that the second interaction information is received at a message center of the first client, and after a control instruction for displaying the second interaction information is obtained according to the prompt information (e.g., the prompt information is clicked, or the message center is entered), the second interaction information is displayed in a predetermined window of the first client, e.g., the second interaction information is displayed at the message center, or a new window is popped up at the first client to display the second interaction information.
Through the embodiment provided by the application, whether the first client continues to play the first media file is judged, so that different display of the second interactive information is realized according to different judgment results, the first client is ensured not to miss the second interactive information to be displayed, and the effect of improving the flexibility of display of the second interactive information is realized.
According to an embodiment of the present invention, there is provided a data interaction method, as shown in fig. 8, the method including:
s802, a data interaction request sent by a first client of a first application is obtained, wherein the data interaction request at least carries object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in by using a first account;
s804, acquiring a second account of the object to be interacted in a second application according to the object indication information, wherein the first account and the second account have an association relationship;
s806, acquiring first interaction information to be interacted;
and S808, sending the first interaction information to a second account of the second application.
Optionally, in this embodiment, the data interaction method may be applied to, but not limited to, a media playing platform, such as a television platform, a digital television platform, or a network media playing platform. Optionally, in this embodiment, the first application may be, but is not limited to, an application for playing a media file, and the second application may be, but is not limited to, an application for instant messaging. That is to say, at the media playing platform, the object to be interacted in the played media file target frame is acquired, so that the interaction information is sent to the account of the object to be interacted in the communication application, so that the data interaction with the object to be interacted displayed in the media file is achieved in real time, and the purpose of expanding the data interaction range is further achieved. For example, as shown in fig. 5 (a), when a first client (playing application App1) of a first application plays to a target frame of a first media file, a click selection operation (i.e., an interaction operation) is performed on an object a to be interacted (e.g., a character on the right side of the figure) in a frame screen of the target frame to generate an interaction instruction, where the interaction instruction carries object indication information of the object to be interacted, for example, the object indication information is the object a to be interacted, and then, as shown in fig. 5 (b), prompt information for determining an interaction platform and an interaction account number for interacting with the object a to be interacted is displayed in the first client of the first application, and it is assumed that the account ID _02 of the object a to be interacted on an S platform (platform where the instant messaging application App2 is located) is selected to interact with. Further, as shown in (c) of fig. 5, acquiring the interaction information, for example, acquiring the interaction information in a voice input manner, and sending the interaction information to a second account ID _02 of the object a to be interacted in a second application (an instant messaging application App2), a display interface of an interaction space of a second client, which is logged in by the second application using the second account ID _02, may be as shown in (d) of fig. 5, where the first account ID _01 of the first application is published before 5 minutes, and for example, the graphic information includes text comments published by the first client and screenshots of corresponding target frames. Further, in this embodiment, the second account may, but is not limited to, send reply information back to the first account ID _01 of the first application through the second account. The above is only an example, and this is not limited in this embodiment.
It should be noted that, in this embodiment, after a data interaction request sent by a first client of a first application is acquired, where the data interaction request carries at least object indication information for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, a second account of the object to be interacted in a second application is acquired according to the object indication information, where the first account and the second account have an association relationship, the first interaction information to be interacted is acquired, and the first interaction information is sent to the second account of the second application. Therefore, after the first interaction information to be interacted is obtained, the first interaction information is sent to the second account of the second application through the first client, so that real-time data interaction is carried out on the object to be interacted in the target frame played by the first client and the first client, data interaction is not limited to be carried out only among the accounts used for playing the media file, and the effect of expanding the data interaction range is achieved. Furthermore, the data interaction is directly carried out in real time with the object to be interacted in the target frame played by the first client, and the login operation does not need to be repeatedly executed to log in different application platforms, so that the effect of improving the data interaction efficiency is further realized.
Optionally, in this embodiment, the first interaction information may include, but is not limited to: the input information and frame picture indication information, the frame picture indication information is used for indicating the frame picture corresponding to the target frame in the first media file.
Optionally, in this embodiment, the input information may include, but is not limited to, at least one of the following: character information, voice information. The character information may include, but is not limited to: text, icons, where the icons may be, but are not limited to, emoticons. In addition, the voice message may be, but is not limited to, recognized and converted into text message in the server of the first application, so as to facilitate forwarding of the text message to the second account of the second application.
Optionally, in this embodiment, the first interaction information may be, but is not limited to, to be sent to a server of the first application, and the server of the first application sends the first interaction information to a second client of the second application through a server of the second application. The sending method of the first interactive information may include, but is not limited to: synchronous transmission and asynchronous transmission. That is, the input information and the frame indication information may be simultaneously transmitted to the server of the first application, so that the server of the first application directly synthesizes the content into the interactive information for forwarding to the server of the second application according to the predetermined format. In addition, the input information and the frame indication information can also be asynchronously sent to the server of the first application, so as to improve the transmission efficiency of the information and reduce the transmission load.
In addition, in this embodiment, the interactive information in the predetermined format sent by the server of the first application to the server of the second application may be, but is not limited to, text information, image information, link information, and combination information of any two or three of the above information. The image information may include, but is not limited to: still pictures (e.g., screenshot of frame of target frame), dynamic pictures (a set of consecutive frame pictures comprising target frame). The link information may be, but not limited to, a frame picture indicating a target frame in the first media file. The above is only an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the server of the first application may include but is not limited to: receiving second interaction information sent by a second account through a server of a second application; and sending the second interactive information to the first client for display.
It should be noted that, the second interaction information may be, but is not limited to, interaction information associated with the first interaction information, for example, the second interaction information is reply information for the first interaction information. As shown in fig. 5 (d), after the first interactive message issued by the first account in the second client is clicked, the second account ID-02 may send the second interactive information to the first account ID-01 of the first application through the second client, so as to implement real-time interaction with the first account. Here, the second interaction information may be, but is not limited to, according to a message reply policy, after the server of the second application acquires the second interaction information, the server of the second application may identify a sending target path of the second interaction information by, but is not limited to, using the source identifier of the first interaction information, thereby implementing sending the second interaction information to the first client through the server of the first application.
According to the embodiment provided by the application, after a data interaction request sent by a first client of a first application is obtained, wherein the data interaction request at least carries object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, a second account of the object to be interacted in a second application is obtained according to the object indication information, wherein the first account and the second account have an association relationship, the first interaction information to be interacted is obtained, and the first interaction information is sent to the second account of the second application. Therefore, after the first interaction information to be interacted is obtained, the first interaction information is sent to the second account of the second application through the first client, so that the effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client and the first client is achieved, data interaction is not limited to be performed only among accounts used for playing the media file, and the effect of expanding the data interaction range is achieved.
As an optional scheme, the obtaining first interaction information to be interacted includes:
s1, obtaining information and frame indication information sent by the first client, where the frame indication information is used to indicate a frame corresponding to the target frame in the first media file, and the first interaction information includes: the information and frame picture indication information sent by the first client.
Optionally, in this embodiment, the obtaining information sent by the first client includes:
1) acquiring character information sent by a first client, wherein the character information comprises: text and/or icons; or
2) Acquiring voice information sent by a first client; and recognizing character information to be interacted from the voice information.
It should be noted that the icons may be, but are not limited to, expressions for interaction, and the above is only an example, and this is not limited in this embodiment.
In addition, in the embodiment, the server of the first application performs format conversion recognition on the voice information sent by the first client to recognize the corresponding character information, so as to be conveniently provided for the second client to display directly. Furthermore, through voice input information, interaction operation is greatly simplified, and interaction efficiency is improved.
Optionally, in this embodiment, sending the first interaction information to the second account of the second application includes:
s1, synthesizing the information sent by the first client and the frame picture indication information into the information with the preset format;
s2, transmitting the predetermined format information.
It should be noted that, in this embodiment, the server of the first application may perform, but is not limited to, a composition operation on the frame picture indicated by the input information and the indication information of the frame picture sent by the first client, so as to obtain the predetermined format information to be sent. The frame picture indication information may be, but is not limited to, synthesizing at least one of the following frame picture information on the target frame into the above-mentioned predetermined format information.
1) A frame on a target frame in a first media file;
2) a set of consecutive frame pictures comprising a target frame in the first media file;
2) first link information indicating a frame picture on a target frame in a first media file;
3) second link information, wherein the second link information is used for indicating a group of continuous frame pictures including the target frame in the first media file.
In addition, in this embodiment, the first link information and the second link information may be displayed on the second client, but not limited thereto, so that the second account can jump to the position indicated by the link information through the second client, thereby starting to play the first media file played by the first client, and the playing progress of the first client can also be shared, for example, the second account can also start to play from the target frame, thereby ensuring real-time performance of data interaction, and further ensuring intuitiveness of the interactive content through content association.
For example, as shown in fig. 5 (d), the frame picture on the target frame in the first media file and the input information "show-while-stick" are synthesized to obtain the predetermined format information, and the predetermined format information is transmitted to the second account and displayed in the second client.
Through the embodiment provided by the application, the second account can directly jump to the playing platform through the link information, and the playing position of the first account, namely the target frame of the first media file, is synchronously played, so that the second account can continuously play the first media file from the target frame, and the effect of sharing the playing progress is realized.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
According to an embodiment of the present invention, there is also provided a data interaction apparatus for implementing the data interaction method, as shown in fig. 9, the apparatus includes:
1) a detecting unit 902, configured to detect, when a first client of a first application plays a target frame of a first media file, an interaction instruction generated by performing an interaction operation on the target frame, where the interaction instruction carries object indication information of an object to be interacted, the object being selected by the interaction operation, and the first client logs in using a first account;
2) a first obtaining unit 904, configured to obtain a second account of the object to be interacted in the second application, where the object indication information indicates that the object to be interacted, where the first account and the second account have an association relationship;
3) a second obtaining unit 906, configured to obtain first interaction information to be interacted;
4) a sending unit 908, configured to send the first interaction information to the second account of the second application through the first client.
Optionally, in this embodiment, the data interaction apparatus may be applied to, but not limited to, a media playing platform, such as a television platform, a digital television platform, or a network media playing platform. Optionally, in this embodiment, the first application may be, but is not limited to, an application for playing a media file, and the second application may be, but is not limited to, an application for instant messaging. That is to say, at the media playing platform, the object to be interacted in the played media file target frame is acquired, so that the interaction information is sent to the account of the object to be interacted in the communication application, so that the data interaction with the object to be interacted displayed in the media file is achieved in real time, and the purpose of expanding the data interaction range is further achieved. For example, as shown in fig. 5 (a), when a first client (playing application App1) of a first application plays to a target frame of a first media file, a click selection operation (i.e., an interaction operation) is performed on an object a to be interacted (e.g., a character on the right side of the figure) in a frame screen of the target frame to generate an interaction instruction, where the interaction instruction carries object indication information of the object to be interacted, for example, the object indication information is the object a to be interacted, and then, as shown in fig. 5 (b), prompt information for determining an interaction platform and an interaction account number for interacting with the object a to be interacted is displayed in the first client of the first application, and it is assumed that the account ID _02 of the object a to be interacted on an S platform (platform where the instant messaging application App2 is located) is selected to interact with. Further, as shown in (c) of fig. 5, acquiring the interaction information, for example, acquiring the interaction information in a voice input manner, and sending the interaction information to a second account ID _02 of the object a to be interacted in a second application (an instant messaging application App2), a display interface of an interaction space of a second client, which is logged in by the second application using the second account ID _02, may be as shown in (d) of fig. 5, where the first account ID _01 of the first application is published before 5 minutes, and for example, the graphic information includes text comments published by the first client and screenshots of corresponding target frames. Further, in this embodiment, the second account may, but is not limited to, send reply information back to the first account ID _01 of the first application through the second account. The above is only an example, and this is not limited in this embodiment.
It should be noted that, in this embodiment, when a first client of a first application, which logs in by using a first account, plays a target frame of a first media file, after an interaction instruction generated by performing an interaction operation on the target frame is detected, an object to be interacted selected by the interaction operation in a frame picture of the target frame indicated by object indication information carried in the interaction instruction is acquired, and a second account of the object to be interacted in a second application is acquired, where the second account has an association relationship with the first account, so that after the first interaction information to be interacted is acquired, the first interaction information is sent to the second account of the second application by the first client, so as to achieve an effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client through the first client, and is not limited to performing data interaction only between accounts used for playing the media file, and further, the effect of expanding the range of data interaction is realized. Furthermore, the data interaction is directly carried out in real time with the object to be interacted in the target frame played by the first client, and the login operation does not need to be repeatedly executed to log in different application platforms, so that the effect of improving the data interaction efficiency is further realized.
Optionally, in this embodiment, the first interaction information may include, but is not limited to: the input information and frame picture indication information, the frame picture indication information is used for indicating the frame picture corresponding to the target frame in the first media file.
Optionally, in this embodiment, the input information may include, but is not limited to, at least one of the following: character information, voice information. The character information may include, but is not limited to: text, icons, where the icons may be, but are not limited to, emoticons. In addition, the voice message may be, but is not limited to, recognized and converted into text message in the server of the first application, so as to facilitate forwarding of the text message to the second account of the second application.
Optionally, in this embodiment, the sending unit includes: and the sending module is used for sending the first interaction information to a server of the first application through the first client, and instructing the server of the first application to send the first interaction information to a second client of the second application through a server of the second application, wherein the second client logs in by using a second account.
That is, the first interaction information may be, but is not limited to, to be sent to a server of the first application, and the server of the first application sends the first interaction information to a second client of the second application through a server of the second application. The sending method of the first interactive information may include, but is not limited to: synchronous transmission and asynchronous transmission. That is, the input information and the frame indication information may be simultaneously transmitted to the server of the first application, so that the server of the first application directly synthesizes the content into the interactive information for forwarding to the server of the second application according to the predetermined format. In addition, the input information and the frame indication information can also be asynchronously sent to the server of the first application, so as to improve the transmission efficiency of the information and reduce the transmission load.
In addition, in this embodiment, the interactive information in the predetermined format sent by the server of the first application to the server of the second application may be, but is not limited to, text information, image information, link information, and combination information of any two or three of the above information. The image information may include, but is not limited to: still pictures (e.g., screenshot of frame of target frame), dynamic pictures (a set of consecutive frame pictures comprising target frame). The link information may be, but not limited to, a frame picture indicating a target frame in the first media file. The above is only an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the object to be interacted may be determined according to, but not limited to, an operation position of the interaction operation. The determining method may include, but is not limited to:
1) when the first media file is played in a full screen mode, the object identifier of the object to be interacted corresponding to the operation coordinate in the target frame can be searched from a preset mapping relation according to the frame number of the target frame and the operation coordinate of the interactive operation on a terminal playing interface, but not limited to, so as to determine the object to be interacted;
it should be noted that the preset mapping relationship may be, but is not limited to, that before the interactive operation is detected, the server of the first application identifies the object to be interacted displayed in each frame through a face recognition technology, and establishes a one-to-one mapping relationship between the object to be interacted and the frame number and the display coordinate of the corresponding display, so that after the frame number and the operation coordinate of the interactive operation of the target frame are obtained according to the detected interactive operation, the object identifier of the object to be interacted in the frame picture of the target frame is obtained by using the mapping relationship, and then the data interaction with the object to be interacted is achieved.
2) When the first media file is played in the media playing window, the above mode 1) may be adopted, which is not described again in this example; the method may also, but is not limited to, obtain a predetermined identifier displayed in an operation position where the first client performs the interaction operation, and thus obtain an object identifier of the object to be interacted according to the predetermined identifier to determine the object to be interacted.
It should be noted that the above-mentioned operation position may be, but is not limited to, displaying an image identifier, such as an avatar identifier of an object to be interacted, as shown in fig. 6. Specifically, when the interaction operation is triggered by clicking the avatar identifier "object a" shown in fig. 6, the object identifier of the corresponding object to be interacted is obtained according to the avatar identifier, so as to determine the object to be interacted.
Optionally, in this embodiment, the apparatus further includes: the receiving unit is used for receiving second interaction information sent by a second account of a second application through a first client after the first interaction information is sent to the second account of the second application through the first client; and the display unit is used for displaying the second interactive information on the first client.
It should be noted that, the second interaction information may be, but is not limited to, interaction information associated with the first interaction information, for example, the second interaction information is reply information for the first interaction information. As shown in fig. 5 (d), after the first interactive message issued by the first account in the second client is clicked, the second account ID-02 may send the second interactive information to the first account ID-01 of the first application through the second client, so as to implement real-time interaction with the first account. Here, the second interaction information may be, but is not limited to, according to a message reply policy, after the server of the second application acquires the second interaction information, the server of the second application may identify a sending target path of the second interaction information by, but is not limited to, using the source identifier of the first interaction information, thereby implementing sending the second interaction information to the first client through the server of the first application. When the first account is offline, the second interaction information may be, but is not limited to, cached in the server of the first application, and pushed from the server of the first application to the first client after the first account logs in.
Optionally, in this embodiment, the display unit includes:
(1) the judging module is used for judging whether the first client side continuously plays the first media file;
(2) the first display module is used for displaying second interactive information on a playing picture of the first media file when the first client continues to play the first media file;
(3) the second display module is used for prompting the first client to receive second interactive information when the first client does not continue to play the first media file; and after the control instruction for displaying the second interactive information is acquired, displaying the second interactive information in a preset window of the first client.
The predetermined window may be, but is not limited to, a message center window of the first client, which is only an example, and this is not limited in this embodiment.
For example, as shown in fig. 7 (a), after the second client clicks the "reply", the second interaction information is input and transmitted. If the first account is online and the first client is still playing the first media file, as shown in fig. 7 (b), the replied second interactive information may be displayed on the playing screen of the first client, such as "reply: …'.
Through the embodiment provided by the application, when a first client logged in by using a first account of a first application plays a target frame of a first media file, after an interaction instruction generated by performing interaction operation on the target frame is detected, an object to be interacted selected by the interaction operation in a frame picture of the target frame indicated by object indication information carried in the interaction instruction is acquired, and a second account of the object to be interacted in a second application is acquired, wherein the second account has an association relation with the first account, so that after the first interaction information to be interacted is acquired, the first interaction information is sent to the second account of the second application through the first client, an effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client is achieved, and data interaction is not limited to be performed only among accounts used for playing the media file, and further, the effect of expanding the range of data interaction is realized.
As an optional solution, the second obtaining unit includes:
1) a first obtaining module, configured to obtain input information and frame indication information, where the frame indication information is used to indicate a frame corresponding to a target frame in a first media file, and the first interaction information includes: input information and frame picture indication information.
Optionally, in this embodiment, the first obtaining module includes:
(1) the first obtaining sub-module is used for obtaining input character information, wherein the character information comprises: text and/or icons; or
(2) The second acquisition submodule is used for acquiring input voice information; and recognizing character information to be interacted from the voice information.
It should be noted that the icons may be, but are not limited to, expressions for interaction, and the above is only an example, and this is not limited in this embodiment.
In addition, in the embodiment, the server performs format conversion on the voice information sent by the first client to identify the corresponding character information, so as to be conveniently provided to the second client for display directly. Furthermore, through voice input information, interaction operation is greatly simplified, and interaction efficiency is improved.
Optionally, in this embodiment, the first obtaining module includes one of:
(1) the third obtaining submodule is used for obtaining a frame picture on a target frame in the first media file;
(2) a fourth obtaining sub-module, configured to obtain a group of continuous frame pictures including the target frame in the first media file;
(3) a fifth obtaining sub-module, configured to obtain first link information, where the first link information is used to indicate a frame picture on a target frame in the first media file;
(4) and the sixth obtaining sub-module is used for obtaining second link information, wherein the second link information is used for indicating a group of continuous frame pictures including the target frame in the first media file.
It should be noted that, in this embodiment, the server of the first application may perform, but is not limited to, a composition operation on the frame picture indicated by the input information and the indication information of the frame picture sent by the first client, so as to obtain the predetermined format information to be sent. The frame picture indication information may be, but is not limited to, synthesizing the at least one frame picture information on the target frame into the predetermined format information.
In addition, in this embodiment, the first link information and the second link information may be displayed on the second client, but not limited thereto, so that the second account can jump to the position indicated by the link information through the second client, thereby starting to play the first media file played by the first client, and the playing progress of the first client can also be shared, for example, the second account can also start to play from the target frame, thereby ensuring real-time performance of data interaction, and further ensuring intuitiveness of the interactive content through content association.
For example, as shown in fig. 5 (d), the frame picture on the target frame in the first media file and the input information "show-while-stick" are synthesized to obtain the predetermined format information, and the predetermined format information is transmitted to the second account and displayed in the second client.
According to the embodiment provided by the application, the input information and the frame indication information are acquired, and the information obtained by synthesizing the input information and the frame indication information is sent to the second account of the second application, so that the data interaction is realized, and meanwhile, the second account is enabled to synchronize the playing progress of the first account, so that the real-time data interaction related to the content is realized.
As an optional scheme, the method further comprises the following steps:
1) the receiving unit is used for receiving second interaction information sent by a second account of a second application through a first client after the first interaction information is sent to the second account of the second application through the first client;
2) and the display unit is used for displaying the second interactive information on the first client.
Optionally, in this embodiment, the receiving unit includes:
(1) and the receiving module is used for receiving second interactive information through the first client, wherein the second interactive information is sent by a second client of a second application and is sent to the first client through a server of the second application and a server of the first application, and the second client logs in by using a second account.
According to the embodiment provided by the application, the second interactive information sent by the second client is received and displayed in the first client, so that one-to-one real-time interaction between the first account and the second account is achieved, and privacy and safety of data interaction are guaranteed.
As an alternative, the display unit includes:
1) the judging module is used for judging whether the first client side continuously plays the first media file;
2) the first display module is used for displaying second interactive information on a playing picture of the first media file when the first client continues to play the first media file;
3) the second display module is used for prompting the first client to receive second interactive information when the first client does not continue to play the first media file; and after the control instruction for displaying the second interactive information is acquired, displaying the second interactive information in a preset window of the first client.
Optionally, in this embodiment, after the server of the first application receives the second interaction information, it is determined whether the first account is online. When the first account is offline, the server of the first application may, but is not limited to, cache the second interaction information in the server of the first application, and push the second interaction information from the server of the first application to the first client after the first account logs in. When the first account is online, it may be, but is not limited to, determining whether the first client continues to play the first media file; and then determining the display position of the second interactive information according to the judgment result.
For example, as shown in fig. 7 (a), after the second client clicks the "reply", the second interaction information is input and transmitted. If the first account is online and the first client is still playing the first media file, as shown in fig. 7 (b), the replied second interactive information may be displayed on the playing screen of the first client, such as "reply: …'. If the first account is online, but the first client does not continue playing the first media file, or the first account is offline, it may be prompted that the second interaction information is received at a message center of the first client, and after a control instruction for displaying the second interaction information is obtained according to the prompt information (e.g., the prompt information is clicked, or the message center is entered), the second interaction information is displayed in a predetermined window of the first client, e.g., the second interaction information is displayed at the message center, or a new window is popped up at the first client to display the second interaction information.
Through the embodiment provided by the application, whether the first client continues to play the first media file is judged, so that different display of the second interactive information is realized according to different judgment results, the first client is ensured not to miss the second interactive information to be displayed, and the effect of improving the flexibility of display of the second interactive information is realized.
According to an embodiment of the present invention, there is also provided a data interaction apparatus for implementing the data interaction method, as shown in fig. 10, the apparatus includes:
1) a first obtaining unit 1002, configured to obtain a data interaction request sent by a first client of a first application, where the data interaction request at least carries object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame of a target frame of a first media file played by the first client, and the first client logs in by using a first account;
2) a second obtaining unit 1004, configured to obtain, according to the object indication information, a second account of the object to be interacted in the second application, where the first account and the second account have an association relationship;
3) a third obtaining unit 1006, configured to obtain first interaction information to be interacted;
4) a sending unit 1008, configured to send the first interaction information to a second account of a second application.
Optionally, in this embodiment, the data interaction apparatus may be applied to, but not limited to, a media playing platform, such as a television platform, a digital television platform, or a network media playing platform. Optionally, in this embodiment, the first application may be, but is not limited to, an application for playing a media file, and the second application may be, but is not limited to, an application for instant messaging. That is to say, at the media playing platform, the object to be interacted in the played media file target frame is acquired, so that the interaction information is sent to the account of the object to be interacted in the communication application, so that the data interaction with the object to be interacted displayed in the media file is achieved in real time, and the purpose of expanding the data interaction range is further achieved. For example, as shown in fig. 5 (a), when a first client (playing application App1) of a first application plays to a target frame of a first media file, a click selection operation (i.e., an interaction operation) is performed on an object a to be interacted (e.g., a character on the right side of the figure) in a frame screen of the target frame to generate an interaction instruction, where the interaction instruction carries object indication information of the object to be interacted, for example, the object indication information is the object a to be interacted, and then, as shown in fig. 5 (b), prompt information for determining an interaction platform and an interaction account number for interacting with the object a to be interacted is displayed in the first client of the first application, and it is assumed that the account ID _02 of the object a to be interacted on an S platform (platform where the instant messaging application App2 is located) is selected to interact with. Further, as shown in (c) of fig. 5, acquiring the interaction information, for example, acquiring the interaction information in a voice input manner, and sending the interaction information to a second account ID _02 of the object a to be interacted in a second application (an instant messaging application App2), a display interface of an interaction space of a second client, which is logged in by the second application using the second account ID _02, may be as shown in (d) of fig. 5, where the first account ID _01 of the first application is published before 5 minutes, and for example, the graphic information includes text comments published by the first client and screenshots of corresponding target frames. Further, in this embodiment, the second account may, but is not limited to, send reply information back to the first account ID _01 of the first application through the second account. The above is only an example, and this is not limited in this embodiment.
It should be noted that, in this embodiment, after a data interaction request sent by a first client of a first application is acquired, where the data interaction request carries at least object indication information for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, a second account of the object to be interacted in a second application is acquired according to the object indication information, where the first account and the second account have an association relationship, the first interaction information to be interacted is acquired, and the first interaction information is sent to the second account of the second application. Therefore, after the first interaction information to be interacted is obtained, the first interaction information is sent to the second account of the second application through the first client, so that the effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client and the first client is achieved, data interaction is not limited to be performed only among accounts used for playing the media file, and the effect of expanding the data interaction range is achieved. Furthermore, the data interaction is directly carried out in real time with the object to be interacted in the target frame played by the first client, and the login operation does not need to be repeatedly executed to log in different application platforms, so that the effect of improving the data interaction efficiency is further realized.
Optionally, in this embodiment, the first interaction information may include, but is not limited to: the input information and frame picture indication information, the frame picture indication information is used for indicating the frame picture corresponding to the target frame in the first media file.
Optionally, in this embodiment, the input information may include, but is not limited to, at least one of the following: character information, voice information. The character information may include, but is not limited to: text, icons, where the icons may be, but are not limited to, emoticons. In addition, the voice message may be, but is not limited to, recognized and converted into text message in the server of the first application, so as to facilitate forwarding of the text message to the second account of the second application.
Optionally, in this embodiment, the first interaction information may be, but is not limited to, to be sent to a server of the first application, and the server of the first application sends the first interaction information to a second client of the second application through a server of the second application. The sending method of the first interactive information may include, but is not limited to: synchronous transmission and asynchronous transmission. That is, the input information and the frame indication information may be simultaneously transmitted to the server of the first application, so that the server of the first application directly synthesizes the content into the interactive information for forwarding to the server of the second application according to the predetermined format. In addition, the input information and the frame indication information can also be asynchronously sent to the server of the first application, so as to improve the transmission efficiency of the information and reduce the transmission load.
In addition, in this embodiment, the interactive information in the predetermined format sent by the server of the first application to the server of the second application may be, but is not limited to, text information, image information, link information, and combination information of any two or three of the above information. The image information may include, but is not limited to: still pictures (e.g., screenshot of frame of target frame), dynamic pictures (a set of consecutive frame pictures comprising target frame). The link information may be, but not limited to, a frame picture indicating a target frame in the first media file. The above is only an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the server of the first application may include but is not limited to: receiving second interaction information sent by a second account through a server of a second application; and sending the second interactive information to the first client for display.
It should be noted that, the second interaction information may be, but is not limited to, interaction information associated with the first interaction information, for example, the second interaction information is reply information for the first interaction information. As shown in fig. 5 (d), after the first interactive message issued by the first account in the second client is clicked, the second account ID-02 may send the second interactive information to the first account ID-01 of the first application through the second client, so as to implement real-time interaction with the first account. Here, the second interaction information may be, but is not limited to, according to a message reply policy, after the server of the second application acquires the second interaction information, the server of the second application may identify a sending target path of the second interaction information by, but is not limited to, using the source identifier of the first interaction information, thereby implementing sending the second interaction information to the first client through the server of the first application.
According to the embodiment provided by the application, after a data interaction request sent by a first client of a first application is obtained, wherein the data interaction request at least carries object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, a second account of the object to be interacted in a second application is obtained according to the object indication information, wherein the first account and the second account have an association relationship, the first interaction information to be interacted is obtained, and the first interaction information is sent to the second account of the second application. Therefore, after the first interaction information to be interacted is obtained, the first interaction information is sent to the second account of the second application through the first client, so that the effect of performing real-time data interaction on the object to be interacted in the target frame played by the first client and the first client is achieved, data interaction is not limited to be performed only among accounts used for playing the media file, and the effect of expanding the data interaction range is achieved.
As an optional solution, the third obtaining unit includes:
1) a first obtaining module, configured to obtain information and frame indication information sent by a first client, where the frame indication information is used to indicate a frame corresponding to a target frame in a first media file, and the first interaction information includes: the information and frame picture indication information sent by the first client.
Optionally, in this embodiment, the first obtaining module includes:
(1) the first obtaining submodule is used for obtaining character information sent by a first client, wherein the character information comprises: text and/or icons; or
(2) The second obtaining submodule is used for obtaining the voice information sent by the first client; and recognizing character information to be interacted from the voice information.
It should be noted that the icons may be, but are not limited to, expressions for interaction, and the above is only an example, and this is not limited in this embodiment.
In addition, in the embodiment, the server of the first application performs format conversion recognition on the voice information sent by the first client to recognize the corresponding character information, so as to be conveniently provided for the second client to display directly. Furthermore, through voice input information, interaction operation is greatly simplified, and interaction efficiency is improved.
As an optional scheme, the sending unit includes:
1) the synthesis module is used for synthesizing the information sent by the first client and the frame picture indication information into information with a preset format;
2) and the sending module is used for sending the information with the preset format.
It should be noted that, in this embodiment, the server of the first application may perform, but is not limited to, a composition operation on the frame picture indicated by the input information and the indication information of the frame picture sent by the first client, so as to obtain the predetermined format information to be sent. The frame picture indication information may be, but is not limited to, synthesizing at least one of the following frame picture information on the target frame into the above-mentioned predetermined format information.
1) A frame on a target frame in a first media file;
2) a set of consecutive frame pictures comprising a target frame in the first media file;
2) first link information indicating a frame picture on a target frame in a first media file;
3) second link information, wherein the second link information is used for indicating a group of continuous frame pictures including the target frame in the first media file.
In addition, in this embodiment, the first link information and the second link information may be displayed on the second client, but not limited thereto, so that the second account can jump to the position indicated by the link information through the second client, thereby starting to play the first media file played by the first client, and the playing progress of the first client can also be shared, for example, the second account can also start to play from the target frame, thereby ensuring real-time performance of data interaction, and further ensuring intuitiveness of the interactive content through content association.
For example, as shown in fig. 5 (d), the frame picture on the target frame in the first media file and the input information "show-while-stick" are synthesized to obtain the predetermined format information, and the predetermined format information is transmitted to the second account and displayed in the second client.
Through the embodiment provided by the application, the second account can directly jump to the playing platform through the link information, and the playing position of the first account, namely the target frame of the first media file, is synchronously played, so that the second account can continuously play the first media file from the target frame, and the effect of sharing the playing progress is realized.
Example 3
The application environment of the embodiment of the present invention may refer to the application environment in embodiment 1, but is not described herein again.
Optionally, in this embodiment, the data interaction method may be applied to, but is not limited to, a data interaction system, where the system includes: data interaction terminal and data interaction server. The first data interaction terminal is exemplified by a Television (Television for short), the second data interaction terminal is exemplified by a mobile phone, the first data interaction terminal comprises a TV and an input module, and a first data interaction server corresponding to the first data interaction terminal comprises: the account number analysis module, the voice conversion module, the message synthesis module and the message sending module, wherein the second data interaction server corresponding to the second data interaction terminal comprises: the system comprises a message reply module, a message distribution module and a message output module. The first media file is exemplified by a video, and the object to be interacted is exemplified by an actor.
The description is made with specific reference to fig. 11-12:
after a user logs in using a login account (e.g., a first account), the video is viewed on a TV (e.g., a first client). And in the process of watching the video, if a user needs to interact with a certain actor in the video, initiating an interaction operation request through the input module. And after receiving the interactive operation command of the user, the input module informs the TV of synchronous interactive operation. And after receiving the interactive operation notification, the TV carries out video scene analysis operation. Analyzing the actor information (such as object identification actor ID) in the current video, the current video playing information (the frame number of the currently played target frame, the screenshot of the video image) and the like. The TV initiates a request to an account number analysis module according to the actor ID, and requests to analyze social account number information (such as a second account number) corresponding to the actor ID. And an account number analysis module in the first data interaction server responds to the social account number information corresponding to the actor ID to the TV, and the TV outputs and displays the corresponding account number information (such as a second account number). Further, the corresponding actor (i.e., social account information corresponding to the actor ID) is selected at the TV through an input module. And the TV asynchronously reports video scene information (currently played video ID, video current target frame number, video screenshot and actor social account information) to the message composition module.
And then, inputting the interactive content through the input module by voice. After receiving the input voice, the input module synchronously sends the input voice to a voice conversion module in the first data interaction server for voice conversion. The voice conversion module receives the synchronous voice, starts voice conversion and converts the voice into characters. And after the voice conversion module finishes the voice conversion, reporting the conversion result to a message synthesis module in the first data interaction server. And the message synthesis module is used for synthesizing information, editing the information of the characters after the interactive voice conversion of the user, the frame number of the current target frame of the video, the video screenshot and the like into interactive information in a picture-text format, and sending the interactive information to actor social account information (such as a second account) in a second application.
And further, after the synthesized interactive information is received, calling a message reply interface to carry out interactive message reply to a message reply module. Wherein, the interactive information carries the source message ID of the reply. And the message reply module in the second data interaction server analyzes the source message ID and inquires the user ID publishing the source message and the corresponding video information (the video ID playing at the moment) according to the source message ID. The message reply module synchronizes the inquired user information (such as the first account number), the video information (such as the video ID) and the interactive message replied by the actor to the message distribution module. And the message distribution module carries out a message distribution decision, wherein the message distribution decision is determined according to the online state of the current user and the video playing state. For example, if the first account is in an offline state, the interactive message is directly sent to a message center of the first client; if the first account is in an online state currently and watches the same video, the interactive message is directly sent to the first client in a bullet screen mode; and if the first account is in an online state at present but the same video is not watched, sending the interactive message to a message center of the first client and sending a prompt message to the user.
Example 4
According to an embodiment of the present invention, there is also provided a data interaction terminal for implementing the data interaction method, as shown in fig. 13, the terminal includes:
1) the processor 1302 is configured to detect an interaction instruction generated by performing an interaction operation on a target frame when a first client of a first application plays the target frame of a first media file, where the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and the first client logs in by using a first account; the method further comprises the steps of obtaining a second account of the object to be interacted in a second application, wherein the second account is indicated by the object indication information, and the first account and the second account have an association relation;
2) a communication interface 1304, connected to the processor 1302, configured to obtain first interaction information to be interacted; the first client is further configured to send the first interaction information to a second account of a second application.
3) The memory 1306 is connected to the processor 1302, and is configured to store the first interaction information, the first account, and a frame of the target frame.
According to an embodiment of the present invention, there is also provided a data interaction server for implementing the data interaction method, as shown in fig. 14, the server includes:
1) the communication interface 1402 is configured to obtain a data interaction request sent by a first client of a first application, where the data interaction request at least carries object indication information for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in by using a first account;
2) the processor 1404 is connected with the communication interface 1402, and configured to acquire a second account of the object to be interacted in the second application according to the object indication information, where the first account and the second account have an association relationship;
the communication interface 1402 is further configured to obtain first interaction information to be interacted; and sending the first interaction information to a second account of a second application.
3) The memory 1406, connected to the processor 1402 and the communication interface 1404, is configured to store a second account and first interaction information of the object to be interacted in the second application.
Optionally, the specific examples in this embodiment may refer to the examples described in embodiment 1 and embodiment 2, and this embodiment is not described herein again.
Example 5
The embodiment of the invention also provides a storage medium. Optionally, in this embodiment, the storage medium may be located in at least one of a plurality of network devices in a network.
Optionally, in this embodiment, the storage medium is configured to store program code for performing the following steps:
s1, when a first client of a first application plays a target frame of a first media file, detecting an interaction instruction generated by performing interaction operation on the target frame, wherein the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and the first client logs in by using a first account;
s2, acquiring a second account of the object to be interacted in the second application, wherein the object indication information indicates that the object to be interacted is in the second application, and the first account and the second account have an association relationship;
s3, acquiring first interaction information to be interacted;
and S4, sending the first interaction information to a second account of the second application through the first client.
Optionally, the storage medium is further arranged to store program code for performing the steps of:
s1, acquiring a data interaction request sent by a first client of a first application, wherein the data interaction request at least carries object indication information for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in by using a first account;
s2, acquiring a second account of the object to be interacted in the second application according to the object indication information, wherein the first account and the second account have an association relationship;
s3, acquiring first interaction information to be interacted;
and S4, sending the first interaction information to a second account of the second application.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Optionally, the specific examples in this embodiment may refer to the examples described in embodiment 1 and embodiment 2, and this embodiment is not described herein again.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (25)

1. A method for data interaction, comprising:
when a first client of a first application plays a target frame of a first media file, detecting an interaction instruction generated by performing interaction operation on the target frame, wherein the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and the first client logs in by using a first account;
acquiring a second account of the object to be interacted in a second application, wherein the second account is indicated by the object indication information, and the first account and the second account have an association relationship;
acquiring first interaction information to be interacted, comprising the following steps: acquiring input information and frame picture indication information, wherein the frame picture indication information is used for indicating a frame picture corresponding to the target frame in the first media file, and the first interaction information comprises: the input information and the frame picture indication information;
and sending the first interaction information to the second account of the second application through the first client.
2. The method of claim 1, wherein sending, by the first client, the first interaction information to the second account of the second application comprises:
and sending the first interaction information to a server of the first application through the first client, and instructing the server of the first application to send the first interaction information to a second client of the second application through a server of the second application, wherein the second client logs in by using the second account.
3. The method of claim 1, wherein the obtaining the input information comprises:
acquiring input character information, wherein the character information comprises: text and/or icons; or
Acquiring input voice information; and identifying character information to be interacted from the voice information.
4. The method according to claim 1, wherein the obtaining frame picture indication information comprises one of:
acquiring a frame picture on the target frame in the first media file;
acquiring a group of continuous frame pictures including the target frame in the first media file;
acquiring first link information, wherein the first link information is used for indicating a frame picture on the target frame in the first media file;
acquiring second link information, wherein the second link information is used for indicating a group of continuous frame pictures including the target frame in the first media file.
5. The method of claim 1, further comprising, after sending, by the first client, the first interaction information to the second account of the second application:
receiving second interaction information sent by the second account of the second application through the first client;
and displaying the second interactive information at the first client.
6. The method of claim 5, wherein the receiving, by the first client, second interaction information sent by the second account of the second application comprises:
and receiving the second interaction information through the first client, wherein the second interaction information is sent by a second client of the second application and is sent to the first client through a server of the second application and a server of the first application, and the second client logs in by using the second account.
7. The method of claim 5, wherein displaying the second interaction information at the first client comprises:
judging whether the first client continues to play the first media file;
when the first client continues to play the first media file, displaying the second interactive information on a playing picture of the first media file;
when the first client does not continue to play the first media file, prompting the first client to receive the second interactive information; and after a control instruction for displaying the second interactive information is acquired, displaying the second interactive information in a preset window of the first client.
8. A method for data interaction, comprising:
acquiring a data interaction request sent by a first client of a first application, wherein the data interaction request at least carries object indication information for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in by using a first account;
acquiring a second account of the object to be interacted in a second application according to the object indication information, wherein the first account and the second account have an association relationship;
acquiring first interaction information to be interacted;
and sending the first interaction information to the second account of the second application.
9. The method of claim 8, wherein the obtaining the first interaction information to be interacted comprises:
acquiring information and frame picture indication information sent by the first client, wherein the frame picture indication information is used for indicating a frame picture corresponding to the target frame in the first media file, and the first interaction information includes: the information sent by the first client and the frame picture indication information.
10. The method of claim 9, wherein the obtaining information sent by the first client comprises:
acquiring character information sent by the first client, wherein the character information comprises: text and/or icons; or
Acquiring voice information sent by the first client; and identifying character information to be interacted from the voice information.
11. The method of claim 9, wherein sending the first interaction information to the second account of the second application comprises:
synthesizing the information sent by the first client and the frame indication information into predetermined format information;
and transmitting the predetermined format information.
12. A data interaction device, comprising:
the system comprises a detection unit, a first client side and a second client side, wherein the detection unit is used for detecting an interaction instruction generated by executing interaction operation on a target frame when the target frame of a first media file is played by the first client side of a first application, the interaction instruction carries object indication information of an object to be interacted selected by the interaction operation, and the first client side logs in by using a first account;
a first obtaining unit, configured to obtain a second account of the object to be interacted in a second application, where the second account is indicated by the object indication information, and the first account and the second account have an association relationship;
a second obtaining unit, configured to obtain first interaction information to be interacted, where the second obtaining unit includes: a first obtaining module, configured to obtain input information and frame indication information, where the frame indication information is used to indicate a frame corresponding to the target frame in the first media file, and the first interaction information includes: the input information and the frame picture indication information;
a sending unit, configured to send the first interaction information to the second account of the second application through the first client.
13. The apparatus of claim 12, wherein the sending unit comprises:
and the sending module is used for sending the first interaction information to the server of the first application through the first client, and instructing the server of the first application to send the first interaction information to a second client of the second application through the server of the second application, wherein the second client logs in by using the second account.
14. The apparatus of claim 12, wherein the first obtaining module comprises:
the first obtaining sub-module is used for obtaining input character information, wherein the character information comprises: text and/or icons; or
The second acquisition submodule is used for acquiring input voice information; and identifying character information to be interacted from the voice information.
15. The apparatus of claim 12, wherein the first obtaining module comprises one of:
a third obtaining sub-module, configured to obtain a frame picture on the target frame in the first media file;
a fourth obtaining sub-module, configured to obtain a group of consecutive frame pictures that include the target frame in the first media file;
a fifth obtaining sub-module, configured to obtain first link information, where the first link information is used to indicate a frame picture on the target frame in the first media file;
a sixth obtaining sub-module, configured to obtain second link information, where the second link information is used to indicate a group of consecutive frame pictures that include the target frame in the first media file.
16. The apparatus of claim 12, further comprising:
a receiving unit, configured to receive, by the first client, second interaction information sent by the second account of the second application after the first interaction information is sent to the second account of the second application by the first client;
and the display unit is used for displaying the second interactive information on the first client.
17. The apparatus of claim 16, wherein the receiving unit comprises:
a receiving module, configured to receive, by the first client, the second interaction information, where the second interaction information is sent by a second client of the second application and sent to the first client through a server of the second application and a server of the first application, and the second client logs in by using the second account.
18. The apparatus of claim 16, wherein the display unit comprises:
the judging module is used for judging whether the first client side continuously plays the first media file;
the first display module is used for displaying the second interactive information on a playing picture of the first media file when the first client continues to play the first media file;
a second display module, configured to prompt the first client to receive the second interaction information when the first client does not continue to play the first media file; and after a control instruction for displaying the second interactive information is acquired, displaying the second interactive information in a preset window of the first client.
19. A data interaction device, comprising:
a first obtaining unit, configured to obtain a data interaction request sent by a first client of a first application, where the data interaction request carries at least object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in using a first account;
a second obtaining unit, configured to obtain, according to the object indication information, a second account of the object to be interacted in a second application, where the first account and the second account have an association relationship;
the third acquisition unit is used for acquiring first interaction information to be interacted;
a sending unit, configured to send the first interaction information to the second account of the second application.
20. The apparatus of claim 19, wherein the third obtaining unit comprises:
a first obtaining module, configured to obtain information and frame indication information sent by the first client, where the frame indication information is used to indicate a frame corresponding to the target frame in the first media file, and the first interaction information includes: the information sent by the first client and the frame picture indication information.
21. The apparatus of claim 20, wherein the first obtaining module comprises:
the first obtaining submodule is configured to obtain character information sent by the first client, where the character information includes: text and/or icons; or
The second obtaining submodule is used for obtaining the voice information sent by the first client; and identifying character information to be interacted from the voice information.
22. The apparatus of claim 20, wherein the sending unit comprises:
the synthesis module is used for synthesizing the information sent by the first client and the frame picture indication information into information with a preset format;
and the sending module is used for sending the predetermined format information.
23. A data interaction terminal, comprising:
the processor detects an interactive instruction generated by executing interactive operation on a target frame when a first client of a first application plays the target frame of a first media file, wherein the interactive instruction carries object indication information of an object to be interacted selected by the interactive operation, and the first client logs in by using a first account; acquiring a second account of the object to be interacted in a second application, wherein the second account is indicated by the object indication information, and the first account and the second account have an association relationship;
the communication interface is connected with the processor, acquires first interaction information to be interacted, and sends the first interaction information to the second account of the second application through the first client;
and the memory is connected with the processor and is used for storing the first interaction information, the first account and the frame picture of the target frame.
24. A data interaction server, comprising:
the communication interface is used for acquiring a data interaction request sent by a first client of a first application, wherein the data interaction request at least carries object indication information used for indicating an object to be interacted, the object to be interacted is displayed in a frame picture of a target frame of a first media file played by the first client, and the first client logs in by using a first account; acquiring first interaction information to be interacted; sending the first interaction information to a second account of a second application; the processor is connected with the communication interface and acquires the second account of the object to be interacted in the second application according to the object indication information, wherein the first account and the second account have an association relationship;
and the memory is connected with the communication interface and the processor and is used for storing the second account number and the first interaction information of the object to be interacted in the second application.
25. A storage medium having stored thereon a computer program, wherein the computer program is arranged to perform the method of any of claims 1 to 7 or the method of any of claims 8 to 11 when executed.
CN201710135742.9A 2017-03-08 2017-03-08 Data interaction method and device Active CN108574878B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710135742.9A CN108574878B (en) 2017-03-08 2017-03-08 Data interaction method and device
PCT/CN2018/078115 WO2018161887A1 (en) 2017-03-08 2018-03-06 Data interaction method and device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710135742.9A CN108574878B (en) 2017-03-08 2017-03-08 Data interaction method and device

Publications (2)

Publication Number Publication Date
CN108574878A CN108574878A (en) 2018-09-25
CN108574878B true CN108574878B (en) 2021-03-26

Family

ID=63447309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710135742.9A Active CN108574878B (en) 2017-03-08 2017-03-08 Data interaction method and device

Country Status (2)

Country Link
CN (1) CN108574878B (en)
WO (1) WO2018161887A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112035275A (en) * 2020-07-30 2020-12-04 长沙市到家悠享网络科技有限公司 Data processing method and server side equipment
CN114125017B (en) * 2020-08-10 2024-04-09 腾讯科技(深圳)有限公司 Media information display method and device, storage medium and electronic equipment
CN114765700B (en) * 2021-01-13 2023-07-14 腾讯科技(深圳)有限公司 Information interaction method and device, storage medium and electronic equipment
CN115473865A (en) * 2022-08-03 2022-12-13 北京达佳互联信息技术有限公司 Information interaction method, server, client and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016196067A1 (en) * 2015-05-29 2016-12-08 Microsoft Technology Licensing, Llc Video messaging
CN106385603A (en) * 2016-09-12 2017-02-08 腾讯科技(深圳)有限公司 Message transmission method and device for media file

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9094578B2 (en) * 2008-07-16 2015-07-28 Echostar Technologies L.L.C. Pay-per-view sharing
CN103095841A (en) * 2013-01-25 2013-05-08 周良文 Communication system and communication method based on network video
CN103929352A (en) * 2013-04-06 2014-07-16 周良文 Service system and method for social contact with object in website through common terminal
CN103873945A (en) * 2014-02-21 2014-06-18 周良文 System and method for socializing with object in video program
CN103888341A (en) * 2014-02-25 2014-06-25 周良文 System and method for performing socialization with objects in music website
CN104333775B (en) * 2014-11-25 2017-11-07 广州华多网络科技有限公司 Virtual objects interactive approach, device and system in a kind of direct broadcast band

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016196067A1 (en) * 2015-05-29 2016-12-08 Microsoft Technology Licensing, Llc Video messaging
CN106385603A (en) * 2016-09-12 2017-02-08 腾讯科技(深圳)有限公司 Message transmission method and device for media file

Also Published As

Publication number Publication date
WO2018161887A1 (en) 2018-09-13
CN108574878A (en) 2018-09-25

Similar Documents

Publication Publication Date Title
CN106658200B (en) Live video sharing and acquiring method and device and terminal equipment thereof
US9984408B1 (en) Method, medium, and system for live video cooperative shopping
KR102019410B1 (en) Methods and systems for providing functional extensions with a landing page of a creative
CN108574878B (en) Data interaction method and device
CN110149549B (en) Information display method and device
CN106980479B (en) Multi-screen interaction method and device and server
CN105592331A (en) Method for processing barrage messages, related equipment, and system
CN104936035A (en) Barrage processing method and system
US20180209807A1 (en) Moving track sharing method and apparatus, and storage medium
CN111901695B (en) Video content interception method, device and equipment and computer storage medium
CN105939483A (en) Video processing method and device
CN109413056B (en) Method and apparatus for processing information
CN110647827A (en) Comment information processing method and device, electronic equipment and storage medium
CN110809172A (en) Interactive special effect display method and device and electronic equipment
CN107241651B (en) Media data playing method and device and intelligent terminal
CN108769261B (en) Multi-screen interaction system, method and interaction screen equipment
CN112752134B (en) Video processing method and device, storage medium and electronic device
CN107508745B (en) Prompting message associated input method and device and computing equipment
CN110673886A (en) Method and device for generating thermodynamic diagram
CN112528052A (en) Multimedia content output method, device, electronic equipment and storage medium
CN108289056B (en) Method and device for sharing dynamic chart and computing equipment
US20180270176A1 (en) Information processing method and device
CN115086747A (en) Information processing method and device, electronic equipment and readable storage medium
CN114666643A (en) Information display method and device, electronic equipment and storage medium
CN114270389A (en) Information acquisition method, device, system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant