CN108449605B - Information synchronous playing method, device, equipment, system and storage medium - Google Patents

Information synchronous playing method, device, equipment, system and storage medium Download PDF

Info

Publication number
CN108449605B
CN108449605B CN201810205908.4A CN201810205908A CN108449605B CN 108449605 B CN108449605 B CN 108449605B CN 201810205908 A CN201810205908 A CN 201810205908A CN 108449605 B CN108449605 B CN 108449605B
Authority
CN
China
Prior art keywords
target information
video frame
information
identifier
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810205908.4A
Other languages
Chinese (zh)
Other versions
CN108449605A (en
Inventor
任金鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201810205908.4A priority Critical patent/CN108449605B/en
Publication of CN108449605A publication Critical patent/CN108449605A/en
Application granted granted Critical
Publication of CN108449605B publication Critical patent/CN108449605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8352Generation of protective data, e.g. certificates involving content or source identification data, e.g. Unique Material Identifier [UMID]

Abstract

The present disclosure provides a method, an apparatus, a device, a system and a storage medium for synchronously playing information, wherein the method comprises the following steps: acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information; and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier. By applying the embodiment of the disclosure, the consistency of the audio and video and the target information can be ensured.

Description

Information synchronous playing method, device, equipment, system and storage medium
Technical Field
The present application relates to the field of live broadcast technologies, and in particular, to a method, an apparatus, a device, a system, and a storage medium for synchronously playing information.
Background
The network video live broadcast can be that a user at a spectator end watches live audio and video live events, such as events, meetings, teaching, operations and the like, which are carried out by a user at a main broadcast end through a network. In the live broadcast push information scene, not only the live broadcast video stream of the main broadcast end but also other target information is played at the audience end. For example, in a scene of live answering, a spectator end can display an answering frame containing a question on a live video, so that when a spectator hears the question broadcasted by a host in the live video through a terminal, the spectator can also see the answering frame from a terminal screen and answer the question.
However, due to network and other reasons, there may be a certain delay between two paths of data, which may result in a situation where the video frame is not synchronized with specific information. For example, the moderator may not have announced the title, and an answer box containing the title is already shown on the screen. Because the requirement of live broadcast answering on real-time performance is high, the answer is judged to be effective only when the answer is completed within the specified answer time. Therefore, the asynchronous display of the broadcast question and the answer frame can influence the answer validity judgment.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a method, an apparatus, a device, a system, and a storage medium for synchronously playing information.
According to a first aspect of the embodiments of the present disclosure, there is provided an information synchronous playing method, the method including:
determining a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in a live video stream, and the information identifier is an identifier of the target information;
and sending the live video stream to a viewer through a live broadcast server, and sending the mapping relation and the target information to the viewer through a communication server, so that the viewer can play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier simultaneously based on the mapping relation.
In an alternative implementation, the target information includes a question topic; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
In an optional implementation manner, the step of determining the information delivery time and the video frame time includes:
when receiving the issuing instruction of the target information, taking the current time as the information issuing time and taking the time of the current video frame in the video stream as the video frame time;
and the issuing instruction is an instruction generated by triggering a key by a specified object when the target information starts to be broadcasted.
According to a second aspect of the embodiments of the present disclosure, there is provided an information synchronized playing method, the method including:
acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
In an alternative implementation, the target information includes a question topic; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
In an optional implementation, the method further includes:
starting countdown when starting to synchronously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier, and stopping playing the target information or forbidding an operation control related to the target information when the countdown is finished;
the countdown time of the countdown is determined based on the difference between the standard countdown time and the terminal network delay, and the terminal network delay is the network delay between the audience terminal and the communication service terminal.
According to a third aspect of the embodiments of the present disclosure, there is provided an information synchronized playback apparatus, the apparatus including:
a relation determining module configured to determine a mapping relation between a video frame identifier and an information identifier, where the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in a live video stream, and the information identifier is an identifier of the target information;
and the information sending module is configured to send the live video stream to the audience through a live broadcast server, and send the mapping relation and the target information to the audience through a communication server, so that the audience can play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier simultaneously based on the mapping relation.
In an alternative implementation, the target information includes a question topic; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
In an optional implementation manner, the relationship determination module is specifically configured to:
when receiving the issuing instruction of the target information, taking the current time as the information issuing time, taking the time of the current video frame in the video stream as the video frame time, and obtaining the mapping relation between the video frame identifier and the information identifier;
and the issuing instruction is an instruction generated by triggering a key by a specified object when the target information starts to be broadcasted.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an information synchronized playback apparatus, the apparatus including:
the information acquisition module is configured to acquire a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and the information playing module is configured to play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relation.
In an alternative implementation, the target information includes a question topic; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
In an optional implementation, the apparatus further comprises a countdown module configured to:
starting countdown when starting to synchronously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier, and stopping playing the target information or forbidding an operation control related to the target information when the countdown is finished;
the countdown time of the countdown is determined based on the difference between the standard countdown time and the terminal network delay, and the terminal network delay is the network delay between the audience terminal and the communication service terminal.
According to a fifth aspect of the embodiments of the present disclosure, a live broadcast system is provided, which includes a main broadcast terminal, a live broadcast server terminal, a communication server terminal, and a viewer terminal;
the method comprises the steps that a main broadcasting end determines a mapping relation between a video frame identifier and an information identifier, a live video stream is sent to a spectator end through a live broadcasting service end, the mapping relation and target information are sent to the spectator end through a communication service end, the video frame identifier is an identifier of a video frame which is expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and the audience terminal obtains the live broadcast video stream, the target information and the mapping relation, and simultaneously plays the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier based on the mapping relation.
According to a sixth aspect of embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
According to a seventh aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of any of the methods described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the method and the device for playing the audio and video frames, the mapping relation between the video frame identification and the information identification is determined at the anchor end, and the audience end can obtain the mapping relation between the video frame identification and the information identification, so that problem titles corresponding to the information identification and the video frames corresponding to the video frame identification can be played simultaneously according to the mapping relation, consistency of the audio and video and target information is guaranteed, and user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram illustrating an application scenario for implementing live broadcasting according to an exemplary embodiment of the present disclosure.
Fig. 2A is a schematic diagram of a live system shown in the present disclosure according to an example embodiment.
Fig. 2B is a flowchart illustrating a method for synchronously playing information according to an exemplary embodiment of the present disclosure.
Fig. 2C is a schematic diagram of a live answer interface shown in the present disclosure according to an exemplary embodiment.
Fig. 3 is a diagram illustrating an application scenario of a method for synchronously playing information according to an exemplary embodiment of the present disclosure.
Fig. 4A or fig. 4B is a flowchart illustrating another method for synchronously playing information according to an exemplary embodiment of the present disclosure.
Fig. 5 is a block diagram illustrating an information synchronized playback system according to an exemplary embodiment of the present disclosure.
Any one of fig. 6 to 8 is a block diagram illustrating an information synchronized playback apparatus according to an exemplary embodiment of the present disclosure.
Fig. 9 is a block diagram illustrating an apparatus for synchronized playback of information according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
The network video live broadcast can be that a user at a spectator end watches live audio and video live events, such as events, meetings, teaching, operations and the like, which are carried out by a user at a main broadcast end through a network. In some live scenes, the viewer plays not only the live video stream of the main broadcast but also other target information. For example, in the scenario of a live answer, the target information may include a question topic. The live question-answering mode is that a user (audience) logs in a live broadcast room within a specified time, online answering is carried out under the guidance of a host, and bonus can be carried out on specified number (such as 12) of questions and set according to the melon score in each period. At the audience end, the answer frame containing the questions can be displayed on the live video, so that the audience can see the answer frame from the screen of the terminal and answer the questions when hearing the questions broadcasted by the host in the live video by using the terminal. As another example, in a live-broadcast first-purchase scenario, the target information may include information on goods for sale. The live broadcast shopping mode can be that a user logs in a live broadcast room within a specified time and carries out online shopping under the guidance of a host. At the audience, the commodity information to be sold can be displayed on the live broadcast video, the audience can listen to the live broadcast video and host broadcast the commodity information, and meanwhile, the audience can also operate the screen, select the preferred commodity and perform order placing/purchasing operation.
It can be understood that the disclosed embodiments can be applied in the following scenarios: in the live broadcast process, a spectator end can receive two paths of data, wherein one path of data can be live broadcast video stream, and the other path of data can be target information. And video frames related to the target information in the live video stream need to be played synchronously with the target information. The target information and at least one frame of video frame in the live video stream have an association relation. For example, the association relationship may be a video frame in which broadcast target information exists in a live video stream. The target information may be data having a limitation on the playback time, such as question information, information on a product to be sold, and information to be memorized, which are not listed here.
As shown in fig. 1, fig. 1 is a schematic diagram of an application scenario for implementing live broadcast according to an exemplary embodiment of the present disclosure. In this scenario, a main broadcaster 110, a server 120, a first viewer 131, and a second viewer 132 may be included. It is understood that the scene is illustrated by two spectators, in practice one or more spectators may be present. The anchor side may be a video stream originating side and the viewer side may be a video stream receiving side. In some scenarios, the anchor side may be referred to as the host side and the viewer side may be referred to as the user side. The anchor side and the audience side (such as the first audience side and the second audience side) can be software installed on the electronic equipment, and can also be terminal equipment. The anchor terminal can call a camera to record videos, take photos and other modes to make live broadcast data, and then sends the live broadcast data to the server terminal through the network. Both the audience terminal and the anchor terminal can send the input interactive messages (such as barrage messages) to the server terminal, so that the interactive messages can be displayed at all the terminals in the same live broadcast room. And aiming at the target information, the target information is often sent to the server by the anchor terminal and sent to each audience terminal by the server. The server is used for providing background services of internet live broadcast, such as storing the corresponding relation between the anchor server and the audience server, distributing live broadcast data, distributing interactive messages, distributing target information and the like.
The server may be a generic term for a plurality of server devices, or may be a generic term for at least one piece of software installed on a server device. In one example, live video stream distribution and data stream distribution may be implemented by the same server. In another example, in order to distinguish different services and improve timeliness of information processing, the server may include a live broadcast server and a communication server, and the like, the live broadcast server may be used to distribute live broadcast video streams, and the communication server is used to distribute data, in particular, target information. It can be understood that, according to actual requirements, the service end may further include a bullet screen service end, a gift service end, and the like, which are not described herein in detail.
The electronic device may be any electronic device that can run software, and the electronic device may be a handheld electronic device or other electronic device. For example, it may be a cellular phone, media player, or other handheld portable device, a slightly smaller portable device such as a wristwatch device or other wearable or miniaturized device, but may also be a PDA (personal digital Assistant), tablet computer, notebook computer, desktop computer, television, computer integrated into a computer display or other electronic equipment, and the like.
The disclosed embodiments are illustrated with one of the live systems. As shown in fig. 2A, fig. 2A is a schematic diagram of a live system shown in the present disclosure according to an exemplary embodiment. As shown in fig. 2B, fig. 2B is a flowchart illustrating a method for synchronously playing information according to an exemplary embodiment of the present disclosure. The method is applied to a live system, and the live system can comprise a main broadcast end 210, a live broadcast service end 220, a communication service end 230 and a viewer end 240. In the embodiment, the example is given by taking the server side comprising the live broadcast server side and the communication server side as an example, the live broadcast server side distributes live broadcast video streams, and the communication server side distributes data. It is understood that in other examples, the live service and the communication service may be the same service. The method comprises the following steps:
the steps performed by the anchor 210 may include steps 201 to 203:
in step 201, a mapping relationship between the video frame identifier and the information identifier is determined.
In step 202, the live video stream is sent to the viewer through the live server.
In step 203, the mapping relationship and the target information are sent to the viewer through the communication server.
The video frame identifier may be an identifier of a first frame associated video frame in a plurality of frames associated with the video frame, so that when the first frame associated video frame is played, the target information starts to be played at the same time. The video frame identifier may also be a set of identifiers of all associated video frames, or an identifier of the first frame associated video frame and an identifier of the last frame associated video frame, so that when the associated video frames are played, the target information is played simultaneously. The information identification is an identification of the target information.
The steps performed by the spectator terminal 240 may include steps 204 and 205:
in step 204, acquiring a live video stream, target information and a mapping relation;
in step 205, based on the mapping relationship, the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier are played simultaneously.
Wherein the server is not shown in fig. 2B. The anchor terminal can call a camera to record videos and take photos, and can also directly obtain audio and video data and the like from the audio and video acquisition terminal and make live broadcast data. In one example, to achieve audio and video synchronization, after audio data and image data are obtained at the live broadcast end, the audio data and the image data may be synthesized, and synchronization between the audio data and the image data is ensured, so as to avoid the situation that the transmission process causes picture and audio non-synchronization.
For a scene in which a video frame associated with target information in a live video stream needs to be played synchronously with the target information, a user can log in a live room within a specified time and perform online operation under the guidance of a host. The video frames in the live video stream associated with the target information may be referred to as associated video frames. The method comprises the steps of shooting a picture containing a host and collecting corresponding audio through a collection end, issuing target information when the host broadcasts the target information, so that a spectator end can receive live video stream data and target information, and plays the target information while playing the live video stream.
The target information is information related to the live video stream, the target information is divided according to types, and the target information can be character information, image information, audio information, video information and the like. The target information is divided according to the content, and the target information can be question subject, commodity information to be sold and the like. During the live broadcast, the user (such as the host) at the anchor end can broadcast the target information. The broadcast target information may be specific content of the read target information or may be indication information for indicating the target information, if the target information is character information or image information; if the target information is character information, image information, audio information or video information, the broadcast target information can be the broadcast target information, and meanwhile, the target information can be introduced. Wherein, the playing can be displaying or playing. Taking the target information as an example of the question, in an example, the anchor user may broadcast specific contents of the question, for example: the famous "urinating child" bronze statue is located in which country, and can even report answer options, such as: firstly, the Netherlands; II, Belgium; and thirdly, Ireland. In another example, the anchor user may only broadcast the number of questions for the question topic, such as: please look at the first question.
In order to realize synchronous playing of target information and associated video frames in a live video stream, the embodiment of the disclosure determines a mapping relationship between a video frame identifier and an information identifier.
The video frame identifier may be an identifier for indicating an associated video frame, and the associated video frame may be a video frame expected to be played in synchronization with the target information in the live video stream. If the live video stream includes multiple frames of associated video frames, the video frame identifier may be an identifier of a first frame of associated video frame in the multiple frames of associated video frames, so that when the first frame of associated video frame is played, the target information starts to be played at the same time. In one example, the video frame identification may be tag information, such as name, number, ID, etc., associated with the video frame. In practical applications, the viewer-side program often determines the video frame according to time, and in order to reduce the change to the viewer-side program, in another example, the video frame identifier may be the time of the current video frame (the currently generated video frame) when the anchor side transmits the target information, and may be referred to as a video frame time in order to distinguish from other times. The video frame time may be the time of the current video frame in the video stream. Since the target information is often triggered and issued when the anchor terminal needs to use, that is, when the anchor terminal sends the target information, a video frame associated with the target information is being generated, therefore, the currently generated video frame is a video frame associated with the target information, and the video frame and the target information need to be played at the same time at the audience terminal.
Wherein the information identifier is information for identifying the target information. In one example, the information identifier may be tag information of the target information, such as name, number, ID, and the like. In another example, the information identifier may be a time when the anchor side transmits the target information, and may be referred to as an information delivery time for distinguishing from other times.
In an optional implementation manner, when a sending instruction of target information is received, the current time may be used as the information sending time, and the time of the current video frame in the video stream may be used as the video frame time.
Wherein, the issuing command is a command for sending target information. The condition for issuing the instruction may be an instruction generated by triggering at a timing when the anchor side is generating/pre-generating the associated video frame and pre-playing the target information. For example, the issuing command may be a command that triggers generation when the target information starts to be broadcasted. The embodiment takes the start of broadcasting the target information as a condition for triggering generation of the issuing instruction, and can realize synchronous playing of the target information and video frames during broadcasting of the target information at a viewer end.
In one example, the issuing instruction may be an instruction generated by a specific object triggering key when the target information starts to be broadcasted. The keys may be physical keys or touch-sensitive virtual keys. For example, when the host starts broadcasting the target information, such as: when the host recites that the first topic is … …, a worker matched with the host clicks a key on the equipment to trigger the issuing instruction for generating the target information. Therefore, the sending time of the issued instruction is manually controlled, and the method is easy to realize.
In another example, the issuing instruction may be a trigger to generate the issuing instruction when the device recognizes that the live video frame includes a preset keyword or a preset key audio. For example, when the host starts broadcasting the target information, such as: when the host recites that the first topic is … …, the anchor terminal recognizes the voice to obtain the preset key voice, and then triggers the issuing instruction for generating the target information. Therefore, the issuing instruction can be automatically triggered and generated through voice recognition or image recognition, and labor cost is saved.
It can be understood that the issuing instruction may also be generated by triggering other conditions, which is not described herein.
And the anchor end sends the mapping relation and the target information to the audience end through the communication service end, and the anchor end sends a data stream through a communication channel, wherein the communication channel can be realized by a long link. Long linking: the link between the terminal and the server can transmit data in two directions. Generally, a link of TCP or UDP is established, and after the link is established, the server pushes data to the terminal in a downlink manner, and the terminal sends a heartbeat packet to the server in an uplink manner. If the heartbeat packet is not detected or returned, the server side breaks the link and the client side carries out re-linking.
In the embodiment of the present disclosure, the anchor terminal may send target information to the viewer terminal, and the target information may be information containing specific content. In order to improve the security of the target information, the transmission target information may be information obtained by encrypting the content of the target information. Therefore, the encrypted target information is sent to the audience through the communication server, so that the timeliness of the target information is guaranteed, and the safety of the specific content of the target information can be guaranteed.
It is understood that the embodiments of the present disclosure take the example of sending the target information to the audience as an example, and in other alternative implementations, the anchor end may send the ID of the target information to the audience. In one example, a database containing the target information may be preloaded at the viewer's end. And when the audience receives the ID of the target information, acquiring the target information corresponding to the ID from a pre-stored database.
However, in order to avoid the risk of cracking the database stored in the audience, in another example, after obtaining the target information ID, the anchor may obtain the target information corresponding to the ID from the service server storing the target information according to the target information ID, thereby improving the security of the target information.
After the audience obtains the live video stream, the target information and the mapping relationship, the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier can be played simultaneously based on the mapping relationship. The playing includes presentation and playing, for example, presentation target information, and playing video data stream. In practical applications, since the transmission speed of the data stream is often faster than that of the video stream, the target information may be received earlier by the viewer than the video frame associated with the viewer, and therefore, the target information is not displayed immediately after being received, but is stored locally. And based on the corresponding relation between the information identification and the video frame identification, when the video frame corresponding to the video frame identification is played, the target information corresponding to the corresponding information identification is played. For example, according to the corresponding relationship between the information issuing time and the video frame time, when the player plays the video frame corresponding to the video frame time, the corresponding problem topic is immediately displayed, thereby ensuring the consistency of the picture, the audio and the target information.
And if the target information is display information, the target information can be displayed on a picture corresponding to the live video stream. When displaying, the target information can be directly displayed, or the target information can be processed and then displayed, so as to provide a control which can be operated by the audience. Taking the target information as the question, an answer frame containing the question may be displayed on the video frame. In one example, not only the topic content is contained in the answer box, but an answer input control is also provided for the viewer to input an answer through the answer input control. In another example, to improve the efficiency of answering, each option may be a trigger control, and when the viewer clicks or touches the option, the triggered option serves as an answer submitted by the viewer. As shown in fig. 2C, fig. 2C is a schematic view of a live question answering interface according to an exemplary embodiment of the present disclosure. In this illustration, each option may be a trigger control, and when the user clicks on one of the options, the triggered option (belgium) serves as the answer submitted by the viewer.
In an alternative implementation manner, the playing time of the target information is time-limited, and the playing time of the target information may be determined by setting a countdown manner. Specifically, starting countdown from the beginning of playing the target information, and stopping playing the target information or disabling the operation control associated with the target information when the countdown is finished, so as to prohibit the viewer from continuing to operate when the countdown is finished.
The countdown time may be a predetermined fixed time, and for example, the countdown time may be 10 s. Therefore, the countdown time is directly set, and the method is easy to realize.
However, in practical applications, the target information is received at different times due to different viewer-side devices, such as networks. However, by setting a fixed countdown time, which is often counted from the time when the target information is received and displayed, the following situations may occur: since there may be a time difference between different apparatuses for receiving the target information, a viewer using an apparatus having a slow reception time can know the target information in advance by viewing the target information in an apparatus having a fast reception time. To avoid this, the countdown time is determined based on the difference between the standard countdown time and the terminal network delay, which may be the network delay between the viewer side and the communication server side.
In one example, the terminal network latency may be determined based on a transmission time of the time tick request initiated by the communication service and a reception time of the time tick request received by the viewer. For example, the communication server periodically sends a time synchronization request carrying the transmission time to the communication server, so that the viewer determines the network delay of the terminal based on the difference between the reception time of the time synchronization request and the transmission time carried in the request.
In another example, the network delay of the terminal may also be determined based on the sending time of the communication service end sending the target information and the receiving time of the audience end receiving the target information, so that the network delay can be determined each time the target information is sent, and the real-time update of the network delay is realized.
To remind the progress of the countdown, the countdown time may be reminded. The reminding mode can be voice reminding or displaying reminding. When the reminder is displayed, the countdown time number can be displayed, and the countdown time number can also be represented by a progress bar and other modes. For example, as shown in FIG. 2C, the countdown progress is displayed as a bar progress bar. It is understood that the progress bar may be in the shape of a bar, a circle, etc., without limitation.
The viewer side can return the operation result to the server side. For example, the operation result may be returned to the communication server or other business servers. The server side can perform statistical analysis on the operation result and feed back the analysis result to the audience side and the live broadcast side. Taking target information as an example of problem topics, the server side can count the selection proportion of each option, and then feeds back the statistical result to the audience side and the anchor side.
The various technical features in the above embodiments can be arbitrarily combined, so long as there is no conflict or contradiction between the combinations of the features, but the combination is limited by the space and is not described one by one, and therefore, any combination of the various technical features in the above embodiments also belongs to the scope disclosed in the present specification. The following is illustrated in one embodiment:
as shown in fig. 3, fig. 3 is an application scenario diagram of an information synchronized playing method shown in the present disclosure according to an exemplary embodiment, where the method is applied to a live question answering system, and the live question answering system may include a main broadcasting end 310, a live broadcasting service end 320, a communication service end 330, and a spectator end 340. The audience members may include a first audience member and an Nth audience member, where N is an integer greater than 1.
The anchor terminal can generate a live broadcast data stream based on audio and video data acquired by the acquisition terminal, when a problem topic is broadcasted, the current time is used as information issuing time (namely topic issuing time), the time of a current video frame in the video stream is used as video frame time, the mapping relation between the video frame time and the topic issuing time is obtained, the live broadcast video stream is sent to a spectator terminal through a live broadcast service terminal, and the mapping relation and the problem topic are sent to the spectator terminal through a communication service terminal.
Regarding sending the question to the viewer through the communication server, in one example, the question may be directly sent to the communication server, and the communication server sends the question to the viewer. In another example, the communication server may store a question topic in advance, or even store an answer to the question topic, the anchor terminal may send a topic identifier (e.g., ID) of the question topic to the communication server, and the communication server determines the question topic according to the topic identifier and sends the determined question topic to the viewer terminal. Furthermore, the communication service end can encrypt the problem topic and send the encrypted problem topic to the audience end so as to improve the safety of the problem topic.
The audience terminal obtains the live broadcast video stream, the question questions and the mapping relation, and can play the question questions corresponding to the question issuing time and the video frames corresponding to the video frame time simultaneously according to the mapping relation. For example, after receiving a question topic, the question topic can be saved locally. Based on the mapping relation between the video frame time and the topic issuing time, when the video frame corresponding to the video frame time is played, the problem topic corresponding to the corresponding topic issuing time is displayed on a screen, so that the consistency of pictures, audios and topics is ensured.
Further, the playing time of the target information can be determined by setting a countdown mode. For example, starting a countdown from the beginning of playing the question topic, and stopping playing the target information or disabling the operation control associated with the target information when the countdown is finished, so as to prohibit the viewer from continuing to operate when the countdown is finished. The countdown time for the countdown may be determined based on a difference between the standard countdown time and a terminal network delay, which may be a network delay between the viewer side and the communication server side.
After receiving the answer from the audience, the communication server can judge the validity of the answer. The answer time may be a preset determination time, and may be 10 seconds, for example. The judgment time is the same as the standard countdown time. In one example, the answer timeout time allows for a wide limit, considering the time spent communicating. When the validity of the answer is judged, the difference value between the sending time of the question and the receiving time of the answer can be compared with the sum value of the standard countdown time and the wide limit value, the validity of the answer is determined according to the comparison result, and the invalid answer can be deleted. The wide-limit value may be a time value obtained based on historical communication time consumption statistics, for example, 1s is added to the original answering time, and answers received within 11s are all considered to be valid answers. The grace value may also be obtained based on a difference between a time for issuing a question topic by the communication service end and a time for issuing a video frame associated with the first frame.
It is understood that, in this embodiment, the mapping relationship, the question, and the result analysis and statistics are implemented in the same service end (communication service end), in other embodiments, the mapping relationship and the question may be sent by the communication service end, and the result analysis and statistics may be performed by the answer service end (service end), which are not listed herein.
It can be seen from the above embodiments that, since the mapping relationship between the video frame time and the topic issue time is recorded, the problem topic corresponding to the topic issue time and the video frame corresponding to the video frame time can be simultaneously played according to the mapping relationship, so as to ensure the consistency of the picture, the audio and the topic, improve the user experience, and simultaneously introduce a fair timing strategy to achieve fair countdown of the answer.
Next, the embodiments of the present disclosure are exemplified from the perspective of the anchor side.
As shown in fig. 4A, fig. 4A is a flowchart illustrating another method for synchronously playing information according to an exemplary embodiment of the present disclosure, which may be used in a host, and includes the following steps:
in step 401, determining a mapping relationship between a video frame identifier and an information identifier, where the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in a live video stream, and the information identifier is an identifier of the target information;
in step 402, the live video stream is sent to the audience through the live service end, and the mapping relationship and the target information are sent to the audience through the communication service end, so that the audience can simultaneously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
And the video frame identification is the identification of a video frame expected to be played simultaneously with the target information in the live video stream. In an example, if a video frame expected to be played simultaneously with target information in a live video stream is a multi-frame, the video frame identifier may be an identifier of a first frame associated video frame in a multi-frame associated video frame, so that when the first frame associated video frame is played, the target information starts to be played simultaneously. In another example, the video frame identifier may also be a set of identifiers of all associated video frames, or an identifier of the first frame associated video frame and an identifier of the last frame associated video frame, so that the target information is played simultaneously during the playing of the associated video frames. The information identification is an identification of the target information.
The target information is information related to the live video stream, the target information is divided according to types, and the target information can be character information, image information, audio information, video information and the like. The target information is divided according to the content, and the target information can be question subject, commodity information to be sold and the like. The video frame associated with the target information in the live video stream may be referred to as an associated video frame, and the video frame identifier may be an identifier for representing the associated video frame. In an alternative implementation, the video frame identifier may be a time of a current video frame (a currently generated video frame) when the anchor transmits the target information, and may be referred to as a video frame time for distinguishing from other times. The video frame time may be the time of the current video frame in the video stream. Since the target information is often triggered and issued when the anchor terminal needs to use, that is, when the anchor terminal sends the target information, a video frame associated with the target information is being generated, therefore, the currently generated video frame is a video frame associated with the target information, and the video frame and the target information need to be played at the same time at the audience terminal. In an optional implementation manner, the information identifier may be a time when the anchor terminal sends the target information, and may be referred to as an information issuing time for distinguishing from other times.
The anchor terminal can send the live video stream to the audience terminal through the live broadcast server terminal, and send the mapping relation and the target information to the audience terminal through the communication server terminal, so that the audience terminal can simultaneously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier based on the mapping relation after obtaining the live video stream, the target information and the mapping relation.
It is understood that the embodiments of the present disclosure are the same as the related art of the embodiment shown in fig. 2B, and are not repeated herein.
According to the embodiment, the anchor terminal determines the mapping relation between the video frame identification and the information identification, so that the accuracy of the mapping relation can be improved, and after the mapping relation is sent to the audience terminal, the audience terminal can simultaneously play the problem topic corresponding to the information identification and the video frame corresponding to the video frame identification according to the mapping relation, so that the consistency of the picture, the audio and the target information is ensured, and the user experience is improved.
Next, the embodiments of the present disclosure are exemplified from the viewpoint of the viewer side.
As shown in fig. 4B, fig. 4B is a flowchart illustrating another method for synchronously playing information according to an exemplary embodiment of the present disclosure, which may be used in a viewer side, and includes the following steps:
in step 403, acquiring a live video stream, target information, and a mapping relationship between a video frame identifier and an information identifier, where the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
in step 404, based on the mapping relationship, the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier are played simultaneously.
And the video frame identification is the identification of a video frame expected to be played simultaneously with the target information in the live video stream. In an example, if a video frame expected to be played simultaneously with target information in a live video stream is a multi-frame, the video frame identifier may be an identifier of a first frame associated video frame in a multi-frame associated video frame, so that when the first frame associated video frame is played, the target information starts to be played simultaneously. In another example, the video frame identifier may also be a set of identifiers of all associated video frames, or an identifier of the first frame associated video frame and an identifier of the last frame associated video frame, so that the target information is played simultaneously during the playing of the associated video frames. The information identification is an identification of the target information.
The target information is information related to the live video stream, the target information is divided according to types, and the target information can be character information, image information, audio information, video information and the like. The target information is divided according to the content, and the target information can be question subject, commodity information to be sold and the like. The video frame associated with the target information in the live video stream may be referred to as an associated video frame, and the video frame identifier may be an identifier for representing the associated video frame. In an alternative implementation, the video frame identifier may be a time of a current video frame (a currently generated video frame) when the anchor transmits the target information, and may be referred to as a video frame time for distinguishing from other times. The video frame time may be the time of the current video frame in the video stream. Since the target information is often triggered and issued when the anchor terminal needs to use, that is, when the anchor terminal sends the target information, a video frame associated with the target information is being generated, therefore, the currently generated video frame is a video frame associated with the target information, and the video frame and the target information need to be played at the same time at the audience terminal. In an optional implementation manner, the information identifier may be a time when the anchor terminal sends the target information, and may be referred to as an information issuing time for distinguishing from other times.
Regarding the obtaining method of the target information, the viewer may receive the target information sent by the server, or may obtain the target information corresponding to the ID from a local or other server according to the ID of the target information sent by the communication server, which is not limited herein.
After the audience obtains the live video stream, the target information and the mapping relationship, the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier can be played simultaneously based on the mapping relationship. The playing includes presentation and playing, for example, presentation target information, and playing video data stream. In practical applications, since the transmission speed of the data stream is often faster than that of the video stream, the target information may be received earlier by the viewer than the video frame associated with the viewer, and therefore, the target information is not displayed immediately after being received, but is stored locally. Based on the corresponding relation between the information identifier and the video frame identifier, when the video frame corresponding to the video frame identifier is played, the target information corresponding to the corresponding information identifier is played, so that the consistency of the picture, the audio and the target information is ensured.
And if the target information is display information, the target information can be displayed on a picture corresponding to the live video stream. When displaying, the target information can be directly displayed, or the target information can be processed and then displayed, so as to provide a control which can be operated by the audience.
In an alternative implementation manner, the playing time of the target information is time-limited, and the playing time of the target information may be determined by setting a countdown manner. Specifically, starting countdown from the beginning of playing the target information, and stopping playing the target information or disabling the operation control associated with the target information when the countdown is finished, so as to prohibit the viewer from continuing to operate when the countdown is finished.
The countdown time may be a predetermined fixed time, and for example, the countdown time may be 10 s. Therefore, the countdown time is directly set, and the method is easy to realize.
However, in practical applications, the target information is received at different times due to different viewer-side devices, such as networks. However, by setting a fixed countdown time, which is often counted from the time when the target information is received and displayed, the following situations may occur: since there may be a time difference between different apparatuses for receiving the target information, a viewer using an apparatus having a slow reception time can know the target information in advance by viewing the target information in an apparatus having a fast reception time. To avoid this, the countdown time is determined based on the difference between the standard countdown time and the terminal network delay, which may be the network delay between the viewer side and the communication server side.
In one example, the terminal network latency may be determined based on a transmission time of the time tick request initiated by the communication service and a reception time of the time tick request received by the viewer. For example, the communication server periodically sends a time synchronization request carrying the transmission time to the communication server, so that the viewer determines the network delay of the terminal based on the difference between the reception time of the time synchronization request and the transmission time carried in the request.
In another example, the network delay of the terminal may also be determined based on the sending time of the communication service end sending the target information and the receiving time of the audience end receiving the target information, so that the network delay can be determined each time the target information is sent, and the real-time update of the network delay is realized.
The viewer side can return the operation result to the server side. For example, the operation result may be returned to the communication server or other business servers. The server side can perform statistical analysis on the operation result and feed back the analysis result to the audience side and the live broadcast side. Taking target information as an example of problem topics, the server side can count the selection proportion of each option, and then feeds back the statistical result to the audience side and the anchor side.
It is understood that the embodiments of the present disclosure are the same as the related art of the embodiment shown in fig. 2B, and are not repeated herein.
According to the embodiment, the audience terminal can obtain the mapping relation between the video frame identification and the information identification, so that the problem topic corresponding to the information identification and the video frame corresponding to the video frame identification can be played simultaneously according to the mapping relation, the consistency of the picture, the audio and the target information is ensured, and the user experience is improved.
Corresponding to the embodiment of the information synchronous playing method, the disclosure also provides embodiments of an information synchronous playing device, equipment and a system applied by the device, and a storage medium.
As shown in fig. 5, fig. 5 is a block diagram of an information synchronized playing system shown in the present disclosure according to an exemplary embodiment, and is used for a live system, where the live system includes a main broadcast end, a live broadcast service end, a communication service end, and a viewer end, the system includes a relationship determining module 510 and an information sending module 520 that are provided at the main broadcast end, and an information obtaining module 530 and an information playing module 540 that are provided at the viewer end:
a relation determining module 510 configured to determine a mapping relation between the video frame identifier and the information identifier;
an information sending module 520, configured to send a live video stream to a viewer through a live service end, and send the mapping relationship and target information to the viewer through a communication service end, where the video frame identifier is an identifier of a video frame that is expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
an information obtaining module 530 configured to obtain a live video stream, target information, and a mapping relationship;
and the information playing module 540 is configured to play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relationship.
As shown in fig. 6, fig. 6 is a block diagram of an information synchronized playback apparatus shown in the present disclosure according to an exemplary embodiment, the apparatus including:
a relationship determining module 610 configured to determine a mapping relationship between a video frame identifier and an information identifier, where the video frame identifier is an identifier of a video frame expected to be played in synchronization with the target information in a live video stream, and the information identifier is an identifier of the target information;
the information sending module 620 is configured to send the live video stream to the audience through the live service end, and send the mapping relationship and the target information to the audience through the communication service end, so that the audience can play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relationship. In an alternative implementation, the target information includes a question topic.
In an optional implementation manner, the information identifier is information delivery time when the anchor side sends the target information.
In an optional implementation manner, the video frame identifier is a video frame time of a current video frame when the anchor side transmits the target information.
In an optional implementation manner, the relationship determining module 610 is specifically configured to:
when receiving the issuing instruction of the target information, taking the current time as the information issuing time, taking the time of the current video frame in the video stream as the video frame time, and obtaining the mapping relation between the video frame identifier and the information identifier;
and the issuing instruction is an instruction generated by triggering a key by a specified object when the target information starts to be broadcasted.
As shown in fig. 7, fig. 7 is a block diagram of another information synchronized playback apparatus according to an exemplary embodiment of the present disclosure, the apparatus including:
an information obtaining module 710 configured to obtain a live video stream, target information, and a mapping relationship between a video frame identifier and an information identifier, where the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and the information playing module 720 is configured to play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relationship.
In an alternative implementation, the target information includes a question topic.
In an optional implementation manner, the information identifier is information delivery time when the anchor side sends the target information.
In an optional implementation manner, the video frame identifier is a video frame time of a current video frame when the anchor side transmits the target information.
As shown in fig. 8, fig. 8 is a block diagram of another information synchronized playback apparatus shown in the present disclosure according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 7, and the apparatus further includes a countdown module 730 configured to:
starting countdown when starting to synchronously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier, and stopping playing the target information or forbidding an operation control related to the target information when the countdown is finished;
the countdown time of the countdown is determined based on the difference between the standard countdown time and the terminal network delay, and the terminal network delay is the network delay between the audience terminal and the communication service terminal.
Correspondingly, the present disclosure also provides a live broadcast system, which includes a main broadcast terminal, a live broadcast server terminal, a communication server terminal, and a spectator terminal;
the method comprises the steps that a main broadcasting end determines a mapping relation between a video frame identifier and an information identifier, a live video stream is sent to a spectator end through a live broadcasting service end, the mapping relation and target information are sent to the spectator end through a communication service end, the video frame identifier is an identifier of a video frame which is expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and the audience terminal obtains the live broadcast video stream, the target information and the mapping relation, and simultaneously plays the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier based on the mapping relation.
Correspondingly, the present disclosure also provides an electronic device, which includes a processor; a memory for storing processor-executable instructions; wherein the processor is configured to:
acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information;
and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
Accordingly, the present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above.
The present disclosure may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
The specific details of the implementation process of the functions and actions of each module in the device are referred to the implementation process of the corresponding step in the method, and are not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
As shown in fig. 9, fig. 9 is a block diagram of an apparatus for synchronized playback of information according to an exemplary embodiment of the present disclosure. The apparatus 900 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like.
Referring to fig. 9, apparatus 900 may include one or more of the following components: a processing component 902, a memory 904, a power component 906, a multimedia component 909, an audio component 910, an input/output (I/O) interface 912, a sensor component 914, and a communication component 916.
The processing component 902 generally controls overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processing component 902 may include one or more processors 920 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 902 can include one or more modules that facilitate interaction between processing component 902 and other components. For example, the processing component 902 may include a multimedia module to facilitate interaction between the multimedia component 909 and the processing component 902.
The memory 904 is configured to store various types of data to support operation at the apparatus 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 904 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 906 provides power to the various components of the device 900. The power components 906 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 900.
The multimedia component 909 includes a screen that provides an output interface between the device 900 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 909 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 900 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 910 is configured to output and/or input audio signals. For example, audio component 910 includes a Microphone (MIC) configured to receive external audio signals when apparatus 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 904 or transmitted via the communication component 916. In some embodiments, audio component 910 also includes a speaker for outputting audio signals.
I/O interface 912 provides an interface between processing component 902 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 914 includes one or more sensors for providing status assessment of various aspects of the apparatus 900. For example, sensor assembly 914 may detect an open/closed state of device 900, the relative positioning of components, such as a display and keypad of device 900, the change in position of device 900 or one of the components of device 900, the presence or absence of user contact with device 900, the orientation or acceleration/deceleration of device 900, and the change in temperature of device 900. The sensor assembly 914 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 916 is configured to facilitate communications between the apparatus 900 and other devices in a wired or wireless manner. The apparatus 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 916 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 916 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 904 comprising instructions, executable by the processor 920 of the apparatus 900 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Wherein the instructions in the storage medium, when executed by the processor, enable the apparatus 900 to perform a method for synchronized playback of information, comprising:
determining a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in a live video stream, and the information identifier is an identifier of the target information; and sending the live video stream to a viewer through a live broadcast server, and sending the mapping relation and the target information to the viewer through a communication server, so that the viewer can play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier simultaneously based on the mapping relation.
Alternatively, the first and second electrodes may be,
acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information; and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (15)

1. An information synchronous playing method is characterized by comprising the following steps:
determining a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with target information in a live video stream, and the information identifier is an identifier of the target information; the target information is data with play time limit;
sending the live video stream to a spectator end through a live broadcast server end, and sending the mapping relation and the target information to the spectator end through a communication server end so that the spectator end can play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relation; and the time for receiving the target information by the audience is earlier than the time for receiving the video frame expected to be played synchronously with the target information.
2. The method of claim 1,
the target information comprises question questions; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
3. The method of claim 2, wherein the step of determining the information delivery time and the video frame time comprises:
when receiving the issuing instruction of the target information, taking the current time as the information issuing time and taking the time of the current video frame in the video stream as the video frame time;
and the issuing instruction is an instruction generated by triggering a key by a specified object when the target information starts to be broadcasted.
4. An information synchronous playing method is characterized by comprising the following steps:
acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information; the target information is data with play time limit; receiving the target information at a time earlier than a time of receiving a video frame expected to be played in synchronization with the target information;
and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
5. The method of claim 4, wherein the target information comprises a question topic; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
6. The method of claim 4, further comprising:
starting countdown when starting to synchronously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier, and stopping playing the target information or forbidding an operation control related to the target information when the countdown is finished;
the countdown time of the countdown is determined based on the difference between the standard countdown time and the terminal network delay, and the terminal network delay is the network delay between the audience terminal and the communication service terminal.
7. An information synchronous playing device, characterized in that the device comprises:
the relation determining module is configured to determine a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with target information in a live video stream, and the information identifier is an identifier of the target information; the target information is data with play time limit;
the information sending module is configured to send a live video stream to a viewer through a live service end, and send the mapping relation and the target information to the viewer through a communication service end, so that the viewer can play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relation; and the time for receiving the target information by the audience is earlier than the time for receiving the video frame expected to be played synchronously with the target information.
8. The apparatus of claim 7,
the target information comprises question questions; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
9. The apparatus of claim 8, wherein the relationship determination module is specifically configured to:
when receiving the issuing instruction of the target information, taking the current time as the information issuing time, taking the time of the current video frame in the video stream as the video frame time, and obtaining the mapping relation between the video frame identifier and the information identifier;
and the issuing instruction is an instruction generated by triggering a key by a specified object when the target information starts to be broadcasted.
10. An information synchronous playing device, characterized in that the device comprises:
the information acquisition module is configured to acquire a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information; the target information is data with play time limit; receiving the target information at a time earlier than a time of receiving a video frame expected to be played in synchronization with the target information;
and the information playing module is configured to play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier at the same time based on the mapping relation.
11. The apparatus of claim 10, wherein the target information comprises a question topic; and/or the presence of a gas in the gas,
the information identification is information issuing time when the anchor terminal sends the target information; and/or the presence of a gas in the gas,
and the video frame identifier is the video frame time of the current video frame when the anchor terminal sends the target information.
12. The apparatus of claim 10, further comprising a countdown module configured to:
starting countdown when starting to synchronously play the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier, and stopping playing the target information or forbidding an operation control related to the target information when the countdown is finished;
the countdown time of the countdown is determined based on the difference between the standard countdown time and the terminal network delay, and the terminal network delay is the network delay between the audience terminal and the communication service terminal.
13. A live broadcast system is characterized by comprising a main broadcast end, a live broadcast service end, a communication service end and an audience end;
the method comprises the steps that a main broadcasting end determines a mapping relation between a video frame identifier and an information identifier, a live video stream is sent to a spectator end through a live broadcasting service end, the mapping relation and target information are sent to the spectator end through a communication service end, the video frame identifier is an identifier of a video frame which is expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information; the target information is data with play time limit;
the audience terminal obtains a live video stream, target information and a mapping relation, and simultaneously plays the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier based on the mapping relation; and the time for receiving the target information by the audience is earlier than the time for receiving the video frame expected to be played synchronously with the target information.
14. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a live video stream, target information and a mapping relation between a video frame identifier and an information identifier, wherein the video frame identifier is an identifier of a video frame expected to be played synchronously with the target information in the live video stream, and the information identifier is an identifier of the target information; the target information is data with play time limit; the time for receiving the target information is earlier than the time for receiving the video frame expected to be played synchronously with the target information;
and based on the mapping relation, simultaneously playing the target information corresponding to the information identifier and the video frame corresponding to the video frame identifier.
15. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201810205908.4A 2018-03-13 2018-03-13 Information synchronous playing method, device, equipment, system and storage medium Active CN108449605B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810205908.4A CN108449605B (en) 2018-03-13 2018-03-13 Information synchronous playing method, device, equipment, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810205908.4A CN108449605B (en) 2018-03-13 2018-03-13 Information synchronous playing method, device, equipment, system and storage medium

Publications (2)

Publication Number Publication Date
CN108449605A CN108449605A (en) 2018-08-24
CN108449605B true CN108449605B (en) 2020-10-30

Family

ID=63194207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810205908.4A Active CN108449605B (en) 2018-03-13 2018-03-13 Information synchronous playing method, device, equipment, system and storage medium

Country Status (1)

Country Link
CN (1) CN108449605B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109714622B (en) * 2018-11-15 2021-04-16 北京奇艺世纪科技有限公司 Video data processing method and device and electronic equipment
CN109819300A (en) * 2019-01-03 2019-05-28 北京潘达互娱科技有限公司 A kind of message content methods of exhibiting, sending method, device, terminal and system
CN112118488A (en) * 2019-06-20 2020-12-22 京东方科技集团股份有限公司 Live broadcast method, electronic equipment and live broadcast system
CN110650353B (en) * 2019-09-25 2020-12-04 广州华多网络科技有限公司 Multi-person continuous-wheat mixed drawing method and device, storage medium and electronic equipment
CN110602260A (en) * 2019-11-12 2019-12-20 南京创维信息技术研究院有限公司 Method and device for synchronous scene playing of intelligent equipment based on AIOT technology

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7721308B2 (en) * 2005-07-01 2010-05-18 Microsoft Corproation Synchronization aspects of interactive multimedia presentation management
US8843959B2 (en) * 2007-09-19 2014-09-23 Orlando McMaster Generating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time
CN104811774A (en) * 2015-04-29 2015-07-29 北京中传数广技术有限公司 Live television precise advertisement and information delivery method and system
CN106488291A (en) * 2016-11-17 2017-03-08 百度在线网络技术(北京)有限公司 The method and apparatus of simultaneous display file in net cast
CN106534944A (en) * 2016-11-30 2017-03-22 北京锤子数码科技有限公司 Video display method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7721308B2 (en) * 2005-07-01 2010-05-18 Microsoft Corproation Synchronization aspects of interactive multimedia presentation management
US8843959B2 (en) * 2007-09-19 2014-09-23 Orlando McMaster Generating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time
CN104811774A (en) * 2015-04-29 2015-07-29 北京中传数广技术有限公司 Live television precise advertisement and information delivery method and system
CN106488291A (en) * 2016-11-17 2017-03-08 百度在线网络技术(北京)有限公司 The method and apparatus of simultaneous display file in net cast
CN106534944A (en) * 2016-11-30 2017-03-22 北京锤子数码科技有限公司 Video display method and device

Also Published As

Publication number Publication date
CN108449605A (en) 2018-08-24

Similar Documents

Publication Publication Date Title
CN108260016B (en) Live broadcast processing method, device, equipment, system and storage medium
CN108449605B (en) Information synchronous playing method, device, equipment, system and storage medium
CN112218103B (en) Live broadcast room interaction method and device, electronic equipment and storage medium
US20210281909A1 (en) Method and apparatus for sharing video, and storage medium
CN110166788B (en) Information synchronous playing method, device and storage medium
CN109348239B (en) Live broadcast fragment processing method and device, electronic equipment and storage medium
CN111182318B (en) Contribution score generation method and device in live broadcast, electronic equipment and storage medium
CN113315986B (en) Live broadcast interaction method and device, product evaluation method and device, electronic equipment and storage medium
CN109151565B (en) Method and device for playing voice, electronic equipment and storage medium
JP2016535351A (en) Video information sharing method, apparatus, program, and recording medium
CN112738544B (en) Live broadcast room interaction method and device, electronic equipment and storage medium
CN112153407B (en) Live broadcast room data interaction method, related device and equipment
CN109754298B (en) Interface information providing method and device and electronic equipment
US20230007312A1 (en) Method and apparatus for information interaction in live broadcast room
CN114025180A (en) Game operation synchronization system, method, device, equipment and storage medium
CN113573092B (en) Live broadcast data processing method and device, electronic equipment and storage medium
US20220078221A1 (en) Interactive method and apparatus for multimedia service
CN111866531A (en) Live video processing method and device, electronic equipment and storage medium
US20180007420A1 (en) Method, device and system for recording television program
CN110191367B (en) Information synchronization processing method and device and electronic equipment
CN112270561A (en) Electronic resource distribution method and device, electronic equipment and storage medium
CN113518240A (en) Live broadcast interaction method, virtual resource configuration method, virtual resource processing method and device
CN110620956A (en) Live broadcast virtual resource notification method and device, electronic equipment and storage medium
CN109729367A (en) The method, apparatus and electronic equipment of live media content information are provided
CN109788364B (en) Video call interaction method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant