CN111556370A - Interactive video interaction method, device, system and storage medium - Google Patents

Interactive video interaction method, device, system and storage medium Download PDF

Info

Publication number
CN111556370A
CN111556370A CN202010257083.8A CN202010257083A CN111556370A CN 111556370 A CN111556370 A CN 111556370A CN 202010257083 A CN202010257083 A CN 202010257083A CN 111556370 A CN111556370 A CN 111556370A
Authority
CN
China
Prior art keywords
interactive
interactive video
server
interaction
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010257083.8A
Other languages
Chinese (zh)
Other versions
CN111556370B (en
Inventor
林默
于敬延
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN202010257083.8A priority Critical patent/CN111556370B/en
Publication of CN111556370A publication Critical patent/CN111556370A/en
Application granted granted Critical
Publication of CN111556370B publication Critical patent/CN111556370B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention relates to an interactive video interaction method, an interactive video interaction device, an interactive video interaction system and a storage medium, wherein the method comprises the following steps: when the interactive node of any interactive video segment in the interactive video is played, first state data is sent to a server; and when the interaction operation of any one of the displayed preset interaction options is detected, sending second state data to the server. Through the mode, the interaction mechanism between the client and the server is set, so that the user can participate in the subsequent plot development design, and the frequent interaction between the client and the server can be avoided as much as possible. Namely, the user participation is increased, the user experience is improved, and meanwhile, the interaction pressure and the interaction cost between the client and the server are reduced.

Description

Interactive video interaction method, device, system and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to an interactive video interaction method, device and system and a storage medium.
Background
The traditional video watching mode includes finding a video resource to be played for a user and then clicking to play. During which the video may be scheduled or exited from viewing. In this process, the user is more watching the video as a spectator. The scenarios in the video are played in sequence according to a set time sequence. If the scenario of the video is not desired by the user, no changes can be made. The disadvantage of this approach is that the user cannot participate in the setting of the video scenario development, and once the scenario development does not meet the user's assumption, the user experience will inevitably be reduced.
Interactive video is a new video type, and a user can interact with a client while watching the video. And according to the interaction result of the client and the user, the development of the subsequent plot is influenced. The method can increase the design participation of the user in the development of the video scenario and effectively improve the user experience.
When a user continuously interacts with a client in a foreground, the client also needs to interact with a server in the background running process actually. For example, the operation instruction of the user is informed to the server in a certain form, so that the server responds accordingly.
The number of interactions between the existing client and the server is usually set at will, and too many interactions will increase the interaction cost, and too few interactions will cause data loss, so an interaction mechanism is urgently needed to solve the above drawbacks.
Disclosure of Invention
In view of the above, embodiments of the present invention provide an interactive video interaction method, apparatus, system and storage medium to solve the above technical problems or some of the technical problems.
In a first aspect, an embodiment of the present invention provides an interactive video interaction method, where the method includes:
when the interactive node of any interactive video segment in the interactive video is played, first state data is sent to a server; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point in the interactive video segment;
and when the interaction operation of any one of the displayed preset interaction options is detected, sending second state data to the server.
In one possible embodiment, the method further comprises: and when the interactive video is determined to be started to play, sending third state data to the server.
In one possible embodiment, the method further comprises:
sending state request information to a server, wherein the state request information comprises user identification information and an interactive video playing request;
receiving previous interactive video playing progress information which is fed back by the server and corresponds to the user identification information and the interactive video playing request;
and determining a starting point of the started playing of the interactive video according to the playing progress information of the previous interactive video.
In one possible embodiment, the method further comprises: and sending fourth state data to the server when playing the video clip corresponding to the interactive operation after responding to the interactive operation.
In one possible embodiment, the method further comprises: and when the end of the interactive video is determined to be played, sending fifth state data to the server.
In one possible embodiment, the method further comprises: and when a control instruction for quitting playing the interactive video is received, sending sixth state data to the server.
In one possible embodiment, the method further comprises: and when the current playing progress is traced back to any previous playing progress in the playing process, the seventh state data is sent to the server.
In one possible embodiment, the method further comprises: and after the current playing progress is traced back to any previous playing progress, eighth state data is sent to the server.
In one possible implementation, the interactive video includes at least one interactive video segment, and in the interactive video segment, at least two preset interaction options are displayed in a preset interaction interval with an interaction node as a starting point, which specifically includes:
and an interaction node is arranged in the preset interaction interval, and at least two preset interaction options are displayed on the interaction node.
In one possible embodiment, the method further comprises: and sending the equipment information of the interactive video watched by the user to a server.
In a second aspect, an embodiment of the present invention provides an interactive video interaction device, where the device includes: the system comprises a processing unit, a sending unit and a display unit;
the processing unit is used for determining whether the interactive node of any interactive video segment in the interactive video is played;
the sending unit is used for sending first state data to the server when the processing unit determines that the interactive node of any interactive video segment in the interactive video is played; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point in the interactive video segment;
the processing unit is also used for detecting whether the displayed preset interaction options are subjected to interaction operation or not;
the sending unit is further used for sending second state data to the server when the processing unit detects the interactive operation on any one of the displayed preset interactive options.
In a third aspect, an embodiment of the present invention provides an interactive video interaction system, where the system includes:
at least one processor and memory;
the processor is configured to execute the interactive video interaction program stored in the memory to implement the interactive video interaction method as described in any one of the embodiments of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer storage medium, where one or more programs are stored, and the one or more programs are executable by the interactive video interaction system described in the third aspect to implement the interactive video interaction method described in any implementation manner of the first aspect.
According to the interactive video interaction method provided by the embodiment of the invention, the interactive video can comprise at least one interactive video segment. When detecting the interactive node currently played to any interactive video clip, sending first state data to the server. In the interactive video clip, at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point, and each preset interactive option is associated with a subsequent plot development plot of the interactive video. According to different interaction options selected by users, the subsequent plot development trends are completely different. And when the interactive operation of any one of the displayed preset interactive options is detected, sending second state data to the server, so that the server can count the statistics of the plot development preference of the user according to the selection of the user, and the like. Through the mode, the interaction mechanism between the client and the server is set, so that the user can participate in the subsequent plot development design, and the frequent interaction between the client and the server can be avoided as much as possible. Namely, the user participation is increased, the user experience is improved, and meanwhile, the interaction pressure and the interaction cost between the client and the server are reduced.
Drawings
Fig. 1 is a schematic flowchart of an interactive video interaction method according to an embodiment of the present invention;
FIG. 2a is a schematic diagram of a current interactive video clip displaying preset interactive options at an interactive node according to the present invention;
FIG. 2b is a schematic view of a scenario provided by the present invention when a female dominant giraffe room is selected;
FIG. 2c is a schematic illustration of a conversation between a female and a male in a Giraffe room provided by the present invention;
FIG. 2d is a schematic view of a scenario provided by the present invention when a female main knockdown zebra room is selected;
FIG. 2e is a schematic diagram of a conversation between a female and another male in a zebra room provided by the present invention;
fig. 3 is a schematic process diagram of status identifiers that may be transmitted in an interactive video playing process according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an interactive video interaction device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an interactive video interaction system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
For the convenience of understanding of the embodiments of the present invention, the following description will be further explained with reference to specific embodiments, which are not to be construed as limiting the embodiments of the present invention.
Fig. 1 is a schematic flowchart of an interactive video interaction method according to an embodiment of the present invention, and fig. 1 is a schematic flowchart of an interactive video interaction method according to an embodiment of the present invention. Before the method steps of the present embodiment are specifically described, some terms appearing in the present application will be explained first, so as to facilitate the reader's further understanding of the present application.
The interactive video may be composed of a plurality of video segments. The video segments include at least one interactive video segment. For convenience of illustration, the different video clips may be named V1, V2, V3 … Vn.
The interactive node is a preset node where the user can interact with the client, and may be defined as I1, I2, and I3 … In. The user can interact with the client while watching the video, so that the development of the subsequent plot is influenced, the branches can be explored for many times, different contents are watched, and the participation degree of the user is improved.
However, in order to avoid frequent interaction between the user and the client, the user is frequently interrupted when watching the video, and the experience is reduced. Meanwhile, in order to avoid the pressure of interaction between the subsequent client and the server caused by frequent interaction between the user and the client, a plurality of interaction nodes cannot be set in each video clip. In the specific implementation process, the number of the interaction nodes generally set is not more than two. The setting of the interactive nodes can be determined according to the video scenario development situation.
In the playing process of the video, the client needs to continuously interact with the background server to ensure that the data of the user can be timely uploaded to the server, so that the user can correctly continue playing in the next playing, and meanwhile, the access records of the user and the like can be checked. Therefore, the interaction opportunity between the client and the server also needs to be determined, and can also be understood as an interaction node between the client and the server. When reaching a certain interactive node, the client sends status data to the server. Therefore, the server can count the video playing situation, the selection of the user for the interactive option, the plot development situation of the user for the interactive video, and the like according to the state data.
Referring to fig. 1, the specific steps of the method include:
step 110, when the interactive node of any interactive video segment in the interactive video is played, sending first state data to a server.
Specifically, as introduced above, the interactive video includes a plurality of video segments. The plurality of video segments includes at least one interactive video segment. The interaction process with the user is emphasized to be in the interactive video clip. Therefore, interactive options are set in the interactive video clips for the user to select so as to determine the subsequent plot development of the video scenario.
And when the current playing progress of the interactive video is determined to be an interactive node playing to any interactive video segment in the interactive video, sending first state data to the service. It should be noted that the first status data, and other status data described below, actually include two types of data, one type being a status flag, and one type being a status parameter corresponding to the status flag. The specific state parameters are shown in table 1.
TABLE 1
Figure BDA0002437568000000071
Shown in table 1 are parameter information included in the communicated state data in different states. And the corresponding symbols in table 1 represent symbols corresponding to the state data. For example, 1 represents first state data; 2 represents second state data, and so on. More specifically, the symbol 0 represents the seventh state data. The parameter names include Id, App _ v, Platform _ Id, App _ lm, and the like. The attribute values corresponding to the parameters comprise 'yes' or 'no', and the main representation of the attribute values is whether the state parameters are sent to the server, and if the attribute values of the parameters below each symbol are yes, the parameter is included in the state data corresponding to the symbol. If not, the state data corresponding to the symbol does not include the parameter. For example, the state parameters included in the first state data are Id, App _ v, Platform _ Id, App _ lm, uniquerd, proocolverison, blockld, tvid, currentTime, Switch _ type, Switch _ time, and the like. The explanations of the meaning of the different parameters are also listed in table 1. The meaning of specific state parameters is well known to those skilled in the art and will not be explained in greater detail here.
The state identification, as the name implies, is identification information for indicating the current state. The first state identifier in the first state data is an interactive node for indicating any one of the interactive video segments played in the interactive video. In the interactive video clip, an interactive node is used as a starting node, and at least two preset interactive options are displayed to a user in a preset interactive interval. In fact, an interaction node is set in the interaction interval, and when the interaction node is reached, at least two interaction options are displayed to a user, so that the user can conveniently select the interaction options.
For example, when the interactive options include only two, one of the interactive options may be to instruct a new video clip to be played, so that the story line moves toward the story line corresponding to the new video clip. The other interactive option is continued according to the progress of the currently played video clip, and is continued to be developed according to the trend of the subsequent plot of the currently played interactive video clip (in this case, an interactive node can be set again, and at least two preset interactive options are displayed on the interactive node for the user to select, so as to facilitate the user to select whether to continue playing the subsequent plot of the currently played interactive video clip, or to select a new video clip for playing, etc.). Alternatively, both interaction options are moving towards the storyline of the new video clip play. How to design specifically can be determined completely according to actual conditions.
And step 120, when the interaction operation on any one of the displayed preset interaction options is detected, sending second state data to the server.
For example, fig. 2 shows an example of a selection of a television play to knock a room and thereby perform a subsequent story development. In fig. 2a, a current interactive video segment is shown, and a schematic diagram showing preset interaction options displayed at an interaction node is shown. The interactive options include selecting a giraffe room, a rainforest room, and a zebra room. Shown in fig. 2b is a schematic view of the scene when a female dominant giraffe room is selected. Shown in fig. 2c is a schematic diagram of a conversation between a woman and a man in a giraffe room. Shown in fig. 2d is a schematic view of the scene when a female major knockdown zebra room is selected. Figure 2e shows a schematic diagram of a conversation between a woman and another man in a zebra room.
The client sends the second state data to the server no matter which preset interaction option is selected. And playing the video clip corresponding to the selected preset interactive option. The second state data comprises a second state identifier and a state parameter corresponding to the second state identifier. Wherein the state parameters in the second state data include: id. App _ v, Platform _ id, App _ lm, uniqueld, prooolversion, blockld, actionld, tvid, currentTime, and autoSeletc.
It should be further noted that the video segment corresponding to the interactive option may be an interactive video segment or a non-interactive video segment. And the second state identifier is used for indicating that the video is played to the interactive node. At this time, the user selects one of the preset interaction options. Or if the user does not make a selection within a certain time, the client side can also select an interactive option by default and directly play the interactive option. The state parameter corresponding to the second state identifier is transmitted to the server, so that the subsequent server can conveniently record the selected interaction option (whether selected by the user or defaulted by the client) and perform big data statistics. It is mainly based on big data statistics to determine which direction the user is more inclined to the storyline.
Optionally, when it is determined that the interactive video is started to play, the method further includes: and sending the third state data to the server.
That is, if the interactive video belongs to the state of being played just now, the third state data needs to be sent to the server. The third state data includes a third state identifier and a state parameter corresponding to the third state identifier. The third status flag is mainly used to inform the server that the current video has already started playing. The state parameters in the third state data include: id. App _ v, Platform _ id, App _ lm, uniqueld, proocolVersion, blockld, tvid, and currentTime.
Further optionally, determining a play starting point of the current start of the interactive video specifically includes:
step a, sending state request information to a server, wherein the state request information comprises user identification information and an interactive video playing request.
And b, receiving previous interactive video playing progress information which is fed back by the server and corresponds to the user identification information and the interactive video playing request.
And c, determining a playing starting point of the current start of the interactive video according to the playing progress information of the previous interactive video.
Specifically, the user identification information and the interactive video playing request are sent to the server, so that the server can conveniently determine the playing progress of the user watching the interactive video at the previous time according to the user identification information and the interactive video playing request. And feeding back the playing progress to the client side. Therefore, the client can take the previous playing progress as the playing starting point of the current start of the interactive video.
Specifically, as described above, the video, regardless of the stage to which it is played, sends status data to the server. Then, when the last video is played until a certain time node exits playing, the client side also sends a status data to the server. I.e. the sixth status data to be described hereinafter. And a status parameter is present in the sixth status data. Specifically, refer to currentTime in the state parameter corresponding to symbol 6 in table 1, which means to record the current tvid playing time. That is, the server may first find a video state statistics table corresponding to the user identification information according to the user identification information and the interactive video playing request. The currentTime parameter in table 1 is found from this statistical table. And finding the time node of the last video playing according to the parameter. Namely the starting point of the video playing. And then, feeding back the playing progress of the interactive video watched by the user to the client side.
And then, when the current playing progress of the interactive video is determined to be at the playing starting point of the current start, the third state data can be sent to the server.
Further optionally, when the video segment corresponding to the interactive operation is played after the interactive operation is responded, the method may further include: and sending the fourth state data to the server.
Although the fourth status data is described here, the status flag and the status parameter included in the fourth status data are actually identical to the status flag and the status parameter corresponding to the third status data. The state identification meaning is also identical to the third state identification meaning. The specific reason is that playing the video segment corresponding to the interactive option can be understood as restarting playing a video segment in a broad sense. Thus, the state identification and the state parameter in the fourth state data and the third state data are the same. In the present application, the status data is transmitted under different trigger conditions for the purpose of distinguishing, and therefore, the third status data and the fourth status data are used for distinguishing.
Optionally, when it is determined that the current playing progress of the interactive video is, and the interactive video is played to the end of the interactive video, the method further includes: and sending the fifth state data to the server. And the fifth state identifier in the fifth state data is used for informing the server that the currently played interactive video is played completely. The state parameters specifically corresponding to the fifth state identifier are also described in detail in table 1, and the state parameters in the fifth state data may include: id. App _ v, Platform _ id, App _ lm, uniqueld, proocolVersion, blockld, tvid, and currentTime.
Optionally, when a control instruction for quitting playing the interactive video is received, the sixth state data is sent to the server, and the playing is quitted after the sending is completed.
Specifically, after the client receives a control instruction for quitting playing the interactive video, sixth state data may be sent to the server, where the sixth state data includes a sixth state identifier for indicating quitting playing and a state parameter corresponding to the sixth state identifier, and the state parameter in the sixth state data may include: id. App _ v, Platform _ id, App _ lm, uniqueld, and prooolversion. And quitting playing after the transmission is finished.
Optionally, when it is determined to trace back the current playing progress to any previous playing progress, the method further includes: and transmitting the seventh status data to the server. The seventh status data includes a seventh status identifier and a status parameter corresponding to the seventh status identifier, the seventh status identifier is used to inform the server, the video backtracks to a previous playing progress, and the status parameter in the seventh status data includes: id. App _ v, Platform _ id, App _ lm, uniqueld, and prooolversion.
Further optionally, after tracing back the current playing progress to any previous playing progress, the method further includes: and transmitting the eighth state data to the server. Similar to the fourth status data, the eighth status flag in the eighth status data actually has the same function as the third status flag, and the status parameter corresponding to the eighth status flag is also the same as the status parameter corresponding to the third status flag, which will not be described in detail herein.
The specific reason is that when any previous playing progress is traced back, a new video clip can be played in a broad sense. Therefore, the eighth status flag in the eighth status data and the third status flag in the third status data have the same function, and the status parameters have the same content. Thus, in table 1, the state parameters in the third state data, the state parameters in the fourth state data, and the state parameters in the eighth state data are all in the same column of table 1, with corresponding symbols 3,4, and 8.
In any case, after the third state data (or the fourth state data, or the eighth state data) is sent to the server, the interactive node of a certain interactive video segment and the preset interactive interval with the interactive node as the starting point may be encountered in the video playing process, and the interactive option is displayed to the user. Exiting playing, tracing back to a certain previous progress, playing to the end and the like. Then, that is, after the third status data (or the fourth status data, or the eighth status data) is sent to the user, the above-described situation that the client sends other status data to the server may be repeated later.
In one embodiment, the present invention provides a process diagram of status identifiers that may be transmitted during the playing of an interactive video on the client side. Referring specifically to fig. 3, fig. 3 is a schematic diagram illustrating a state change that may exist during the playing process. Vi in the figure represents the ith video clip, for example, a video V1, a video clip V2 and a video clip V3 are shown in the figure, and other omission is not shown. Ii represents the ith interactive node, shown are interactive node I1 in video clip V1 and interactive node I2 in video clip V2, and Oi represents the options offered to the user. For example, it is shown that two interactive options O1 and O2 are included in the video clip V1. Si represents the ith state data.
Before describing the state process diagram, it is first clear how the client detects the current playing state of the video, as shown in fig. 3. Specifically, the client determines that the current playing state can be obtained by the following method:
the first type: the interaction node, the interaction interval, the display node of the interaction option (i.e., the interaction node), the video segment to be played corresponding to the interaction option, and the trigger node corresponding to the video playing to the ending part are all determined by the script obtained in advance by the client.
Specifically, after the client acquires the interactive video to be played according to a trigger instruction of the user, the client requests the server for interactive video data, and the server issues an interactive script while issuing the interactive video data. In the script, a time node at which the video is to be ended, an interaction node of the interactive video clip, an interaction interval range, an interaction option to be displayed in the interaction interval, a video clip bound with different interaction options (that is, different interaction options are selected to correspond to a video clip to be played next moment) have been preset, and after the interaction options are displayed, if a user does not make a selection within a default preset time (for example, 1s), a video clip corresponding to a default option set by the playing system is played, and the like.
Therefore, the client determines the video playing state according to the script, and can automatically send corresponding state data to the server after the operation is naturally executed.
And in the second category, the client can determine the playing state according to the operation instruction of the user and then send the response state data to the server after executing the operation corresponding to the operation instruction of the user.
The state process diagram schematically shows that: an interactive video is played. After the video starts playing, the client sends the third status data to the server S3.
For example, when it is determined that the current playing progress of the interactive video is, playing to the interactive node I1 of the interactive video clip, the client transmits the first state data S1 to the server.
In a preset interaction interval with a preset interaction node I1 as a starting point, a user interacts with a client, for example, the client sets several subsequent video development directions and shows the subsequent video development directions to the user in the form of options, see interaction options O1 to O4 in fig. 3, and the user can select the subsequent video development scenario according to the user's desire, so that the subsequent video development scenario is played according to the user's desire. After the user finishes the selection, the client selects the video clip corresponding to the interaction option to play. Or if the user does not make a selection within the specified time, the client side can play a certain interactive option in a default mode according to the script setting, and plays the video clip corresponding to the default interactive option. And, the client will send the second status data to the server S2. Thereafter, the fourth status data S4 is sent to the server again. The video segment corresponding to the interactive option is also an interactive video segment, see interactive video segment V2 in fig. 3. Thus, when the interactive node I2 to the interactive video clip V2 is played, the first state data S1 is transmitted to the server. And displaying the interaction options O3 and O4 in the interaction interval for the user to select, playing a video clip corresponding to the interaction option finally selected (or selected by default) by the user, and then sending second state data S2 to the server. Thereafter, the fourth state data S4 is sent again. When the interactive video is played to the end, the fifth status data is sent to the server S5. Since the video backtracking to the previous play progress is not shown, the seventh status data S0 is not shown in the drawing. The exit play state is not shown either, so the sixth state data S6 is not shown. Similarly, since any progress of the current progress of the playing back to the previous one is not displayed, and the eighth status data is naturally not transmitted to the server, S8 is not displayed in fig. 3 either. All others are shown in figure 3.
Optionally, in addition to the status data, the device information of the user watching the interactive video may be sent to the server. The specific time when the transmission is completely transmitted can be set according to actual conditions. For example, the transmission may be immediately before the first status data is sent, or before the second status data is sent, or after it is determined that the video is initiated to play. The data are transmitted to the server, mainly for the convenience of the server to carry out big data statistics in the following process. For example, to count the user's preferences for the development of a video story, which segments of a story line the user prefers, or which stars the user likes, etc. In addition, it can be clearly understood which device the user uses to watch the video. Furthermore, videos of the same type can be subsequently recommended for the users according to the hobbies of different users, and the user experience degree is further improved.
According to the interactive video interaction method provided by the embodiment of the invention, the interactive video can comprise at least one interactive video segment. When detecting the interactive node currently played to any interactive video clip, sending first state data to the server. In the interactive video clip, at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point, and each preset interactive option is associated with a subsequent plot development plot of the interactive video. According to different interaction options selected by users, the subsequent plot development trends are completely different. And when the interactive operation of any one of the displayed preset interactive options is detected, sending second state data to the server, so that the server can count the statistics of the plot development preference of the user according to the selection of the user, and the like. Through the mode, the interaction mechanism between the client and the server is set, so that the user can participate in the subsequent plot development design, and the frequent interaction between the client and the server can be avoided as much as possible. Namely, the user participation is increased, the user experience is improved, and meanwhile, the interaction pressure and the interaction cost between the client and the server are reduced.
Fig. 4 is an interactive video interaction apparatus according to an embodiment of the present invention, where the apparatus includes: processing section 401 and transmitting section 402.
A processing unit 401, configured to determine whether an interactive node of any one interactive video segment in an interactive video has been played;
a sending unit 402, configured to send first state data to the server when the processing unit 401 determines that an interactive node of any one interactive video segment in the interactive video has been played; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point in the interactive video segment;
the processing unit 401 is further configured to detect whether an interactive operation has been performed on the displayed preset interactive option;
the sending unit 402 is further configured to send second state data to the server when the processing unit detects an interaction operation on any one of the displayed preset interaction options.
Optionally, the processing unit 401 is further configured to determine whether the interactive video is started to be played.
The sending unit 402 is further configured to send third status data to the server when the processing unit 401 determines that the interactive video is started to be played.
Optionally, the apparatus further comprises a receiving unit 403.
The sending unit 402 is specifically configured to send status request information to a server, where the status request information includes user identification information and an interactive video playing request;
a receiving unit 403, configured to receive previous interactive video playing progress information corresponding to the user identification information and the interactive video playing request, where the previous interactive video playing progress information is fed back by the server;
the processing unit 401 is specifically configured to determine a starting point at which the interactive video is started to be played this time according to the previous interactive video playing progress information.
Optionally, the sending unit 402 is further configured to send the fourth state data to the server when the processing unit 401 plays the video segment corresponding to the interactive operation after responding to the interactive operation.
Optionally, when the processing unit 401 determines that the playing is finished, the sending unit 402 is further configured to send fifth status data to the server.
Optionally, the receiving unit 403 is further configured to receive a control instruction for quitting playing the interactive video.
The sending unit 402 is further configured to send sixth status data to the server when the receiving unit 403 receives a control instruction to quit playing the interactive video.
Optionally, when the processing unit 401 determines to trace back the current playing progress to any previous playing progress, the sending unit 402 is further configured to send seventh status data to the server.
Optionally, the sending unit 402 is further configured to send eighth status data to the server after the processing unit 401 traces back the current playing progress to any previous playing progress.
Optionally, the apparatus further comprises a presentation unit 404.
An interaction node is disposed in the preset interaction interval, and the display unit 404 is configured to display at least two preset interaction options at the interaction node.
Optionally, the sending unit 402 is further configured to send device information of the user watching the interactive video to the server.
The functions executed by the functional components of the interactive video interaction device provided in this embodiment have been described in detail in the embodiment corresponding to fig. 1, and therefore are not described herein again.
According to the interactive video interaction device provided by the embodiment of the invention, the interactive video can comprise at least one interactive video segment. When detecting the interactive node currently played to any interactive video clip, sending first state data to the server. In the interactive video clip, at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point, and each preset interactive option is associated with a subsequent plot development plot of the interactive video. According to different interaction options selected by users, the subsequent plot development trends are completely different. And when the interactive operation of any one of the displayed preset interactive options is detected, sending second state data to the server, so that the server can count the statistics of the plot development preference of the user according to the selection of the user, and the like. Through the mode, the interaction mechanism between the client and the server is set, so that the user can participate in the subsequent plot development design, and the frequent interaction between the client and the server can be avoided as much as possible. Namely, the user participation is increased, the user experience is improved, and meanwhile, the interaction pressure and the interaction cost between the client and the server are reduced.
Fig. 5 is a schematic structural diagram of an interactive video interaction system according to an embodiment of the present invention, where the interactive video interaction system 500 shown in fig. 5 includes: at least one processor 501, memory 502, at least one network interface 503, and other user interfaces 504. The various components of interactive video interaction system 500 are coupled together by a bus system 505. It is understood that the bus system 505 is used to enable connection communications between these components. The bus system 505 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 505 in FIG. 5.
The user interface 504 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, among others.
It is to be understood that the memory 502 in embodiments of the present invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DRRAM). The memory 502 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 502 stores elements, executable units or data structures, or a subset thereof, or an expanded set thereof as follows: an operating system 5021 and application programs 5022.
The operating system 5021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 5022 includes various applications, such as a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services. The program for implementing the method according to the embodiment of the present invention may be included in the application program 5022.
In the embodiment of the present invention, by calling a program or an instruction stored in the memory 502, specifically, a program or an instruction stored in the application 5022, the processor 501 is configured to execute the method steps provided by the method embodiments, for example, including:
when the interactive node of any interactive video segment in the interactive video is played, first state data is sent to a server; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point in the interactive video segment;
and when the interaction operation of any one of the displayed preset interaction options is detected, sending second state data to the server.
Optionally, when it is determined that the interactive video is started to be played, the third state data is sent to the server.
Optionally, sending status request information to the server, where the status request information includes user identification information and an interactive video playing request;
receiving previous interactive video playing progress information which is fed back by the server and corresponds to the user identification information and the interactive video playing request;
and determining a starting point of the started playing of the interactive video according to the playing progress information of the previous interactive video.
Optionally, when the video segment corresponding to the interactive operation is played after the interactive operation is responded, the fourth state data is sent to the server.
Optionally, when it is determined that the end of the interactive video is played, the fifth status data is sent to the server.
Optionally, when a control instruction for quitting playing the interactive video is received, the sixth state data is sent to the server.
Optionally, when the current playing progress is traced back to any previous playing progress in the playing process, the seventh status data is sent to the server.
Optionally, after the current playing progress is traced back to any previous playing progress, the eighth status data is sent to the server.
Optionally, an interaction node is arranged in the preset interaction interval, and at least two preset interaction options are displayed at the interaction node.
Optionally, the device information of the user watching the interactive video is sent to the server.
The method disclosed by the above-mentioned embodiments of the present invention may be applied to the processor 501, or implemented by the processor 501. The processor 501 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 501. The Processor 501 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software elements in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 502, and the processor 501 reads the information in the memory 502 and completes the steps of the method in combination with the hardware.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented in one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions of the present application, or a combination thereof.
For a software implementation, the techniques herein may be implemented by means of units performing the functions herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The interactive video interaction system provided in this embodiment may be the interactive video interaction system shown in fig. 5, and may perform all the steps of the interactive video interaction method shown in fig. 1, so as to achieve the technical effect of the interactive video interaction method shown in fig. 1.
The embodiment of the invention also provides a storage medium (computer readable storage medium). The storage medium herein stores one or more programs. Among others, the storage medium may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, a hard disk, or a solid state disk; the memory may also comprise a combination of memories of the kind described above.
When one or more programs in the storage medium can be executed by one or more processors, the interactive video interaction method executed on the interactive video interaction system side is realized.
The processor is used for executing the interactive video interaction program stored in the memory so as to realize the following steps of the interactive video interaction method executed on the interactive video interaction system side:
when the interactive node of any interactive video segment in the interactive video is played, first state data is sent to a server; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking an interactive node as a starting point in the interactive video segment;
and when the interaction operation of any one of the displayed preset interaction options is detected, sending second state data to the server.
Optionally, when it is determined that the interactive video is started to be played, the third state data is sent to the server.
Optionally, sending status request information to the server, where the status request information includes user identification information and an interactive video playing request;
receiving previous interactive video playing progress information which is fed back by the server and corresponds to the user identification information and the interactive video playing request;
and determining a starting point of the started playing of the interactive video according to the playing progress information of the previous interactive video.
Optionally, when the video segment corresponding to the interactive operation is played after the interactive operation is responded, the fourth state data is sent to the server.
Optionally, when it is determined that the end of the interactive video is played, the fifth status data is sent to the server.
Optionally, when a control instruction for quitting playing the interactive video is received, the sixth state data is sent to the server.
Optionally, when the current playing progress is traced back to any previous playing progress in the playing process, the seventh status data is sent to the server.
Optionally, after the current playing progress is traced back to any previous playing progress, the eighth status data is sent to the server.
Optionally, an interaction node is arranged in the preset interaction interval, and at least two preset interaction options are displayed at the interaction node.
Optionally, the device information of the user watching the interactive video is sent to the server.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, it should be understood that the above embodiments are merely exemplary embodiments of the present invention and are not intended to limit the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (11)

1. An interactive video interaction method, wherein the method is executed by a client, and the method comprises:
when the interactive node of any interactive video segment in the interactive video is played, first state data is sent to a server; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking the interactive node as a starting point in the interactive video segment;
and when the interaction operation of any one of the displayed preset interaction options is detected, sending second state data to the server.
2. The method of claim 1, further comprising:
and when the interactive video is determined to be started to play, sending third state data to the server.
3. The method of claim 2, further comprising:
sending state request information to the server, wherein the state request information comprises user identification information and an interactive video playing request;
receiving previous interactive video playing progress information which is fed back by the server and corresponds to the user identification information and the interactive video playing request;
and determining a starting point of the started playing of the interactive video according to the playing progress information of the previous interactive video.
4. The method of claim 1, further comprising:
and sending fourth state data to the server when playing the video clip corresponding to the interactive operation after responding to the interactive operation.
5. The method according to any one of claims 1-4, further comprising:
and when the interactive video is determined to be played to the end of the interactive video, sending fifth state data to the server.
6. The method according to any one of claims 1-4, further comprising:
and when a control instruction for quitting playing the interactive video is received, sending sixth state data to the server.
7. The method according to any one of claims 1-4, further comprising:
and when the current playing progress is traced back to any previous playing progress in the playing process, sending seventh state data to the server.
8. The method of claim 7, further comprising: and after the current playing progress is traced back to any previous playing progress, eighth state data is sent to the server.
9. An interactive video interaction device, the device comprising: the system comprises a processing unit, a sending unit and a display unit;
the processing unit is used for determining whether the interactive node of any interactive video segment in the interactive video is played;
the sending unit is used for sending first state data to the server when the processing unit determines that the interactive node of any interactive video segment in the interactive video is played; the interactive video comprises at least one interactive video segment, and at least two preset interactive options are displayed in a preset interactive interval taking the interactive node as a starting point in the interactive video segment;
the processing unit is also used for detecting whether the displayed preset interaction options are subjected to interaction operation or not;
the sending unit is further used for sending second state data to the server when the processing unit detects the interactive operation on any one of the displayed preset interactive options.
10. An interactive video interaction system, the system comprising: at least one processor and memory;
the processor is used for executing the interactive video interaction program stored in the memory so as to realize the interactive video interaction method of any one of claims 1-8.
11. A computer storage medium storing one or more programs executable by the interactive video interaction system of claim 10 to implement the interactive video interaction method of any one of claims 1 to 8.
CN202010257083.8A 2020-04-02 2020-04-02 Interactive video interaction method, device, system and storage medium Active CN111556370B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010257083.8A CN111556370B (en) 2020-04-02 2020-04-02 Interactive video interaction method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010257083.8A CN111556370B (en) 2020-04-02 2020-04-02 Interactive video interaction method, device, system and storage medium

Publications (2)

Publication Number Publication Date
CN111556370A true CN111556370A (en) 2020-08-18
CN111556370B CN111556370B (en) 2022-10-25

Family

ID=72002396

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010257083.8A Active CN111556370B (en) 2020-04-02 2020-04-02 Interactive video interaction method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN111556370B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970563A (en) * 2020-08-28 2020-11-20 维沃移动通信有限公司 Video processing method and device and electronic equipment
CN112261481A (en) * 2020-10-16 2021-01-22 腾讯科技(深圳)有限公司 Interactive video creating method, device and equipment and readable storage medium
CN113099281A (en) * 2021-02-22 2021-07-09 互影科技(北京)有限公司 Video interaction method and device, storage medium and terminal
CN113347498A (en) * 2021-05-28 2021-09-03 北京爱奇艺科技有限公司 Video playing method and device and computer readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104410920A (en) * 2014-12-31 2015-03-11 合一网络技术(北京)有限公司 Video segment playback amount-based method for labeling highlights
CN104639986A (en) * 2014-12-31 2015-05-20 小米科技有限责任公司 Multimedia data playing method and device
CN105812889A (en) * 2016-03-31 2016-07-27 北京奇艺世纪科技有限公司 Method and system for displaying playing progress bar
CN106998486A (en) * 2016-01-22 2017-08-01 百度在线网络技术(北京)有限公司 Video broadcasting method and device
CN107295359A (en) * 2016-04-11 2017-10-24 腾讯科技(北京)有限公司 A kind of video broadcasting method and device
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN108366278A (en) * 2018-02-01 2018-08-03 北京奇艺世纪科技有限公司 A kind of user in video playing interacts implementation method and device
CN109982114A (en) * 2017-12-28 2019-07-05 优酷网络技术(北京)有限公司 Video interaction method and device
CN110719530A (en) * 2019-10-21 2020-01-21 北京达佳互联信息技术有限公司 Video playing method and device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104410920A (en) * 2014-12-31 2015-03-11 合一网络技术(北京)有限公司 Video segment playback amount-based method for labeling highlights
CN104639986A (en) * 2014-12-31 2015-05-20 小米科技有限责任公司 Multimedia data playing method and device
CN106998486A (en) * 2016-01-22 2017-08-01 百度在线网络技术(北京)有限公司 Video broadcasting method and device
CN105812889A (en) * 2016-03-31 2016-07-27 北京奇艺世纪科技有限公司 Method and system for displaying playing progress bar
CN107295359A (en) * 2016-04-11 2017-10-24 腾讯科技(北京)有限公司 A kind of video broadcasting method and device
US20180376216A1 (en) * 2016-04-11 2018-12-27 Tencent Technology (Shenzhen) Company Limited Video playback method and apparatus, and computer readable storage medium
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN109982114A (en) * 2017-12-28 2019-07-05 优酷网络技术(北京)有限公司 Video interaction method and device
CN108366278A (en) * 2018-02-01 2018-08-03 北京奇艺世纪科技有限公司 A kind of user in video playing interacts implementation method and device
CN110719530A (en) * 2019-10-21 2020-01-21 北京达佳互联信息技术有限公司 Video playing method and device, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970563A (en) * 2020-08-28 2020-11-20 维沃移动通信有限公司 Video processing method and device and electronic equipment
CN112261481A (en) * 2020-10-16 2021-01-22 腾讯科技(深圳)有限公司 Interactive video creating method, device and equipment and readable storage medium
CN113099281A (en) * 2021-02-22 2021-07-09 互影科技(北京)有限公司 Video interaction method and device, storage medium and terminal
CN113099281B (en) * 2021-02-22 2022-05-17 互影科技(北京)有限公司 Video interaction method and device, storage medium and terminal
CN113347498A (en) * 2021-05-28 2021-09-03 北京爱奇艺科技有限公司 Video playing method and device and computer readable storage medium

Also Published As

Publication number Publication date
CN111556370B (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN111556370B (en) Interactive video interaction method, device, system and storage medium
CN110784752B (en) Video interaction method and device, computer equipment and storage medium
US11412307B2 (en) Interaction information processing method, client, service platform, and storage medium
US10225613B2 (en) Method and apparatus for video playing processing and television
US11140462B2 (en) Method, apparatus, and device for generating an essence video and storage medium
US20210160577A1 (en) Method for playing video, electronic device and storage medium
US9055193B2 (en) System and method of a remote conference
CN111586452A (en) Cross-device interaction method and device and playing device
CN109495427B (en) Multimedia data display method and device, storage medium and computer equipment
CN112203111A (en) Multimedia resource preloading method and device, electronic equipment and storage medium
CN106878825B (en) Live broadcast-based sound effect display method and device
CN111294606A (en) Live broadcast processing method and device, live broadcast client and medium
EP2754112B1 (en) System amd method for producing complex multimedia contents by an author and for using such complex multimedia contents by a user
WO2017140226A1 (en) Video processing method and device therefor
CN113727129B (en) Live interaction method, device, system, equipment and storage medium
CN106792251A (en) Method for information display, device and terminal
CN103747280A (en) Method for creating a program and device thereof
CN114257572A (en) Data processing method and device, computer readable medium and electronic equipment
CN107484040A (en) A kind of method for realizing network acceleration
CN113727136A (en) Live broadcast pushing method, system, device, equipment and storage medium
CN112019858A (en) Video playing method and device, computer equipment and storage medium
JP6970729B2 (en) TV desktop display method and equipment
US9894396B1 (en) Media production system with dynamic modification of multiple media items
CN102833608A (en) Realization method for avoiding double focuses of control displayed on intelligent television screen
US10917696B2 (en) Content provision server, content provision program, content provision system and user program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant