CN111193963A - Video interaction method and terminal - Google Patents

Video interaction method and terminal Download PDF

Info

Publication number
CN111193963A
CN111193963A CN202010009041.2A CN202010009041A CN111193963A CN 111193963 A CN111193963 A CN 111193963A CN 202010009041 A CN202010009041 A CN 202010009041A CN 111193963 A CN111193963 A CN 111193963A
Authority
CN
China
Prior art keywords
interactive
interaction
video
data
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010009041.2A
Other languages
Chinese (zh)
Other versions
CN111193963B (en
Inventor
刘德建
李上杰
方振华
郭玉湖
陈宏�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Tianquan Educational Technology Ltd
Original Assignee
Fujian Tianquan Educational Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Tianquan Educational Technology Ltd filed Critical Fujian Tianquan Educational Technology Ltd
Priority to CN202010009041.2A priority Critical patent/CN111193963B/en
Publication of CN111193963A publication Critical patent/CN111193963A/en
Application granted granted Critical
Publication of CN111193963B publication Critical patent/CN111193963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders

Abstract

The invention discloses a video interaction method and a terminal, wherein interaction stream data is added into video data to obtain interaction video data; and decoding the interactive video data in real time, and if the interactive stream data is read, performing interactive operation according to the interactive stream data. According to the invention, the interactive stream data is added in the video data, and when the video is played subsequently, the interactive operation is carried out according to the read interactive stream data, manual intervention is not needed, and the skip playing can be automatically carried out according to the interactive stream data, so that the video interactive mode is more intelligent and automatic; the user can set the interactive mode by oneself, and after generating the interactive streaming data with the interactive mode to add to video data, can carry out the video interaction according to user's self-setting, make the video interactive mode more nimble various, thereby bring good video interaction experience for the user.

Description

Video interaction method and terminal
Technical Field
The invention relates to the technical field of videos, in particular to a video interaction method and a terminal.
Background
Video, broadly refers to various techniques for capturing, recording, processing, storing, transmitting, and reproducing a series of still images as electrical signals. When the continuous image changes more than 24 frames of pictures per second, human eyes cannot distinguish a single static picture according to the persistence of vision principle; it appears as a smooth continuous visual effect, so that the continuous picture is called a video.
Video technology develops for years, and the video technology tends to mature for a long time and becomes an indispensable part of daily life of people. However, in the conventional video technology, generally videos are played in sequence, and in the absence of manual intervention, only the videos can be played from the beginning or a specific position to the end of the video time in sequence, although the player can implement logics such as playing, pausing and jumping, such interaction modes are too simple to bring good video interaction experience to users.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the video interaction method and the terminal are provided, and more flexible and diversified video interaction modes are provided, so that good video interaction experience is brought to users.
In order to solve the technical problems, the invention adopts the technical scheme that:
a video interaction method, comprising the steps of:
s1, adding interactive stream data in the video data to obtain interactive video data;
and S2, decoding the interactive video data in real time, and if interactive stream data are read, performing interactive operation according to the interactive stream data.
In order to solve the technical problem, the invention adopts another technical scheme as follows:
a video interaction terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
s1, adding interactive stream data in the video data to obtain interactive video data;
and S2, decoding the interactive video data in real time, and if interactive stream data are read, performing interactive operation according to the interactive stream data.
The invention has the beneficial effects that: a video interaction method and a terminal, interactive stream data are added in video data, when a video is played subsequently, interactive operation is carried out according to the read interactive stream data, manual intervention is not needed, automatic skip playing can be carried out according to the interactive stream data, and a video interaction mode is more intelligent and automatic; the user can set the interactive mode by oneself, and after generating the interactive streaming data with the interactive mode to add to video data, can carry out the video interaction according to user's self-setting, make the video interactive mode more nimble various, thereby bring good video interaction experience for the user.
Drawings
Fig. 1 is a schematic flowchart of a video interaction method according to an embodiment of the present invention;
FIG. 2 is a timing diagram illustrating interaction selection according to an embodiment of the present invention;
fig. 3 is a screenshot of interactive video data in a playing process according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a video interaction terminal according to an embodiment of the present invention.
Description of reference numerals:
1. a video interaction terminal; 2. a processor; 3. a memory.
Detailed Description
In order to explain technical contents, achieved objects, and effects of the present invention in detail, the following description is made with reference to the accompanying drawings in combination with the embodiments.
Before this, in order to facilitate understanding of the technical solution of the present invention, the english abbreviations, devices and the like referred to in the present invention are described as follows:
(1) OVERLAY: the Chinese language is interpreted as an overlay, which is a digital video display technology, and allows a video signal to be directly displayed on a screen only through a video memory without being processed by a video card, so that when a window is opened, the original window is directly overlaid.
Referring to fig. 1 to 3, a video interaction method includes the steps of:
s1, adding interactive stream data in the video data to obtain interactive video data;
and S2, decoding the interactive video data in real time, and if interactive stream data are read, performing interactive operation according to the interactive stream data.
From the above description, the beneficial effects of the present invention are: interactive stream data is added into video data, interactive operation is carried out according to the read interactive stream data when a video is played subsequently, manual intervention is not needed, and automatic skip playing can be carried out according to the interactive stream data, so that a video interactive mode is more intelligent and automatic; the user can set the interactive mode by oneself, and after generating the interactive streaming data with the interactive mode to add to video data, can carry out the video interaction according to user's self-setting, make the video interactive mode more nimble various, thereby bring good video interaction experience for the user.
Further, the step S1 is specifically:
writing interactive information and playing control information into interactive stream data, and embedding the interactive stream data into video data through OVERLAY to obtain interactive video data, wherein the interactive video data comprises video stream data, audio stream data and the interactive stream data, and the interactive information comprises interactive event type, interactive event data and interactive event time point.
It can be known from the above description that the interactive stream data is embedded in the video data through overlap, which does not affect the playing of the video file, so that the interactive stream data can be triggered along with the playing of the video to implement video interaction.
Further, the step S2 is specifically:
s21, decoding the video stream data, the audio stream data and the interactive stream data in real time;
s22, playing the video, and starting to read the interactive stream data corresponding to each frame;
s23, if the interactive information is read, triggering video pause, acquiring the interactive event data in the interactive information, and popping event display information according to the interactive event data;
s24, acquiring event interaction behaviors of a user, and performing branch playing according to the event interaction behaviors, the interaction event data and the interaction event time points;
and S25, if the control interaction behavior of the user is received, jumping to the appointed time according to the interaction logic in the playing control information.
It can be known from the above description that pop-up interaction is introduced in video playing, after the interaction is completed, a preset interaction event or script is used to drive the subsequent playing trend of the video, and meanwhile, the control interaction behavior corresponding to the playing control information can be performed manually, so that the interaction between the user and the video is richer and more diverse, and the video playing is more vivid and interesting.
Further, the interaction event types in the step S1 include interaction selection, input content and submitted answer;
the step S23 specifically includes:
if the interactive event type is read to include interactive information of interactive selection, triggering video pause, acquiring interactive event data in the interactive information, and popping event display information according to the interactive event data, wherein the event display information includes interactive selection data and at least two interactive selection buttons;
the step S24 specifically includes:
acquiring a first interaction selection button selected by a user, and obtaining a first playing time interval corresponding to the first interaction selection button according to the interaction event time point, wherein the interaction event time point comprises a playing time interval corresponding to each interaction selection button and a jumping time point after the interaction event data is completed each time, and the first playing time interval comprises a first playing starting time point and a first playing ending time point;
triggering the video to jump to the picture with the time of the first playing starting time point and playing the picture to the first playing ending time point;
and triggering the video to jump to the picture with the jumping time point, and continuing to play the video.
From the above description, when the interactive event is interactive selection, the user selects the interactive selection button, and the video automatically jumps to the corresponding screen and plays, thereby providing a preferred implementation of the interactive mode.
Further, the event interaction behavior and the control interaction behavior both belong to user interaction behaviors, and the user interaction behaviors comprise mouse interaction behaviors, question answering interaction behaviors and AI interaction behaviors.
As can be seen from the above description, a variety of interactive control manners can be implemented to facilitate the interaction between the user and the video.
Referring to fig. 2, a video interactive terminal includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the following steps:
s1, adding interactive stream data in the video data to obtain interactive video data;
and S2, decoding the interactive video data in real time, and if interactive stream data are read, performing interactive operation according to the interactive stream data.
From the above description, the beneficial effects of the present invention are: interactive stream data is added into video data, interactive operation is carried out according to the read interactive stream data when a video is played subsequently, manual intervention is not needed, and automatic skip playing can be carried out according to the interactive stream data, so that a video interactive mode is more intelligent and automatic; the user can set the interactive mode by oneself, and after generating the interactive streaming data with the interactive mode to add to video data, can carry out the video interaction according to user's self-setting, make the video interactive mode more nimble various, thereby bring good video interaction experience for the user.
Further, when the processor executes the step S1, the following steps are specifically implemented:
writing interactive information and playing control information into interactive stream data, and embedding the interactive stream data into video data through OVERLAY to obtain interactive video data, wherein the interactive video data comprises video stream data, audio stream data and the interactive stream data, and the interactive information comprises interactive event type, interactive event data and interactive event time point.
It can be known from the above description that the interactive stream data is embedded in the video data through overlap, which does not affect the playing of the video file, so that the interactive stream data can be triggered along with the playing of the video to implement video interaction.
Further, when the processor executes the step S2, the following steps are specifically implemented:
s21, decoding the video stream data, the audio stream data and the interactive stream data in real time;
s22, playing the video, and starting to read the interactive stream data corresponding to each frame;
s23, if the interactive information is read, triggering video pause, acquiring the interactive event data in the interactive information, and popping event display information according to the interactive event data;
s24, acquiring event interaction behaviors of a user, and performing branch playing according to the event interaction behaviors, the interaction event data and the interaction event time points;
and S25, if the control interaction behavior of the user is received, jumping to the appointed time according to the interaction logic in the playing control information.
It can be known from the above description that pop-up interaction is introduced in video playing, after the interaction is completed, a preset interaction event or script is used to drive the subsequent playing trend of the video, and meanwhile, the control interaction behavior corresponding to the playing control information can be performed manually, so that the interaction between the user and the video is richer and more diverse, and the video playing is more vivid and interesting.
Further, the interaction event types in the step S1 include interaction selection, input content and submitted answer;
when the processor executes the step S23 of the computer program, the following steps are specifically implemented:
if the interactive event type is read to include interactive information of interactive selection, triggering video pause, acquiring interactive event data in the interactive information, and popping event display information according to the interactive event data, wherein the event display information includes interactive selection data and at least two interactive selection buttons;
when the processor executes the step S24 of the computer program, the following steps are specifically implemented:
acquiring a first interaction selection button selected by a user, and obtaining a first playing time interval corresponding to the first interaction selection button according to the interaction event time point, wherein the interaction event time point comprises a playing time interval corresponding to each interaction selection button and a jumping time point after the interaction event data is completed each time, and the first playing time interval comprises a first playing starting time point and a first playing ending time point;
triggering the video to jump to the picture with the time of the first playing starting time point and playing the picture to the first playing ending time point;
and triggering the video to jump to the picture with the jumping time point, and continuing to play the video.
From the above description, when the interactive event is interactive selection, the user selects the interactive selection button, and the video automatically jumps to the corresponding screen and plays, thereby providing a preferred implementation of the interactive mode.
Further, the event interaction behavior and the control interaction behavior both belong to user interaction behaviors, and the user interaction behaviors comprise mouse interaction behaviors, question answering interaction behaviors and AI interaction behaviors.
As can be seen from the above description, a variety of interactive control manners can be implemented to facilitate the interaction between the user and the video.
Referring to fig. 1 to fig. 3, a first embodiment of the present invention is:
a video interaction method, comprising the steps of:
s1, adding interactive stream data in the video data to obtain interactive video data;
in this embodiment, step S1 specifically includes:
writing interactive information and playing control information into interactive stream data, embedding the interactive stream data into video data through OVERLAY to obtain interactive video data, wherein the interactive video data comprises video stream data, audio stream data and interactive stream data, the interactive information comprises interactive event types, interactive event data and interactive event time points, and the interactive event types comprise interactive selection, input content and submitted answers; in this embodiment, the format of the interaction stream data includes a total length of the interaction data, a type code of the interaction, a corresponding data length, and a final data storage area;
and S2, decoding the interactive video data in real time, and if the interactive stream data is read, performing interactive operation according to the interactive stream data.
In this embodiment, step S2 specifically includes:
s21, decoding video stream data, audio stream data and interactive stream data in real time;
s22, playing the video, and starting to read the interactive stream data corresponding to each frame;
as shown in fig. 2, when the video is played from 00: 00;
s23, if the interactive information is read, triggering the video to pause, acquiring the interactive event data in the interactive information, and popping out the event display information according to the interactive event data;
in this embodiment, if the interactive event type including interactive selection interactive information is read, triggering video pause, acquiring interactive event data in the interactive information, and popping event display information according to the interactive event data, wherein the event display information includes interactive selection data and at least two interactive selection buttons;
as shown in fig. 2, when the playing is performed at a certain time, the event of interactive selection is triggered, for example, whether the answer 1+2 is 3 is correct or not, at this time, the popped up information includes two interactive selection buttons of "yes" and "no", that is, corresponding to branch 1 and branch 2 in fig. 2;
s24, acquiring the event interaction behavior of the user, and performing branch playing according to the event interaction behavior, the interaction event data and the interaction event time point;
in this embodiment, a first interactive selection button selected by a user is obtained, and a first play time interval corresponding to the first interactive selection button is obtained according to an interactive event time point, where the interactive event time point includes a play time interval corresponding to each interactive selection button and a jump time point after interactive event data is completed each time, and the first play time interval includes a first play start time point and a first play end time point;
triggering the video to jump to a picture with the time of a first playing starting time point, and playing the picture to a first playing ending time point;
triggering the video to jump to a picture with the jumping time point, and continuously playing the video, wherein the video can be continuously played from the current sequence; the method can also be played from any time point, and the time point is a jumping time point preset by a user when the interactive data stream is coded;
as shown in fig. 2, the "yes" interactive selection button corresponds to branch 1 in fig. 2, and the "no" corresponds to branch 2, when the event interaction behavior of the user is to select the "yes" interactive selection button, the screen at the time point of 03:10 is skipped to, and the played sound is answered correctly, and the played video content is a correct prompt screen; and if the event interaction behavior of the user is to select the interaction selection button of 'no', jumping to a picture at the time point of 04:25, playing the sound with wrong answer, and playing the video content with wrong prompt picture.
And S25, if the control interactive behavior of the user is received, jumping to the appointed time according to the interactive logic in the playing control information.
In this embodiment, the following websites can be looked up to understand this embodiment more fully: http:// gcncs.101. com/v0.1/static/ai101 ppt/www/MyMovie.mp4? serviceName 101 ppt. Fig. 3 is a screenshot corresponding to the website, and as can be seen from fig. 3, the events in the website are: what is called "liquid bread" hereinafter? The interactive selection buttons comprise milk, soybean milk and beer, and then jump to different pictures according to different interactive selection buttons clicked by a user; as can be seen from the right side of fig. 3, the control interaction behavior of the user may be play, pause, and page jump, etc.
In this embodiment, the event interaction behavior and the control interaction behavior both belong to user interaction behaviors, and the user interaction behaviors include a mouse interaction behavior, a question answering interaction behavior, and an AI interaction behavior, so as to facilitate interaction between a user and a video.
Therefore, based on the embodiment, the application is carried out in the interactive video production stage, and the playing stage is used to string up the operation flow of the whole interactive video; meanwhile, the video has the advantages of being more transmissible, the complexity of an application program is reduced, the content of 3D real-time rendering can be coded into the video by using the technology, and the requirement of the 3D real-time rendering on the performance of a machine is reduced.
Referring to fig. 4, a second embodiment of the present invention is:
a video interactive terminal 1 comprises a memory 3, a processor 2 and a computer program stored on the memory 3 and operable on the processor, wherein the processor 2 implements the steps of the first embodiment when executing the computer program.
In summary, according to the video interaction method and the terminal provided by the present invention, the interactive stream data is embedded into the video data through overlap, so that the playing of the video file is not affected, and when the video is played subsequently, the interactive operation is performed according to the read interactive stream data, without manual intervention, and the skip playing can be automatically performed according to the interactive stream data, so that the video interaction mode is more intelligent and automatic; the method comprises the steps that a user can set an interaction mode by himself, after the interaction mode is generated into interaction stream data and added to video data, video interaction can be carried out according to the self setting of the user, for example, pop-up interaction is introduced in video playing, after the interaction is finished, a preset interaction event or script is used for driving the subsequent playing trend of the video, and meanwhile, the manual control interaction behavior corresponding to playing control information can also be carried out, so that the interaction between the user and the video is richer and more varied, the video playing is more vivid and interesting, and good video interaction experience is brought to the user; in addition, a diversified interactive control manner is provided to facilitate interaction between the user and the video.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent changes made by using the contents of the present specification and the drawings, or applied directly or indirectly to the related technical fields, are included in the scope of the present invention.

Claims (10)

1. A video interaction method, comprising the steps of:
s1, adding interactive stream data in the video data to obtain interactive video data;
and S2, decoding the interactive video data in real time, and if interactive stream data are read, performing interactive operation according to the interactive stream data.
2. The video interaction method according to claim 1, wherein the step S1 specifically comprises:
writing interactive information and playing control information into interactive stream data, and embedding the interactive stream data into video data through OVERLAY to obtain interactive video data, wherein the interactive video data comprises video stream data, audio stream data and the interactive stream data, and the interactive information comprises interactive event type, interactive event data and interactive event time point.
3. The video interaction method according to claim 2, wherein the step S2 specifically comprises:
s21, decoding the video stream data, the audio stream data and the interactive stream data in real time;
s22, playing the video, and starting to read the interactive stream data corresponding to each frame;
s23, if the interactive information is read, triggering video pause, acquiring the interactive event data in the interactive information, and popping event display information according to the interactive event data;
s24, acquiring event interaction behaviors of a user, and performing branch playing according to the event interaction behaviors, the interaction event data and the interaction event time points;
and S25, if the control interaction behavior of the user is received, jumping to the appointed time according to the interaction logic in the playing control information.
4. The video interaction method of claim 3, wherein the interaction event types in step S1 include interaction selection, input content and submitted answer;
the step S23 specifically includes:
if the interactive event type is read to include interactive information of interactive selection, triggering video pause, acquiring interactive event data in the interactive information, and popping event display information according to the interactive event data, wherein the event display information includes interactive selection data and at least two interactive selection buttons;
the step S24 specifically includes:
acquiring a first interaction selection button selected by a user, and obtaining a first playing time interval corresponding to the first interaction selection button according to the interaction event time point, wherein the interaction event time point comprises a playing time interval corresponding to each interaction selection button and a jumping time point after the interaction event data is completed each time, and the first playing time interval comprises a first playing starting time point and a first playing ending time point;
triggering the video to jump to the picture with the time of the first playing starting time point and playing the picture to the first playing ending time point;
and triggering the video to jump to the picture with the jumping time point, and continuing to play the video.
5. A video interaction method as claimed in claim 3 or 4, wherein: the event interaction behavior and the control interaction behavior belong to user interaction behaviors, and the user interaction behaviors comprise mouse interaction behaviors, answer interaction behaviors and AI interaction behaviors.
6. A video interaction terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
s1, adding interactive stream data in the video data to obtain interactive video data;
and S2, decoding the interactive video data in real time, and if interactive stream data are read, performing interactive operation according to the interactive stream data.
7. The video interaction terminal according to claim 6, wherein said processor, when executing said step S1, implements the following steps:
writing interactive information and playing control information into interactive stream data, and embedding the interactive stream data into video data through OVERLAY to obtain interactive video data, wherein the interactive video data comprises video stream data, audio stream data and the interactive stream data, and the interactive information comprises interactive event type, interactive event data and interactive event time point.
8. The video interaction terminal of claim 7, wherein the processor, when executing the step S2, implements the following steps:
s21, decoding the video stream data, the audio stream data and the interactive stream data in real time;
s22, playing the video, and starting to read the interactive stream data corresponding to each frame;
s23, if the interactive information is read, triggering video pause, acquiring the interactive event data in the interactive information, and popping event display information according to the interactive event data;
s24, acquiring event interaction behaviors of a user, and performing branch playing according to the event interaction behaviors, the interaction event data and the interaction event time points;
and S25, if the control interaction behavior of the user is received, jumping to the appointed time according to the interaction logic in the playing control information.
9. The video interaction terminal of claim 8, wherein the interaction event types in step S1 include interaction selection, input content and submitted answer;
when the processor executes the step S23 of the computer program, the following steps are specifically implemented:
if the interactive event type is read to include interactive information of interactive selection, triggering video pause, acquiring interactive event data in the interactive information, and popping event display information according to the interactive event data, wherein the event display information includes interactive selection data and at least two interactive selection buttons;
when the processor executes the step S24 of the computer program, the following steps are specifically implemented:
acquiring a first interaction selection button selected by a user, and obtaining a first playing time interval corresponding to the first interaction selection button according to the interaction event time point, wherein the interaction event time point comprises a playing time interval corresponding to each interaction selection button and a jumping time point after the interaction event data is completed each time, and the first playing time interval comprises a first playing starting time point and a first playing ending time point;
triggering the video to jump to the picture with the time of the first playing starting time point and playing the picture to the first playing ending time point;
and triggering the video to jump to the picture with the jumping time point, and continuing to play the video.
10. A video interaction terminal according to claim 8 or 9, characterized in that: the event interaction behavior and the control interaction behavior belong to user interaction behaviors, and the user interaction behaviors comprise mouse interaction behaviors, answer interaction behaviors and AI interaction behaviors.
CN202010009041.2A 2020-01-06 2020-01-06 Video interaction method and terminal Active CN111193963B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010009041.2A CN111193963B (en) 2020-01-06 2020-01-06 Video interaction method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010009041.2A CN111193963B (en) 2020-01-06 2020-01-06 Video interaction method and terminal

Publications (2)

Publication Number Publication Date
CN111193963A true CN111193963A (en) 2020-05-22
CN111193963B CN111193963B (en) 2022-10-21

Family

ID=70710697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010009041.2A Active CN111193963B (en) 2020-01-06 2020-01-06 Video interaction method and terminal

Country Status (1)

Country Link
CN (1) CN111193963B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162301A (en) * 2015-04-14 2016-11-23 北京奔流网络信息技术有限公司 A kind of information-pushing method
CN106803993A (en) * 2017-03-01 2017-06-06 腾讯科技(深圳)有限公司 It is a kind of to realize the method and device that video branching selection is played
US20190111346A1 (en) * 2017-10-13 2019-04-18 Microsoft Technology Licensing, Llc Shareable video experience tailored to video-consumer device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162301A (en) * 2015-04-14 2016-11-23 北京奔流网络信息技术有限公司 A kind of information-pushing method
CN106803993A (en) * 2017-03-01 2017-06-06 腾讯科技(深圳)有限公司 It is a kind of to realize the method and device that video branching selection is played
US20190111346A1 (en) * 2017-10-13 2019-04-18 Microsoft Technology Licensing, Llc Shareable video experience tailored to video-consumer device

Also Published As

Publication number Publication date
CN111193963B (en) 2022-10-21

Similar Documents

Publication Publication Date Title
CN108900854B (en) Live broadcast microphone room switching method, storage medium, equipment and system
CN104837051B (en) Video broadcasting method and client
US11140462B2 (en) Method, apparatus, and device for generating an essence video and storage medium
CN111541947A (en) Teaching video processing method, device and system
US8837912B2 (en) Information processing apparatus, information processing method and program
CN111698566A (en) Video playing method and device, electronic equipment and storage medium
CN115423905A (en) Digital human driving method, system, device and storage medium
CN112527171A (en) Multimedia file playing method, device, equipment and medium
JP4296145B2 (en) Playback apparatus and method
US11315607B2 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium
CN111193963B (en) Video interaction method and terminal
CN106993230A (en) Intelligence based on plot plays interaction control method
WO2022198971A1 (en) Virtual character action switching method and apparatus, and storage medium
US10596452B2 (en) Toy interactive method and device
CN100373931C (en) Program detail information display apparatus and method thereof
CN112019858B (en) Video playing method and device, computer equipment and storage medium
CN106331786B (en) Picture and text mode shows the method and system of playlist
WO2010131493A1 (en) Video image processing device, video image processing method, information storage medium, and program
CN109684487A (en) Media file and its generation method and playback method
JP7406761B1 (en) Programs and equipment to support human resource development
CN116089728B (en) Method and related device for generating voice interaction novel for children
WO2021235525A1 (en) Content playback program and content playback device
CN114666648B (en) Video playing method and electronic equipment
CN108024147A (en) Scene playback method, smart television and computer-readable recording medium
CN114866849B (en) Video playing method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant