CN110825383A - Video interaction method and device and computer readable storage medium - Google Patents

Video interaction method and device and computer readable storage medium Download PDF

Info

Publication number
CN110825383A
CN110825383A CN201910975068.4A CN201910975068A CN110825383A CN 110825383 A CN110825383 A CN 110825383A CN 201910975068 A CN201910975068 A CN 201910975068A CN 110825383 A CN110825383 A CN 110825383A
Authority
CN
China
Prior art keywords
script
language
host
compiled
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910975068.4A
Other languages
Chinese (zh)
Other versions
CN110825383B (en
Inventor
董熠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN201910975068.4A priority Critical patent/CN110825383B/en
Publication of CN110825383A publication Critical patent/CN110825383A/en
Application granted granted Critical
Publication of CN110825383B publication Critical patent/CN110825383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • G06F8/43Checking; Contextual analysis
    • G06F8/433Dependency analysis; Data or control flow analysis
    • G06F8/434Pointers; Aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The invention provides a video interaction method, a video interaction device and a computer-readable storage medium, wherein the video interaction method comprises the following steps: acquiring a script file and an interactive video sent by a server; when the interactive video is played to the triggering time point of the interactive video, rendering a script paragraph corresponding to the triggering time point in the script file according to the host language according to the mapping relation between the script language of the script file and the host language of the client to generate an interactive interface; when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph; an execution event is executed. The method and the device can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most of host languages, so that the script file can realize the interactive function corresponding to the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.

Description

Video interaction method and device and computer readable storage medium
Technical Field
The invention belongs to the technical field of computers, and particularly relates to a video interaction method and device and a computer readable storage medium.
Background
Interactive video means that interactive experience is integrated into linear video through various technical means. With the increase of broadband access speed and the maturity of multimedia playing technology, interactive video is more and more widely applied.
Among the prior art, when interactive video was applied to the client side environment, to different clients, there can be multiple different host environments, with interactive video applied to different environments, specifically include: and developing system controls applied to the interactive video in the host environment aiming at the host environments of different clients, and triggering interactive events in the interactive video through the corresponding system controls when the interactive video is played in the host environment.
However, in the current scheme, system controls applied to interactive videos in different types of host environments need to be developed from scratch, so that the system controls of each interactive video are matched with the host environment, and the system controls in different host environments cannot be compatible with each other, which results in poor compatibility of the interactive videos.
Disclosure of Invention
In view of this, the present invention provides a video interaction method, an apparatus and a computer-readable storage medium, which solve the problem of poor compatibility of interactive videos in the current scheme to a certain extent.
According to a first aspect of the present invention, there is provided a video interaction method, which may include:
the method comprises the steps that a script file and an interactive video sent by a server are obtained, wherein at least one trigger time point is arranged on the time axis of the interactive video and is used for triggering and generating a corresponding interactive interface;
when the interactive video is played to the triggering time point, rendering a script paragraph in the script file corresponding to the triggering time point according to a mapping relation between a script language of the script file and a host language of a client side and generating an interactive interface;
when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is the host language;
executing the execution event.
According to a second aspect of the present invention, there is provided a video interaction apparatus, which may include:
the acquisition module is used for acquiring the script file and the interactive video sent by the server, at least one trigger time point is arranged on the time axis of the interactive video, and the trigger time point is used for triggering and generating a corresponding interactive interface;
the rendering module is used for rendering a script paragraph corresponding to the trigger time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client when the interactive video is played to the trigger time point, so as to generate an interactive interface;
the determining module is used for determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph when the user operation event aiming at the interactive interface is received, wherein the processing language of the execution event is the host language;
and the execution module is used for executing the execution event.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program implements the steps of the video interaction method according to the first aspect.
Aiming at the prior art, the invention has the following advantages:
the invention provides a video interaction method, which comprises the following steps: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on the time axis of the interactive video; when the interactive video is played to a triggering time point, rendering a script paragraph corresponding to the triggering time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client to generate an interactive interface; when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; an execution event is executed. The method and the device can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most of host languages, so that the script file can realize the interactive function corresponding to the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart illustrating steps of a video interaction method according to an embodiment of the present invention;
fig. 2 is a system architecture diagram of a video interaction method according to an embodiment of the present invention;
FIG. 3 is an interface diagram of a video interaction method according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating steps of another video interaction method according to an embodiment of the present invention;
fig. 5 is a block diagram of a video interaction device according to an embodiment of the present invention;
fig. 6 is a block diagram of a rendering module according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Fig. 1 is a flowchart of steps of a video interaction method provided in an embodiment of the present invention, which is applied to a terminal, and as shown in fig. 1, the method may include:
step 101, acquiring the script file and the interactive video sent by the server, wherein at least one trigger time point is arranged on a time axis of the interactive video.
And the triggering time point is used for triggering and generating a corresponding interactive interface.
In the embodiment of the invention, the interactive video is applied to different host environments and can be realized through the script file, the script file can be a Lua script file, and the Lua script file is designed to be flexibly embedded into the application program, so that flexible expansion and customization functions are provided for the application program. The Lua script file is written by a standard C Language (CPRogramming Language), and can be compiled and run on almost all operating systems and platforms.
In addition, the Lua script file can be easily called by C language or Java language codes and can also be used for calling functions of the C language or the Java language in reverse, so that the Lua script file can be widely applied to application programs. The Lua script file can be used not only as an extension script, but also as a common configuration file, and is easier to understand and maintain.
In the embodiment of the present invention, the host environment of the client to which the interactive video is applied includes, but is not limited to, a C language host environment (IOS system) or a Java language (android system) host environment. For the host environment of the client, the embodiment of the invention does not directly develop the control of the interactive video through the host language corresponding to the host environment, but realizes the development of the control of the interactive video by setting a universal and independent script file, and the script file can be used for describing the style of an interactive interface in the interactive video and the service logic of an interactive event in the interactive video. The script file is compiled by using a Lua script language of a non-host language, so that the object mapping between the script language of the script file and the host language of the client needs to be performed when the script file of the non-host language is applied to the host environment.
Referring to fig. 2, a system architecture diagram of a video interaction method according to an embodiment of the present invention is shown, where the system architecture diagram includes: a client and a server. The client can be a mobile terminal, a personal computer, an application running on a computing device, and the like; the server can be a cloud server, a business server, and the like.
In an implementation manner of the embodiment of the present invention, the server may establish a corresponding script file according to an interaction manner of the interactive video, and actively issue the script file and the interactive video to the client, so as to achieve a purpose that the client passively acquires the script file and the interactive video. It should be noted that, in another implementation manner of the embodiment of the present invention, after the server establishes the corresponding script file according to the interactive manner of the interactive video, the script file and the interactive video may also be added to a response to the script request according to the script request sent by the client, so as to achieve the purpose that the client actively acquires the script file and the interactive video.
The interactive video can be divided into a plurality of video segments, the connection time point between adjacent video segments can be a trigger time point, and when the interactive video is played to the trigger time point, the video segments played at the trigger time point need to be determined according to the interaction action with the user.
For example, an interactive video is divided into 3 video segments: fragment a, fragment B and fragment C. When the segment A is played, the ending time point of the segment A is taken as a trigger time point, an interactive interface for selecting an option 1 and an option 2 can be generated at the moment, and if a user selects the option 1, the segment B is further played after the segment A is played; if the user selects option 2, after the segment a is played, the segment C is further played.
And 102, when the interactive video is played to the triggering time point, rendering a script paragraph corresponding to the triggering time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client, and generating an interactive interface.
In this step, in order to satisfy the object mapping between the scripting language of the script file and the host language of the client, the embodiment of the present invention may establish the mapping relationship between the scripting language and the host language in the preset memory space of the client according to the characteristics of the scripting language and the host language.
Specifically, the mapping relationship includes, but is not limited to, a functional mapping relationship between the script language and the host language, a control mapping relationship between the script language and the host language, and a class mapping relationship between the script language and the host language.
The function mapping relationship may include a mapping relationship between a specific function in the script file for implementing a corresponding function and a function in the host environment of the client for implementing the same function; the control mapping relationship may include a mapping relationship between a specific control for implementing a corresponding function in the script file and a control for implementing the same function in a host environment of the client; the class mapping relationship may include a mapping relationship between an entity class in the script file and an entity class in the hosting environment of the client. Through the three mapping relations, the script file compiled by the script language can be converted into the host language which can be processed in the host environment of the client, and the application of the script file in the host environment of the client is realized.
In the embodiment of the invention, the script file can be used for realizing the interaction between the user and the interactive video when the interactive video is played to the triggering time point. And when the interactive video comprises a plurality of trigger time points, the script file comprises a plurality of script sections corresponding to the trigger time points, and each script section is used for realizing the interactive content corresponding to the trigger time point.
For example, an interactive video includes a trigger time point 1 and a trigger time point 2, the script file includes a script paragraph a corresponding to the trigger time point 1 and a script paragraph B corresponding to the trigger time point 2, the script paragraph a describes a code for generating an interactive interface a, and the script paragraph B describes a code for generating an interactive interface B.
When the interactive video is played to the trigger time point 1, rendering the script paragraph A corresponding to the trigger time point 1 to generate an interactive interface a for displaying; when the interactive video is played to the trigger time point 2, the script paragraph B corresponding to the trigger time point 2 can be rendered, and an interactive interface B is generated for display.
Specifically, in the embodiment of the present invention, since the script file is generated by compiling a script language, the script language is usually different from a host language of a host environment in the client, and the client cannot directly read and render a script paragraph of the script file. Therefore, in the embodiment of the present invention, the client may convert the script paragraphs compiled by the script language into the host language understood by the client for parsing and rendering according to the mapping relationship between the script language and the host language stored in the client, so that the script paragraphs are rendered in the host language environment to generate the interactive interface.
Step 103, when a user operation event for the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein a processing language of the execution event is the host language.
In the embodiment of the present invention, a script section corresponding to the interactive interface may define one or more user operation events and an execution event corresponding to the user operation event. The client can further find the execution event corresponding to the user operation event in the script paragraph through the mapping relation according to the user operation event executed by the user on the interactive interface.
For example, for a script segment corresponding to a trigger time point, the script segment describes a touch interface with a functional button 1 and a functional button 2, and after the script segment defines that a user triggers the functional button 1, an execution event is further performed to perform a confirmation operation; and after the user triggers the functionality button 2, the event is executed to further execute the cancel operation. After the client renders the touch interface according to the script paragraph and displays the touch interface, if the touch operation of the user on the functional button 1 is received, the corresponding execution event is further determined as an execution confirmation operation; if the touch operation of the user on the functional button 2 is received, the corresponding execution event is further determined to be the execution cancellation operation.
Specifically, in the embodiment of the present invention, since the script file is generated by compiling a script language, the script language is different from a host language of a host environment in the client, and the user operation event received by the client is usually based on the host language, the client cannot directly determine, in the script paragraph, an execution event corresponding to the user operation event compiled by the host language according to the user operation event compiled by the host language. Therefore, in the embodiment of the present invention, the client may convert the user operation event compiled by the host language into the script language that can be understood by the script paragraphs according to the mapping relationship between the script language stored in the client and the host language, so that the execution event of the script language corresponding to the user operation event is determined in the script paragraphs in the script language environment.
After the execution event of the script language is determined, the execution event of the script language can be further converted into a host language which can be understood by the client through a mapping relation between the script language and the host language stored in the client, so that the execution event of the host language is obtained.
And step 104, executing the execution event.
In this step, since the execution event of the host language is output in step 103, the execution event can be executed in the host environment in the client to implement the corresponding function.
An execution event may include, but is not limited to, playing another piece of video, performing a function operation, presenting a link, etc.
Specifically, referring to fig. 3, an interface diagram of a video interaction method provided by the embodiment of the present invention is shown, where an interactive video 20 is being played in a screen 10 of a client, and in a time axis 21 of the interactive video 20, the interactive video 20 is currently played to a trigger time point X, at this time, according to a script paragraph corresponding to the trigger time point, the client may render the script paragraph according to a host language to generate an interactive interface 30 for display, and the interactive interface 30 includes two interface elements: a button 31 and a button 32, after the user presses the button 31 in the interactive interface 30, the video clip a can be further played after the trigger time point X; after the user presses the button 32 in the interactive interface 30, the video clip B can be further played after the trigger time point X.
In the embodiment of the invention, the script file can be applied to various host environments of the client due to the universality of the script language and the mapping interchangeability between the script language and most host languages, and under the condition that the interactive mode of the interactive video needs to be updated, only the server needs to update the original script file to obtain a new script file, and the server issues the new script file to the client, so that the original script file is updated without considering the influence of the host environment, the updating operation is simpler, the aim of quickly updating the interactive mode of the interactive video is fulfilled, the reusability of the interactive video is improved, and the quick iteration of the interactive function of the interactive video is realized.
To sum up, the video interaction method provided by the embodiment of the present invention includes: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on the time axis of the interactive video; when the interactive video is played to a triggering time point, rendering a script paragraph corresponding to the triggering time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client to generate an interactive interface; when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; an execution event is executed. The method and the device can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most of host languages, so that the script file can realize the interactive function corresponding to the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
Fig. 4 is a flowchart illustrating steps of another video interaction method according to an embodiment of the present invention, as shown in fig. 4, the method may include:
step 401, obtaining the script file and the interactive video sent by the server, wherein at least one trigger time point is arranged on a time axis of the interactive video.
This step may specifically refer to step 101, which is not described herein again.
Optionally, before step 403, the method may further include:
step 402, establishing a stack structure model according to the script language and the host language, wherein the stack structure model comprises a mapping relation between the script language and the host language.
In the embodiment of the present invention, the script file includes a plurality of script paragraphs constructed by a script language, and according to the service logic of the interactive video, the script paragraphs include a plurality of codes, a certain order relationship is formed among the plurality of codes, and when the script paragraphs are analyzed and executed, the script paragraphs need to be executed strictly according to the order relationship. Therefore, according to the mapping relationship between the script language and the host language, the mapping conversion of the script file or the mapping conversion of the host object in the host environment are converted according to the order relationship.
Specifically, a stack means a data structure in which data items are arranged in order, and data items can be inserted and deleted only at one end, called a top of stack (top), and a stack data structure may be included in a stack structure model, and used for storing data items having a certain order definition and outputting the data items in the order.
In the embodiment of the present invention, a stack structure model of a stack structure may be established in a preset storage space of a client, where the stack structure model has two storage structures, a heap (heap) and a stack (heap): can be seen as a tree, such as: heap ordering, the advantage of heap is that memory size can be allocated dynamically, and the life cycle does not need to tell the compiler in advance. Stack (stack): a first-in-last-out data structure, stack, has the advantage of faster access.
For data items input into the stack structure model, storage spaces in the stack structure model in a corresponding order can be imported according to the order of stacking, and in each storage space, the data items can be converted from the host language to the script language or the script language to the host language based on the mapping relation between the script language and the host language included in the stack structure model.
Optionally, the mapping relationship includes: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
The function mapping relationship may include a mapping relationship between a specific function in the script file for implementing a corresponding function and a function in the host environment of the client for implementing the same function; the control mapping relationship may include a mapping relationship between a specific control for implementing a corresponding function in the script file and a control for implementing the same function in a host environment of the client; the class mapping relationship may include a mapping relationship between an entity class in the script file and an entity class in the hosting environment of the client. Through the three mapping relations, the script file compiled by the script language can be converted into the host language which can be processed in the host environment of the client, and the application of the script file in the host environment of the client is realized.
And 403, when the interactive video is played to the trigger time point, determining a script paragraph corresponding to the trigger time point in the script file according to the first corresponding relation.
Wherein the script file includes: a first correspondence between the script paragraph and the trigger time point.
In this step, when the interactive video includes a plurality of trigger time points, the script file includes a first corresponding relationship between script segments and the trigger time points, and each script segment is used to implement the interactive content corresponding to the trigger time point.
When the interactive video is played to the triggering time point, the script paragraph corresponding to the triggering time point in the script file can be determined according to the first corresponding relation.
Specifically, when the interactive video is played to a trigger time point, the interactive video can be understood as being played to a corresponding interactive interval, the interactive interval is a time or a time period corresponding to the trigger time point, at this time, an entry function for a script file class can be triggered, an interactive view layer based on the script file is created, and a player callback is registered in the interactive view layer, wherein the callback refers to a function called by a function pointer, the callback is a method called by another method using a method as a first parameter of the function, and the callback is a method called when some events occur. Callback functions are called by another party when a specific event or condition occurs, for responding to the event or condition.
In the embodiment of the present invention, registering the player callback with the interactive view layer may be understood as notifying the player of an analysis method of a script file through a callback function, so that the player compiles a script paragraph compiled by a script language into a script paragraph compiled by a host language.
Step 404, inputting the script paragraphs into the stack structure model, and outputting the script paragraphs compiled by the host language.
For a script paragraph input into the stack structure model, codes in the script paragraph can be imported into storage spaces in the stack structure model corresponding to the sequence according to the stack sequence, and in each storage space, the codes can be converted from the script language to the host language based on the mapping relation between the script language and the host language included in the stack structure model, so that the script paragraph compiled by the host language can be obtained, and the script paragraph compiled by the host language can be analyzed and processed by the host environment of the client.
And 405, rendering the script paragraphs compiled by the host language to generate the interactive interface.
In the embodiment of the invention, the script paragraph plays a role in describing the interactive interface, the client analyzes the script paragraph compiled by the host language to obtain the interface document with a plurality of interface elements, and then the interactive interface can be generated by adding corresponding interface data to the interface elements in the interface document, and the interactive interface is used for triggering the execution event corresponding to the user operation event in combination with the user operation event of the user to realize the interactive function of the interactive video.
Optionally, step 405 may specifically include:
substep 4051, parsing the script paragraphs compiled by the host language to obtain the class object, the control object and the function object of the script paragraphs compiled by the host language.
In the embodiment of the invention, the script paragraphs compiled by the host language are analyzed, so that the class object, the control object and the function object of the script paragraphs compiled by the host language can be obtained. The class is a collection of things or events with common characteristics, the class object is an instantiation of the class, and all objects of one class correspond to the same class object. The control can be understood as a component in the interface and can also be understood as a tool in the interface, and the control object is an instantiation of the control. The class of heavy-duty function call operators, whose objects are often referred to as function objects (function objects), i.e. they are objects that behave like functions, also called emulation functions. A function object is a programmatic object that allows calls to be treated as a normal function. These three types of objects are the basis for building the relevant interface elements in the interactive interface and may exist in the form of structural labels.
Substep 4052, generating at least one interface element according to the script paragraph compiled from the host language, the class object, the control object, and the function object.
In this step, the class object, the control object, and the function object may be constructed as corresponding interface elements according to interface structure rules defined in the script paragraphs compiled by the host language. The interface structure rule comprises a space constraint and a business logic constraint which are used for constructing the class object, the control object and the function object into corresponding interface elements, wherein the space constraint defines the appearance form (such as the forms of buttons, dialog boxes and the like) of the generated interface elements, and the business logic constraint defines the business logic (such as the interface elements are used for triggering the operations of determining, cancelling and the like) of the generated interface elements.
Substep 4053, obtaining interface data corresponding to the interface element.
In this step, referring to fig. 2, after the client generates the interface element, the client needs to add corresponding interface data to the interface element to enable the interface element to form a required appearance and to implement a corresponding service logic, where the client may send a data request to the server, and after receiving the data request, the server may add the interface data in a JSON (JavaScript Object Notation) form into a response to the data request to enable the client to extract the corresponding interface data from the response.
It should be noted that the client may also establish the interface database locally in advance, and obtain the interface data from the local interface database when the interface data is needed.
Substep 4054, generating said interactive interface based on said interface elements and said interface data.
In the step, rendering operation is performed according to the interface elements and the interface data, so that an interactive interface can be obtained, in the process, the interface elements can form a required appearance form, and when the interface elements in the interactive interface receive user interaction events of a user, the interface elements can trigger realization of corresponding business logic.
And step 406, when a user operation event for the interactive interface is received, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the scripting language.
Specifically, when the user operation event for the interactive interface is received, the method for mutually converting the user operation event compiled by the script language and the user operation event compiled by the host language can be further informed to the stack structure model through the callback function, so that the stack structure model can realize the mutual conversion of the user operation event compiled by the script language and the user operation event compiled by the host language.
In the embodiment of the present invention, because the user operation event received by the client is usually generated based on the host language, the client cannot determine the execution event corresponding to the user operation event compiled by the host language in the script segment according to the user operation event compiled by the host language directly. Therefore, in the embodiment of the present invention, the client may input the user operation event compiled by the host language into the stack structure model, and convert the user operation event compiled by the host language into the script language that can be understood by the script paragraphs according to the mapping relationship between the script language stored in the stack structure model and the host language, so as to obtain the user operation event compiled by the script language, and further determine the execution event of the script language corresponding to the user operation event in the script paragraphs in the script language environment.
Step 407, determining an execution event compiled by the script language according to the user operation event compiled by the script language and the second corresponding relation.
Wherein the script paragraph comprises: a second correspondence between the user operation event and the execution event.
In the embodiment of the present invention, after the user operation event compiled by the scripting language is determined, the execution event of the scripting language corresponding to the user operation event compiled by the scripting language may be further determined according to a second corresponding relationship between the user operation event compiled by the scripting language defined in the script paragraph and the execution event of the scripting language.
And step 408, inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language.
In this embodiment of the present invention, because the execution event of the scripting language obtained in step 407 is generated based on the scripting language, and the client cannot directly process the execution event of the scripting language, in this embodiment of the present invention, the client may input the execution event of the scripting language into the stack structure model, and convert the execution event of the scripting language into an execution event of a host language that the client can understand and process according to a mapping relationship between the scripting language and the host language stored in the stack structure model, so as to obtain an execution event that enables the client to further execute the host language in a host environment of the client.
And step 409, executing the execution event.
This step can refer to step 105, which is not described herein.
In summary, the video interaction method provided in the embodiment of the present invention includes: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on the time axis of the interactive video; when the interactive video is played to a triggering time point, rendering a script paragraph corresponding to the triggering time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client to generate an interactive interface; when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; an execution event is executed. The method and the device can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most of host languages, so that the script file can realize the interactive function corresponding to the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
Fig. 5 is a block diagram of a video interaction apparatus according to an embodiment of the present invention, and as shown in fig. 5, the apparatus 50 may include:
an obtaining module 501, configured to obtain the script file and the interactive video sent by the server, where a time axis of the interactive video is provided with at least one trigger time point;
an establishing module 502, configured to establish a stack structure model according to the script language and the host language, where the stack structure model includes a mapping relationship between the script language and the host language.
Optionally, the mapping relationship includes: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
A rendering module 503, configured to render, according to a mapping relationship between a scripting language of the script file and a host language of a client, a script paragraph in the script file corresponding to the trigger time point according to the host language when the interactive video is played to the trigger time point, so as to generate an interactive interface;
a determining module 504, configured to determine, when a user operation event for the interactive interface is received, an execution event corresponding to the user operation event according to the mapping relationship and the script paragraph, where a processing language of the execution event is the host language;
optionally, the script paragraph includes: a second correspondence between the user operation event and the execution event; the determining module 504 is specifically configured to:
when a user operation event aiming at the interactive interface is received, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the script language;
determining an execution event compiled by the script language according to the user operation event compiled by the script language and the second corresponding relation;
and inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language.
An execution module 505, configured to execute the execution event.
Optionally, referring to fig. 6, fig. 6 is a block diagram of a rendering module according to an embodiment of the present invention, where the script file includes: a first correspondence between the script paragraph and the trigger time point; the rendering module 503 includes:
the determining sub-module 5031 is configured to determine, according to the first corresponding relationship, a script paragraph corresponding to the trigger time point in the script file when the interactive video is played to the trigger time point;
a stack submodule 5032, configured to input the script paragraphs into the stack structure model, and output script paragraphs compiled by the host language;
the rendering submodule 5033 is configured to render the script paragraphs compiled by the host language, and generate the interactive interface.
Optionally, the rendering sub-module 5033 is specifically configured to:
analyzing the script paragraphs compiled by the host language to obtain class objects, control objects and function objects of the script paragraphs compiled by the host language;
generating at least one interface element according to the script paragraph compiled by the host language, the class object, the control object and the function object;
acquiring interface data corresponding to the interface elements;
and generating the interactive interface according to the interface elements and the interface data.
In summary, the video interaction apparatus provided in the embodiment of the present invention includes: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on the time axis of the interactive video; when the interactive video is played to a triggering time point, rendering a script paragraph corresponding to the triggering time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client to generate an interactive interface; when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; an execution event is executed. The method and the device can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most of host languages, so that the script file can realize the interactive function corresponding to the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
For the above device embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for the relevant points, refer to the partial description of the method embodiment.
Preferably, an embodiment of the present invention further provides a terminal, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when the computer program is executed by the processor, the computer program implements each process of the video interaction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the video interaction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As is readily imaginable to the person skilled in the art: any combination of the above embodiments is possible, and thus any combination between the above embodiments is an embodiment of the present invention, but the present disclosure is not necessarily detailed herein for reasons of space.
The video interaction methods provided herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The structure required to construct a system incorporating aspects of the present invention will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components of the video interaction method according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.

Claims (13)

1. A video interaction method, the method comprising:
the method comprises the steps that a script file and an interactive video sent by a server are obtained, wherein at least one trigger time point is arranged on the time axis of the interactive video and is used for triggering and generating a corresponding interactive interface;
when the interactive video is played to the triggering time point, rendering a script paragraph in the script file corresponding to the triggering time point according to a mapping relation between a script language of the script file and a host language of a client side and generating an interactive interface;
when a user operation event aiming at the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is the host language;
executing the execution event.
2. The method according to claim 1, wherein before the rendering the script paragraph corresponding to the trigger time point in the script file according to the host language according to the mapping relationship between the script language of the script file and the host language of the client to generate the interactive interface, the method further comprises:
and establishing a stack structure model according to the script language and the host language, wherein the stack structure model comprises a mapping relation between the script language and the host language.
3. The method of claim 2, wherein the mapping comprises: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
4. The method of claim 2, wherein the script file comprises: a first correspondence between the script paragraph and the trigger time point;
when the interactive video is played to the triggering time point, rendering a script paragraph corresponding to the triggering time point in the script file according to a mapping relation between a script language of the script file and a host language of a client according to the host language to generate an interactive interface, including:
when the interactive video is played to the trigger time point, determining a script paragraph corresponding to the trigger time point in the script file according to the first corresponding relation;
inputting the script paragraphs into the stack structure model and outputting the script paragraphs compiled by the host language;
and rendering the script paragraphs compiled by the host language to generate the interactive interface.
5. The method of claim 4, wherein rendering the script paragraphs compiled in the host language to generate the interactive interface comprises:
analyzing the script paragraphs compiled by the host language to obtain class objects, control objects and function objects of the script paragraphs compiled by the host language;
generating at least one interface element according to the script paragraph compiled by the host language, the class object, the control object and the function object;
acquiring interface data corresponding to the interface elements;
and generating the interactive interface according to the interface elements and the interface data.
6. The method of claim 2, wherein the script paragraph comprises: a second correspondence between the user operation event and the execution event;
when a user operation event for the interactive interface is received, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, including:
when a user operation event aiming at the interactive interface is received, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the script language;
determining an execution event compiled by the script language according to the user operation event compiled by the script language and the second corresponding relation;
and inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language.
7. A video interaction apparatus, the apparatus comprising:
the acquisition module is used for acquiring the script file and the interactive video sent by the server, at least one trigger time point is arranged on the time axis of the interactive video, and the trigger time point is used for triggering and generating a corresponding interactive interface;
the rendering module is used for rendering a script paragraph corresponding to the trigger time point in the script file according to a host language of a client according to a mapping relation between the script language of the script file and the host language of the client when the interactive video is played to the trigger time point, so as to generate an interactive interface;
the determining module is used for determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph when the user operation event aiming at the interactive interface is received, wherein the processing language of the execution event is the host language;
and the execution module is used for executing the execution event.
8. The apparatus of claim 7, further comprising:
and the establishing module is used for establishing a stack structure model according to the script language and the host language, and the stack structure model comprises a mapping relation between the script language and the host language.
9. The apparatus of claim 8, wherein the mapping relationship comprises: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
10. The apparatus of claim 8, wherein the script file comprises: a first correspondence between the script paragraph and the trigger time point; the rendering module includes:
the determining submodule is used for determining a script paragraph corresponding to the trigger time point in the script file according to the first corresponding relation when the interactive video is played to the trigger time point;
the stack submodule is used for inputting the script paragraphs into the stack structure model and outputting the script paragraphs compiled by the host language;
and the rendering submodule is used for rendering the script paragraphs compiled by the host language to generate the interactive interface.
11. The apparatus of claim 10, wherein the rendering submodule is specifically configured to:
analyzing the script paragraphs compiled by the host language to obtain class objects, control objects and function objects of the script paragraphs compiled by the host language;
generating at least one interface element according to the script paragraph compiled by the host language, the class object, the control object and the function object;
acquiring interface data corresponding to the interface elements;
and generating the interactive interface according to the interface elements and the interface data.
12. The apparatus of claim 8, wherein the script paragraph comprises: a second correspondence between the user operation event and the execution event; the determining module is specifically configured to:
when a user operation event aiming at the interactive interface is received, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the script language;
determining an execution event compiled by the script language according to the user operation event compiled by the script language and the second corresponding relation;
and inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language.
13. A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, implements a video interaction method as claimed in any one of claims 1 to 6.
CN201910975068.4A 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium Active CN110825383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910975068.4A CN110825383B (en) 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910975068.4A CN110825383B (en) 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110825383A true CN110825383A (en) 2020-02-21
CN110825383B CN110825383B (en) 2023-08-22

Family

ID=69549150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910975068.4A Active CN110825383B (en) 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110825383B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040296A (en) * 2020-09-06 2020-12-04 刘承昊 Interactive content playing platform
CN114007146A (en) * 2021-10-29 2022-02-01 湖南快乐阳光互动娱乐传媒有限公司 Interactive method and device of interactive video, storage medium and electronic equipment
CN115460468A (en) * 2022-08-10 2022-12-09 北京爱奇艺科技有限公司 Interactive video file creating method and interactive video playing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1763717A (en) * 2005-11-24 2006-04-26 北京中星微电子有限公司 System and method for calling host software functions by using script and its compiler
US20070106946A1 (en) * 2005-11-07 2007-05-10 Philip Goetz Method and system for developing interactive Web applications in a unified framework
WO2017028720A1 (en) * 2015-08-19 2017-02-23 阿里巴巴集团控股有限公司 Object sending method and device
CN108345458A (en) * 2018-01-25 2018-07-31 微梦创科网络科技(中国)有限公司 A kind of call method and system of static compilation language and script
CN108933969A (en) * 2018-07-25 2018-12-04 深圳市茁壮网络股份有限公司 A kind of method and system for realizing digital TV video frequency animation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106946A1 (en) * 2005-11-07 2007-05-10 Philip Goetz Method and system for developing interactive Web applications in a unified framework
CN1763717A (en) * 2005-11-24 2006-04-26 北京中星微电子有限公司 System and method for calling host software functions by using script and its compiler
WO2017028720A1 (en) * 2015-08-19 2017-02-23 阿里巴巴集团控股有限公司 Object sending method and device
CN108345458A (en) * 2018-01-25 2018-07-31 微梦创科网络科技(中国)有限公司 A kind of call method and system of static compilation language and script
CN108933969A (en) * 2018-07-25 2018-12-04 深圳市茁壮网络股份有限公司 A kind of method and system for realizing digital TV video frequency animation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张承忠;李敏;: "脚本语言在复杂用户界面制作中的应用研究" *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040296A (en) * 2020-09-06 2020-12-04 刘承昊 Interactive content playing platform
CN114007146A (en) * 2021-10-29 2022-02-01 湖南快乐阳光互动娱乐传媒有限公司 Interactive method and device of interactive video, storage medium and electronic equipment
CN115460468A (en) * 2022-08-10 2022-12-09 北京爱奇艺科技有限公司 Interactive video file creating method and interactive video playing method and device
CN115460468B (en) * 2022-08-10 2023-09-15 北京爱奇艺科技有限公司 Interactive video file creation method, interactive video playing method, device, electronic equipment and medium

Also Published As

Publication number Publication date
CN110825383B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
US9342237B2 (en) Automated testing of gesture-based applications
US8347272B2 (en) Call graph dependency extraction by static source code analysis
US20060117267A1 (en) System and method for property-based focus navigation in a user interface
CN110825383B (en) Video interaction method and device and computer readable storage medium
JP2015534145A (en) User interface control framework for stamping out controls using declarative templates
US20220032192A1 (en) User interface processing method and device
US10705858B2 (en) Automatic import of third party analytics
CN113918195A (en) Application interface updating method and device, electronic equipment and readable storage medium
CN110955409A (en) Method and device for creating resources on cloud platform
CN113778897B (en) Automatic test method, device and equipment for interface and storage medium
US20200110584A1 (en) Automated code generation for functional testing of software applications
CN105005596B (en) page display method and device
CN112965716B (en) Page processing method and device, electronic equipment and readable storage medium
US10142446B2 (en) Dialog server
CN112068879A (en) Method and device for constructing client application development framework based on configuration
CN105095398B (en) A kind of information providing method and device
CN114911541B (en) Processing method and device of configuration information, electronic equipment and storage medium
WO2014024255A1 (en) Terminal and video playback program
CN114661402A (en) Interface rendering method and device, electronic equipment and computer readable medium
CN111782196A (en) MVP architecture-based development method and device
CN112068895A (en) Code configuration method and device, video playing equipment and storage medium
CN113961279A (en) Page rendering method, device, server and storage medium
CN111045674A (en) Interactive method and device of player
CN115309376B (en) Application creation method and device, electronic equipment and storage medium
CN112306324B (en) Information processing method, apparatus, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant