CN110825383B - Video interaction method and device and computer readable storage medium - Google Patents

Video interaction method and device and computer readable storage medium Download PDF

Info

Publication number
CN110825383B
CN110825383B CN201910975068.4A CN201910975068A CN110825383B CN 110825383 B CN110825383 B CN 110825383B CN 201910975068 A CN201910975068 A CN 201910975068A CN 110825383 B CN110825383 B CN 110825383B
Authority
CN
China
Prior art keywords
script
language
host
compiled
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910975068.4A
Other languages
Chinese (zh)
Other versions
CN110825383A (en
Inventor
董熠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN201910975068.4A priority Critical patent/CN110825383B/en
Publication of CN110825383A publication Critical patent/CN110825383A/en
Application granted granted Critical
Publication of CN110825383B publication Critical patent/CN110825383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • G06F8/43Checking; Contextual analysis
    • G06F8/433Dependency analysis; Data or control flow analysis
    • G06F8/434Pointers; Aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Stored Programmes (AREA)

Abstract

The invention provides a video interaction method, a video interaction device and a computer readable storage medium, comprising the following steps: acquiring a script file and an interactive video sent by a server; when the interactive video is played to a triggering time point of the interactive video, rendering a script paragraph corresponding to the triggering time point in the script file according to the host language according to the mapping relation between the script language of the script file and the host language of the client side, and generating an interactive interface; when receiving a user operation event aiming at the interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph; executing the execution event. The invention can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most host languages, so that the script file can realize the corresponding interactive function of the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.

Description

Video interaction method and device and computer readable storage medium
Technical Field
The invention belongs to the technical field of computers, and particularly relates to a video interaction method, a video interaction device and a computer readable storage medium.
Background
The interactive video refers to integrating interactive experience into a linear video through various technical means. With the improvement of broadband access speed and the maturation of multimedia playing technology, the application of interactive video is becoming wider and wider.
In the prior art, when an interactive video is applied to a client environment, multiple different host environments exist for different clients, and the interactive video is applied to different environments, specifically including: aiming at the host environments of different clients, developing a system control applied to the interactive video in the host environment, and triggering the interactive event in the interactive video through the corresponding system control when the interactive video is played in the host environment.
However, in the current scheme, for different types of host environments, it is required to develop a system control applied to an interactive video in the host environment from scratch, so that the system control of each interactive video is matched with the host environment, and the system controls of different host environments cannot be mutually compatible, resulting in poor compatibility of the interactive video.
Disclosure of Invention
In view of the above, the present invention provides a video interaction method, apparatus and computer readable storage medium, which solve the problem of poor compatibility of interactive video in the current scheme to a certain extent.
According to a first aspect of the present invention, there is provided a video interaction method, the method may comprise:
acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on a time axis of the interactive video, and the trigger time point is used for triggering and generating a corresponding interactive interface;
when the interactive video is played to the trigger time point, according to the mapping relation between the script language of the script file and the host language of the client, rendering a script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface;
when receiving a user operation event aiming at the interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is the host language;
executing the execution event.
According to a second aspect of the present invention, there is provided a video interaction device, the device may comprise:
the acquisition module is used for acquiring the script file and the interactive video sent by the server, wherein at least one trigger time point is arranged on a time axis of the interactive video and is used for triggering and generating a corresponding interactive interface;
The rendering module is used for rendering script paragraphs corresponding to the triggering time point in the script file according to the mapping relation between the script language of the script file and the host language of the client when the interactive video is played to the triggering time point, and generating an interactive interface according to the host language;
the determining module is used for determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph when receiving the user operation event aiming at the interactive interface, wherein the processing language of the execution event is the host language;
and the execution module is used for executing the execution event.
In a third aspect, an embodiment of the present invention provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, the computer program implementing the steps of the video interaction method according to the first aspect when being executed by a processor.
Aiming at the prior art, the invention has the following advantages:
the invention provides a video interaction method, which comprises the following steps: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on a time axis of the interactive video; when the interactive video is played to a trigger time point, according to the mapping relation between the script language of the script file and the host language of the client, rendering the script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface; when receiving a user operation event aiming at an interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; executing the execution event. The invention can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most host languages, so that the script file can realize the corresponding interactive function of the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of steps of a video interaction method according to an embodiment of the present invention;
fig. 2 is a system architecture diagram of a video interaction method according to an embodiment of the present invention;
FIG. 3 is an interface schematic diagram of a video interaction method according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating steps of another video interaction method according to an embodiment of the present invention;
FIG. 5 is a block diagram of a video interaction device according to an embodiment of the present invention;
fig. 6 is a block diagram of a rendering module provided by an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Fig. 1 is a flowchart of steps of a video interaction method provided in an embodiment of the present invention, which is applied to a terminal, as shown in fig. 1, and the method may include:
step 101, acquiring the script file and the interactive video sent by the server, wherein at least one trigger time point is arranged on a time axis of the interactive video.
The triggering time point is used for triggering and generating a corresponding interactive interface.
In the embodiment of the invention, the application of the interactive video in different host environments can be realized through script files, the script files can be Lua script files, and the Lua script files are designed to flexibly embed the application programs, so that flexible expansion and customization functions are provided for the application programs. The Lua script file is written in standard C language (C Programming Language) and can be compiled and run on almost all operating systems and platforms.
In addition, the Lua script file can be easily called by the C language or Java language code, and can also be reversely called by the C language or Java language function, so that the Lua script file can be widely applied in application programs. The Lua script file is not only used as an extension script, but also can be used as a common configuration file, and is easier to understand and maintain.
In an embodiment of the present invention, the host environment of the client to which the interactive video is applied includes, but is not limited to, a C language host environment (IOS system) or a Java language (android system) host environment. For the host environment of the client, the embodiment of the invention does not directly develop the control of the interactive video through the host language corresponding to the host environment, but realizes the control development of the interactive video by setting a universal and independent script file, wherein the script file can be used for describing the style of the interactive interface in the interactive video and the business logic of the interactive event in the interactive video. The script file is compiled by the Lua script language of the non-hosting language, so that the script file of the non-hosting language is applied to the hosting environment, and object mapping between the script language of the script file and the hosting language of the client is required.
Referring to fig. 2, a system architecture diagram of a video interaction method provided by an embodiment of the present invention is shown, where the system architecture diagram includes: a client and a server. The client may be a mobile terminal, a personal computer, an application running on a computing device, or the like; the server may be a cloud server, a business server, or the like.
In one implementation manner of the embodiment of the invention, the server can establish a corresponding script file according to the interaction mode of the interactive video, and actively transmit the script file and the interactive video to the client, so that the purpose that the client passively acquires the script file and the interactive video is achieved. It should be noted that, in another implementation manner of the embodiment of the present invention, after the server establishes the corresponding script file according to the interaction manner of the interactive video, the server may further add the script file and the interactive video to the response to the script request according to the script request sent by the client, so as to achieve the purpose that the client actively obtains the script file and the interactive video.
The interactive video may be divided into a plurality of video clips, and the connection time point between adjacent video clips may be a trigger time point, where when the interactive video is played to the trigger time point, the video clip played at the trigger time point needs to be determined according to the interaction with the user.
For example, an interactive video is divided into 3 video segments: fragment a, fragment B and fragment C. When the segment A is played, taking the ending time point of the segment A as a trigger time point, generating an interactive interface for selecting the option 1 and the option 2, and further playing the segment B after the segment A is played if the user selects the option 1; if the user selects option 2, after playing segment a, further playing segment C.
And 102, when the interactive video is played to the trigger time point, rendering a script paragraph corresponding to the trigger time point in the script file according to the host language according to the mapping relation between the script language of the script file and the host language of the client, and generating an interactive interface.
In this step, in order to satisfy the object mapping between the scripting language of the scripting file and the host language of the client, the embodiment of the present invention may establish a mapping relationship between the scripting language and the host language in the preset memory space of the client according to the characteristics of the scripting language and the host language.
Specifically, the mapping relationship includes, but is not limited to, a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
The function mapping relation can comprise a specific function for realizing corresponding functions in the script file and a mapping relation between the specific function and a function for realizing the same function in a host environment of the client; the control mapping relation can comprise a mapping relation between a specific control for realizing a corresponding function in the script file and a control for realizing the same function in the host environment of the client; the class mapping relationship may include a mapping relationship between an entity class in the script file and an entity class in the hosting environment of the client. Through the three mapping relations, the script file compiled by the script language can be converted into a host language capable of being processed in the host environment of the client, and application of the script file in the host environment of the client is realized.
In the embodiment of the invention, the script file can be used for realizing the interaction between the user and the interactive video when the interactive video is played to the trigger time point. And when the interactive video comprises a plurality of trigger time points, a plurality of script paragraphs corresponding to the trigger time points are included in the script file, and each script paragraph is used for realizing the interactive content corresponding to the trigger time points.
For example, an interactive video includes a trigger time point 1 and a trigger time point 2, and the script file includes a script paragraph a corresponding to the trigger time point 1 and a script paragraph B corresponding to the trigger time point 2, where the script paragraph a describes a code for generating the interactive interface a and the script paragraph B describes a code for generating the interactive interface B.
When the interactive video is played to the trigger time point 1, rendering the script paragraph A corresponding to the trigger time point 1 to generate an interactive interface a for display; when the interactive video is played to the trigger time point 2, rendering can be performed on the script paragraph B corresponding to the trigger time point 2, and an interactive interface B is generated for display.
Specifically, in the embodiment of the present invention, since the script file is compiled and generated by a scripting language, the scripting language is generally different from a hosting language of a hosting environment in the client, and the client cannot directly read and render a script paragraph of the script file. Therefore, in the embodiment of the invention, the client can convert the script paragraphs compiled by the script language into the host language which can be understood by the client for analysis and rendering according to the mapping relation between the script language and the host language stored in the client, so that the rendering of the script paragraphs is realized under the environment of the host language, and an interactive interface is generated.
Step 103, when receiving a user operation event aiming at the interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is the host language.
In the embodiment of the invention, the script paragraph corresponding to the interactive interface can define one or more user operation events and execution events corresponding to the user operation events. The client side can further find the execution event corresponding to the user operation event in the script paragraph through the mapping relation according to the user operation event executed by the user on the interactive interface.
For example, for a script paragraph corresponding to a trigger time point, the script paragraph describes a touch interface with a functional button 1 and a functional button 2, and after the script paragraph defines that the user triggers the functional button 1, the execution event is further to execute a confirmation operation; and after the user triggers the functionality button 2, the execution event is a further execution of a cancel operation. After rendering the touch interface according to the script paragraph and displaying, if the touch operation of the user on the functional button 1 is received, the client further determines that the corresponding execution event is an execution confirmation operation; if the touch operation of the user on the functional button 2 is received, the corresponding execution event is further determined to be the execution cancellation operation.
Specifically, in the embodiment of the present invention, since the script file is compiled and generated by a scripting language, the scripting language is different from a host language of a host environment in the client, and the user operation event received by the client is usually based on the host language, the client cannot directly compile the user operation event according to the host language, and determines an execution event corresponding to the user operation event compiled by the host language in the script paragraph. Therefore, in the embodiment of the invention, the client can convert the user operation event compiled by the host language into the script language which can be understood by the script paragraph according to the mapping relation between the script language and the host language stored in the client, so that the execution event of the script language corresponding to the user operation event is determined in the script paragraph under the script language environment.
After determining the execution event of the scripting language, the execution event of the scripting language can be further converted into a host language which can be understood by the client through the mapping relation between the scripting language and the host language stored in the client, so as to obtain the execution event of the host language.
Step 104, executing the execution event.
In this step, since the step 103 outputs the execution event of the host language, the execution event can be executed in the host environment in the client to realize the corresponding function.
An execution event may include, but is not limited to, playing another piece of video, performing a certain functional operation, presenting a piece of link, etc.
Specifically, referring to fig. 3, an interface diagram of a video interaction method provided by the embodiment of the present invention is shown, where an interactive video 20 is being played in a screen 10 of a client, in a time axis 21 of the interactive video 20, the interactive video 20 is currently played to a trigger time point X, at this time, according to a script paragraph corresponding to the trigger time point, the client may render the script paragraph according to a host language, generate an interactive interface 30, and display the interactive interface 30, where the interactive interface 30 includes two interface elements: a button 31 and a button 32, after which the user presses the button 31 in the interactive interface 30, the video clip a may be further played after the trigger time point X; after the user presses button 32 in interactive interface 30, video clip B may be further played after trigger time point X.
In the embodiment of the invention, the script file can be applied to various host environments of the client due to the universality of the script language and the mapping interchangeability between the script language and most host languages, and only the server is required to update the original script file to obtain a new script file under the condition that the interactive mode of the interactive video is required to be updated, and the server transmits the new script file to the client, so that the original script file is updated without considering the influence of the host environment, the updating operation is simpler, the aim of updating the edition of the interactive mode of the interactive video is fulfilled, the reusability of the interactive video is improved, and the quick iteration of the interactive function of the interactive video is realized.
In summary, the video interaction method provided by the embodiment of the invention includes: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on a time axis of the interactive video; when the interactive video is played to a trigger time point, according to the mapping relation between the script language of the script file and the host language of the client, rendering the script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface; when receiving a user operation event aiming at an interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; executing the execution event. The invention can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most host languages, so that the script file can realize the corresponding interactive function of the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
Fig. 4 is a flowchart of steps of another video interaction method according to an embodiment of the present invention, as shown in fig. 4, the method may include:
Step 401, acquiring the script file and the interactive video sent by the server, wherein at least one trigger time point is set on a time axis of the interactive video.
This step may refer to step 101, and will not be described herein.
Optionally, before step 403, the method may further include:
step 402, building a stack structure model according to the scripting language and the host language, wherein the stack structure model comprises a mapping relation between the scripting language and the host language.
In the embodiment of the invention, the script file comprises a plurality of script paragraphs constructed by a script language, the script paragraphs comprise a plurality of codes according to the business logic of the interactive video, a certain order relation is formed among the codes, and the script paragraphs need to be executed strictly according to the order relation when being analyzed and executed. Therefore, according to the mapping relation between the script language and the host language, mapping conversion is performed on the script file or mapping conversion is performed on the host object in the host environment, and conversion is performed according to the order relation.
In particular, a stack means a data structure in which data items are arranged in sequence, and data items can be inserted and deleted only at one end called a stack top (top), and a stack data structure may be included in a stack structure model, for storing data items having a certain order definition, and outputting the data items in the order.
In the embodiment of the invention, a stack structure model of a stack structure can be established in a preset storage space of a client, wherein the stack structure model is provided with two storage structures of a stack and a stack, and the stack (heap): can be seen as a tree, such as: heap ordering has the advantage that memory size can be allocated dynamically, nor does the lifetime have to be told to the compiler in advance. Stack (stack): a first-in last-out data structure, the stack has the advantage of faster access speed.
For data items input into the stack structure model, storage spaces with corresponding orders in the stack structure model can be imported according to the order of the stack, and each storage space can be used for converting the data items from the host language into the script language or converting the script language into the host language based on the mapping relation between the script language and the host language included in the stack structure model.
Optionally, the mapping relationship includes: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
The function mapping relation can comprise a specific function for realizing corresponding functions in the script file and a mapping relation between the specific function and a function for realizing the same function in a host environment of the client; the control mapping relation can comprise a mapping relation between a specific control for realizing a corresponding function in the script file and a control for realizing the same function in the host environment of the client; the class mapping relationship may include a mapping relationship between an entity class in the script file and an entity class in the hosting environment of the client. Through the three mapping relations, the script file compiled by the script language can be converted into a host language capable of being processed in the host environment of the client, and application of the script file in the host environment of the client is realized.
Step 403, determining a script paragraph corresponding to the trigger time point in the script file according to the first correspondence when the interactive video is played to the trigger time point.
Wherein, the script file includes: and a first corresponding relation between the script paragraph and the triggering time point.
In the step, when the interactive video includes a plurality of trigger time points, the script file includes a first correspondence between script segments and the trigger time points, and each script segment is used for realizing interactive contents corresponding to the trigger time points.
When the interactive video is played to the trigger time point, according to the first corresponding relation, a script paragraph corresponding to the trigger time point in the script file can be determined.
Specifically, the interactive video is played to a trigger time point, namely the interactive video is played to a corresponding interactive interval, the interactive interval is a moment or a period corresponding to the trigger time point, at the moment, an entry function aiming at a script file class can be triggered, an interactive view layer based on the script file is created, and a player callback is registered to the interactive view layer, wherein the callback refers to a function called through a function pointer, the callback refers to a method called by any other method taking a method as a first parameter, and the callback refers to a method called when certain events occur. The callback function is invoked by another party when a specific event or condition occurs, for responding to the event or condition.
In the embodiment of the invention, the player callback is registered with the interactive view layer, which can be understood as informing the player of the analysis method of the script file through the callback function, so that the player compiles script paragraphs compiled by the script language into script paragraphs compiled by the host language.
Step 404, inputting the script paragraph into the stack structure model, and outputting the script paragraph compiled by the host language.
For script paragraphs input into the stack structure model, codes in the script paragraphs can be imported into storage spaces of corresponding orders in the stack structure model according to the order of the stack, and in each storage space, the codes can be converted from the script language into a host language based on a mapping relation between the script language and the host language included in the stack structure model, so that script paragraphs compiled by the host language can be obtained, and the script paragraphs compiled by the host language can be analyzed and processed by a host environment of a client.
And step 405, rendering the script paragraph compiled by the host language to generate the interactive interface.
In the embodiment of the invention, the script paragraph plays a role in describing the interactive interface, the client analyzes the script paragraph compiled by the host language, so that an interface document with a plurality of interface elements can be obtained, and the interactive interface can be generated by adding corresponding interface data for the interface elements in the interface document, and is used for combining user operation events of a user, triggering execution events corresponding to the user operation events and realizing the interactive function of the interactive video.
Optionally, step 405 may specifically include:
sub-step 4051, parsing the script paragraph compiled in the host language to obtain a class object, a control object and a function object of the script paragraph compiled in the host language.
In the embodiment of the invention, the class object, the control object and the function object of the script paragraph compiled by the host language can be obtained by analyzing the script paragraph compiled by the host language. Wherein a class is a collection of things or events that have a common characteristic to a group, a class object is an instantiation of a class, and all objects of a class correspond to the same class object. A control can be understood as a component in an interface, and can also be understood as a tool in the interface, and a control object is an instantiation of the control. The class of reload function call operators whose objects are often called function objects (function objects), i.e. they are objects that behave like functions, also called imitations. A function object is a programmatically designed object that is allowed to be called as a normal function. These three types of objects are the basis for constructing relevant interface elements in the interactive interface and may exist in the form of structural tags.
Sub-step 4052, generating at least one interface element from the script paragraph, the class object, the control object, and the function object compiled by the host language.
In this step, class objects, control objects, and function objects may be built as corresponding interface elements according to the interface structure rules defined in the script paragraphs compiled in the host language. The interface structure rule comprises space constraint and business logic constraint of constructing class objects, control objects and function objects into corresponding interface elements, wherein the space constraint defines the appearance form (such as a button, a dialog box and the like) of the generated interface elements, and the business logic constraint defines the business logic of the generated interface elements (such as the interface elements are used for triggering the operations of determining, cancelling and the like).
Sub-step 4053, obtaining interface data corresponding to the interface element.
In this step, referring to fig. 2, after the client generates the interface element, the client needs to add corresponding interface data to the interface element, so that the interface element can form a required appearance form and can implement corresponding service logic, where the client may send a data request to the server, and after receiving the data request, the server may add the interface data in a response to the data request in a JS Object description (JSON, javaScript Object notification) form, so that the client extracts the corresponding interface data from the response.
It should be noted that, the client may also build an interface database locally in advance, and obtain the interface data from the local interface database when the interface data is needed.
Sub-step 4054, generating said interactive interface from said interface element and said interface data.
In the step, rendering operation is carried out according to the interface elements and the interface data, so that an interactive interface can be obtained, in the process, the interface elements can form the required appearance form, and when the interface elements in the interactive interface receive a user interaction event of a user, the interface elements can trigger to realize corresponding business logic.
Step 406, when receiving a user operation event for the interactive interface, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the scripting language.
Specifically, when receiving the user operation event aiming at the interactive interface, the callback function can further inform the stack structure model of a method for mutually converting the user operation event compiled by the scripting language and the user operation event compiled by the host language, so that the stack structure model can mutually convert the user operation event compiled by the scripting language and the user operation event compiled by the host language.
In the embodiment of the invention, because the user operation event received by the client is usually generated based on the host language, the client cannot directly compile the user operation event according to the host language, and the execution event corresponding to the user operation event compiled by the host language is determined in the script paragraph. Therefore, in the embodiment of the invention, the client can input the user operation event compiled by the host language into the stack structure model, and convert the user operation event compiled by the host language into the script language which can be understood by the script paragraph according to the mapping relation between the script language and the host language stored in the stack structure model, so as to obtain the user operation event compiled by the script language, and further determine the execution event of the script language corresponding to the user operation event in the script paragraph under the script language environment.
Step 407, determining an execution event compiled by the scripting language according to the user operation event compiled by the scripting language and the second corresponding relation.
Wherein, script paragraph includes: and a second correspondence between the user operation event and the execution event.
In the embodiment of the invention, after determining the user operation event compiled by the scripting language, the execution event of the scripting language corresponding to the user operation event compiled by the scripting language can be further determined through the second corresponding relation between the user operation event compiled by the scripting language and the execution event of the scripting language defined in the scripting paragraph.
Step 408, inputting the execution event compiled by the scripting language into the stack structure model, and outputting the execution event compiled by the host language.
In the embodiment of the present invention, since the execution event of the scripting language obtained in step 407 is generated based on the scripting language, the client cannot directly process the execution event of the scripting language, so in the embodiment of the present invention, the client may input the execution event of the scripting language into the stack structure model, and convert the execution event of the scripting language into the execution event of the host language that the client can understand and process according to the mapping relationship between the scripting language and the host language stored in the stack structure model, so as to further execute the execution event of the host language in the host environment of the client.
Step 409, executing the execution event.
This step may refer to step 105 described above, and will not be described here again.
In summary, the video interaction method provided by the embodiment of the invention includes: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on a time axis of the interactive video; when the interactive video is played to a trigger time point, according to the mapping relation between the script language of the script file and the host language of the client, rendering the script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface; when receiving a user operation event aiming at an interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; executing the execution event. The invention can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most host languages, so that the script file can realize the corresponding interactive function of the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
Fig. 5 is a block diagram of a video interaction device according to an embodiment of the present invention, and as shown in fig. 5, the device 50 may include:
the obtaining module 501 is configured to obtain the script file and the interactive video sent by the server, where at least one trigger time point is set on a time axis of the interactive video;
and the building module 502 is configured to build a stack structure model according to the scripting language and the host language, where the stack structure model includes a mapping relationship between the scripting language and the host language.
Optionally, the mapping relationship includes: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
A rendering module 503, configured to render, when the interactive video is played to the trigger time point, a script paragraph corresponding to the trigger time point in the script file according to a mapping relationship between a script language of the script file and a host language of a client, and generate an interactive interface according to the host language;
A determining module 504, configured to determine, when a user operation event for the interactive interface is received, an execution event corresponding to the user operation event according to the mapping relationship and the script paragraph, where a processing language of the execution event is the host language;
optionally, the script paragraph includes: a second correspondence between the user operation event and the execution event; the determining module 504 is specifically configured to:
when receiving a user operation event aiming at the interactive interface, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the scripting language;
determining an execution event compiled by the scripting language according to the user operation event compiled by the scripting language and the second corresponding relation;
and inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language.
An execution module 505 is configured to execute the execution event.
Optionally, referring to fig. 6, fig. 6 is a block diagram of a rendering module provided by an embodiment of the present invention, where the script file includes: a first correspondence between the script paragraph and the trigger time point; the rendering module 503 includes:
A determining submodule 5031, configured to determine, according to the first correspondence, a script paragraph corresponding to the trigger time point in the script file when the interactive video is played to the trigger time point;
a stack submodule 5032, configured to input the script paragraph into the stack structure model and output a script paragraph compiled in the host language;
and a rendering submodule 5033, configured to render the script paragraph compiled by the host language, and generate the interactive interface.
Optionally, the rendering submodule 5033 is specifically configured to:
analyzing the script paragraphs compiled by the host language to obtain class objects, control objects and function objects of the script paragraphs compiled by the host language;
generating at least one interface element according to script paragraphs compiled by the host language, the class objects, the control objects and the function objects;
acquiring interface data corresponding to the interface elements;
and generating the interactive interface according to the interface element and the interface data.
In summary, the video interaction device provided by the embodiment of the present invention includes: acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on a time axis of the interactive video; when the interactive video is played to a trigger time point, according to the mapping relation between the script language of the script file and the host language of the client, rendering the script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface; when receiving a user operation event aiming at an interactive interface, determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph, wherein the processing language of the execution event is a host language; executing the execution event. The invention can utilize the universality of the script language in the script file and the mapping interchangeability between the script language and most host languages, so that the script file can realize the corresponding interactive function of the interactive video on the premise of being applied to various host environments of the client, thereby improving the compatibility of the interactive video.
For the above-described device embodiments, the description is relatively simple, as it is substantially similar to the method embodiments, with reference to the description of the method embodiments in part.
Preferably, the embodiment of the present invention further provides a terminal, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program when executed by the processor implements each process of the embodiment of the video interaction method, and the same technical effects can be achieved, so that repetition is avoided, and no redundant description is given here.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the video interaction method embodiment described above, and can achieve the same technical effects, so that repetition is avoided and no further description is given here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
As will be readily appreciated by those skilled in the art: any combination of the above embodiments is possible, and thus is an embodiment of the present invention, but the present specification is not limited by the text.
The video interaction methods provided herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a system constructed with aspects of the present invention will be apparent from the description above. In addition, the present invention is not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in a video interaction method according to embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.

Claims (9)

1. A method of video interaction, the method comprising:
acquiring a script file and an interactive video sent by a server, wherein at least one trigger time point is arranged on a time axis of the interactive video, and the trigger time point is used for triggering and generating a corresponding interactive interface;
Establishing a stack structure model according to a script language and a host language, wherein the stack structure model comprises a mapping relation between the script language and the host language;
when the interactive video is played to the trigger time point, according to the mapping relation between the script language of the script file and the host language of the client, rendering a script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface; the script paragraph includes: a second correspondence between user operation events and execution events;
when receiving a user operation event aiming at the interactive interface, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the scripting language; determining an execution event compiled by the scripting language according to the user operation event compiled by the scripting language and the second corresponding relation; inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language; the processing language of the execution event is the host language;
executing the execution event.
2. The method of claim 1, wherein the mapping relationship comprises: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
3. The method of claim 1, wherein the script file comprises: a first correspondence between the script paragraph and the trigger time point;
when the interactive video is played to the trigger time point, according to a mapping relation between a script language of the script file and a host language of the client, rendering a script paragraph corresponding to the trigger time point in the script file according to the host language, and generating an interactive interface, wherein the method comprises the following steps:
when the interactive video is played to the trigger time point, determining a script paragraph corresponding to the trigger time point in the script file according to the first corresponding relation;
inputting the script paragraph into the stack structure model, and outputting the script paragraph compiled by the host language;
And rendering the script paragraphs compiled by the host language to generate the interactive interface.
4. The method of claim 3, wherein rendering the script paragraphs compiled by the host language to generate the interactive interface comprises:
analyzing the script paragraphs compiled by the host language to obtain class objects, control objects and function objects of the script paragraphs compiled by the host language;
generating at least one interface element according to script paragraphs compiled by the host language, the class objects, the control objects and the function objects;
acquiring interface data corresponding to the interface elements;
and generating the interactive interface according to the interface element and the interface data.
5. A video interaction device, the device comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a script file and an interactive video sent by a server, at least one trigger time point is arranged on a time axis of the interactive video, and the trigger time point is used for triggering and generating a corresponding interactive interface;
the rendering module is used for rendering script paragraphs corresponding to the triggering time point in the script file according to the mapping relation between the script language of the script file and the host language of the client when the interactive video is played to the triggering time point, and generating an interactive interface according to the host language;
The determining module is used for determining an execution event corresponding to the user operation event according to the mapping relation and the script paragraph when receiving the user operation event aiming at the interactive interface, wherein the processing language of the execution event is the host language;
an execution module for executing the execution event;
the building module is used for building a stack structure model according to the script language and the host language, wherein the stack structure model comprises a mapping relation between the script language and the host language;
the script paragraph includes: a second correspondence between the user operation event and the execution event; the determining module is specifically configured to:
when receiving a user operation event aiming at the interactive interface, inputting the user operation event into the stack structure model, and outputting the user operation event compiled by the scripting language;
determining an execution event compiled by the scripting language according to the user operation event compiled by the scripting language and the second corresponding relation;
and inputting the execution event compiled by the script language into the stack structure model, and outputting the execution event compiled by the host language.
6. The apparatus of claim 5, wherein the mapping relationship comprises: one or more of a functional mapping relationship between the scripting language and the host language, a control mapping relationship between the scripting language and the host language, and a class mapping relationship between the scripting language and the host language.
7. The apparatus of claim 5, wherein the script file comprises: a first correspondence between the script paragraph and the trigger time point; the rendering module includes:
the determining submodule is used for determining script paragraphs corresponding to the triggering time point in the script file according to the first corresponding relation when the interactive video is played to the triggering time point;
a stack sub-module, configured to input the script paragraph into the stack structure model, and output a script paragraph compiled by the host language;
and the rendering sub-module is used for rendering the script paragraphs compiled by the host language and generating the interactive interface.
8. The apparatus of claim 7, wherein the rendering sub-module is specifically configured to:
analyzing the script paragraphs compiled by the host language to obtain class objects, control objects and function objects of the script paragraphs compiled by the host language;
Generating at least one interface element according to script paragraphs compiled by the host language, the class objects, the control objects and the function objects;
acquiring interface data corresponding to the interface elements;
and generating the interactive interface according to the interface element and the interface data.
9. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the video interaction method according to any of claims 1 to 4.
CN201910975068.4A 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium Active CN110825383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910975068.4A CN110825383B (en) 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910975068.4A CN110825383B (en) 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110825383A CN110825383A (en) 2020-02-21
CN110825383B true CN110825383B (en) 2023-08-22

Family

ID=69549150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910975068.4A Active CN110825383B (en) 2019-10-14 2019-10-14 Video interaction method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110825383B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040296A (en) * 2020-09-06 2020-12-04 刘承昊 Interactive content playing platform
CN114007146A (en) * 2021-10-29 2022-02-01 湖南快乐阳光互动娱乐传媒有限公司 Interactive method and device of interactive video, storage medium and electronic equipment
CN115460468B (en) * 2022-08-10 2023-09-15 北京爱奇艺科技有限公司 Interactive video file creation method, interactive video playing method, device, electronic equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1763717A (en) * 2005-11-24 2006-04-26 北京中星微电子有限公司 System and method for calling host software functions by using script and its compiler
WO2017028720A1 (en) * 2015-08-19 2017-02-23 阿里巴巴集团控股有限公司 Object sending method and device
CN108345458A (en) * 2018-01-25 2018-07-31 微梦创科网络科技(中国)有限公司 A kind of call method and system of static compilation language and script
CN108933969A (en) * 2018-07-25 2018-12-04 深圳市茁壮网络股份有限公司 A kind of method and system for realizing digital TV video frequency animation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106946A1 (en) * 2005-11-07 2007-05-10 Philip Goetz Method and system for developing interactive Web applications in a unified framework

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1763717A (en) * 2005-11-24 2006-04-26 北京中星微电子有限公司 System and method for calling host software functions by using script and its compiler
WO2017028720A1 (en) * 2015-08-19 2017-02-23 阿里巴巴集团控股有限公司 Object sending method and device
CN108345458A (en) * 2018-01-25 2018-07-31 微梦创科网络科技(中国)有限公司 A kind of call method and system of static compilation language and script
CN108933969A (en) * 2018-07-25 2018-12-04 深圳市茁壮网络股份有限公司 A kind of method and system for realizing digital TV video frequency animation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张承忠 ; 李敏 ; .脚本语言在复杂用户界面制作中的应用研究.信息技术与信息化.2007,(03),48-50. *

Also Published As

Publication number Publication date
CN110825383A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN110825383B (en) Video interaction method and device and computer readable storage medium
US9715370B2 (en) Method and system for providing content
US9977770B2 (en) Conversion of a presentation to Darwin Information Typing Architecture (DITA)
US8615750B1 (en) Optimizing application compiling
US8965890B2 (en) Context sensitive media and information
US9823908B2 (en) Apparatus for providing framework to develop client application executed on multiple platforms, and method using the same
CN110209967B (en) Page loading method and device, terminal equipment and computer readable medium
CN110020329B (en) Method, device and system for generating webpage
CN111068328A (en) Game advertisement configuration table generation method, terminal device and medium
CN109582317B (en) Method and apparatus for debugging hosted applications
KR20060047998A (en) Method and system for embedding context information in a document
CN114036439A (en) Website building method, device, medium and electronic equipment
US20200110584A1 (en) Automated code generation for functional testing of software applications
CN105005596B (en) page display method and device
US20030069998A1 (en) Motion services protocol accessible through uniform resource locator (URL)
JP4702835B2 (en) Web service customization system
CN115098092A (en) Page generation method, device, equipment and storage medium
CN110489124B (en) Source code execution method, source code execution device, storage medium and computer equipment
CN112068895A (en) Code configuration method and device, video playing equipment and storage medium
CN112714148A (en) Interface configuration method, device, equipment and medium
CN112306324B (en) Information processing method, apparatus, device and medium
CN115309376B (en) Application creation method and device, electronic equipment and storage medium
Sikora Dart Essentials
EP3872630B1 (en) Request processing method and apparatus, electronic device, and computer storage medium
KR100723913B1 (en) Structured data broadcasting application, recording medium thereof, and operating method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant