CN113064590A - Method and device for processing interactive components in interactive video - Google Patents

Method and device for processing interactive components in interactive video Download PDF

Info

Publication number
CN113064590A
CN113064590A CN201911364509.3A CN201911364509A CN113064590A CN 113064590 A CN113064590 A CN 113064590A CN 201911364509 A CN201911364509 A CN 201911364509A CN 113064590 A CN113064590 A CN 113064590A
Authority
CN
China
Prior art keywords
interactive
component template
template
interactive component
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911364509.3A
Other languages
Chinese (zh)
Other versions
CN113064590B (en
Inventor
雷彬
刘嘉鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911364509.3A priority Critical patent/CN113064590B/en
Publication of CN113064590A publication Critical patent/CN113064590A/en
Application granted granted Critical
Publication of CN113064590B publication Critical patent/CN113064590B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • G06F9/4451User profiles; Roaming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the application provides a method for processing an interactive component in an interactive video. The processing method comprises the following steps: displaying a configuration information editing interface of the interactive component template according to an editing request of the interactive component template; displaying a code editing interface of the interactive component template based on the configuration information acquired by the configuration information editing interface; and generating and storing a target interaction component template based on the component codes acquired by the code editing interface. According to the technical scheme, the user can generate the interactive component template according to actual requirements when making the interactive video, and the conformity of the interactive component template and the interactive video is guaranteed.

Description

Method and device for processing interactive components in interactive video
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for processing an interactive component in an interactive video.
Background
With the development of multimedia playing technology, the appearance of interactive video enables audience to participate in the development of the scenario in the video. The audience can select interesting plot development lines through the interactive components in the interactive video, so that interaction with the interactive video is achieved. However, how to determine an appropriate interactive component according to an interactive video is an urgent technical problem to be solved to ensure the conformity between the interactive video and the interactive component.
Disclosure of Invention
The embodiment of the application provides a processing method and device for interactive components in an interactive video, and further an appropriate interactive component can be determined at least to a certain extent according to the interactive video, so that the integrating degree of the interactive video and the interactive component is guaranteed.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
According to an aspect of an embodiment of the present application, there is provided a processing method for an interactive component in an interactive video, the processing method including:
displaying a configuration information editing interface of the interactive component template according to an editing request of the interactive component template;
displaying a code editing interface of the interaction component template based on the configuration information acquired by the configuration information editing interface;
and generating and storing a target interaction component template based on the component codes acquired by the code editing interface.
According to an aspect of the embodiments of the present application, there is provided a processing apparatus for an interactive component in an interactive video, the processing apparatus including:
the first display module is used for displaying a configuration information editing interface of the interactive component template according to an editing request aiming at the interactive component template;
the second display module is used for displaying a code editing interface of the interaction component template based on the configuration information acquired by the configuration information editing interface;
and the processing module is used for generating and storing a target interaction component template based on the component codes acquired by the code editing interface.
In some embodiments of the present application, based on the foregoing solution, the interactive component template is an existing interactive component template, and the second display module is configured to: and displaying a code editing interface of the interactive component template based on the configuration information acquired by the configuration information editing interface, wherein the code editing interface comprises the component codes of the interactive component template.
In some embodiments of the present application, based on the foregoing, the processing module is configured to: receiving and storing modifications to the component code; generating a target interaction component template according to the modification aiming at the component codes; and storing the target interaction component template.
In some embodiments of the present application, based on the foregoing, the processing module is further configured to: and uploading the interactive component template to an interactive component template library according to the sharing request of the target interactive component template.
In some embodiments of the present application, based on the foregoing, the processing module is further configured to: displaying the interactive component templates contained in the interactive component template library according to the access request of the interactive component template library; and responding to the interaction component template selected in the interaction component template library, and acquiring the selected interaction component template.
In some embodiments of the present application, based on the foregoing solution, the interactive component template is an existing interactive component template, and the processing module is further configured to: determining the identification information of the target interaction component template according to the identification information of the interaction component template, wherein the identification information of the interaction component template is associated with the identification information of the target interaction component template; and associating the determined identification information of the target interaction component template with the target interaction component template.
In some embodiments of the present application, based on the foregoing, the processing module is further configured to: determining a related interactive component template according to the identification information of the interactive component template; and dividing the associated interactive component template into the same component version management library for storage.
In some embodiments of the present application, based on the foregoing, the processing module is further configured to: displaying a configuration information editing interface of an existing interactive component template according to a use request of the existing interactive component template; displaying a parameter configuration interface of the existing interactive component template based on the configuration information acquired by the configuration information editing interface; and generating and storing a target interaction component according to the parameter information acquired by the parameter configuration interface.
In some embodiments of the present application, based on the foregoing, the processing module is further configured to: and generating a block according to the target interaction component template, and uploading the block to a block chain network for storage.
According to an aspect of the embodiments of the present application, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the processing method of the interactive component in the interactive video as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the processing method of the interactive component in the interactive video as described in the above embodiments.
In the technical solutions provided in some embodiments of the present application, a configuration information editing interface of an interactive component template is displayed according to an editing request for the interactive component template, a code editing interface of the interactive component template is displayed based on configuration information acquired by the configuration information editing interface, and a target interactive component template is generated and stored based on a component code acquired by the code editing interface, so that an author of an interactive video can create an interactive component suitable for the interactive video according to the interactive video, instead of just selecting the interactive component from existing interactive component templates, and the engagement degree between the interactive video and the interactive component is ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary system architecture to which aspects of embodiments of the present application may be applied;
FIG. 2 is a flow diagram illustrating a method for processing interactive components in an interactive video according to one embodiment of the present application;
FIG. 3 is a flowchart illustrating step S230 of the method for processing interactive components in the interactive video of FIG. 2 according to an embodiment of the present application;
FIG. 4 is a flow diagram illustrating a process for accessing an interactive component template library further included in a method for processing interactive components in an interactive video according to an embodiment of the present application;
fig. 5 is a schematic flow chart illustrating the determination of the identification information of the target interactive component template further included in the processing method of the interactive component in the interactive video according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating the establishment of a component version management library further included in the processing method of interactive components in an interactive video according to an embodiment of the present application;
FIG. 7 is a flow chart illustrating the use of an existing interactive component template further included in the interactive component processing method in the interactive video according to an embodiment of the present application;
fig. 8A shows a schematic diagram of a blockchain data sharing system according to an embodiment of the present application;
fig. 8B shows a schematic structural diagram of a blockchain according to an embodiment of the present application;
FIG. 8C shows a schematic diagram of the generation of a new tile according to an embodiment of the present application;
fig. 9 to 10 are schematic diagrams illustrating terminal interfaces of a processing method of an interactive component in an interactive video according to an embodiment of the present application;
FIG. 11 shows a block diagram of a processing device of an interactive component in an interactive video according to an embodiment of the present application;
FIG. 12 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of the embodiments of the present application can be applied.
As shown in fig. 1, the system architecture may include at least one video production end 110, at least one storage server 120, and at least one video playback end 130. The connection between the video production end 110 and the storage server 120, and between the storage server 120 and the video playing end 130 may be through a network, and it should be noted that the network may include various connection types, such as a wired communication link, a wireless communication link, and so on.
It should be understood that the number of video production ends, storage servers, and video playback ends in fig. 1 is merely illustrative. Any number of video production ends, storage servers and video playing ends can be provided according to implementation requirements. For example, the storage server 120 may be a server cluster composed of a plurality of servers, and the like.
In an embodiment of the present application, an interactive video creator may make an interactive video and an interactive component template through the video making terminal 110, for example, when the interactive video creator wants to make the interactive component template, the video making terminal 110 may display a configuration information editing interface of the interactive component template according to an editing request for the interactive component template, display a code editing interface of the interactive component template based on configuration information obtained by the configuration information editing interface, and generate and store a target interactive component template based on a component code obtained by the code editing interface, so as to obtain the interactive component template required by the interactive video creator.
After the video production end 110 produces the completed interactive component template, the produced interactive component template may be uploaded to the storage server 120 for storage. When the video playing terminal 130 plays the interactive video, and when the interactive component is set, the corresponding interactive component template configured to be completed can be obtained from the storage server 120 for use.
The implementation details of the technical solution of the embodiment of the present application are set forth in detail below:
fig. 2 is a flow chart illustrating a processing method of an interactive component in an interactive video according to an embodiment of the present application. Referring to fig. 2, the method for processing the interactive components in the interactive video at least includes steps S210 to S230, which are described in detail as follows:
in step S210, a configuration information editing interface of the interactive component template is displayed according to an editing request for the interactive component template.
The interactive video can be a novel video which integrates interactive experience into a linear video through various technical means. When watching the interactive video, the audience can participate in the plot development through the interaction with the interactive video, so as to bring brand new impression experience to the audience.
The interactive component may be an interactive component disposed in the interactive video for interacting with the audience. It should be understood that the interactive component may include information related to the interactive video scenario development, and the audience may select a corresponding scenario development route according to the information included in the interactive component, so as to achieve the purpose of participating in the scenario development of the interactive video.
The interactive component template may be a template for making an interactive component, and the interactive video creator may select a corresponding interactive component template and configure the interactive component template, for example, fill in a document of the interactive component template or configure parameter information of the interactive component template. And configuring the interactive component template to obtain the required interactive component for application.
The editing requirement for the interactive component template may be information for requesting editing of the interactive component template, and it should be understood that the editing described herein may be configuration for an existing interactive component template, or may be creation of a brand new interactive component template according to the requirements of an interactive video creator, which is not particularly limited in this application. In one example of the present application, when an interactive video creator is authoring an interactive video, an edit request for an interactive component template may be sent by triggering a specific area (e.g., an "edit interactive component" button, etc.) on the video authoring interface.
The configuration information editing interface may be an interface for editing configuration information of the interactive component template. The configuration information editing interface may include configuration information editing options, which may include, but are not limited to, a pattern of the interactive components, a location of the interactive components, a time point of occurrence of the interactive components, and the like. The video creator can edit the corresponding configuration information editing options according to actual needs to obtain the required interactive component template.
In this embodiment, when an editing request of an interactive video creator for an interactive component template is received, a configuration information editing interface of the interactive component template is displayed on a display interface of a terminal device, so that the interactive video creator edits the configuration information editing interface. It should be understood that the terminal device may be any computing device having a display means, for example, the terminal device may be any one of a laptop computer, a desktop computer, a tablet computer, or a smart phone, etc.
Referring to fig. 2, in step S220, a code editing interface of the interactive component template is displayed based on the configuration information acquired by the configuration information editing interface.
The code editing interface can be an interface used for editing the component codes of the interactive component template. The interactive video creator may modify the interactive component code by editing the component code of the interactive component, and may further obtain an interactive component template meeting the requirement, for example, the video creator may modify the shape, size, color, or operation logic of the interactive component template by editing the component code of the interactive component template.
In this embodiment, according to the configuration information edited by the interactive video creator on the configuration information editing interface, the configuration information of the interactive video creator on the interactive component is received, and according to the received configuration information, the code editing interface of the interactive component template is displayed on the display interface of the terminal device.
In an embodiment of the present application, the code editing interface for displaying the interactive component template is displayed based on the configuration information acquired by the configuration information editing interface, and may be a code editing interface corresponding to the configuration information displayed according to the configuration information input by the interactive video creator. For example, the interactive component to be edited by the interactive video creator is an option branch, and the viewer can determine different plot development directions of the interactive video by selecting different branches in the option branch, so that the interactive video enters different plot contents. The configuration information editing interface at the option branch may include a branch number, which is a configuration information editing option, and when the interactive video creator inputs a different branch number, the corresponding code editing interface is displayed based on the input branch number, for example, the code editing interface may include a code editing area of each branch, the interactive video creator may input a corresponding component code in the code editing area corresponding to each branch to determine that the viewer selects the operating logic corresponding to the different branch, and so on.
In step S230, a target interaction component template is generated and stored based on the component codes acquired by the code editing interface.
The target interaction component template can be an interaction component template meeting the use requirements of an interaction video creator, and the interaction video creator can configure and edit the interaction component template according to actual use requirements to obtain the required target interaction component template.
In this embodiment, according to the component code input by the interactive video creator in the code editing interface, an interactive component template corresponding to the component code, that is, a target interactive component template, is generated, and the target component code is stored, so that when the subsequent interactive video to which the target interactive component template is applied is played, the video playing end acquires and applies the interactive video.
In an embodiment of the present application, the target interaction component template is stored, and the generated target interaction component template may be stored in a server (e.g., the storage server 120 shown in fig. 1, etc.), so that when the video playing end plays the corresponding interactive video, the configured target interaction component template may be obtained from the server and applied. Specifically, the configured target interaction component template can be stored in the server in the form of component codes. When the video playing end plays the interactive video, the video playing end can acquire the component code corresponding to the interactive component from the server and execute the component code according to the interactive component of the interactive video, so that the due configuration information and the due function of the interactive component can be obtained.
It should be noted that the video playing end and the video producing end may agree in advance an editing rule of the component code, so that the component code input by the interactive video creator according to the editing rule may be executed by the video playing end. Therefore, the target interactive assembly template can be operated without simultaneously updating the target interactive assembly template by the video making end and the video playing end, unnecessary updating steps of the video playing end are omitted, and the updating efficiency of the target interactive assembly template is improved.
In one embodiment of the present application, the video player may be configured with an interactive template engine for parsing and executing the component codes of the interactive component template. It should be understood that the interactive template engine may parse and execute the component code according to the editing rule of the pre-agreed component code. Therefore, the interactive template engine can analyze and execute any component codes written according to the editing rule, namely, the video playing end can use any interactive component template written according to the editing rule, and the trouble that the interactive component template needs to be updated synchronously with the video making end can be avoided.
In the embodiment shown in fig. 2, according to the editing request of the interactive video creator for the interactive component template, the configuration information editing interface of the interactive component template is displayed to the interactive video creator, based on the configuration information acquired by the configuration information editing interface, the code editing interface of the interactive component template is displayed to the interactive video creator, and the corresponding target interactive component template is generated according to the acquired component code, so that the production and configuration of the interactive component template can be completed at the video production end, so that the interactive component template suitable for the interactive video can be used by the video creator according to actual needs, the diversity of the interactive component template is ensured, and the fitting degree between the interactive component template and the interactive video is further improved.
Based on the embodiment shown in fig. 2, in an embodiment of the present application, the interactive component template may be an existing interactive component template, and the code editing interface for displaying the interactive component template based on the configuration information acquired by the configuration information editing interface includes:
and displaying a code editing interface of the interactive component template based on the configuration information acquired by the configuration information editing interface, wherein the code editing interface comprises the component codes of the interactive component template.
In this embodiment, if the interactive video creator edits an existing interactive component template, the component codes of the interactive component template may be displayed together when the code editing interface of the interactive component template is displayed to the interactive video creator. The interactive video creator may edit the interactive component template based on the component code, for example, modifying the shape of the interactive component may be performed by replacing the parameters of the component code related to the shape, and so on. Need not to write the subassembly code of interactive subassembly template again, improved interactive video author's editing efficiency, and only need replace key parameter can realize the edition to interactive subassembly template, easy operation, the encoding level requirement is lower, thereby the interactive video author who consequently has a small amount of coding knowledge also can realize obtaining required interactive subassembly template to the edition of interactive subassembly template, has guaranteed the practicality of code editing.
Based on the foregoing embodiments, fig. 3 illustrates a flowchart of step S230 in the processing method of the interactive component in the interactive video of fig. 2 according to an embodiment of the present application. In the embodiment shown in fig. 3, step S230 at least includes steps S310 to S330, which are described in detail as follows:
in step S310, modifications to the component code are received and stored.
In this embodiment, if the interactive video creator edits an existing interactive component template, the component codes of the interactive component template are displayed together when the code editing interface of the interactive component template is displayed. The interactive video creator may make modifications according to the displayed component code, such as modifying the color of the interactive component template from red to blue, modifying the shape of the interactive component template from round to square, and so on. The video production end can receive and store the modification of the interactive video creator aiming at the component code according to the operation of the interactive video creator.
In step S320, a target interactive component template is generated according to the modification to the component code.
In this embodiment, after the interactive video creator finishes editing the component code, the video production end may run the component code by triggering a specific area (for example, an "execute code" button, etc.) on the interface to generate an interactive component template corresponding to the component code.
In step S330, the target interaction component template is stored.
In an embodiment of the present application, the storing of the target interaction component template may be storing the target interaction component template in a local file of the video production end, or storing the target interaction component template on a video production platform for subsequent use.
In an embodiment of the application, the storing the target interaction component template may be uploading the target interaction component to a storage server for storage, so that a video playing end obtains the target interaction component template when playing the interactive video.
In the embodiment shown in fig. 3, the component codes of the existing interactive component templates are displayed on the code editing interface of the interactive component codes, so that an interactive video creator can edit the existing interactive component templates according to the component codes, thereby improving the editing efficiency of the interactive component templates.
Based on the embodiment shown in fig. 2, in an embodiment of the present application, the method for processing an interactive component in an interactive video further includes:
and uploading the interactive component template to an interactive component template library according to the sharing request of the target interactive component template.
The sharing request for the target interaction component template may be information used by the interactive video creator to request the target interaction component to be shared.
The interactive component template library may be a database for storing interactive component templates shared by the interactive video creators. After the interactive video creator completes the creation of the interactive component template, the created interactive component template can be uploaded to the interactive component template library for storage, so that others can obtain the interactive component template.
In this embodiment, the interactive video creator may upload the interactive component template created by the creator to the interactive component template library for storage, so as to be acquired and used by other interactive video creators. When an interactive video creator wants to share an interactive component template made by the creator, the creator can send a sharing request, and the video making end can obtain information of the interactive component template to be shared according to the received sharing request and upload the interactive component template to the interactive component template library for storage for others to obtain.
In an embodiment of the present application, the sharing request may include identification information of the interactive component template to be shared, and it should be understood that the identification information uniquely corresponds to the interactive component template. The video making end can obtain the component codes of the interactive component template corresponding to the identification information according to the identification information contained in the sharing request, and upload the component codes to the interactive component template library for sharing.
In another embodiment of the present application, the sharing request may also include a component code of the interactive component template to be shared, and the video production end may directly obtain the component code of the interactive component template from the sharing request, and upload the component code to the interactive component template library for storage.
Therefore, the interactive assembly template library can contain a large number of interactive assembly templates manufactured by the interactive video creators, so that the interactive video creators can select from the acquiescent interactive assembly templates of the manufacturing platform and also from the interactive assembly template library when manufacturing interactive videos, the diversity of the interactive assembly templates is enriched, the styles of the interactive assemblies are more diversified, and the interestingness of the interactive videos is increased.
Based on the above embodiments, fig. 4 shows a flowchart illustrating a process of accessing an interactive component template library, which is further included in the interactive component processing method in an interactive video according to an embodiment of the present application. In the embodiment shown in fig. 4, accessing the interactive component template library at least includes steps S410 to S420, which are described in detail as follows:
in step S410, an interactive component template included in the interactive component template library is displayed according to an access request to the interactive component template library.
Wherein the access request may be information to request access to the library of interactive component templates. In an example, the interactive video production interface may be provided with a "access interactive component template library" button, and when the interactive video creator wants to access the interactive component template library, the "access interactive component template library" button may be triggered to send an access request to the interactive component template library.
In this embodiment, when the video production terminal receives an access request to the interactive component template library, the interactive component templates stored in the interactive component template library may be displayed on the interface. In one example, the interactive component templates in the interactive component template library can be displayed in a list form for the interactive video creator to read; in another example, the interactive component template in the interactive component template library can also be displayed in the form of an effect display image, so that an interactive video creator can intuitively know the style of the interactive component template according to the effect display image of the interactive component template, and the interactive video creator can conveniently select the interactive component template; it should be understood that the interactive component templates in the interactive component template library may be displayed in other forms, and those skilled in the art may set the interactive component templates according to actual implementation needs, which is not limited in this application.
In step S420, in response to the interactive component template selected in the interactive component template library, the selected interactive component template is obtained.
In this embodiment, the interactive video creator may select a desired interactive component template according to the interactive component templates in the displayed interactive component template library, for example, click or check the corresponding interactive component template. And the video making end downloads the selected interactive component template from the network so as to be used by an interactive video creator.
In the embodiment shown in fig. 4, the interactive video creator may obtain the required interactive component template from the interactive component template library, rather than obtaining the required interactive component template from the default template library of the video production end, so as to ensure the engagement degree between the interactive component and the interactive video. In addition, the interactive video creator can obtain the required interactive component template without editing, and the interactive video production efficiency is improved.
Based on the embodiment shown in fig. 2, fig. 5 is a flowchart illustrating a process of determining identification information of a target interactive component template further included in a processing method of an interactive component in an interactive video according to an embodiment of the present application. In the embodiment shown in fig. 5, if the interactive component template is an existing interactive component template, determining the identification information of the target interactive component template at least includes steps S510 to S520, which are described in detail as follows:
in step S510, according to the identification information of the interactive component template, the identification information of the target interactive component template is determined.
In this embodiment, since the interactive component template is an existing interactive component template, the target interactive component template generated by editing according to the existing interactive component template has a relatively large association with the existing interactive component template. The identification information of the target interactive component template can be determined according to the identification information of the interactive component template so as to embody the relevance between the two. For example, if the identification information of the existing interactive component template is No.123456-V1, the identification information of the target interactive component template generated according to the existing interactive component template may be correspondingly determined to be No.123456-V2, and so on.
In step S520, the determined identification information of the target interaction component template is associated with the target interaction component template.
In one embodiment of the present application, associating the determined identification information of the target interaction component template with the target interaction component template may be adding the identification information to a file name of the target interaction component template. For example, the file name of the target interaction component template is: and a branch option, wherein the identification information of the target interaction component template is NO.123456-V2, the identification information is added to the file name as follows: branch option-NO. 123456-V2, and so on.
In an embodiment of the present application, a correspondence table between the interaction component template and the identification information may be pre-established, and after the identification information of the target interaction component template is determined, the target interaction component template and the identification information are correspondingly stored in the correspondence table. During subsequent use, the identification information corresponding to the interactive component template can be obtained by inquiring the corresponding relation table.
In the embodiment shown in fig. 5, the identification information of the target interaction component template is determined according to the identification information of the existing interaction component template, which may represent the association between the target interaction component template and the existing interaction component template. When the interactive video creator selects the interactive component template, the associated interactive component template can be determined through the identification information, and the interactive video creator can conveniently read and search.
Based on the embodiments shown in fig. 2 and fig. 5, fig. 6 is a flowchart illustrating a process of establishing a component version management library, which is further included in the processing method for interactive components in an interactive video according to an embodiment of the present application. In the embodiment shown in fig. 6, the creating of the component version management library at least includes steps S610 to S620, which are described in detail as follows:
in step S610, an associated interactive component template is determined according to the identification information of the interactive component template.
In this embodiment, since the identification information of the newly generated interactive component template is determined according to the identification information of the interactive component template existing previously, it is possible to determine that there is an associated interactive component template according to the identification information of the interactive component template. It will be appreciated that there should be an associated interactive component template that is generated by editing a newly generated interactive component template based on a pre-existing interactive component template.
In step S620, the associated interactive component templates are divided into the same component version management library for storage.
Wherein, the component version management library can be a database for managing the interactive component template with the association. It should be understood that there should be some correlation between the interactive component templates stored in the same component version management library, i.e., the interactive component template generated later is edited based on the interactive component template existing earlier.
In the embodiment, after the associated interaction component templates are determined according to the identification information, the interaction component templates with the associated interaction are divided into the same component version management library for storage, so that certain association exists among the interaction component templates stored in one component version management library, and the interaction component templates can be conveniently managed and searched by an interaction video creator.
In the embodiment shown in fig. 6, the associated interactive component templates are determined according to the identification information, and the associated interactive component templates are divided into the same component version management library for storage. When searching for the interactive component template, the interactive video creator can search for the interactive component templates of different versions in the same series by searching for different component version management libraries, so that the interactive video creator can accurately search for a series of interactive component templates, search efficiency is improved, and management of the interactive component templates is facilitated.
Based on the embodiment shown in fig. 2, fig. 7 is a flowchart illustrating a process of using an existing interactive component template further included in the interactive video processing method according to an embodiment of the present application. In the embodiment shown in fig. 7, the step of using the existing interactive component template at least includes steps S710 to S720, which are described in detail as follows:
in step S710, a configuration information editing interface of an existing interactive component template is displayed according to an application request for the existing interactive component template.
The application request for the existing interactive component template may be information for requesting application of the existing interactive component template. When the interactive video creator wants to use an existing interactive component template, the selected interactive component template can be determined and an application request for the interactive component template is sent.
In this embodiment, when an application request of an existing interactive component template is received, a configuration information editing interface of the interactive component template may be displayed on the interface, and an interactive video creator may edit configuration information of the interactive component template on the configuration information editing interface according to actual needs. For example, editing the existing interactive component template-interactive bubbles, the case of the interactive bubbles and the number of the interactive bubbles can be edited on the configuration information editing interface of the interactive component template, and so on.
In step S720, a parameter configuration interface of the existing interactive component template is displayed based on the configuration information acquired by the configuration information editing interface.
The parameter configuration interface may be a configuration interface for editing parameters of the interactive component template. The parameters may include, but are not limited to, size (e.g., length, width, and height), location, and time of occurrence of the interactive component, among others.
In this embodiment, according to the configuration information acquired by the configuration information editing interface, a parameter configuration interface corresponding to the configuration information is displayed on the interface. For example, editing is performed on an existing interactive component template-interactive bubbles, and then a corresponding parameter configuration interface is displayed according to the acquired configuration information, for example, if the number of the interactive bubbles configured in the configuration information is two, then a parameter configuration interface that should display two interactive bubbles, such as "bubble one" and "bubble two", should be displayed on the parameter configuration interface. The interactive video creator can edit the parameter information of the required interactive component template in the parameter configuration interface.
In step S730, a target interaction component is generated and stored according to the parameter information acquired by the parameter configuration interface.
In this embodiment, the video production end may correspondingly generate and store a target interaction component required by the interactive video creator according to the parameter information input by the interactive video creator on the parameter configuration interface, so as to be applied by the interactive video creator.
In an embodiment of the application, the target interaction component may be stored by uploading the target interaction component to a storage server for being acquired by a video playing end, and the video playing end can use the target interaction component without configuring the interaction component because the target interaction component is configured.
In the embodiment shown in fig. 7, when an application request for an existing interactive component template is received, a configuration information editing interface of the interactive component template is displayed, a parameter configuration interface of the interactive component template is displayed based on configuration information acquired by the configuration information editing interface, and a corresponding target interactive component is generated based on parameter information acquired by the parameter configuration interface, so that the configuration of the interactive component can be realized at a video production end, and when a video is played at a video playing end, the configuration of the interactive component is not required, thereby increasing the loading speed of the interactive component at the video playing end.
Fig. 8A shows a schematic diagram of a blockchain data sharing system according to an embodiment of the present application.
Referring to the blockchain data sharing system shown in fig. 8A, the blockchain data sharing system refers to a system for performing data sharing between nodes, the blockchain data sharing system may include a plurality of nodes 801, and the plurality of nodes 801 may refer to respective terminal devices or servers in the blockchain data sharing system. Each node 801, in normal operation, may receive input information, which may be information related to a target interactive component template, and maintain shared data within the blockchain data sharing system based on the received input information. In order to ensure information intercommunication in the blockchain data sharing system, information connection can exist between each node in the blockchain data sharing system, and information transmission can be carried out between the nodes through the information connection. For example, when an arbitrary node in the blockchain data sharing system receives input information, other nodes in the blockchain data sharing system acquire the input information according to a consensus algorithm, and store the input information as data in blockchain shared data, so that the data stored in all nodes in the blockchain data sharing system are consistent.
Each node in the blockchain data sharing system has a corresponding node identifier, and each node in the blockchain data sharing system can store the node identifiers of other nodes in the blockchain data sharing system, so that the generated blocks can be broadcast to other nodes in the blockchain data sharing system according to the node identifiers of other nodes. Each node may maintain a node identifier list as shown in the following table, and store the node name and the node identifier in the node identifier list correspondingly. The node identifier may be an IP (Internet Protocol) address and any other information that can be used to identify the node, and table 1 only illustrates the IP address as an example.
Node name Node identification
Node
1 117.114.151.174
Node 2 117.116.189.145
Node N 119.123.789.258
Each node in the blockchain data sharing system stores one identical blockchain. The block chain is composed of a plurality of blocks, see fig. 8B, and fig. 8B shows a schematic structural diagram of the block chain according to the technical solution of the present application, where the block chain is composed of a plurality of blocks, a starting block includes a block header and a block main body, the block header stores an input information characteristic value, a version number, a timestamp, and a difficulty value, and the block main body stores input information; the next block of the starting block takes the starting block as a parent block, the next block also comprises a block head and a block main body, the block head stores the input information characteristic value of the current block, the block head characteristic value of the parent block, the version number, the timestamp and the difficulty value, and the like, so that the block data stored in each block in the block chain is associated with the block data stored in the parent block, and the safety of the input information in the block is ensured.
Referring to fig. 8C, fig. 8C is a schematic diagram illustrating a process of generating a new block according to the technical solution of an embodiment of the present application, when each block in a block chain is generated, a node where the block chain is located checks input information when the node receives the input information, and after the check is completed, the input information is stored in a memory pool, and a hash tree used for recording the input information is updated; and then, updating the updating time stamp to the time when the input information is received, trying different random numbers, and calculating the characteristic value for multiple times, so that the calculated characteristic value can meet the following formula:
SHA256(SHA256(version+prev_hash+merkle_root+ntime+nbits+x))<TARGET
wherein, SHA256 is a characteristic value algorithm used for calculating a characteristic value; version is version information of the relevant block protocol in the block chain; prev _ hash is a block head characteristic value of a parent block of the current block; merkle _ root is a characteristic value of the input information; ntime is the update time of the update timestamp; nbits is the current difficulty, is a fixed value within a period of time, and is determined again after exceeding a fixed time period; x is a random number; TARGET is a feature threshold, which can be determined from nbits.
Therefore, when the random number meeting the formula is obtained through calculation, the information can be correspondingly stored, and the block head and the block main body are generated to obtain the current block. And then, the node where the block chain is located respectively sends the newly generated blocks to other nodes in the block chain data sharing system where the newly generated blocks are located according to the node identifications of the other nodes in the block chain data sharing system, the newly generated blocks are verified by the other nodes, and the newly generated blocks are added to the block chain stored in the newly generated blocks after the verification is completed.
Based on the foregoing embodiments, in one embodiment of the present application, a target interaction component template is stored, including:
and generating a block according to the target interaction component template, and uploading the block to a block chain network for storage.
In this embodiment, generating the tile according to the target interactive component template may be generating the tile according to the component code of the target interactive component template, and uploading the tile to the blockchain network for storage. It will be appreciated that multiple interactive video creators may be configured as nodes in a blockchain network to enable sharing of interactive component templates.
In this embodiment, a block is generated according to the target interactive component template, and the block is uploaded to the block chain network for storage, so that sharing of the target interactive component can be realized, and other interactive video creators can conveniently acquire the interactive component templates made by others.
Based on the technical solution of the above embodiment, a specific application scenario of an embodiment of the present application is introduced as follows:
fig. 9 to 10 are schematic diagrams of terminal interfaces of a processing method of an interactive component in an interactive video according to an embodiment of the present application.
Referring to fig. 9, fig. 9 is a configuration information editing interface of the interactive element template (in the embodiment, the interactive element template-interactive bubble is taken as an example for explanation). As shown in fig. 9, when an editing request of an interactive video creator for an interactive component template is received, a configuration information editing interface of the interactive component template is displayed on the interface, and the configuration information editing interface may include a document editing option 910 and a quantity editing option 920. The interactive video creator may enter text information in the case edit box 911 to determine the case of the interactive component template, and enter numbers in the number edit box 921 to determine the number of bubbles in the interactive component template.
Referring to fig. 10, fig. 10 is a code editing interface of the interactive component template. As shown in fig. 10, the code editing interface may include a video display area 1010, a configuration information editing area 1020, and a code editing area 1030.
The video display area 1010 may be configured to display video content of an interactive video, and when an interactive component is newly created or edited, the interactive component may be displayed in the video display area and displayed together with the video content of the interactive video. The interactive video creator can preview the display effect of the interactive component according to the content displayed in the video display area 1010 to determine whether the interactive component meets the requirement.
The configuration information editing area 1020 can modify the configuration information of the interactive component template in real time when editing the component codes of the interactive components, so that an interactive video creator can modify the configuration information of the interactive component template conveniently.
The code editing area 1030 may be used to edit the component code of the interactive component template. In this embodiment, the code editing area may include a CSS (Cascading Style Sheets) editing area, a JavaScript editing area, and an HTML (hypertext markup language) editing area, where the CSS editing area may be used to edit rendering effects of the interactive component template, such as color, shape, and the like of the interactive component template, and the JavaScript editing area may be used to edit run logic of the interactive component template, such as selecting a certain branch to enter a certain scenario, and the like. The HTML editing area may be used to edit the size (e.g., length, width, and height), location, etc. of the interactive component template.
In the embodiments shown in fig. 9 and 10, through the setting of the configuration information editing interface and the code editing interface, an interactive video creator can configure and edit an interactive component template according to actual needs when creating an interactive video to obtain a required target interactive component template, so that the diversity of the interactive component template is enriched, and the degree of engagement between the interactive component template and the interactive video is also ensured.
The following describes embodiments of an apparatus of the present application, which can be used to perform a processing method for an interactive component in an interactive video in the above embodiments of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the processing method of the interactive component in the interactive video described above in the present application.
FIG. 11 shows a block diagram of a processing device for interactive components in an interactive video according to an embodiment of the present application.
Referring to fig. 11, a processing apparatus for an interactive component in an interactive video according to an embodiment of the present application includes:
the first display module 1110 is configured to display a configuration information editing interface of an interactive component template according to an editing request for the interactive component template;
a second display module 1120, configured to display a code editing interface of the interactive component template based on the configuration information obtained by the configuration information editing interface;
and the processing module 1130 is configured to generate and store a target interaction component template based on the component codes acquired by the code editing interface.
In some embodiments of the present application, based on the above scheme, the interactive component template is an existing interactive component template, and the second display module 1120 is configured to: and displaying a code editing interface of the interactive component template based on the configuration information acquired by the configuration information editing interface, wherein the code editing interface comprises the component codes of the interactive component template.
In some embodiments of the present application, based on the foregoing, the processing module 1130 is configured to: receiving and storing modifications to the component code; generating a target interaction component template according to the modification aiming at the component codes; and storing the target interaction component template.
In some embodiments of the present application, based on the foregoing, the processing module 1130 is further configured to: and uploading the interactive component template to an interactive component template library according to the sharing request of the target interactive component template.
In some embodiments of the present application, based on the foregoing, the processing module 1130 is further configured to: displaying the interactive component templates contained in the interactive component template library according to the access request of the interactive component template library; and responding to the interaction component template selected in the interaction component template library, and acquiring the selected interaction component template.
In some embodiments of the present application, based on the foregoing solution, the interactive component template is an existing interactive component template, and the processing module 1130 is further configured to: determining the identification information of the target interaction component template according to the identification information of the interaction component template, wherein the identification information of the interaction component template is associated with the identification information of the target interaction component template; and associating the determined identification information of the target interaction component template with the target interaction component template.
In some embodiments of the present application, based on the foregoing, the processing module 1130 is further configured to: determining a related interactive component template according to the identification information of the interactive component template; and dividing the associated interactive component template into the same component version management library for storage.
In some embodiments of the present application, based on the foregoing, the processing module 1130 is further configured to: displaying a configuration information editing interface of an existing interactive component template according to a use request of the existing interactive component template; displaying a parameter configuration interface of the existing interactive component template based on the configuration information acquired by the configuration information editing interface; and generating and storing a target interaction component according to the parameter information acquired by the parameter configuration interface.
In some embodiments of the present application, based on the foregoing, the processing module 1130 is further configured to: and generating a block according to the target interaction component template, and uploading the block to a block chain network for storage.
FIG. 12 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
It should be noted that the computer system of the electronic device shown in fig. 12 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 12, the computer system includes a Central Processing Unit (CPU)1201, which can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1202 or a program loaded from a storage section 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data necessary for system operation are also stored. The CPU 1201, ROM 1202, and RAM 1203 are connected to each other by a bus 1204. An Input/Output (I/O) interface 1205 is also connected to bus 1204.
The following components are connected to the I/O interface 1205: an input section 1206 including a keyboard, a mouse, and the like; an output section 1207 including a Display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 1208 including a hard disk and the like; and a communication section 1209 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 1209 performs communication processing via a network such as the internet. A driver 1210 is also connected to the I/O interface 1205 as needed. A removable medium 1211, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1210 as necessary, so that a computer program read out therefrom is mounted into the storage section 1208 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1209, and/or installed from the removable medium 1211. The computer program executes various functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 1201.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with a computer program embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A method for processing an interactive component in an interactive video is characterized by comprising the following steps:
displaying a configuration information editing interface of the interactive component template according to an editing request of the interactive component template;
displaying a code editing interface of the interaction component template based on the configuration information acquired by the configuration information editing interface;
and generating and storing a target interaction component template based on the component codes acquired by the code editing interface.
2. The processing method according to claim 1, wherein if the interactive component template is an existing interactive component template, displaying a code editing interface of the interactive component template based on the configuration information acquired by the configuration information editing interface, includes:
and displaying a code editing interface of the interactive component template based on the configuration information acquired by the configuration information editing interface, wherein the code editing interface comprises the component codes of the interactive component template.
3. The processing method of claim 2, wherein generating and storing a target interaction component code template based on the component code obtained by the code editing interface comprises:
receiving and storing modifications to the component code;
generating a target interaction component template according to the modification aiming at the component code;
and storing the target interaction component template.
4. The processing method of claim 1, further comprising:
and uploading the interactive component template to an interactive component template library according to the sharing request of the target interactive component template.
5. The processing method of claim 4, further comprising:
displaying the interactive component templates contained in the interactive component template library according to the access request of the interactive component template library;
and responding to the interaction component template selected in the interaction component template library, and acquiring the selected interaction component template.
6. The process of claim 1, wherein the interactive component template is an existing interactive component template, the process further comprising:
determining the identification information of the target interaction component template according to the identification information of the interaction component template;
and associating the determined identification information of the target interaction component template with the target interaction component template.
7. The processing method of claim 6, further comprising:
determining a related interactive component template according to the identification information of the interactive component template;
and dividing the associated interactive component template into the same component version management library for storage.
8. The processing method of claim 1, further comprising:
displaying a configuration information editing interface of an existing interactive component template according to an application request of the existing interactive component template;
displaying a parameter configuration interface of the existing interactive component template based on the configuration information acquired by the configuration information editing interface;
and generating and storing a target interaction component according to the parameter information acquired by the parameter configuration interface.
9. The processing method according to any one of claims 1 to 8, wherein storing the target interaction component template comprises:
and generating a block according to the target interaction component template, and uploading the block to a block chain network for storage.
10. A device for processing an interactive component in an interactive video, comprising:
the first display module is used for displaying a configuration information editing interface of the interactive component template according to an editing request aiming at the interactive component template;
the second display module is used for displaying a code editing interface of the interaction component template based on the configuration information acquired by the configuration information editing interface;
and the processing module is used for generating and storing a target interaction component template based on the component codes acquired by the code editing interface.
CN201911364509.3A 2019-12-26 2019-12-26 Processing method and device for interactive components in interactive video Active CN113064590B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911364509.3A CN113064590B (en) 2019-12-26 2019-12-26 Processing method and device for interactive components in interactive video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911364509.3A CN113064590B (en) 2019-12-26 2019-12-26 Processing method and device for interactive components in interactive video

Publications (2)

Publication Number Publication Date
CN113064590A true CN113064590A (en) 2021-07-02
CN113064590B CN113064590B (en) 2023-10-27

Family

ID=76557878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911364509.3A Active CN113064590B (en) 2019-12-26 2019-12-26 Processing method and device for interactive components in interactive video

Country Status (1)

Country Link
CN (1) CN113064590B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114845131A (en) * 2022-04-29 2022-08-02 北京达佳互联信息技术有限公司 Interactive information configuration method, device, electronic equipment, medium and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN108228836A (en) * 2018-01-04 2018-06-29 武汉斗鱼网络科技有限公司 Video compatible loading method, device and video component
WO2019040543A1 (en) * 2017-08-22 2019-02-28 Codestream, Inc. Systems and methods for providing an instant communication channel within integrated development environments
CN110225412A (en) * 2019-07-05 2019-09-10 腾讯科技(深圳)有限公司 Video interaction method, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019040543A1 (en) * 2017-08-22 2019-02-28 Codestream, Inc. Systems and methods for providing an instant communication channel within integrated development environments
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN108228836A (en) * 2018-01-04 2018-06-29 武汉斗鱼网络科技有限公司 Video compatible loading method, device and video component
CN110225412A (en) * 2019-07-05 2019-09-10 腾讯科技(深圳)有限公司 Video interaction method, device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SIEMENS / ITALTEL: "R3-99961 "Info Model and State Management Functions for NodeB logical O amp;amp;M"", 3GPP TSG_RAN\\WG3_IU, no. 3 *
张晶;黄小锋;: "基于业务模型和界面模型的代码生成工具", 电脑与信息技术, no. 02 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114845131A (en) * 2022-04-29 2022-08-02 北京达佳互联信息技术有限公司 Interactive information configuration method, device, electronic equipment, medium and program product
CN114845131B (en) * 2022-04-29 2023-10-03 北京达佳互联信息技术有限公司 Interactive information configuration method and device, electronic equipment, medium and program product

Also Published As

Publication number Publication date
CN113064590B (en) 2023-10-27

Similar Documents

Publication Publication Date Title
US11664019B2 (en) Content presentation analytics and optimization
US10855765B2 (en) Content atomization
US10565293B2 (en) Synchronizing DOM element references
US20210074276A1 (en) Content Segmentation and Time Reconciliation
CN111079047B (en) Web-oriented page construction system
US20120226776A1 (en) System and Methods for Facilitating the Synchronization of Data
US20110161847A1 (en) System and method for integrating and publishing pages of content
CN110719524A (en) Video playing method and device, intelligent playing equipment and storage medium
CN111031400B (en) Barrage presenting method and system
US20160191975A1 (en) An apparatus for providing, editing and playing video contents and the method thereof
CN110381383B (en) Method and device for generating interactive audio and video based on mobile terminal, computing equipment and storage medium
CN110784753B (en) Interactive video playing method and device, storage medium and electronic equipment
CN108781311A (en) Video player frame for distribution of media and management platform
CN112528203A (en) Webpage-based online document making method and system
CN109032599B (en) Method, device, equipment and medium for generating interactive flow chart based on XML (extensive Makeup language) representation
CN111031399A (en) Bullet screen processing method and system
CN113064590B (en) Processing method and device for interactive components in interactive video
CN112861472B (en) Shared document display method, device and equipment and computer readable storage medium
CN113535177A (en) Form generation method, device and equipment
CN113207039A (en) Video processing method and device, electronic equipment and storage medium
JP5291448B2 (en) Content production server and content production program
CN104111768B (en) Interactive window and method and system for customizing, quoting and synchronizing interactive window
US20110145841A1 (en) System and method for generating pages of content
CN115065866B (en) Video generation method, device, equipment and storage medium
CN112699157B (en) Information recommendation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40049215

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant