CN110784753A - Interactive video playing method and device, storage medium and electronic equipment - Google Patents

Interactive video playing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN110784753A
CN110784753A CN201910979613.7A CN201910979613A CN110784753A CN 110784753 A CN110784753 A CN 110784753A CN 201910979613 A CN201910979613 A CN 201910979613A CN 110784753 A CN110784753 A CN 110784753A
Authority
CN
China
Prior art keywords
video
interactive
chapter
playing
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910979613.7A
Other languages
Chinese (zh)
Other versions
CN110784753B (en
Inventor
雷彬
刘里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910979613.7A priority Critical patent/CN110784753B/en
Publication of CN110784753A publication Critical patent/CN110784753A/en
Application granted granted Critical
Publication of CN110784753B publication Critical patent/CN110784753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8541Content authoring involving branching, e.g. to different story endings

Abstract

The disclosure provides an interactive video playing method and device, electronic equipment and a storage medium; relates to the technical field of computers. The interactive video playing method comprises the following steps: responding to the interactive video played on the graphical user interface, and acquiring a first chapter video corresponding to the interactive video; acquiring interaction configuration information bound to the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time; when the first chapter video playing time is matched with the triggering time, rendering an interactive control on the graphical user interface according to the interactive configuration information; and responding to the selection of the interactive control, and acquiring and playing a second chapter video corresponding to the interactive control to finish the playing of the interactive video. The method and the device can improve the playing speed of the interactive video and improve the working efficiency of the system.

Description

Interactive video playing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interactive video playing method, an interactive video playing apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of internet technology, people have higher and higher requirements for entertainment forms. Interactive Video (IV) is a brand new Video type, and when a user watches a Video, the user interacts with the Video to enhance somatosensory feedback and participate in the development of a plot, so that the user can have richer watching experience.
At present, the background data structure corresponding to the existing interactive video maintains the relationship among the video, the relationship between the interaction and the video and the interaction style, and uses the association mode for maintenance. In the method, each item of content is independent, and the interaction of the video can be clearly seen from the file, but the data structure organization method has poor readability, and when an actual program runs, the interaction corresponding to the video is searched each time, the interaction list (interactBlockList) needs to be fully inquired, so that all the interaction configurations used by the current video can be found, and the efficiency is low; secondly, the data structure only has a video and interaction layer incidence relation, the structure encapsulation of the video is lacked, the video and the interaction are displayed through a large plot story line, the combination, the arrangement and the packaging of the plot of the user can not be met, and the use experience of the user is reduced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to an interactive video playing method, an interactive video playing apparatus, an electronic device, and a computer-readable storage medium, so as to overcome the problems of poor system operation effect and low efficiency when playing an interactive video, which are caused by the limitations and defects of the related art, to a certain extent.
According to a first aspect of the present disclosure, an interactive video playing method is provided, which is applied to a terminal device with a graphical user interface, and the method includes:
responding to the interactive video played on the graphical user interface, and acquiring a first chapter video corresponding to the interactive video;
acquiring interaction configuration information bound to the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time;
when the first chapter video playing time is matched with the triggering time, rendering an interactive control on the graphical user interface according to the interactive configuration information;
and responding to the selection of the interactive control, and acquiring and playing a second chapter video corresponding to the interactive control to finish the playing of the interactive video.
In an exemplary embodiment of the present disclosure, the method for playing an interactive video further includes a chapter list, and the obtaining a first chapter video corresponding to the interactive video in response to playing the interactive video on the graphical user interface includes:
responding to the interactive video played on the graphical user interface, and acquiring a chapter list corresponding to the interactive video;
and acquiring a first chapter video corresponding to the interactive video according to the chapter list.
In an exemplary embodiment of the present disclosure, the method for playing an interactive video further includes a video list, and the obtaining a first chapter video corresponding to the interactive video according to the chapter list includes:
determining identification information of a first chapter video corresponding to the interactive video according to the chapter list;
and extracting the corresponding first chapter video from the video list through the identification information.
In an exemplary embodiment of the disclosure, before obtaining the interactive configuration information of the first chapter video binding, the method further includes:
acquiring interaction configuration information of the first chapter video in the interaction video;
and binding the interaction configuration information with the first chapter video, and storing the first chapter video into a video list.
In an exemplary embodiment of the present disclosure, rendering an interactive control on the gui according to the interactive configuration information includes:
acquiring element information in the interactive configuration information; wherein the element information includes one or more of a text element, a picture element, a button element, and a control element;
and rendering an interactive control and interactive content on the graphical user interface according to the element information.
In an exemplary embodiment of the disclosure, before rendering the interactive control and the interactive content on the graphical user interface according to the element information, the method further includes:
acquiring a preset interactive configuration template, and determining configuration parameters corresponding to the interactive configuration template;
and generating the element information according to the configuration parameters.
In an exemplary embodiment of the present disclosure, the method further comprises:
and acquiring modification data of the interactive configuration template, and modifying the interactive configuration template according to the modification data to generate a user-defined interactive configuration template.
According to a second aspect of the present disclosure, there is provided an interactive video playing device, including:
the chapter video acquisition module is used for responding to the interactive video played on the graphical user interface and acquiring a first chapter video corresponding to the interactive video;
the interactive configuration information acquisition module is used for acquiring interactive configuration information bound to the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time;
the interactive control rendering module is used for rendering an interactive control on the graphical user interface through the interactive configuration information when the first chapter video playing time is matched with the triggering time;
and the interactive video playing module is used for responding to the selection of the interactive control, acquiring a second chapter video corresponding to the interactive control and playing the second chapter video so as to finish playing the interactive video.
In an exemplary embodiment of the present disclosure, the chapter video acquisition module further includes:
the chapter list acquisition unit is used for responding to the interactive video played on the graphical user interface and acquiring a chapter list corresponding to the interactive video;
and the first chapter video acquisition unit is used for acquiring a first chapter video corresponding to the interactive video according to the chapter list.
In an exemplary embodiment of the present disclosure, the first chapter video acquisition unit is further configured to:
determining identification information of a first chapter video corresponding to the interactive video according to the chapter list;
and extracting the corresponding first chapter video from the video list through the identification information.
In an exemplary embodiment of the present disclosure, the interactive video playing apparatus further includes an interactive configuration information binding unit, and the interactive configuration information binding unit is configured to:
acquiring interaction configuration information of the first chapter video in the interaction video;
and binding the interaction configuration information with the first chapter video, and storing the first chapter video into a video list.
In an exemplary embodiment of the disclosure, the interactive control rendering module is further configured to:
acquiring element information in the interactive configuration information; wherein the element information includes one or more of a text element, a picture element, a button element, and a control element;
and rendering an interactive control and interactive content on the graphical user interface according to the element information.
In an exemplary embodiment of the present disclosure, the interactive video playback apparatus further includes an element information generating unit configured to:
acquiring a preset interactive configuration template, and determining configuration parameters corresponding to the interactive configuration template;
and generating the element information according to the configuration parameters.
In an exemplary embodiment of the present disclosure, the interactive video playing apparatus further includes a custom interactive configuration template generating unit, where the custom interactive configuration template generating unit is configured to:
and acquiring modification data of the interactive configuration template, and modifying the interactive configuration template according to the modification data to generate a user-defined interactive configuration template.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the interactive video playing method provided by an example embodiment of the present disclosure, when an interactive video is played, a first chapter video corresponding to the interactive video is acquired, then interactive configuration information bound to the first chapter video is acquired, and when the playing time of the first chapter video matches with the trigger time, an interactive control is rendered on a graphical user interface according to the interactive configuration information; and responding to the selection of the interactive control, and acquiring and playing a second chapter video corresponding to the interactive control to finish the playing of the interactive video. On one hand, the interactive video is divided into a first chapter video and a second chapter video, so that the readability of the corresponding data structure of the interactive video is improved, meanwhile, the interactive video is used as a further package of the video and the interactive relation, the use and the expansion of a user are facilitated, and the use experience of the user is improved; on the other hand, when the first chapter of video is played, the interaction configuration information bound with the first chapter of video is directly obtained, the interaction configuration information corresponding to the first chapter of video does not need to be inquired in an interaction configuration list in a full-scale mode, and the working efficiency of the system is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a schematic diagram illustrating an exemplary system architecture to which the method and apparatus for interactive video playing according to the embodiments of the present disclosure may be applied;
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device used to implement embodiments of the present disclosure;
fig. 3 is a schematic diagram illustrating an index script file and an interaction interval script file relationship corresponding to an interactive video in the related art according to an embodiment of the present disclosure;
fig. 4 schematically shows a diagram of an interactive video presentation in the related art according to an embodiment of the present disclosure;
fig. 5 schematically shows a flow diagram of an interactive video playback method according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic diagram of an interactive video correspondence protocol file according to one embodiment of the present disclosure;
fig. 7 schematically shows a schematic block diagram of an interactive video playback device according to an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which the method and apparatus for interactive video playing according to the embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The interactive video playing method provided by the embodiment of the present disclosure is generally executed by the server 105, and accordingly, the interactive video playing apparatus is generally disposed in the server 105. However, it is easily understood by those skilled in the art that the interactive video playing method provided in the embodiment of the present disclosure may also be executed by the terminal devices 101, 102, and 103, and accordingly, the interactive video playing apparatus may also be disposed in the terminal devices 101, 102, and 103, which is not particularly limited in this exemplary embodiment. For example, in an exemplary embodiment, the user may upload the interactive operation of the user to the server 105 through the terminal devices 101, 102, and 103, and the server transmits the acquired and generated interactive video to the terminal devices 101, 102, and 103 through the interactive video playing method provided by the embodiment of the present disclosure.
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU)201 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for system operation are also stored. The CPU201, ROM 202, and RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input portion 206 including a keyboard, a mouse, and the like; an output section 207 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 208 including a hard disk and the like; and a communication section 209 including a network interface card such as a LAN card, a modem, or the like. The communication section 209 performs communication processing via a network such as the internet. A drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 210 as necessary, so that a computer program read out therefrom is mounted into the storage section 208 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 209 and/or installed from the removable medium 211. The computer program, when executed by a Central Processing Unit (CPU)201, performs various functions defined in the methods and apparatus of the present application. In some embodiments, the computer system 200 may further include an AI (artificial intelligence) processor for processing computing operations related to machine learning.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 5, and so on.
The technical solution of the embodiment of the present disclosure is explained in detail below:
the interactive video can be a completely new video type, when a user watches the video, the user can enhance the somatosensory feedback through the interaction with the video, participate in the plot development and bring richer watching experience to the user. The interactive video interaction capability can be the capability of enabling users to interact in the interactive video, and comprises a branching scenario, visual angle switching, interactive bubbles and the like, wherein the branching scenario is the capability of supporting the users to select a scenario development direction and go to different scenario contents; the method comprises the following steps of switching visual angles, supporting switching among a plurality of visual angles, and acquiring different content watching experiences; the interactive bubbles support adding information such as characters and pictures into the video, and users can click, slide and the like to interact.
The interactive video content configuration data structure scheme refers to an interactive video created by an interactive video creator, and needs to be described through a structured, standardized and highly readable data format. In interactive video, video is the main medium, interactive capability is based on video content, and a relationship between video content and interaction needs to be maintained. In addition, the interactive capability may use materials such as pictures and audio, including the video itself, and therefore, the whole data structure needs to consider the composition relationship of the video, the interaction and the materials (pictures, audio, etc.).
The interactive video protocol in the related art is a script file set which bears related interactive information and mainly comprises three parts: json can provide index information of all playing sections and interaction sections, and is unique in a script file set; interaction zone script files (interaction + number-para.json) can be matched with interaction zone component style files to realize dynamic configuration of interaction component contents, and one interaction zone corresponds to one script file and can be multiple; the resource file may provide image resources used by the interaction interval script. The resource file mainly represents a static resource used by the interactive video, and the relationship between the index script file and the interaction interval script file can be represented as shown in fig. 3.
Fig. 3 schematically illustrates an index script file and an interaction interval script file corresponding to an existing interactive video according to an embodiment of the present disclosure, and referring to fig. 3, a video list 302(playBlockList) and an interaction list 303 (interactiblocklist) are maintained in an index script file 301(index. json), each interaction in the interaction list 303 (interactiblocklist) is associated to a video corresponding to the video list 302(playBlockList) through an interaction identification blockId, a basic relationship between a current interaction and the video is recorded, for example, at which time point of the video the interaction starts, how long the interaction lasts, conditions of interaction display, and the like, and simultaneously, a style of the interaction and an interaction content are recorded in a corresponding file, for example, an interaction interval script file 001 (interaction 001-para. json)306, and this file 306 describes detailed information of the interaction with an id of the interaction 001, including a title text button, a text button response event, and the like, for example, in fig. 3, a text element 307 (meas), a picture element 308(imgs), a button element 309(btns), and a control element 310 (ctrl) represent processes for describing an interactive style.
The mode used in the related scheme is that a plurality of sets of interactive component style files UIInfo are formulated. Referring to fig. 4, a UI0001 represents one style of branching storyline capability: the style specifies 1 text element metas (title), 3 image elements imgs (background image before button selection, background image after button selection, component background image), 1 to 5 button elements btns, and 1 custom element (countdown) for the style. The interactive style is specified clearly, and only the information such as the document information, the interactive response information, the countdown time and the like needs to be filled in the interactive interval script file interactive 001-para.
However, the related art maintains the relationship among the video, the interaction, the video relationship and the interaction style, and maintains the relationship in an associated manner. In this way, each item of content is independent, and how much video interaction is clearly seen from the file, but the data structure organization method has many defects: for example, readability is poor, the interactive configuration of items for viewing videos needs to be switched among a plurality of configuration files; the method is not friendly to the program, because when the actual program runs, the video is played first, and then the interaction used by the video is found, in the prior art, each time the interaction is found, the whole interaction configuration used by the current video can be found only by inquiring aiming at the full amount of an interaction list (interactBlockList); only one layer of incidence relation between videos and interactions is needed, the structural encapsulation of the videos is lacked, the videos and the interactions are displayed through a large plot story line, and the combination, the arrangement and the packaging of the scenarios of a creator cannot be met; and the style is limited too much, the component style cannot be modified after being defined, if the current component template style is finely adjusted, the current scheme needs to copy one for modification and rename the one into a new component style file name to use.
Based on one or more of the problems described above, the present example embodiment provides an interactive video playing method. The interactive video playing method may be applied to the server 105, and may also be applied to one or more of the terminal devices 101, 102, and 103, which is not particularly limited in this exemplary embodiment. Referring to fig. 5, the interactive video playing method may include the following steps S510 to S540:
step S510, responding to the interactive video played on the graphical user interface, and acquiring a first chapter video corresponding to the interactive video;
step S520, acquiring interaction configuration information bound with the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time;
step S530, when the playing time of the first chapter video is matched with the triggering time, rendering an interactive control on the graphical user interface according to the interactive configuration information;
and S540, responding to the selection of the interactive control, acquiring a second chapter video corresponding to the interactive control and playing the second chapter video so as to finish playing the interactive video.
In the interactive video playing method provided by the exemplary embodiment, on one hand, the interactive video is divided into the first section of video and the second section of video, so that the readability of the data structure corresponding to the interactive video is improved, and meanwhile, the interactive video is used as a further package for the video and the interactive relation, so that the interactive video playing method is convenient for the user to use and expand, and the use experience of the user is improved; on the other hand, when the first chapter of video is played, the interaction configuration information bound with the first chapter of video is directly obtained, the interaction configuration information corresponding to the first chapter of video does not need to be inquired in an interaction configuration list in a full-scale mode, and the working efficiency of the system is improved.
The above steps of the present exemplary embodiment will be described in more detail below.
In step S510, in response to playing an interactive video on the gui, a first chapter video corresponding to the interactive video is obtained.
In an example embodiment of the present disclosure, the interactive video may refer to a capability provided in the video and enabling a user to interact with the video, for example, the interaction in the interactive video may include a branching scenario, a view switching, an interactive bubble, and the like, which is not particularly limited in this example embodiment. The first chapter video may refer to a video that is not before interaction and is corresponding to each chapter divided according to the trend of the scenario when the interactive video is produced, for example, the first chapter video may be a video that replaces the scenario background in the first chapter of the interactive video, or may refer to a guide video that is not before interaction in the interactive video, and of course, the first chapter video may also be other video resources, which is not particularly limited in this example embodiment.
Specifically, in response to the interactive video being played on the graphical user interface, a chapter list corresponding to the interactive video is obtained; and acquiring a first chapter video corresponding to the interactive video according to the chapter list. The chapter list may be a list formed by dividing an interactive video into a plurality of chapters according to a plot trend or content of the interactive video when a user creates the interactive video. Because the type of the interactive video can relate to various types such as TV dramas, movies and fantasy, a one-layer encapsulation relation can be added to the video and the interactive relation by setting the chapter list for the interactive video, the requirements of multiple periods, multiple collections, chapter-by-chapter pay unlocking and the like of the interactive video by a user are met, the use and the expansion of the user are facilitated, and the use experience of the user is improved.
Specifically, identification information of a first chapter video corresponding to the interactive video is determined according to the chapter list; and extracting the corresponding first section video in the video list through the identification information. The identification information may refer to an identifier corresponding to the first chapter of video, for example, the identification information may be a video name (ID) or coding information corresponding to the first chapter of video, and of course, the identification information may also be other information capable of uniquely identifying the first chapter of video, which is not particularly limited in this example embodiment. The video list may be a list for recording all video resources corresponding to the interactive video, find a corresponding address link in the video list according to the identification information, and obtain a corresponding first chapter video from a resource file corresponding to the interactive video according to the address link. According to the chapter list and the video list, the video resources in the resource file are managed, the working efficiency of the system for acquiring the video resources to be played when the interactive video is played can be improved, the interactive video loading time is shortened, and the use experience of a user is improved.
In step S520, the interaction configuration information bound to the first chapter video is obtained through the identification information of the first chapter video.
In an example embodiment of the present disclosure, the interactive configuration information may refer to configuration information corresponding to an interaction occurring in a video set when a user makes the interactive video, for example, the interactive configuration information may be configuration information corresponding to an option of a branching scenario in the interactive video, or configuration information corresponding to an interaction in the interactive video that needs to be subjected to view angle switching, and of course, the interactive configuration information may also be configuration information corresponding to an interactive bubble in the interactive video, which is not particularly limited in this example embodiment. The interactive configuration information can be bound with the corresponding video resources in the data structure, when the interactive video is played, the interactive configuration information can be quickly acquired according to the identification information corresponding to the currently played video, and the configuration information corresponding to the video is acquired without being fully inquired in the configuration list, so that the acquisition speed of the interactive configuration information can be increased, the loading efficiency of the interactive video is improved, and the use experience of a user is improved.
Further, before the interactive configuration information bound to the first chapter of video is acquired, that is, when the user produces the interactive video, the interactive configuration information of the user on the first chapter of video (or other chapter of video) in the interactive video is acquired, the interactive configuration information is bound with the first chapter of video (or other chapter of video), and the first chapter of video (or other chapter of video) is stored in a video list (associated resource file).
In step S530, when the playing time of the first chapter of video matches the trigger time, rendering an interactive control on the gui according to the interactive configuration information.
In an example embodiment of the present disclosure, the interaction configuration information may include a trigger time, where the trigger time may refer to a time when a producer interacts with the interactive video when the interactive video is played, for example, the trigger time may be a time point when a scenario in the interactive video changes in a branch, or a time point when a video in the interactive video switches a view angle, and of course, the trigger time may also be a time point when an interactive bubble in the interactive video appears, which is not particularly limited in this example embodiment. When the playing time of the first chapter of video is matched with the triggering time, the point is considered as a time point when the interactive control or the interactive content can appear in the first chapter of video so that the user can interact with the interactive video. The interactive control may be an interactive control rendered in a graphical user interface (interactive video) and operable by a user, for example, the interactive control may be an option for controlling a trend of a scenario, or may be a button for controlling a view in the interactive video to switch, and of course, the interactive control may also be an interactive bubble in the interactive video, which is not limited in this example embodiment.
Specifically, element information in the interactive configuration information is obtained; and rendering the interactive control and the interactive content on the graphical user interface according to the element information. The element information may refer to a parameter type provided in the interactive configuration template for configuring the interactive configuration information, for example, the element information may be a text element (meta) or a picture element (imgs), and of course, the element information may also be a button element (btns) and a control element (ctrl), which is not limited in this example embodiment. The content attribute of each element information is composed of a specified parameter configuration, such as a name of an element, an attribute value, style information, a response event, an occurrence condition, and the like, and the present exemplary embodiment is not limited thereto. The interactive content may refer to background content and the like in the interactive video generated when the interactive control is provided.
Further, a preset interactive configuration template is obtained, configuration parameters corresponding to the interactive configuration template are determined, and element information is generated according to the configuration parameters. The interactive configuration template may refer to a configuration template that is provided by a developer in advance and enables a user to set an interactive video. The user may configure the constituent element information by configuring the specified parameter in the interactive configuration template, so as to generate the interactive configuration information, for example, the specified parameter may refer to a name, an attribute value, style information, a response event, an occurrence condition, and the like of the element, which is not limited in this example embodiment.
Furthermore, modification data of the interactive configuration template is obtained, and the interactive configuration template is modified according to the modification data to generate a user-defined interactive configuration template. The modification data may be data that a user performs a custom adjustment based on a preset interactive configuration template, and the preset interactive configuration template may be adjusted according to the modification data to generate a custom interactive configuration template that is adapted to an interactive video of the user. The corresponding user-defined interactive configuration template is generated through the modification data of the user, the user can make an expected interactive video, the extensibility of the interactive video making is improved, and the use experience of the user is improved.
In step S540, in response to the selection of the interactive control, a second chapter video corresponding to the interactive control is acquired and played to complete the playing of the interactive video.
In an example embodiment of the present disclosure, the responding to the selection of the interactive control may refer to the user or the viewer triggering an operation of the interactive control provided by the interactive video, for example, the operation may be an operation of the user selecting the interactive control through touch by a terminal device, or an operation of triggering the interactive control through a computer cursor, which is not limited in this example embodiment. The second chapter of video may be a video corresponding to the next scenario trend triggered by the user through the interactive control in the first chapter of video, for example, an option a and an option B are provided in the first chapter of video for the user to select, and when the user selects the option B, the second chapter of video corresponding to the option B is obtained.
It should be noted that "first" and "second" in this exemplary embodiment are performed to distinguish video resources (files) corresponding to an interactive node before and after (or different chapters), and this exemplary embodiment should not be limited in any way.
Fig. 6 schematically shows a schematic diagram of an interactive video correspondence protocol file according to an embodiment of the present disclosure.
Referring to fig. 6, the interactive video corresponds to a chapter list 602(chapterList) in a protocol file 601 for giving the creator a plot diversity and a plot organization plot. If there is actually only one chapter, only one default chapter needs to be provided. In the chapter list 602(chapterList), there is a video ID list array, where this field is used to record the video content information used by this chapter, and the first video ID of this field array is the video content entry of this chapter. The video content is maintained by a video list 603(videoList) which records a list of all video content used by the entire interactive video. The video content ID is provided as a unique identifier to the chapter list 602 (chapterList). The interactive list 604(interactList) is recorded in association with each item in the video list (videoList), that is, each video configuration interactive content is directly mounted under the video information, and one video can configure a plurality of interactive contents. The interactive configuration 605(interactInfo) locates the used interactive template by ID, and can define an interactive content by configuring the information of specific elements, where the element information includes text elements (meta), picture elements (imgs), button elements (btns), and control elements (ctrl). In the definition of interactive template 606(template), each template configuration is composed of four element types (meta, imgs, btns, ctrl), and each element content attribute is composed of a specified parameter configuration 607(template base): template name, prop, type, action, show dynamic display showVariableExpress. These parameters collectively describe the name, attribute value, style information, response event, and occurrence condition of an element. The custom component template 608(custom interacttemplate) means that an author can make some customized modifications according to an existing module, including modifying the style information of the component, so that a base mechanism in the interactive template 606(template) can be used to base an existing template, thereby meeting the requirement of rewriting and expanding a custom template.
Interactive configuration, which is generally not written manually due to its particularity and complexity but generated by a program, and after finally falling into an interactive configuration file, the accuracy of the interactive configuration may need to be checked manually, which requires that the configuration file has certain readability; actually using a program (interactive player) to the interactive configuration, wherein the program operation requires speed and performance, and the shortest path is required to search the video and the corresponding interaction, and then executing and rendering to achieve optimal performance and efficiency; the interactive type can relate to various types such as TV dramas, movies, heddles and the like, and the requirements of multiple periods, multiple sets and multiple pay unlocking can be met, and a layer of packaging relation needs to be added to the video and interactive relation, so that the use and the expansion of an author are facilitated; the interaction requirements are various, the interaction limit of the author, particularly the style limit, cannot be restricted, if some buttons are required to be in specified positions, the imagination of the author is probably restricted, and therefore high friendly expansibility of the interaction style is needed, no restriction is made, and data can be well organized.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, in this example embodiment, an interactive video playing device is also provided. The interactive video playing device can be applied to a server or terminal equipment. Referring to fig. 7, the interactive video playing apparatus 700 may include a chapter video acquiring module 710, an interactive configuration information acquiring module 720, an interactive control rendering module 730, and an interactive video playing module 740. Wherein:
the chapter video acquisition module 710 is configured to respond to an interactive video played on the graphical user interface and acquire a first chapter video corresponding to the interactive video;
the interactive configuration information obtaining module 720 is configured to obtain interactive configuration information bound to the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time;
the interactive control rendering module 730 is configured to render an interactive control on the graphical user interface through the interactive configuration information when the first chapter video playing time matches the trigger time;
the interactive video playing module 740 is configured to respond to the selection of the interactive control, acquire and play a second chapter video corresponding to the interactive control to complete playing of the interactive video.
In an exemplary embodiment of the present disclosure, the chapter video acquisition module 710 further includes:
the chapter list acquisition unit is used for responding to the interactive video played on the graphical user interface and acquiring a chapter list corresponding to the interactive video;
and the first chapter video acquisition unit is used for acquiring a first chapter video corresponding to the interactive video according to the chapter list.
In an exemplary embodiment of the present disclosure, the first chapter video acquisition unit is further configured to:
determining identification information of a first chapter video corresponding to the interactive video according to the chapter list;
and extracting the corresponding first chapter video from the video list through the identification information.
In an exemplary embodiment of the present disclosure, the interactive video playing apparatus 700 further includes an interactive configuration information binding unit, and the interactive configuration information binding unit is configured to:
acquiring interaction configuration information of the first chapter video in the interaction video;
and binding the interaction configuration information with the first chapter video, and storing the first chapter video into a video list.
In an exemplary embodiment of the disclosure, the interactive control rendering module 730 is further configured to:
acquiring element information in the interactive configuration information; wherein the element information includes one or more of a text element, a picture element, a button element, and a control element;
and rendering an interactive control and interactive content on the graphical user interface according to the element information.
In an exemplary embodiment of the present disclosure, the interactive video playback device 700 further includes an element information generating unit configured to:
acquiring a preset interactive configuration template, and determining configuration parameters corresponding to the interactive configuration template;
and generating the element information according to the configuration parameters.
In an exemplary embodiment of the present disclosure, the interactive video playing apparatus 700 further includes a custom interactive configuration template generating unit, and the custom interactive configuration template generating unit is configured to:
and acquiring modification data of the interactive configuration template, and modifying the interactive configuration template according to the modification data to generate a user-defined interactive configuration template.
The specific details of each module or unit in the interactive video playing apparatus have been described in detail in the corresponding interactive video playing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An interactive video playing method is applied to terminal equipment with a graphical user interface, and the method comprises the following steps:
responding to the interactive video played on the graphical user interface, and acquiring a first chapter video corresponding to the interactive video;
acquiring interaction configuration information bound to the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time;
when the first chapter video playing time is matched with the triggering time, rendering an interactive control on the graphical user interface according to the interactive configuration information;
and responding to the selection of the interactive control, and acquiring and playing a second chapter video corresponding to the interactive control to finish the playing of the interactive video.
2. The interactive video playing method according to claim 1, wherein the interactive video playing method further includes a chapter list, and the obtaining a first chapter video corresponding to the interactive video in response to playing the interactive video on the graphical user interface includes:
responding to the interactive video played on the graphical user interface, and acquiring a chapter list corresponding to the interactive video;
and acquiring a first chapter video corresponding to the interactive video according to the chapter list.
3. The interactive video playing method according to claim 2, wherein the interactive video playing method further includes a video list, and the obtaining the first chapter video corresponding to the interactive video according to the chapter list includes:
determining identification information of a first chapter video corresponding to the interactive video according to the chapter list;
and extracting the corresponding first chapter video from the video list through the identification information.
4. The interactive video playing method according to claim 1, wherein before obtaining the interactive configuration information of the first chapter video binding, the method further comprises:
acquiring interaction configuration information of the first chapter video in the interaction video;
and binding the interaction configuration information with the first chapter video, and storing the first chapter video into a video list.
5. The interactive video playing method of claim 1, wherein rendering an interactive control on the gui according to the interactive configuration information comprises:
acquiring element information in the interactive configuration information; wherein the element information includes one or more of a text element, a picture element, a button element, and a control element;
and rendering an interactive control and interactive content on the graphical user interface according to the element information.
6. The interactive video playing method of claim 5, wherein before rendering the interactive control and the interactive content on the graphical user interface according to the element information, the method further comprises:
acquiring a preset interactive configuration template, and determining configuration parameters corresponding to the interactive configuration template;
and generating the element information according to the configuration parameters.
7. The interactive video playback method of claim 6, further comprising:
and acquiring modification data of the interactive configuration template, and modifying the interactive configuration template according to the modification data to generate a user-defined interactive configuration template.
8. An interactive video playback device, comprising:
the chapter video acquisition module is used for responding to the interactive video played on the graphical user interface and acquiring a first chapter video corresponding to the interactive video;
the interactive configuration information acquisition module is used for acquiring interactive configuration information bound to the first chapter video according to the identification information of the first chapter video; wherein the interaction configuration information comprises a trigger time;
the interactive control rendering module is used for rendering an interactive control on the graphical user interface through the interactive configuration information when the first chapter video playing time is matched with the triggering time;
and the interactive video playing module is used for responding to the selection of the interactive control, acquiring a second chapter video corresponding to the interactive control and playing the second chapter video so as to finish playing the interactive video.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-7 via execution of the executable instructions.
CN201910979613.7A 2019-10-15 2019-10-15 Interactive video playing method and device, storage medium and electronic equipment Active CN110784753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910979613.7A CN110784753B (en) 2019-10-15 2019-10-15 Interactive video playing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910979613.7A CN110784753B (en) 2019-10-15 2019-10-15 Interactive video playing method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110784753A true CN110784753A (en) 2020-02-11
CN110784753B CN110784753B (en) 2023-01-17

Family

ID=69385664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910979613.7A Active CN110784753B (en) 2019-10-15 2019-10-15 Interactive video playing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110784753B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111818371A (en) * 2020-07-17 2020-10-23 腾讯科技(深圳)有限公司 Interactive video management method and related device
CN111918140A (en) * 2020-08-06 2020-11-10 腾讯科技(深圳)有限公司 Video playing control method and device, computer equipment and storage medium
CN114125501A (en) * 2021-10-30 2022-03-01 杭州当虹科技股份有限公司 Interactive video generation method and playing method and device thereof
WO2022062771A1 (en) * 2020-09-27 2022-03-31 北京达佳互联信息技术有限公司 Livestreaming room data exchange method and apparatus

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040210947A1 (en) * 2003-04-15 2004-10-21 Shusman Chad W. Method and apparatus for interactive video on demand
CN104768083A (en) * 2015-04-07 2015-07-08 无锡天脉聚源传媒科技有限公司 Video playing method and device achieving chapter content display
CN105228015A (en) * 2015-09-30 2016-01-06 天脉聚源(北京)科技有限公司 A kind of method and device configuring TV interaction systems guidance information
CN105471871A (en) * 2015-11-26 2016-04-06 传线网络科技(上海)有限公司 Method and apparatus for providing video clip set
US20170168697A1 (en) * 2015-12-09 2017-06-15 Shahar SHPALTER Systems and methods for playing videos
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN108769814A (en) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 Video interaction method, device and readable medium
CN108924584A (en) * 2018-05-30 2018-11-30 互影科技(北京)有限公司 The packaging method and device of interactive video
CN109597981A (en) * 2017-09-30 2019-04-09 腾讯科技(深圳)有限公司 A kind of methods of exhibiting, device and the storage medium of text interactive information
CN109982142A (en) * 2017-12-28 2019-07-05 优酷网络技术(北京)有限公司 Video broadcasting method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040210947A1 (en) * 2003-04-15 2004-10-21 Shusman Chad W. Method and apparatus for interactive video on demand
CN104768083A (en) * 2015-04-07 2015-07-08 无锡天脉聚源传媒科技有限公司 Video playing method and device achieving chapter content display
CN105228015A (en) * 2015-09-30 2016-01-06 天脉聚源(北京)科技有限公司 A kind of method and device configuring TV interaction systems guidance information
CN105471871A (en) * 2015-11-26 2016-04-06 传线网络科技(上海)有限公司 Method and apparatus for providing video clip set
US20170168697A1 (en) * 2015-12-09 2017-06-15 Shahar SHPALTER Systems and methods for playing videos
CN109597981A (en) * 2017-09-30 2019-04-09 腾讯科技(深圳)有限公司 A kind of methods of exhibiting, device and the storage medium of text interactive information
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN109982142A (en) * 2017-12-28 2019-07-05 优酷网络技术(北京)有限公司 Video broadcasting method and device
CN108924584A (en) * 2018-05-30 2018-11-30 互影科技(北京)有限公司 The packaging method and device of interactive video
CN108769814A (en) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 Video interaction method, device and readable medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111818371A (en) * 2020-07-17 2020-10-23 腾讯科技(深圳)有限公司 Interactive video management method and related device
CN111818371B (en) * 2020-07-17 2021-12-24 腾讯科技(深圳)有限公司 Interactive video management method and related device
CN111918140A (en) * 2020-08-06 2020-11-10 腾讯科技(深圳)有限公司 Video playing control method and device, computer equipment and storage medium
WO2022062771A1 (en) * 2020-09-27 2022-03-31 北京达佳互联信息技术有限公司 Livestreaming room data exchange method and apparatus
CN114125501A (en) * 2021-10-30 2022-03-01 杭州当虹科技股份有限公司 Interactive video generation method and playing method and device thereof

Also Published As

Publication number Publication date
CN110784753B (en) 2023-01-17

Similar Documents

Publication Publication Date Title
CN110784753B (en) Interactive video playing method and device, storage medium and electronic equipment
CN111294663B (en) Bullet screen processing method and device, electronic equipment and computer readable storage medium
CN109640188B (en) Video preview method and device, electronic equipment and computer readable storage medium
EP3457295A2 (en) Method for recording, editing and reproduction of computer session
US11899907B2 (en) Method, apparatus and device for displaying followed user information, and storage medium
CN113190314B (en) Interactive content generation method and device, storage medium and electronic equipment
CN107908401B (en) Multimedia file making method based on Unity engine
CN111741367B (en) Video interaction method and device, electronic equipment and computer readable storage medium
WO2020220773A1 (en) Method and apparatus for displaying picture preview information, electronic device and computer-readable storage medium
US20220310125A1 (en) Method and apparatus for video production, device and storage medium
CN112165652B (en) Video processing method, device, equipment and computer readable storage medium
US11941728B2 (en) Previewing method and apparatus for effect application, and device, and storage medium
JP2023539815A (en) Minutes interaction methods, devices, equipment and media
KR20190131074A (en) Virtual scene display method and device, and storage medium
US20170185422A1 (en) Method and system for generating and controlling composite user interface control
US20200066304A1 (en) Device-specific video customization
US11838576B2 (en) Video distribution system, method, computing device and user equipment
CN113014985A (en) Interactive multimedia content processing method and device, electronic equipment and storage medium
CN112015927B (en) Method and device for editing multimedia file, electronic equipment and storage medium
CN113365010A (en) Volume adjusting method, device, equipment and storage medium
CN104994429A (en) Video playing method and device
US20230326488A1 (en) Content creation based on text-to-image generation
CN113392260B (en) Interface display control method, device, medium and electronic equipment
KR102545040B1 (en) Video playback methods, devices, electronic devices, storage media and computer program products
AU2020288833B2 (en) Techniques for text rendering using font patching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant