CN112437352B - Video merging and playing method and device - Google Patents

Video merging and playing method and device Download PDF

Info

Publication number
CN112437352B
CN112437352B CN202011253156.2A CN202011253156A CN112437352B CN 112437352 B CN112437352 B CN 112437352B CN 202011253156 A CN202011253156 A CN 202011253156A CN 112437352 B CN112437352 B CN 112437352B
Authority
CN
China
Prior art keywords
video
control instruction
playing
user
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011253156.2A
Other languages
Chinese (zh)
Other versions
CN112437352A (en
Inventor
崔英林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lianshang Xinchang Network Technology Co Ltd
Original Assignee
Lianshang Xinchang Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lianshang Xinchang Network Technology Co Ltd filed Critical Lianshang Xinchang Network Technology Co Ltd
Priority to CN202011253156.2A priority Critical patent/CN112437352B/en
Publication of CN112437352A publication Critical patent/CN112437352A/en
Application granted granted Critical
Publication of CN112437352B publication Critical patent/CN112437352B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4825End-user interface for program selection using a list of items to be played back in a given order, e.g. playlists

Abstract

The embodiment of the application discloses a video merging and playing method and device. One embodiment of the method comprises: caching a video list in a downloading and caching mode, and reserving a caching space for the un-downloaded part of the video in the video list; acquiring a control instruction corresponding to a video in a video list; inserting a corresponding control instruction behind the video in the video list; and in response to the triggering control instruction, controlling the playing state of the corresponding video based on the control instruction. The embodiment improves the video switching efficiency through proper memory management and instruction management, and solves the video blocking problem.

Description

Video merging and playing method and device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a video merging and playing method and device.
Background
With the rapid development of the mobile internet, various applications are emerging. Among them, short video applications take up a large part. Currently, there is frequent video switching during the use of short video applications by users. And video switching consumes a lot, especially the cache delay on the network is bigger. For this case, a common solution is to cache the video locally, save to disk, wait until the local video is loaded by the player at the switch.
Disclosure of Invention
The embodiment of the application provides a video merging and playing method and device.
In a first aspect, an embodiment of the present application provides a video merging and playing method, including: caching the video list in a downloading and caching mode, and reserving a caching space for the un-downloaded part of the video in the video list; acquiring a control instruction corresponding to a video in a video list; inserting a corresponding control instruction behind the video in the video list; and in response to the triggering control instruction, controlling the playing state of the corresponding video based on the control instruction.
In some embodiments, the method further comprises: and if the corresponding video is played, continuing to control the playing state of the subsequent video based on the control instruction.
In some embodiments, the control instructions include at least one of the following types: the method comprises a playing instruction and a user control instruction, wherein the playing instruction is generated by an instruction server and is triggered when the video playing is finished, and the playing instruction comprises at least one of the following items: replaying, playing the next piece and playing the previous piece, wherein the user control instruction is generated based on user operation and is triggered immediately, and the user control instruction comprises at least one of the following items: fast forward, fast backward, slow forward, slow backward.
In some embodiments, obtaining the control instruction corresponding to the video in the video list includes: displaying a currently played video on a user interface; in response to detecting the user operation of the user on the user interface, sending data of the user operation to the instruction controller; and receiving a user control instruction generated by the instruction controller based on the data operated by the user.
In some embodiments, in response to triggering the control instruction, controlling the playing state of the corresponding video based on the control instruction includes: and if the user operation occurs in one video, controlling the playing state of the video based on the user control instruction.
In some embodiments, in response to triggering the control instruction, controlling the playing state of the corresponding video based on the control instruction further includes: if the user operation occurs between the at least two videos, controlling the playing state of one of the at least two videos based on the user control instruction; and continuing to control the playing state of the subsequent video based on the control instruction, including: and if one video is played, skipping to play the next video of the at least two videos, and continuously controlling the play state of the next video of the at least two videos based on the user control instruction.
In some embodiments, the videos in the video list are pre-formatted to match the player of the client.
In some embodiments, the buffer space of the video in the video list is determined by the size of the video in the video list.
In a second aspect, an embodiment of the present application provides a video merging and playing apparatus, including: the buffer unit is configured to buffer the video list in a downloading-while-buffering mode and reserve a buffer space for the un-downloaded part of the video in the video list; the acquisition unit is configured to acquire a control instruction corresponding to a video in the video list; the inserting unit is configured to insert corresponding control instructions after the videos in the video list; and the first control unit is configured to play the video lists in sequence and respond to the trigger control instruction, and controls the playing state of the corresponding video based on the control instruction.
In some embodiments, the apparatus further comprises: and the second control unit is configured to continue to control the playing state of the subsequent video based on the control instruction if the playing of the corresponding video is finished.
In some embodiments, the control instructions include at least one of the following types: the method comprises a playing instruction and a user control instruction, wherein the playing instruction is generated by an instruction server and is triggered when the video playing is finished, and the playing instruction comprises at least one of the following items: replaying, playing the next piece and playing the previous piece, wherein the user control instruction is generated based on user operation and is triggered immediately, and the user control instruction comprises at least one of the following items: fast forward, fast backward, slow forward, slow backward.
In some embodiments, the obtaining unit is further configured to: displaying a currently played video on a user interface; in response to detecting the user operation of the user on the user interface, sending data of the user operation to the instruction controller; and receiving a user control instruction generated by the instruction controller based on the data operated by the user.
In some embodiments, the first control unit is further configured to: and if the user operation occurs in one video, controlling the playing state of the one video based on the user control instruction.
In some embodiments, the first control unit is further configured to: if the user operation occurs between the at least two videos, controlling the playing state of one of the at least two videos based on the user control instruction; and the second control unit is further configured to: and if one video is played, skipping to play the next video of the at least two videos, and continuously controlling the play state of the next video of the at least two videos based on the user control instruction.
In some embodiments, the videos in the video list are pre-formatted to match the player of the client.
In some embodiments, the buffer space of the video in the video list is determined by the size of the video in the video list.
In a third aspect, an embodiment of the present application provides a computer device, including: one or more processors; a storage device having one or more programs stored thereon; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, which, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the video merging and playing method and device provided by the embodiment of the application, firstly, a video list is cached in a downloading and caching mode, and a cache space is reserved for the un-downloaded part of a video in the video list; then acquiring a control instruction corresponding to the video in the video list; then inserting a corresponding control instruction behind the video in the video list; and finally, playing the video lists in sequence, responding to the trigger control instruction, and controlling the playing state of the corresponding video based on the control instruction. The video switching efficiency is improved through proper memory management and instruction management, and the video blocking problem is solved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 is an exemplary system architecture to which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a video merge-play method according to the present application;
FIG. 3 is a flow diagram of yet another embodiment of a video merge-play method according to the present application;
fig. 4 is a scene diagram of a video merge play method that can be applied to the present application;
FIG. 5 is a schematic illustration of a video list;
FIG. 6 is a schematic diagram of command control between videos;
FIG. 7 is a block diagram of a computer system suitable for use to implement the computer device of an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the video merge-play method of the present application may be applied.
As shown in fig. 1, devices 101, 102 and network 103 may be included in system architecture 100. Network 103 is the medium used to provide communication links between devices 101, 102. Network 103 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The devices 101, 102 may be hardware devices or software that support network connectivity to provide various network services. When the device is hardware, it can be a variety of electronic devices including, but not limited to, smart phones, tablets, laptop portable computers, desktop computers, servers, and the like. In this case, the hardware device may be implemented as a distributed device group including a plurality of devices, or may be implemented as a single device. When the device is software, the software can be installed in the electronic devices listed above. At this time, as software, it may be implemented as a plurality of software or software modules for providing a distributed service, for example, or as a single software or software module. And is not particularly limited herein.
In practice, a device may provide a respective network service by installing a respective client application or server application. After the device has installed the client application, it may be embodied as a client in network communications. Accordingly, after the server application is installed, it may be embodied in network communications as a server.
As an example, in fig. 1, device 101 is embodied as a client and device 102 is embodied as a server. For example, device 101 may be a client of a short video application and device 102 may be a server of a short video application.
It should be noted that the video merge playing method provided in the embodiment of the present application may be executed by the device 101.
It should be understood that the number of networks and devices in fig. 1 is merely illustrative. There may be any number of networks and devices, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a video merge-play method according to the present application is shown. The video merging and playing method comprises the following steps:
step 201, caching the video list in a manner of downloading and caching simultaneously, and reserving a cache space for the un-downloaded part of the video in the video list.
In this embodiment, when a user opens a short video application, a client of the short video application (e.g., device 101 shown in fig. 1) may send a video list request to a server of the short video application (e.g., device 102 shown in fig. 1). The server can push a video list comprising but not limited to videos which are favorite of the user or published by friends of the user to the client. Then, the client can download and cache the video list from the server in a downloading-while-caching manner.
Typically, the videos in the client-cached video list are not complete videos, but only contain a portion of the videos. And the client reserves a cache space for the un-downloaded part of the video in the video list so as to facilitate the subsequent downloading and use of the video. In this way, the broadcast start state can be reached as soon as possible. Wherein the buffer space can be used for buffering the video. To avoid client cache waste, the size of the cache space may be determined based on the size of the video. Of course, in the case that the client has sufficient cache, the size of the largest video in the video list may also be determined, and a cache space not smaller than the size of the largest video is reserved for each video.
In some embodiments, the format of the Video stored in the server of the short Video application is various, including but not limited to MP4 (Moving Picture Experts Group Audio Layer IV, moving Picture Experts Group Audio Layer 4), FLV (Flash Video, streaming media format), and so on. If the video list is directly downloaded from the server, the player of the client not only needs to support playing of videos in multiple formats, but also needs to switch among videos in different formats, which increases the time consumption for switching videos. Therefore, the server can perform unified format conversion on the videos in the video list. E.g. converted to a unified format video via a transcoder. And the specific converted format needs to be matched with the player of the client. The format supported by the player of the client is the format matched with the player of the client. For example, the player of the client supports MP4, and can be converted into MP4 uniformly. For example, if the player of the client supports both MP4 and FLV, the player may be converted into MP4 or FLV.
Step 202, acquiring a control instruction corresponding to a video in the video list.
In this embodiment, each video in the video list may insert a control instruction. The client can obtain a control instruction corresponding to each video in the video list.
Generally, the control command may indicate the playing logic of the video, such as whether to replay or play the next strip after the playing is completed. The control instructions may include, but are not limited to, at least one of the following types: play instructions, user control instructions, and the like. Wherein, the instruction server can generate the playing instruction according to the default playing logic. The play instruction may be triggered at the end of the video play, including but not limited to at least one of: replay, play next, play previous, etc. The command controller may generate the user control command based on a user operation (e.g., seek operation). The user control instructions may be triggered instantaneously, including but not limited to at least one of: fast forward, fast reverse, slow forward, slow reverse, and so on.
And step 203, inserting a corresponding control instruction behind the video in the video list.
In this embodiment, for the control instruction corresponding to each video in the video list, the client may insert it after the corresponding video.
And step 204, playing the video lists in sequence, responding to the trigger control instruction, and controlling the playing state of the corresponding video based on the control instruction.
In this embodiment, the video manager of the client may sequentially deliver the videos in the cached video list to the player of the client. In this way, the player can play the video list in order to have the videos displayed at the user interface of the client. And when the control instruction is triggered, controlling the playing state of the corresponding video based on the control instruction. Wherein, for the playing instruction, when the playing instruction is played, the playing state of the corresponding video is controlled, such as replaying, playing the next piece, playing the previous piece, and so on. For the user control instruction, when the user control instruction is inserted into the current video, the playing state of the corresponding video is controlled, such as fast forward, fast backward, dragging the playing progress and the like.
In some embodiments, the control instruction may control the playing state of the subsequent video in addition to the playing state of the corresponding video. Therefore, if the corresponding video is played, the playing state of the subsequent video is continuously controlled based on the control instruction. For example, if the user controls the current video to play at a double speed, after the current video is played, the subsequent video is played at the double speed.
According to the video merging and playing method provided by the embodiment of the application, firstly, a video list is cached in a downloading and caching mode, and a caching space is reserved for the un-downloaded part of a video in the video list; then acquiring a control instruction corresponding to the video in the video list; then inserting a corresponding control instruction behind the video in the video list; and finally, playing the video lists in sequence, responding to the trigger control instruction, and controlling the playing state of the corresponding video based on the control instruction. The video switching efficiency is improved through proper memory management and instruction management, and the video blocking problem is solved.
With further reference to fig. 3, shown is a flow 300 of yet another embodiment of a video merge-play method according to the present application. The video merging and playing method comprises the following steps:
step 301, caching the video list in a manner of downloading and caching simultaneously, and reserving a cache space for the un-downloaded part of the video in the video list.
In this embodiment, the specific operation of step 301 has been described in detail in step 201 in the embodiment shown in fig. 2, and is not described again here.
Step 302, displaying the currently played video on the user interface.
In this embodiment, when the player plays a current video in the video list, the currently played video may be displayed on the user interface.
Step 303, in response to detecting the user operation of the user on the user interface, sending data of the user operation to the instruction controller.
In this embodiment, the user may perform a user operation on the current video on the user interface. The client can collect data operated by the user and send the data to the instruction controller.
The combination of the control instruction and the code stream is adopted, so that the situation that a user can switch the video only by turning pages up and down is avoided. The user can optionally define the user operation to correspond to the user control instruction. When user operation is executed, a corresponding user control instruction is inserted into the video and is analyzed by the player. For example, a video switching instruction corresponding to a click operation may be defined. And clicking the user interface by the user to generate data, and transmitting the data to the command controller to generate a video switching command. The video switching instruction is directly inserted into the end of the current video. When the player plays the command, the player immediately executes the video switching command, and directly loads and jumps to the next video head to start playing. The interface is not switched at all, and the operation of the user interface is saved.
Step 304, receiving a user control command generated by the command controller based on the data of the user operation.
In this embodiment, the instruction controller may generate a user control instruction based on data of a user operation, and send the user control instruction to the client.
And 305, inserting a corresponding control instruction behind the video in the video list.
Step 306, the video list is played in order.
In this embodiment, the specific operations of steps 305-306 are already described in detail in steps 203-204 in the embodiment shown in fig. 2, and are not described herein again.
Step 307, whether the user operation occurs within a video.
In this embodiment, during the video playing process, the client may determine whether the detected user operation occurs within one video. If it occurs within one video, step 308 is performed, and if it occurs between at least two videos, step 309 is performed.
And 308, controlling the playing state of a video based on the user control instruction.
In this embodiment, if the user operation occurs in one video, the client may control the playing state of the one video based on the user control instruction. For example, if one video is normally fast-forwarded, only the one video is normally fast-forwarded by being controlled.
Step 309, controlling the playing state of one of the at least two videos based on the user control instruction.
In this embodiment, if the user operation occurs between the at least two videos, the client may control the playing state of one of the at least two videos based on the user control instruction.
And step 310, if the playing of one video is finished, skipping to play the next video of the at least two videos, and continuously controlling the playing state of the next video of the at least two videos based on the user control instruction.
In this embodiment, if one video is played completely, the client may jump to play a next video of the at least two videos, and continue to control a playing state of the next video of the at least two videos based on the user control instruction.
Typically, for user operations to occur between at least two videos, normal user operations are typically controlled to end up at the end of the first video. In the embodiment of the application, control instructions such as jump are supplemented, and when the control is carried out to the end of the first video, the control instructions are triggered to jump to the next video for continuing the control.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 2, the flow 300 of the video merge playing method in the present embodiment highlights the user operation control step. Therefore, according to the scheme described in this embodiment, a user can perform user operation based on a requirement, and generate a corresponding user control instruction to control the playing state of the video. And for the user operation between at least two videos, control instructions such as jump are supplemented, when the control is carried out to the end of the first video, the control instructions are triggered, and the control is continued by jumping to the next video.
For easy understanding, fig. 4 provides an application scenario diagram that can implement the video merge playing method of the present application. As shown in fig. 4, when a user opens a short video application on a client, a video list request is sent to a video server of the short video application. The video server stores a video list consisting of video 1, video 2 and video 3. The video server video transcodes video 1, video 2 and video 3 into a format supported by the player of the client. And the client downloads the video list from the video server in a downloading and caching mode and caches the video list into the video cache. The video in the video cache is not a complete video and contains only a portion of the video. And the video cache reserves a cache space for the un-downloaded part of the video in the video list so as to be convenient for the subsequent downloading of the video. The control instruction corresponding to the video in the video list comes from the instruction controller. The control instruction comprises two types of playing instruction and user control instruction. The playing instruction is generated by the instruction server and is inserted into the video in the video list through the instruction controller. And when the video manager of the client transfers the video list in the video cache to the player for playing, displaying the currently played video on a User Interface (UI). And the user performs user operation on the current video on the user interface. The data of the user operation is sent to the instruction controller. The command controller generates a user control command based on the data operated by the user, and inserts the user control command into the video in the video list. The video manager delivers the videos in the cached video list to the player in order. The player plays the video list in order and the videos are displayed on the user interface. And when the control instruction is triggered, controlling the playing state of the corresponding video based on the control instruction.
Fig. 5 shows a schematic diagram of a video list in a video cache. Video 1 comprises video 1 downloaded data and a video 1 not downloaded reservation after which control instructions are inserted. Similarly, video 2 includes video 2 downloaded data and a video 2 not downloaded reservation, after which control instructions are inserted. Video 2 comprises video 2 downloaded data and a video 2 not downloaded reservation after which control instructions are inserted.
Fig. 5 is a schematic diagram showing command control between videos. If the user action occurs between video 1, video 2 and video 3. When jumping to the next video, analyzing the instruction, searching the video starting address from the addresses of the video 1, the video 2 and the video 3, assembling the instruction to obtain a jumping address, and loading the video by the player according to the jumping address to continue to control the playing state.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use in implementing a computing device (e.g., device 101 shown in FIG. 1) of an embodiment of the present application. The computer device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU) 701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that the computer program read out therefrom is mounted in the storage section 708 as necessary.
In particular, the processes described above with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU) 701, performs the above-described functions defined in the method of the present application.
It should be noted that the computer readable medium of the present application can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or electronic device. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a cache unit, an acquisition unit, an insertion unit, and a first control unit. Where the names of these units do not constitute a limitation on the unit itself in this case, for example, a caching unit may also be described as a "unit that caches a video list in a manner of caching while downloading" and reserves a cache space for an un-downloaded portion of a video in the video list ".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the computer device described in the above embodiments; or may exist separately without being assembled into the computer device. The computer readable medium carries one or more programs which, when executed by the computing device, cause the computing device to: caching the video list in a downloading and caching mode, and reserving a caching space for the un-downloaded part of the video in the video list; acquiring a control instruction corresponding to a video in a video list; inserting a corresponding control instruction behind the video in the video list; and in response to the triggering control instruction, controlling the playing state of the corresponding video based on the control instruction.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements in which any combination of the features described above or their equivalents does not depart from the spirit of the invention disclosed above. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (8)

1. A video merging and playing method comprises the following steps:
caching a video list in a downloading and caching mode, and reserving a caching space for an un-downloaded part of a video in the video list; wherein the video is a short video;
acquiring a control instruction corresponding to a video in the video list, wherein the control instruction at least comprises a playing instruction and a user control instruction, the playing instruction is generated by an instruction server according to default playing logic, and is triggered when the video playing is finished, and the control instruction comprises at least one of the following items: replaying, playing the next strip, playing the previous strip; the user control instruction is generated based on user operation and is triggered immediately, and the user control instruction comprises at least one of the following executed in a double-speed playing mode: fast forward, fast backward, slow forward, slow backward;
inserting a corresponding control instruction behind the video in the video list;
playing the video lists in sequence, and responding to the control instruction triggered, and controlling the playing state of the corresponding videos based on the control instruction;
and if the corresponding video is played, continuously controlling the playing state of the subsequent video based on the control instruction.
2. The method of claim 1, wherein the obtaining of the control instruction corresponding to the video in the video list comprises:
displaying a currently played video on a user interface;
in response to detecting a user operation of a user on the user interface, sending data of the user operation to an instruction controller;
and receiving a user control instruction generated by the instruction controller based on the data of the user operation.
3. The method of claim 2, wherein the controlling, in response to triggering the control instruction, a play state of a corresponding video based on the control instruction comprises:
and if the user operation occurs in one video, controlling the playing state of the video based on the user control instruction.
4. The method of claim 3, wherein the controlling, in response to triggering the control instruction, a play state of a corresponding video based on the control instruction further comprises:
if the user operation occurs between at least two videos, controlling the playing state of one of the at least two videos based on the user control instruction; and
the continuing to control the playing state of the subsequent video based on the control instruction comprises:
and if the video playing is finished, skipping to play the next video in the at least two videos, and continuously controlling the playing state of the next video in the at least two videos based on the user control instruction.
5. The method according to one of claims 1 to 4, wherein the videos in the video list are pre-formatted to match the player of the client.
6. The method according to one of claims 1 to 4, wherein the buffer space of the videos in the video list is determined by the size of the videos in the video list.
7. A computer device, comprising:
one or more processors;
a storage device on which one or more programs are stored;
the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method recited in any of claims 1-6.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN202011253156.2A 2020-11-11 2020-11-11 Video merging and playing method and device Active CN112437352B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011253156.2A CN112437352B (en) 2020-11-11 2020-11-11 Video merging and playing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011253156.2A CN112437352B (en) 2020-11-11 2020-11-11 Video merging and playing method and device

Publications (2)

Publication Number Publication Date
CN112437352A CN112437352A (en) 2021-03-02
CN112437352B true CN112437352B (en) 2023-02-17

Family

ID=74701223

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011253156.2A Active CN112437352B (en) 2020-11-11 2020-11-11 Video merging and playing method and device

Country Status (1)

Country Link
CN (1) CN112437352B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301424B (en) * 2021-05-21 2023-07-11 北京字跳网络技术有限公司 Play control method, device, storage medium and program product

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009301640A (en) * 2008-06-12 2009-12-24 Fujitsu Ten Ltd Reproduction control device, method, and program
CN109391843B (en) * 2017-08-03 2022-02-25 腾讯科技(深圳)有限公司 Online video speed doubling playing method, device, medium and intelligent terminal
CN107547940B (en) * 2017-09-13 2019-11-12 广州酷狗计算机科技有限公司 Video playing processing method, equipment and computer readable storage medium
CN108566519B (en) * 2018-04-28 2022-04-12 腾讯科技(深圳)有限公司 Video production method, device, terminal and storage medium
CN111385660B (en) * 2018-12-28 2022-07-12 广州市百果园信息技术有限公司 Video on demand method, device, equipment and storage medium
CN111510789B (en) * 2019-01-30 2021-09-21 上海哔哩哔哩科技有限公司 Video playing method, system, computer equipment and computer readable storage medium
CN110366024A (en) * 2019-07-01 2019-10-22 北京达佳互联信息技术有限公司 A kind of method and device playing video
CN110856031B (en) * 2019-11-18 2022-01-18 广州市百果园信息技术有限公司 Media resource display system, method, equipment and storage medium
CN110996134B (en) * 2019-12-23 2022-09-09 腾讯科技(深圳)有限公司 Video playing method, device and storage medium
CN111405346A (en) * 2020-03-20 2020-07-10 北京字节跳动网络技术有限公司 Video stream playing control method, device and storage medium

Also Published As

Publication number Publication date
CN112437352A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
EP4178188A1 (en) Landscape interaction method and apparatus, electronic device, and storage medium
CN102845072B (en) The media content playback quality improved
EP4021000A1 (en) Video playback method, device, apparatus, and storage medium
US20230224545A1 (en) Video playback method and apparatus, storage medium, and electronic device
CN113424553B (en) Method and system for playback of media items
CN108769816B (en) Video playing method, device and storage medium
CN112738633B (en) Video playing method, device, equipment and readable storage medium
CN111654711A (en) Video playing control method, video playing method and device
WO2023134610A1 (en) Video display and interaction method and apparatus, electronic device, and storage medium
WO2023103889A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN111240564B (en) Material display method and device, electronic equipment and storage medium
CN112437352B (en) Video merging and playing method and device
CN114040245B (en) Video playing method and device, computer storage medium and electronic equipment
CN113542817B (en) Mixed playing method and equipment for programs
CN114125551B (en) Video generation method, device, electronic equipment and computer readable medium
US11907121B2 (en) Methods for caching and reading content, client, and storage medium
CN111225255B (en) Target video push playing method and device, electronic equipment and storage medium
MX2008016087A (en) Methods and system to provide references associated with data streams.
US20230137492A1 (en) Landscape interaction method and apparatus, electronic device, and storage medium
US20230247234A1 (en) Method and apparatus for playing livestreaming video, electronic device, storage medium, and program product
CN113727172B (en) Video cache playing method and device, electronic equipment and storage medium
CN113553451B (en) Media playing method, device, electronic equipment and program product
CN113038260B (en) Music extension method, device, electronic equipment and storage medium
CN113507632A (en) Video processing method, device, terminal and storage medium
KR102459197B1 (en) Method and apparatus for presentation customization and interactivity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant