CN110290398B - Video issuing method and device, storage medium and electronic equipment - Google Patents

Video issuing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN110290398B
CN110290398B CN201910544271.6A CN201910544271A CN110290398B CN 110290398 B CN110290398 B CN 110290398B CN 201910544271 A CN201910544271 A CN 201910544271A CN 110290398 B CN110290398 B CN 110290398B
Authority
CN
China
Prior art keywords
video
issued
transmitted
frame
transparency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910544271.6A
Other languages
Chinese (zh)
Other versions
CN110290398A (en
Inventor
邓将军
邓卓尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910544271.6A priority Critical patent/CN110290398B/en
Publication of CN110290398A publication Critical patent/CN110290398A/en
Application granted granted Critical
Publication of CN110290398B publication Critical patent/CN110290398B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present disclosure relates to a video distribution method, device, storage medium and electronic device, including: receiving a video issuing instruction; under the condition that a video issuing instruction is received, determining a video resource packet corresponding to a first video to be issued specified by the video issuing instruction, wherein the video resource packet comprises a second video to be issued obtained after the first video to be issued is subjected to transparency and RGB value separation processing; and sending the determined video resource packet to a designated terminal according to the video issuing instruction. Through the technical scheme, the problem that the video animation with the transparent channel cannot be directly issued to the user terminal to be played can be solved, the restoration degree of the video animation can be guaranteed, the realization cost is greatly reduced compared with other solutions, and the user experience is further improved.

Description

Video issuing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of video processing, and in particular, to a video distribution method and apparatus, a storage medium, and an electronic device.
Background
At present, if a user terminal wants to play a video sent by a server in real time, it is not successful to directly send a video animation with transparency to the user terminal, because the user terminal cannot directly analyze the video animation with a transparent channel. There are three methods that are commonly used, such as may be implemented by native code, or using GIF kinetic graphs, or also by the cos2d engine, but each has certain drawbacks. For example, if the native code is used to implement complex animation effects, the implementation cost is too high, if the GIF dynamic graph is used to implement the complex animation effects, the resource volume problem is caused, if the cocos2d engine is used, the learning cost and the maintenance cost are high, and the animation design is not restored to a high degree by all the above schemes, so that the animation special effects are not good in playing effect. Therefore, how to perfectly restore the animation video delivered by the server and ensure less implementation cost again becomes a problem to be solved urgently.
Disclosure of Invention
The invention aims to provide a video issuing method, a video issuing device, a storage medium and electronic equipment, which can solve the problem that video animation with a transparent channel cannot be directly issued to a user terminal for playing, can also ensure the restoration degree of the video animation, greatly reduces the implementation cost compared with other solutions, and further improves the user experience.
In order to achieve the above object, the present disclosure provides a video delivery method, including:
receiving a video issuing instruction;
under the condition that the video issuing instruction is received, determining a video resource package corresponding to a first video to be issued specified by the video issuing instruction, wherein the video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB value separation processing, each frame of video frames in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB value of each pixel point in the corresponding video frame in the second video to be issued;
and sending the determined video resource packet to a designated terminal according to the video issuing instruction.
The present disclosure also provides a video delivery apparatus, the apparatus comprising:
the receiving module is used for receiving a video issuing instruction;
the video determining module is used for determining a video resource package corresponding to a first video to be issued specified by the video issuing instruction under the condition that the video issuing instruction is received, wherein the video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB value separation processing, each frame of video frame in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB value of each pixel point in the corresponding video frame in the second video to be issued;
and the video issuing module is used for sending the determined video resource packet to a specified terminal according to the video issuing instruction.
The present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
The present disclosure also provides an electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method described above.
Through the technical scheme, when the server needs to send the video animation to the terminal, the transparency and the RGB value of each frame of video frame in the video animation can be separated in advance, each frame of video frame comprises two parts, and the transparency and the RGB value in the original video frame are respectively stored, so that the user terminal can directly analyze and play the video animation according to the received video animation, the problem that the video animation with the transparent channel cannot be directly issued to the user terminal to play can be solved, the restoration degree of the video animation can be ensured, the realization cost is greatly reduced compared with other solutions, and the user experience is further improved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a flowchart illustrating a video delivery method according to an exemplary embodiment of the present disclosure.
Fig. 2 is a schematic diagram illustrating a 10 th frame of a second video to be distributed in a video distribution method according to an exemplary embodiment of the present disclosure.
Fig. 3a and 3b are schematic diagrams illustrating a first to-be-transmitted video of a horizontal screen and a vertical screen in a video transmission method according to an exemplary embodiment of the present disclosure.
Fig. 4 is a flowchart illustrating a video delivery method according to still another exemplary embodiment of the present disclosure.
Fig. 5 is a block diagram illustrating a structure of a video distribution apparatus according to an exemplary embodiment of the present disclosure.
Fig. 6 is a block diagram illustrating a structure of a video distribution apparatus according to still another exemplary embodiment of the present disclosure.
Fig. 7 is a block diagram illustrating an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
Fig. 1 is a flowchart illustrating a video delivery method according to an exemplary embodiment of the present disclosure. As shown in fig. 1, the method includes steps 101 to 103.
In step 101, a video distribution command is received. In the present disclosure, the source of the video issuing command is not limited, as long as the server receives a command for instructing that a specific video needs to be issued.
In step 102, in the case of receiving the video delivery instruction, determining a video resource packet corresponding to the first to-be-delivered video specified by the video delivery instruction, where the content in the video resource packet is already configured in the server and can be dynamically configured in the server. The video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB values are separated, each frame of video frame in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB values of each pixel point in the corresponding video frame in the second video to be issued. After the server receives the video issuing command, the first video to be issued is determined according to the appointed video which is indicated to be issued to the user terminal in the command, and the resource package which is actually required to be issued to the user terminal is determined according to the first video to be issued. The first video to be issued is the original video animation, wherein each pixel in each frame of video frame comprises an RGB value and a transparency, and the video animation with the transparent channel cannot be directly decoded and played on the user terminal. And the resource package to be transmitted to the user terminal, which is determined in step 102, includes the second video to be transmitted after the separation process of the transparency and the RGB values. Therefore, the user terminal can directly play the video according to the received resource packet, so that the user of the terminal can see the playing effect of the first video to be issued.
Each video frame in the second video frame to be issued comprises two parts which are respectively used for storing the RGB value and the transparency of the corresponding video frame in the original video. For example, as shown in fig. 2, the picture of the 10 th frame of the second video frame to be distributed may be divided into two parts, namely a first part 1 and a second part 2, as shown in the figure, the transparency included in all the pixel points in the 10 th frame of the first video to be distributed is stored in the pixel points in the first part 1, and the RGB values included in all the pixel points in the 10 th frame of the first video to be distributed is stored in the pixel points in the second part 2.
The pixel points in the second portion 2 correspond to the pixel points in the 10 th frame of video frame in the first video to be transmitted one by one, for example, the RGB values stored in the pixel points at the first row and the first column position in the second portion 2 are the RGB values in the pixel points at the first row and the first column position in the corresponding video frame in the first video to be transmitted.
The pixel points in the first portion 1 and the pixel points in the 10 th frame of video frame in the first video to be distributed may also be in one-to-one correspondence, but the distribution position of the transparency stored in the first portion 1 may be different from the distribution position of the transparency of each pixel point in the 10 th frame of video frame in the first video to be distributed, that is, the transparency may not be stored in such a way that each pixel point stores the transparency of one corresponding pixel point, but a plurality of transparencies may be stored in each pixel point, for example, three transparency values of the pixel points at the first row, the first column, the second column, and the third column of the first row of the original video frame may all be stored in the pixel point at the first row, the first column of the second portion 2 in the second video to be distributed, and the three transparency values may be stored in the three positions originally used for storing RGB values in the pixel point, the transparency of all the pixel points in the original video frame can be stored in a part of the pixel points in the corresponding video frame in the second video to be issued, so that the transparency of a plurality of the pixel points is compressed to one pixel point to be represented, the user terminal can be enabled to analyze the video frame, the video volume of the second video to be issued can be reduced to a certain extent, the bandwidth occupied by the video to be issued is reduced, and the video issuing speed can be effectively improved.
Fig. 2 is only to illustrate that a video frame in a corresponding second video to be delivered after a video frame of a first video to be delivered is subjected to transparency and RGB value separation includes two parts in which the transparency and the RGB value are respectively stored, and is not used to limit that video pictures of the first part 1 and the second part 2 must be consistent, nor is it used to limit specific picture content in the second video to be delivered.
In step 103, the determined video resource packet is sent to a designated terminal according to the video issuing instruction. And after determining the resource packet to be issued, sending the determined video resource packet to a specified user terminal according to the video issuing instruction.
For example, in a live scene, a user may generate some interactive operations when watching a live broadcast in a terminal, such as delivering a main broadcast gift, and the like, when a part of the gift is delivered, a corresponding animation video needs to be displayed to implement a special effect of delivering the gift, and the corresponding animation video needs to be delivered from a server to the terminal for caching when a client of the terminal is started or at other specific time, so that the client of the user terminal triggers the video delivery process shown in step 101 to step 103 when determining that a corresponding animation video resource needs to be cached from the server.
Through the technical scheme, when the server needs to send the video animation to the terminal, the transparency and the RGB value of each frame of video frame in the video animation can be separated in advance, each frame of video frame comprises two parts, and the transparency and the RGB value in the original video frame are respectively stored, so that the user terminal can directly analyze and play the video animation according to the received video animation, the problem that the video animation with the transparent channel cannot be directly issued to the user terminal to play can be solved, the restoration degree of the video animation can be ensured, the realization cost is greatly reduced compared with other solutions, and the user experience is further improved.
In a possible implementation manner, the video resource package further includes a configuration file corresponding to the second video to be delivered, where the configuration file is used to provide a corresponding clipping alignment manner when the second video to be delivered is mapped to the texture coordinate of the designated terminal. When videos of the same format are played on different terminals, due to the difference of terminal model types, videos of the same fixed size may show different sizes on different terminal model types, so that the configuration file can provide cutting alignment modes suitable for different model types when the second video to be distributed is played by the terminal, and when the second video to be distributed can be mapped to texture coordinates on a designated terminal, mapping can be adjusted automatically according to the cutting alignment modes given in the configuration file, so that the terminal of any model can display video animation of which size is most suitable for the model of the terminal according to the second video to be distributed and the cutting alignment modes in the configuration file.
In a possible implementation manner, the second video to be delivered in the video resource package includes a horizontal screen video to be delivered corresponding to the first video to be delivered of the horizontal screen and a vertical screen video to be delivered corresponding to the first video to be delivered of the vertical screen. In a live scene, the first to-be-transmitted video of the cross screen may be a gift-offering video animation played when the terminal plays a live screen on the cross screen, as shown in fig. 3 a; the first video to be delivered of the vertical screen may be a gift-offering video animation played when the terminal plays a live-broadcast picture in the vertical screen, as shown in fig. 3 b. Both videos need to be sent to the user terminal so that the user terminal can display video animation with a proper size in different display pictures, and therefore after the transparency and the RGB value of the first video to be distributed are separated, the obtained second video to be distributed includes both the horizontal screen video to be distributed and the vertical screen video to be distributed, that is, the horizontal screen video to be distributed and the vertical screen video to be distributed, which are separated by the transparency and the RGB value, in the video resource packet to be sent to the specified terminal and determined by the server in step 102, respectively.
Fig. 4 is a flowchart illustrating a video delivery method according to still another exemplary embodiment of the present disclosure. As shown in fig. 4, the method includes step 401 in addition to steps 101 to 103 shown in fig. 1.
In step 401, the second video to be delivered is re-encoded to reduce the volume of the second video to be delivered. For example, the second video to be transmitted may be converted into h265 format, so that the volume of the converted video file can be greatly reduced compared with the volume of the conventional h264 format video file. And after the second video to be transmitted is re-encoded into the h265 format, the decoding and playing of the designated terminal receiving the second video to be transmitted are not affected, and the designated terminal can still finish the decoding and playing of the second video to be transmitted in the h265 format. In addition, most of the models of the terminals can support hardware decoding at present, and the designated terminal can decode and play the second video to be transmitted in the h265 format more quickly after receiving the second video to be transmitted by starting the hardware decoding on the designated terminal. In a possible implementation manner, when the second video to be delivered includes a horizontal screen video to be delivered and a vertical screen video to be delivered, both the horizontal screen video to be delivered and the vertical screen video to be delivered are re-encoded, reduced in volume, and then sent to the designated terminal.
Fig. 5 is a block diagram illustrating a structure of a video distribution apparatus according to an exemplary embodiment of the present disclosure. As shown in fig. 5, the apparatus includes: the receiving module 10 is used for receiving a video issuing instruction; a video determining module 20, configured to determine, when the video issuing instruction is received, a video resource packet corresponding to a first video to be issued specified by the video issuing instruction, where the video resource packet includes a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB values are separated, each video frame in the second video to be issued is divided into two parts, each pixel in the first part is used to store the transparency of each pixel in the corresponding video frame in the first video to be issued, and each pixel in the second part is used to store the RGB value of each pixel in the corresponding video frame in the second video to be issued; and the video issuing module 30 is configured to send the determined video resource packet to a specified terminal according to the video issuing instruction.
Through the technical scheme, when the server needs to send the video animation to the terminal, the transparency and the RGB value of each frame of video frame in the video animation can be separated in advance, each frame of video frame comprises two parts, and the transparency and the RGB value in the original video frame are respectively stored, so that the user terminal can directly analyze and play the video animation according to the received video animation, the problem that the video animation with the transparent channel cannot be directly issued to the user terminal to play can be solved, the restoration degree of the video animation can be ensured, the realization cost is greatly reduced compared with other solutions, and the user experience is further improved.
In a possible implementation manner, the video resource package further includes a configuration file corresponding to the second video to be delivered, where the configuration file is used to provide a corresponding clipping alignment manner when the second video to be delivered is mapped to the texture coordinate of the designated terminal.
In a possible implementation manner, the second video to be delivered in the video resource package includes a horizontal screen video to be delivered corresponding to the first video to be delivered of the horizontal screen and a vertical screen video to be delivered corresponding to the first video to be delivered of the vertical screen.
In a possible implementation manner, among the partial pixel points in the first portion of each video frame in the second video to be distributed, the transparency of three pixel points in the corresponding video frame in the first video to be distributed is included.
Fig. 6 is a block diagram illustrating a structure of a video distribution apparatus according to still another exemplary embodiment of the present disclosure. As shown in fig. 6, the apparatus includes, in addition to the modules shown in fig. 5: and the encoding module 40 is configured to re-encode the second video to be delivered so as to reduce the volume of the second video to be delivered.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the functional module, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Referring now to FIG. 7, a block diagram of an electronic device 600 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, a background server, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage device 606 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 606 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 7 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such embodiments, the computer program may be downloaded and installed from a network through the communication device 609, or installed from the storage device 606, or installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects the internet protocol addresses from the at least two internet protocol addresses and returns the internet protocol addresses; receiving an internet protocol address returned by the node evaluation equipment; wherein the obtained internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
According to one or more embodiments of the present disclosure, a video delivery method is provided, including: receiving a video issuing instruction; under the condition that the video issuing instruction is received, determining a video resource package corresponding to a first video to be issued specified by the video issuing instruction, wherein the video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB value separation processing, each frame of video frames in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB value of each pixel point in the corresponding video frame in the second video to be issued; and sending the determined video resource packet to a designated terminal according to the video issuing instruction.
According to one or more embodiments of the present disclosure, a video delivery method is further provided, where the video resource package further includes a configuration file corresponding to the second video to be delivered, and the configuration file is used to provide a corresponding clipping alignment manner when the second video to be delivered is mapped to texture coordinates of the designated terminal.
According to one or more embodiments of the present disclosure, a video delivery method is further provided, where the second video to be delivered in the video resource package includes a horizontal screen video to be delivered corresponding to the first video to be delivered of the horizontal screen and a vertical screen video to be delivered corresponding to the first video to be delivered of the vertical screen.
According to one or more embodiments of the present disclosure, a video delivery method is further provided, where in part of the pixel points in the first portion of each video frame in the second video to be delivered, transparency of three pixel points in a corresponding video frame in the first video to be delivered is included.
According to one or more embodiments of the present disclosure, a video delivery method is further provided, where before the sending the determined video resource package to a specific terminal according to the video delivery instruction, the method further includes: and re-encoding the second video to be transmitted so as to reduce the volume of the second video to be transmitted.
According to one or more embodiments of the present disclosure, there is provided a video delivery apparatus including: the receiving module is used for receiving a video issuing instruction; the video determining module is used for determining a video resource package corresponding to a first video to be issued specified by the video issuing instruction under the condition that the video issuing instruction is received, wherein the video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB value separation processing, each frame of video frame in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB value of each pixel point in the corresponding video frame in the second video to be issued; and the video issuing module is used for sending the determined video resource packet to a specified terminal according to the video issuing instruction.
According to one or more embodiments of the present disclosure, a video delivery apparatus is further provided, where the video resource package further includes a configuration file corresponding to the second video to be delivered, and the configuration file is configured to provide a corresponding clipping alignment manner when the second video to be delivered is mapped to texture coordinates of the designated terminal.
According to one or more embodiments of the present disclosure, a video delivery apparatus is further provided, where the second video to be delivered in the video resource package includes a horizontal screen video to be delivered corresponding to the first video to be delivered of the horizontal screen and a vertical screen video to be delivered corresponding to the first video to be delivered of the vertical screen.
According to one or more embodiments of the present disclosure, a computer-readable storage medium is provided, on which a computer program is stored, which program, when executed by a processor, performs the steps of the video delivery method described above.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the video delivery method described above.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (9)

1. A video distribution method is characterized by comprising the following steps:
receiving a video issuing instruction;
under the condition that the video issuing instruction is received, determining a video resource package corresponding to a first video to be issued specified by the video issuing instruction, wherein the video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB value separation processing, each frame of video frames in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB value of each pixel point in the corresponding video frame in the second video to be issued;
sending the determined video resource packet to a designated terminal according to the video issuing instruction;
and the transparency of three pixel points in the corresponding video frame in the first video to be transmitted is contained in each pixel point of partial pixel points in the first part of each frame of video frame in the second video to be transmitted.
2. The method according to claim 1, wherein the video resource package further includes a configuration file corresponding to the second video to be delivered, and the configuration file is used for providing a corresponding clipping alignment manner when the second video to be delivered is mapped to the texture coordinates of the designated terminal.
3. The method according to claim 1 or 2, wherein the second video to be transmitted in the video resource package comprises a horizontal screen video to be transmitted corresponding to the first video to be transmitted of a horizontal screen and a vertical screen video to be transmitted corresponding to the first video to be transmitted of a vertical screen.
4. The method according to claim 1, wherein before the sending the determined video resource package to a designated terminal according to the video delivery instruction, the method further comprises:
and re-encoding the second video to be transmitted so as to reduce the volume of the second video to be transmitted.
5. A video distribution apparatus, comprising:
the receiving module is used for receiving a video issuing instruction;
the video determining module is used for determining a video resource package corresponding to a first video to be issued specified by the video issuing instruction under the condition that the video issuing instruction is received, wherein the video resource package comprises a second video to be issued corresponding to the first video to be issued, the second video to be issued is the first video to be issued after transparency and RGB value separation processing, each frame of video frame in the second video to be issued is divided into two parts, the pixels of the first part are used for storing the transparency of each pixel point in the corresponding video frame in the first video to be issued, and the pixels of the second part are used for storing the RGB value of each pixel point in the corresponding video frame in the second video to be issued;
the video issuing module is used for sending the determined video resource packet to a specified terminal according to the video issuing instruction;
and the transparency of three pixel points in the corresponding video frame in the first video to be transmitted is contained in each pixel point of partial pixel points in the first part of each frame of video frame in the second video to be transmitted.
6. The apparatus according to claim 5, wherein the video resource package further includes a configuration file corresponding to the second video to be delivered, and the configuration file is configured to provide a corresponding clipping alignment manner when the second video to be delivered is mapped to the texture coordinates of the designated terminal.
7. The apparatus according to claim 5 or 6, wherein the second video to be transmitted in the video resource package comprises a horizontal screen video to be transmitted corresponding to the first video to be transmitted of a horizontal screen and a vertical screen video to be transmitted corresponding to the first video to be transmitted of a vertical screen.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
9. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 4.
CN201910544271.6A 2019-06-21 2019-06-21 Video issuing method and device, storage medium and electronic equipment Active CN110290398B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910544271.6A CN110290398B (en) 2019-06-21 2019-06-21 Video issuing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910544271.6A CN110290398B (en) 2019-06-21 2019-06-21 Video issuing method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110290398A CN110290398A (en) 2019-09-27
CN110290398B true CN110290398B (en) 2021-11-05

Family

ID=68004278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910544271.6A Active CN110290398B (en) 2019-06-21 2019-06-21 Video issuing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110290398B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246274A (en) * 2020-02-04 2020-06-05 腾讯科技(深圳)有限公司 Method for determining data for displaying information and method and device for displaying information
CN112291584A (en) * 2020-10-30 2021-01-29 维沃移动通信有限公司 Dynamic effect file processing method and device and electronic equipment
CN113423016A (en) * 2021-06-18 2021-09-21 北京爱奇艺科技有限公司 Video playing method, device, terminal and server
CN114173157B (en) * 2021-12-10 2022-12-16 广州博冠信息科技有限公司 Video stream transmission method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108377359A (en) * 2018-03-14 2018-08-07 苏州科达科技股份有限公司 Video anti-error code method, device, electronic equipment, storage medium
CN108848325A (en) * 2018-06-26 2018-11-20 蒋大武 A kind of image synthesizing method for scratching picture based on natural image
CN109272565A (en) * 2017-07-18 2019-01-25 腾讯科技(深圳)有限公司 Animation playing method, device, storage medium and terminal
CN109348276A (en) * 2018-11-08 2019-02-15 北京微播视界科技有限公司 Video pictures method of adjustment, device, computer equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8189908B2 (en) * 2005-09-02 2012-05-29 Adobe Systems, Inc. System and method for compressing video data and alpha channel data using a single stream
WO2018000126A1 (en) * 2016-06-27 2018-01-04 Intel Corporation Method and system of multi-dynamic range multi-layer video blending with alpha channel sideband for video playback
US10636178B2 (en) * 2017-09-21 2020-04-28 Tiny Pixels Technologies Inc. System and method for coding and decoding of an asset having transparency
CN109191549B (en) * 2018-11-14 2023-11-10 广州酷狗计算机科技有限公司 Method and device for displaying animation
CN109495790A (en) * 2018-11-30 2019-03-19 北京字节跳动网络技术有限公司 Paster adding method, device, electronic equipment and readable medium based on editing machine
CN109727301A (en) * 2018-12-29 2019-05-07 北京字节跳动网络技术有限公司 Generate method, apparatus, electronic equipment and the storage medium of dynamic wallpaper

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272565A (en) * 2017-07-18 2019-01-25 腾讯科技(深圳)有限公司 Animation playing method, device, storage medium and terminal
CN108377359A (en) * 2018-03-14 2018-08-07 苏州科达科技股份有限公司 Video anti-error code method, device, electronic equipment, storage medium
CN108848325A (en) * 2018-06-26 2018-11-20 蒋大武 A kind of image synthesizing method for scratching picture based on natural image
CN109348276A (en) * 2018-11-08 2019-02-15 北京微播视界科技有限公司 Video pictures method of adjustment, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN110290398A (en) 2019-09-27

Similar Documents

Publication Publication Date Title
CN110290398B (en) Video issuing method and device, storage medium and electronic equipment
CN111399956B (en) Content display method and device applied to display equipment and electronic equipment
CN110809189B (en) Video playing method and device, electronic equipment and computer readable medium
US11936924B2 (en) Live room setup method and apparatus, electronic device, and storage medium
US11785195B2 (en) Method and apparatus for processing three-dimensional video, readable storage medium and electronic device
CN112272226B (en) Picture loading method and device and readable storage medium
US11893770B2 (en) Method for converting a picture into a video, device, and storage medium
CN113521728A (en) Cloud application implementation method and device, electronic equipment and storage medium
CN110806846A (en) Screen sharing method, screen sharing device, mobile terminal and storage medium
CN112770159A (en) Multi-screen interaction system, method, device, equipment and storage medium
CN115767181A (en) Live video stream rendering method, device, equipment, storage medium and product
CN115761090A (en) Special effect rendering method, device, equipment, computer readable storage medium and product
CN114095671A (en) Cloud conference live broadcast system, method, device, equipment and medium
CN112053286A (en) Image processing method, image processing device, electronic equipment and readable medium
CN111338729A (en) Method, device, medium and electronic equipment for playing view
CN114860139A (en) Video playing method, video playing device, electronic equipment, storage medium and program product
CN114445600A (en) Method, device and equipment for displaying special effect prop and storage medium
CN116248889A (en) Image encoding and decoding method and device and electronic equipment
CN114979762B (en) Video downloading and transmitting method and device, terminal equipment, server and medium
CN112887742B (en) Live stream processing method, device, equipment and storage medium
CN113766255B (en) Video stream merging method, device, electronic equipment and computer medium
CN114630157A (en) Live broadcast starting method, equipment and program product
CN110570502A (en) method, apparatus, electronic device and computer-readable storage medium for displaying image frame
CN111212296A (en) Live broadcast room gift list configuration method, device, medium and electronic equipment
CN112346682A (en) Image special effect processing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant