CN113099232A - Video decoding method, video decoding device, electronic equipment and computer storage medium - Google Patents

Video decoding method, video decoding device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN113099232A
CN113099232A CN202110320486.7A CN202110320486A CN113099232A CN 113099232 A CN113099232 A CN 113099232A CN 202110320486 A CN202110320486 A CN 202110320486A CN 113099232 A CN113099232 A CN 113099232A
Authority
CN
China
Prior art keywords
alpha
layer
color
frame
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110320486.7A
Other languages
Chinese (zh)
Other versions
CN113099232B (en
Inventor
眭世晨
刘华平
许崝
林勇平
徐贤锋
陈舜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Netease Cloud Music Technology Co Ltd
Original Assignee
Hangzhou Netease Cloud Music Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Netease Cloud Music Technology Co Ltd filed Critical Hangzhou Netease Cloud Music Technology Co Ltd
Priority to CN202110320486.7A priority Critical patent/CN113099232B/en
Publication of CN113099232A publication Critical patent/CN113099232A/en
Application granted granted Critical
Publication of CN113099232B publication Critical patent/CN113099232B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • H04N19/21Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding with binary alpha-plane coding for video objects, e.g. context-based arithmetic encoding [CAE]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Abstract

The application discloses a video decoding method, a video decoding device, electronic equipment and a computer storage medium, wherein the video decoding method comprises the following steps: receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream; acquiring a color layer coding stream in a video stream, and decoding the color layer coding stream by a color layer decoder to obtain a color value; acquiring an Alpha layer coding stream in a video stream, and decoding the Alpha layer coding stream by an Alpha layer decoder to obtain an Alpha value; and obtaining a display image according to the color value and the Alpha value. According to the method and the device, an Alpha layer decoder is added on the existing color layer decoder framework, so that the decoding of the Alpha layer frame data of the HEVC Video With Alpha format Video file is realized, and the semitransparent characteristic of the Video is supported.

Description

Video decoding method, video decoding device, electronic equipment and computer storage medium
Technical Field
The present invention relates to the field of video encoding and decoding technologies, and in particular, to a video decoding method, an apparatus, an electronic device, and a computer storage medium.
Background
This section is intended to provide a background or context to the embodiments of the application that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
In the field of image coding and decoding technology, an Alpha Channel (Alpha Channel) is a parameter for influencing the transparency and translucency of an image. For images in PNG and webp formats, the transparency and translucency of each pixel in the image can be determined by appending an Alpha (Alpha) value to the RGB value of the pixel. When displaying, the mixing proportion of the pixel color RGB value of the display image and the pixel color RGB value of the background image is determined through the Alpha value, so that the display image is semi-transparently covered on the background image.
In the technical field of video coding and decoding, the Alpha channel is applied to each image frame of a video, so that the video with the semitransparent characteristic can be obtained. An HEVC Video format extended standard HEVC Video With Alpha supporting Alpha channels stores Alpha channel values in an additional layer (namely an Alpha layer) independent of a color layer, and pixel values of the Alpha layer are acquired during display and serve as corresponding Alpha values to be displayed and mixed.
Most of related decoding technical schemes support the decoding of color layers, and a corresponding solution needs to be provided for how to realize the general Video decoding suitable for the HEVC Video With Alpha format on the basis of the existing decoding architecture.
Disclosure of Invention
In view of the above technical problems, there is a need for an improved method capable of implementing general Video decoding suitable for HEVC Video With Alpha format based on the existing decoding architecture.
In a first aspect, an embodiment of the present application provides a video decoding method, including:
receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream;
acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
acquiring an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value;
and obtaining a display image according to the color value and the Alpha value.
Therefore, the method and the device have the advantage that the semi-transparent characteristic can be supported by decoding the Video in the HEVC Video With Alpha format by adding the Alpha layer decoder on the basis of the existing color layer decoder. If the color layer decoder and the Alpha layer decoder are software decoders, the technical scheme of the application can be applied to electronic devices of different types, different models and different operating systems (such as Windows, Android, Linux, Mac OS, IOS and the like) to operate, and is not limited to electronic devices of specific models and specific operating systems.
Optionally, determining that the video stream includes both color layer data and Alpha layer data comprises:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the extended data identifier corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream.
In this way, whether an Alpha layer coding stream exists in the video stream is determined through the extension data identification in the video parameter set.
Optionally, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for caching color layer reference frames obtained in the process of decoding the color layer coding stream;
and creating an Alpha layer reference frame buffer area, wherein the Alpha layer reference frame buffer area is used for caching the Alpha layer reference frames obtained in the Alpha layer coding stream decoding process.
In this way, the reference frame corresponding to the current frame can be cached by establishing the color layer reference frame buffer area and the Alpha layer reference frame buffer area, so that the decoding is realized.
Optionally, after the color layer decoder and the Alpha layer decoder are called, the method further includes:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set as a color layer image parameter set or an Alpha layer image parameter set according to a data layer mark parameter in the image parameter set;
initializing the color layer decoder using the color layer sequence parameter set and the color layer image parameter set;
initializing the Alpha layer decoder using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
In this way, the color layer decoder and the Alpha layer decoder are initialized through the respective corresponding parameter sets of the color layer and the Alpha layer, so that the subsequent decoding of color layer frame data and Alpha layer frame data is facilitated according to the parameters in the parameter sets.
Optionally, acquiring a color layer encoded stream and an Alpha layer encoded stream in the video stream includes:
acquiring frame data in the video stream, and determining the frame data as color layer frame data or Alpha layer frame data according to a data layer mark parameter of the frame data;
sending the color layer frame data to the color layer decoder;
sending the Alpha layer frame data to the Alpha layer decoder.
In this way, the color layer frame data and the Alpha layer frame data can be distinguished by the data layer flag parameter nuh _ layer _ id, which facilitates decoding using a corresponding decoder.
Optionally, the video decoding method further includes:
when the data layer mark parameter is analyzed to be null or an illegal value, acquiring the frame serial number of the frame to be decoded of the frame data;
if the frame serial number is the new frame serial number, determining that the frame to be decoded is color layer frame data;
and if the frame serial number is a repeated frame serial number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameters in the frame to be decoded.
Optionally, determining, according to the frame marker parameter in the frame to be decoded, that the frame to be decoded is color layer frame data or Alpha layer frame data, including:
when the frame to be decoded is determined to be an initial data packet according to the frame marking parameters, determining the frame to be decoded to be Alpha layer frame data;
and when the frame to be decoded is determined not to be the initial data packet according to the frame marking parameters, determining that the frame to be decoded is the color layer frame data.
In this way, the color layer frame data and the Alpha layer frame data are distinguished by the frame sequence number and the frame mark parameter, so that decoding failure caused by error of the data layer mark parameter can be avoided.
Optionally, the Alpha layer frame data is in YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
In this way, by discarding data in the U color direction and V color direction in the Alpha layer frame data, the decoding speed of the Alpha layer decoder can be increased.
Optionally, obtaining a display image according to the color value and the Alpha value includes:
outputting the color values and the Alpha values to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphics processor, and mixing the color value and the Alpha value according to the position of the pixel.
Thus, the color value and the Alpha value are mixed through the graphics processor, and the semi-transparent characteristic of the video is realized.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to the color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coding stream to obtain an Alpha value, generating a decoding channel mark parameter corresponding to the Alpha layer;
wherein the graphics processor determines a color value and an Alpha value for each pixel according to the decoding channel flag parameter.
In this way, subsequent graphics processors are facilitated to distinguish the color values from the Alpha values.
Optionally, the video decoding method further includes:
calling a color layer decoder when the video stream does not comprise an Alpha layer coding stream;
acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
and obtaining a display image according to the color value.
In this way, decoding compatibility for video files without Alpha layers is achieved by decoding video files that do not include Alpha layer encoded streams using a color layer decoder.
In a second aspect, an embodiment of the present application provides a video decoding apparatus, including:
the device comprises an initialization unit, a color layer decoder and an Alpha layer decoder, wherein the initialization unit is used for receiving a video stream to be decoded, and calling the color layer decoder and the Alpha layer decoder respectively when determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream;
the color layer decoding unit is used for acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
an Alpha layer decoding unit, configured to obtain an Alpha layer encoded stream in the video stream, and decode the Alpha layer encoded stream by the Alpha layer decoder to obtain an Alpha value;
and the display unit is used for obtaining a display image according to the color value and the Alpha value.
Optionally, determining that the video stream includes both color layer data and Alpha layer data comprises:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the extended data identifier corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream.
Optionally, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for caching color layer reference frames obtained in the process of decoding the color layer coding stream;
and creating an Alpha layer reference frame buffer area, wherein the Alpha layer reference frame buffer area is used for caching the Alpha layer reference frames obtained in the Alpha layer coding stream decoding process.
Optionally, after the color layer decoder and the Alpha layer decoder are called, the method further includes:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set as a color layer image parameter set or an Alpha layer image parameter set according to a data layer mark parameter in the image parameter set;
initializing the color layer decoder using the color layer sequence parameter set and the color layer image parameter set;
initializing the Alpha layer decoder using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
Optionally, acquiring a color layer encoded stream and an Alpha layer encoded stream in the video stream includes:
acquiring frame data in the video stream, and determining the frame data as color layer frame data or Alpha layer frame data according to a data layer mark parameter of the frame data;
sending the color layer frame data to the color layer decoder;
sending the Alpha layer frame data to the Alpha layer decoder.
Optionally, the video decoding apparatus further includes:
the data judgment unit is used for acquiring the frame serial number of the frame to be decoded of the frame data when the data layer mark parameter is analyzed to be null or an illegal value;
if the frame serial number is the new frame serial number, determining that the frame to be decoded is color layer frame data;
and if the frame serial number is a repeated frame serial number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameters in the frame to be decoded.
Optionally, determining, according to the frame marker parameter in the frame to be decoded, that the frame to be decoded is color layer frame data or Alpha layer frame data, including:
when the frame to be decoded is determined to be an initial data packet according to the frame marking parameters, determining the frame to be decoded to be Alpha layer frame data;
and when the frame to be decoded is determined not to be the initial data packet according to the frame marking parameters, determining that the frame to be decoded is the color layer frame data.
Optionally, the Alpha layer frame data is in YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
Optionally, obtaining a display image according to the color value and the Alpha value includes:
outputting the color values and the Alpha values to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphics processor, and mixing the color value and the Alpha value according to the position of the pixel.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to the color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coding stream to obtain an Alpha value, generating a decoding channel mark parameter corresponding to the Alpha layer;
wherein the graphics processor determines a color value and an Alpha value for each pixel according to the decoding channel flag parameter.
Optionally, the initialization unit is further configured to call a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
the display unit is also used for obtaining a display image according to the color value;
when the color value is determined that the video stream does not include an Alpha layer encoded stream by the initialization unit, the color layer decoding unit acquires a color layer encoded stream in the video stream, and the color layer decoder decodes the color layer encoded stream to obtain a color value.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the video decoding method.
In a fourth aspect, an embodiment of the present application provides a computer storage medium storing a computer program, where the computer program is used to implement the video decoding method.
According to the Video decoding method, the Video decoding device, the electronic equipment and the computer storage medium, an Alpha layer decoder is added to an existing color layer decoder framework, so that the decoding of Alpha layer frame data of an HEVC Video With Alpha format Video file can be realized, and the semitransparent characteristic of the Video is supported.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
fig. 1 is a schematic view of an application scenario of a video decoding method according to an embodiment of the present application;
fig. 2 is a flowchart illustrating a video decoding method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of determining whether the video stream includes color layer data and Alpha layer data at the same time according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating initialization of a decoder using a parameter set according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a process of determining frame data as color layer frame data or Alpha layer frame data according to an embodiment of the present application;
fig. 6 is a schematic flowchart illustrating decoding of Alpha layer frame data according to an embodiment of the present application;
FIG. 7 is a flowchart illustrating a process for displaying decoded video using a graphics processor according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a video decoding apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a computer program product according to an embodiment of the present application.
Detailed Description
The principles and spirit of the present application will be described with reference to a number of exemplary embodiments. It should be understood that these embodiments are given solely for the purpose of enabling those skilled in the art to better understand and to practice the present application, and are not intended to limit the scope of the present application in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As will be appreciated by one skilled in the art, embodiments of the present application may be embodied as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
In this document, it is to be understood that any number of elements in the figures are provided by way of illustration and not limitation, and any nomenclature is used for differentiation only and not in any limiting sense.
For convenience of understanding, terms referred to in the embodiments of the present application are explained below:
video coding and decoding:
the video coding mode is a mode of converting a file in an original video format into a file in another video format by a compression technology. The most important codec standards in video streaming include h.263, h.264, and h.265 of the international telecommunication union, M-JPEG of the moving picture experts group and MPEG series standards of the international organization for standardization moving picture experts group, and VPx and AV1 designated by Google Netflix and other companies. Video decoding is the reverse operation of video encoding.
(ii) High Efficiency Video Coding (HEVC):
an advanced video coding format, HEVC, was published in 2012 and was designed to replace the h.264 coding format. HEVC has higher compression rate and better compression quality than H.264 and supports 4K, 10-12bit HDR and 3D video formats.
In the HEVC standard, there are a number of different parameter sets in the video stream for setting the codec parameters. The Video Parameter Set (VPS), the Sequence Parameter Set (SPS), the Picture Parameter Set (PPS), and the like are included. The following describes the parameter sets relevant to the present application:
(1)VPS:
the layers of a given video sequence refer to the same VPS.
The VPS contains information mainly including:
a plurality of layers and operation points share syntax elements;
key information of the operation point, such as grade and grade;
other operating point characteristic information not belonging to SPS, such as virtual Reference Decoder (HRD) parameters.
(2)SPS:
SPS contains information mainly including:
information in an image format.
And coding the parameter information.
Information related to the reference image.
And fourthly, relevant parameters of grade, layer and level.
Time domain hierarchical information.
Sixthly, visualizing the availability information.
And (c) other information. Including the VPS number, SPS identification number, and SPS extension information of the current SPS reference.
(3)PPS:
The PPS contains information mainly including:
coding availability indication of the tool.
And (4) quantizing the related syntactic elements of the process.
Tile related syntax elements.
And fourthly, removing relevant syntax elements of the block filtering.
Control information in the slice header.
Sixthly, other information which can be shared when encoding one image.
(iii) HEVC Video With Alpha standard:
the HEVC Video With Alpha standard is an HEVC Video format extension standard that supports Alpha channels. The standard stores Alpha channel values in an additional layer (i.e., Alpha layer) independent of the color layer, and acquires pixel values of the Alpha layer as corresponding Alpha values for display blending when displaying.
For video files with Alpha layers, the standard partial coding is as follows:
(1) VPS part:
nuh_layer_id=0。
vps _ extension ≠ 0 to denote that the video stream contains an additional layer of extensions.
(2) SPS part:
the nuh _ layer _ id is 0, which indicates that the SPS is the color layer SPS.
② nuh _ layer _ id is 1, which indicates that the SPS is Alpha layer SPS.
(3) PPS part:
the nuh _ layer _ id is 0, which indicates that the PPS is a color layer PPS.
The number of the PPS layer _ id is 1, which means that the PPS is an Alpha layer PPS.
(4) Frame data part:
the nuh _ layer _ id is 0, and this frame data is color layer frame data.
The color layer frame data is consistent with the common HEVC format video coding mode.
② nuh _ layer _ id ═ vps _ extension, and this frame data represents Alpha layer frame data.
The Alpha layer frame data is encoded using YUV color model format, where Y color direction data is the Alpha value, U-128, V-128.
(IV) RGB color model:
one color standard is to obtain various colors by superimposing three color channels of Red (Red), Green (Green), and Blue (Blue). This standard covers almost all colors that human vision can perceive, and is one of the most widely used color systems today.
And (V) YUV color model:
one color standard is to obtain a wide variety of colors by superimposing Y, U, V three color channels. Y represents brightness (Luminince or Luma). U and V represent Chroma (Chroma) for describing the color and saturation of the image.
The principles and spirit of the present application are explained in detail below with reference to several representative embodiments of the present application.
Summary of The Invention
According to the technical scheme for decoding the Video in the HEVC Video With Alpha format, the Alpha layer data can be directly ignored and only the color layer data is decoded in the related technical scheme, so that the semi-transparent characteristic of the Video cannot be realized.
In order to solve the above problem, the present application provides a video decoding method, including: receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream; acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value; acquiring an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value; and obtaining a display image according to the color value and the Alpha value. According to the Video decoding method, the Alpha layer decoder is added on the existing color layer decoder framework, so that the decoding of the Alpha layer frame data of the HEVC Video With Alpha format Video file can be realized, and the semitransparent characteristic of the Video is supported.
Having described the basic principles of the present application, various non-limiting embodiments of the present application are described in detail below.
Application scene overview
Fig. 1 is a schematic view of an application scenario of a video decoding method according to an embodiment of the present application. Terminal devices 101 and 102 are included in the application scenario shown in fig. 1. The terminal devices 101 and 102 include, but are not limited to, desktop computers, mobile phones, mobile computers, tablet computers, media players, smart wearable devices, smart televisions, car mounted devices, Personal Digital Assistants (PDAs), and other electronic devices. The terminal devices 101 and 102 may be two different devices or the same device, and are only distinguished here for explaining the correspondence between encoding and decoding.
The terminal device 101 is configured to encode an original Video file according to the HEVC Video With Alpha standard to obtain a Video file With a semitransparent characteristic, and deliver the Video file to the terminal device 102. The data transmission method between the terminal device 101 and the terminal device 102 includes, but is not limited to, network connection, reading from a removable storage medium, and the like.
The terminal device 102 is configured to receive the video file, convert the video file into a video stream to be decoded, and call a color layer decoder and an Alpha layer decoder when determining that the video stream simultaneously includes a color layer encoded stream and an Alpha layer encoded stream; acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value; acquiring an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value; and obtaining a display image according to the color value and the Alpha value.
It should be noted that the above application scenarios are only presented to facilitate understanding of the spirit and principles of the present application, and the embodiments of the present application are not limited in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
Exemplary method
Referring to fig. 2, an embodiment of the present application provides a video decoding method, including:
s201, receiving a video stream to be decoded;
s202, judging whether the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream, if so, executing a step S203;
s203, calling a color layer decoder and an Alpha layer decoder respectively;
s204, acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
s205, obtaining an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value;
and S206, obtaining a display image according to the color value and the Alpha value.
In a specific implementation process, the color layer decoder and the Alpha layer decoder may be hardware decoders or software decoders.
Therefore, the method and the device have the advantage that the semi-transparent characteristic can be supported by decoding the Video in the HEVC Video With Alpha format by adding the Alpha layer decoder on the basis of the existing color layer decoder. If the color layer decoder and the Alpha layer decoder are software decoders, the technical scheme of the application can be applied to electronic devices of different types, different models and different operating systems (such as Windows, Android, Linux, Mac OS, IOS and the like) to operate, and is not limited to electronic devices of specific models and specific operating systems.
Optionally, the video decoding method further includes:
when determining that the video stream does not include an Alpha layer encoded stream, executing step S207;
s207, calling a color layer decoder;
subsequently executing the step S204;
and S208, obtaining a display image according to the color value obtained in the step S204.
In this way, decoding compatibility for video files without Alpha layers is achieved by decoding video files that do not include Alpha layer encoded streams using a color layer decoder.
Optionally, as shown in fig. 3, the determining in step S202 whether the video stream includes both color layer data and Alpha layer data includes:
s301, acquiring a VPS from the video stream;
s302, analyzing an expansion data identifier VPS _ extension in the VPS;
s303, if the extension data identification vps _ extension corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream.
S304, if the extension data identification vps _ extension corresponding to the Alpha layer is not analyzed, determining that the video stream does not include the Alpha layer coding stream.
According to the HEVC Video With Alpha coding standard, when vps _ extension ≠ 0, it indicates that an additional layer independent from the color layer is included in the Video stream. In this application, if vps _ extension ≠ 0, it is determined that the color layer encoded stream and the Alpha layer encoded stream are included in the video stream at the same time. And when vps _ extension is 0, it indicates that no additional layer independent from the color layer is included in the video stream. In this application, if vps _ extension is 0, it is determined that the Alpha layer encoded stream is not included in the video stream.
Optionally, in step S201, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for caching color layer reference frames obtained in the process of decoding the color layer coding stream;
and creating an Alpha layer reference frame buffer area, wherein the Alpha layer reference frame buffer area is used for caching the Alpha layer reference frames obtained in the Alpha layer coding stream decoding process.
In the encoding process, inter-frame prediction is generally adopted for encoding, that is, a part of image frames in an original video file will refer to information of other image frames (reference frames), and changes relative to the reference frames will be encoded. Correspondingly, in the decoding process, the image information of the current frame needs to be decoded according to the information of the reference frame. In this way, the reference frame corresponding to the current frame can be cached by establishing the color layer reference frame buffer area and the Alpha layer reference frame buffer area, so that the decoding is realized.
If the video stream does not include the Alpha layer encoded stream, in step S207, when the color layer decoder is called, a color layer reference frame buffer area is created, where the color layer reference frame buffer area is used to cache the color layer reference frame obtained in the process of decoding the color layer encoded stream.
Before formal decoding operation is carried out, decoding parameters need to be initialized and set for a decoder by using parameter sets corresponding to the color layer and the Alpha layer respectively.
Optionally, as shown in fig. 4, after the calling the color layer decoder and the Alpha layer decoder in step S203, the method further includes:
s401, acquiring SPS from the video stream;
s402, determining the SPS to be a color layer SPS or an Alpha layer SPS according to a data layer mark parameter nuh _ layer _ id in the SPS;
s403, acquiring PPS from the video stream;
s404, determining the PPS as a color layer PPS or an Alpha layer PPS according to a data layer mark parameter nuh _ layer _ id of the PPS;
s405, initializing the color layer decoder by utilizing the color layer SPS and the color layer PPS;
s406, initializing the Alpha layer decoder by utilizing the Alpha layer SPS and the Alpha layer PPS.
In a specific implementation process, if the video stream includes the color layer encoded stream and the Alpha layer encoded stream at the same time, the video stream includes at least two sets of SPS and two sets of PPS. If nuh _ layer _ id is 0, the corresponding SPS or PPS is the color layer parameter set; if nuh _ layer _ id is 1, the corresponding SPS or PPS is an Alpha layer parameter set.
In this way, the color layer decoder and the Alpha layer decoder are initialized through the respective corresponding parameter sets of the color layer and the Alpha layer, so that the subsequent decoding of color layer frame data and Alpha layer frame data is facilitated according to the parameters in the parameter sets.
If the video stream does not include an Alpha layer encoded stream, after the color layer decoder is called, the method further includes:
and acquiring SPS and PPS from the video stream, and initializing the color layer decoder by directly utilizing the SPS and the PPS.
After the parameter initialization setting of the decoder by obtaining the parameter set of the color layer and the parameter set of the Alpha layer is completed, frame data in the video stream needs to be distinguished to be decoded by using corresponding decoders respectively.
Optionally, as shown in fig. 5, the acquiring, in step S204, a color layer encoded stream in the video stream, and the acquiring, in step S205, an Alpha layer encoded stream in the video stream include:
s501, acquiring frame data in the video stream;
s502, judging that the frame data is color layer frame data or Alpha layer frame data according to a data layer mark parameter nuh _ layer _ id of the frame data; when the frame data is determined to be color layer frame data, executing step S503, and when the frame data is determined to be Alpha layer frame data, executing step S504;
s503, sending the color layer frame data to the color layer decoder;
s504, the Alpha layer frame data is sent to the Alpha layer decoder.
According to the coding standard of HEVC Video With Alpha, when nuh _ layer _ id of frame data is 0, it indicates that the frame data is color layer frame data; when nuh _ layer _ id of frame data is vps extension, the frame data is Alpha layer frame data.
In this way, the color layer frame data and the Alpha layer frame data can be distinguished by the data layer flag parameter nuh _ layer _ id, which facilitates decoding using a corresponding decoder.
Optionally, as shown in fig. 5, the video decoding method further includes:
in step S502, when it is analyzed that the data layer flag parameter nuh _ layer _ id is null or an illegal value, step S505 is executed;
in a specific implementation process, when the data layer flag parameter nuh _ layer _ id is an illegal value, nuh _ layer _ id ≠ 0, and nuh _ layer _ id ≠ vps extension, at this time, it cannot be determined that the frame data is color layer frame data or Alpha layer frame data according to the data layer flag parameter nuh _ layer _ id.
S505, acquiring a frame number frame _ num of the frame to be decoded of the frame data;
s506, judging whether the frame sequence number frame _ num is a new frame sequence number or not; if the frame number frame _ num is a new frame number, determining that the frame to be decoded is color layer frame data, and executing step S503; if the frame number frame _ num is a repeated frame number, execute step S507;
s507, judging that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame mark parameter first _ slice _ in _ frame in the frame to be decoded; and correspondingly executing the step S503 or the step S504.
Specifically, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame mark parameter first _ slice _ in _ frame in the frame to be decoded includes:
determining the frame to be decoded as the Alpha layer frame data when determining the frame to be decoded as an initial data packet according to the frame mark parameter first _ slice _ in _ frame;
and when the frame to be decoded is determined not to be the initial data packet according to the frame marking parameter first _ slice _ in _ frame, determining that the frame to be decoded is the color layer frame data.
In a specific implementation process, when a frame mark parameter first _ slice _ in _ frame in the frame to be decoded is 1, determining that the frame to be decoded is an initial data packet; when the frame mark parameter first _ slice _ in _ frame in the frame to be decoded is 0, determining that the frame to be decoded is not the initial data packet.
In this way, by distinguishing the color layer frame data and the Alpha layer frame data by the frame number frame _ num or by combining the frame number frame _ num and the frame flag parameter first _ slice _ in _ frame, it is possible to avoid decoding failure due to an error in the data layer flag parameter nuh _ layer _ id.
According to the coding standard of HEVC Video With Alpha, the Alpha layer frame data is in a YUV color model format.
In the process of decoding Alpha layer frame data by using the Alpha layer decoder, optionally, as shown in fig. 6, in step S205, the decoding the Alpha layer encoded stream by the Alpha layer decoder to obtain an Alpha value includes:
s601, the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data;
s602, the Alpha layer decoder decodes the Y color direction data to obtain the Alpha value.
By discarding the data in the U color direction and the V color direction in the Alpha layer frame data, the decoding speed of the Alpha layer decoder can be increased by about 30%.
Optionally, as shown in fig. 7, in step S206, obtaining a display image according to the color value and the Alpha value includes:
s703, outputting the color value and the Alpha value to a Graphics Processing Unit (GPU);
s704, determining a mixing mode of the color values and the Alpha values by the GPU according to the color values and the Alpha values of each pixel, and mixing the color values and the Alpha values according to the positions of the pixels.
As an optional implementation manner, the step S703 of outputting the color value and the Alpha value to a GPU specifically includes:
and respectively carrying out texture sampling on the color value and the Alpha value to obtain two textures, and outputting the two textures to a GPU.
As another optional implementation manner, the step S703 of outputting the color value and the Alpha value to a GPU specifically includes:
and performing texture sampling on the color values and the Alpha values to obtain a texture, and outputting the texture to a GPU.
In a specific implementation process, if texture sampling is performed on the color values and the Alpha values in step S703 and then the color values and the Alpha values are output to the GPU, when performing blending operation in step S704, a shader (shader) is first used to merge the two textures into a four-dimensional matrix including the color values and the Alpha values, and then blending is performed. In the step S704, there are many schemes for mixing the color value and the Alpha value, and the present application is not limited herein. One specific mixing scheme is as follows:
Figure BDA0002992675800000191
where a denotes pixels of an image frame of a video obtained by decoding by the video decoding method of the present application, B denotes pixels of an image frame of a background video, C denotes pixels of a displayed image frame obtained by final blending, and R, G, B denotes the red, green, and blue tristimulus values of the corresponding image, respectively.
Thus, the color values and the Alpha values are mixed through the GPU, and the semitransparent characteristic of the video is achieved.
Correspondingly, if the video stream does not include the Alpha layer encoded stream, in step S208, obtaining the display image according to the color value specifically includes: and outputting the color values to a GPU, and displaying by the GPU according to the color values of each pixel.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
s701, after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel flag parameter output flag corresponding to the color layer;
and/or, S702, after the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, generating a decoding channel flag parameter output flag corresponding to the Alpha layer;
in step S704, the GPU determines the color value and Alpha value of each pixel according to the decoding channel flag parameter output flag.
In this way, subsequent graphics processors are facilitated to distinguish the color values from the Alpha values.
Exemplary device
Having described the method of the exemplary embodiments of the present application, a video decoding apparatus of the exemplary embodiments of the present application is described next.
Fig. 8 is a schematic structural diagram of a video decoding apparatus 800 according to an embodiment of the present application. In one embodiment, the video decoding device 800 comprises: an initialization unit 801, a color layer decoding unit 802, an Alpha layer decoding unit 803, and a display unit 804.
An initialization unit 801, configured to receive a video stream to be decoded, and call a color layer decoder and an Alpha layer decoder when determining that the video stream includes a color layer encoded stream and an Alpha layer encoded stream at the same time;
a color layer decoding unit 802, configured to obtain a color layer encoded stream in the video stream, and decode the color layer encoded stream by the color layer decoder to obtain a color value;
an Alpha layer decoding unit 803, configured to obtain an Alpha layer encoded stream in the video stream, and decode the Alpha layer encoded stream by the Alpha layer decoder to obtain an Alpha value;
and the display unit 804 is configured to obtain a display image according to the color value and the Alpha value.
Optionally, determining that the video stream includes both color layer data and Alpha layer data comprises:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the extended data identifier corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream.
Optionally, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for caching color layer reference frames obtained in the process of decoding the color layer coding stream;
and creating an Alpha layer reference frame buffer area, wherein the Alpha layer reference frame buffer area is used for caching the Alpha layer reference frames obtained in the Alpha layer coding stream decoding process.
Optionally, after the color layer decoder and the Alpha layer decoder are called, the method further includes:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set as a color layer image parameter set or an Alpha layer image parameter set according to a data layer mark parameter in the image parameter set;
initializing the color layer decoder using the color layer sequence parameter set and the color layer image parameter set;
initializing the Alpha layer decoder using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
Optionally, acquiring a color layer encoded stream and an Alpha layer encoded stream in the video stream includes:
acquiring frame data in the video stream, and determining the frame data as color layer frame data or Alpha layer frame data according to a data layer mark parameter of the frame data;
sending the color layer frame data to the color layer decoder;
sending the Alpha layer frame data to the Alpha layer decoder.
Optionally, the video decoding apparatus further includes:
a data determining unit (not shown in fig. 8) configured to obtain a frame number of the frame to be decoded of the frame data when the data layer flag parameter is null or an illegal value;
if the frame serial number is the new frame serial number, determining that the frame to be decoded is color layer frame data;
and if the frame serial number is a repeated frame serial number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameters in the frame to be decoded.
Optionally, determining, according to the frame marker parameter in the frame to be decoded, that the frame to be decoded is color layer frame data or Alpha layer frame data, including:
when the frame to be decoded is determined to be an initial data packet according to the frame marking parameters, determining the frame to be decoded to be Alpha layer frame data;
and when the frame to be decoded is determined not to be the initial data packet according to the frame marking parameters, determining that the frame to be decoded is the color layer frame data.
Optionally, the Alpha layer frame data is in YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
Optionally, obtaining a display image according to the color value and the Alpha value includes:
outputting the color values and the Alpha values to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphics processor, and mixing the color value and the Alpha value according to the position of the pixel.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to the color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coding stream to obtain an Alpha value, generating a decoding channel mark parameter corresponding to the Alpha layer;
wherein the graphics processor determines a color value and an Alpha value for each pixel according to the decoding channel flag parameter.
Optionally, the initialization unit 801 is further configured to invoke a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
the display unit 804 is further configured to obtain a display image according to the color value;
when the initialization unit 801 determines that the video stream does not include an Alpha layer encoded stream, the color layer decoding unit 802 obtains a color layer encoded stream in the video stream, and the color layer decoder decodes the color layer encoded stream to obtain a color value.
Based on the same inventive concept as the video decoding method, an embodiment of the present application further provides an electronic device, which may specifically be an electronic device such as a desktop computer, a mobile phone, a mobile computer, a tablet computer, a media player, an intelligent wearable device, an intelligent television, a vehicle-mounted device, and a Personal Digital Assistant (PDA).
The electronic device 900 is described below with reference to fig. 9. The electronic device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 900 may be embodied in the form of a general-purpose computing device, for example, it may be a terminal device and the components of the electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one memory unit 920, and a bus 930 that couples various system components including the memory unit 920 and the processing unit 910.
The Processing Unit 910 may be a general-purpose Processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present Application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Bus 930 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The storage unit 920 may include a readable medium in the form of a volatile Memory, such as a Random Access Memory (RAM) 921 and/or a cache Memory 922, and may further include a Read-Only Memory (ROM) 923.
Storage unit 920 may also include programs/utilities 925 having a set (at least one) of program modules 924, such program modules 924 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The program modules 924, when executed by the processing unit 910, cause the processing unit 910 to perform various steps in a video decoding method according to various exemplary embodiments of the present application described in the "exemplary methods" section above in this specification.
For example, processing unit 910 may perform a video decoding method as shown in fig. 2, including: s201, receiving a video stream to be decoded; s202, judging and determining whether the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream; if so, S203, calling a color layer decoder and an Alpha layer decoder respectively; s204, acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value; s205, obtaining an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value; and S206, obtaining a display image according to the color value and the Alpha value.
The electronic device 900 may also communicate with one or more external devices 940 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the internet) via the Network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Exemplary program product
The embodiment of the present application provides a computer-readable storage medium for storing computer program instructions for the electronic device, which contains a program for executing the video decoding method in any exemplary embodiment of the present application.
The computer storage media may be any available media or data storage device that can be accessed by a computer, including but not limited to magnetic memory (e.g., floppy disks, hard disks, magnetic tape, magneto-optical disks (MOs), etc.), optical memory (e.g., CDs, DVDs, BDs, HVDs, etc.), and semiconductor memory (e.g., ROMs, EPROMs, EEPROMs, non-volatile memory (NAND FLASH), Solid State Disks (SSDs)), etc.
In some possible embodiments, as shown in fig. 10, various aspects of the present application may also be implemented as a computer program product 1000 comprising program code for causing an electronic device to perform the steps in the video decoding method according to various exemplary embodiments of the present application described in the above-mentioned "exemplary methods" section of this specification, when the computer program product runs on the electronic device, for example, the electronic device may perform the video decoding method as shown in fig. 2, comprising: s201, receiving a video stream to be decoded; s202, judging and determining whether the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream; if so, S203, calling a color layer establishing decoder and an Alpha layer decoder respectively; s204, acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value; s205, obtaining an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value; and S206, obtaining a display image according to the color value and the Alpha value.
The computer program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer program product for video decoding according to an embodiment of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a server device. However, the program product of the present application is not limited thereto, and in this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device over any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., over the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
While the spirit and principles of the application have been described with reference to several particular embodiments, it is to be understood that the application is not limited to the disclosed embodiments, nor is the division of aspects, which is for convenience only as the features in such aspects may not be combined to benefit from the description. The application is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (10)

1. A video decoding method, comprising:
receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream;
acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
acquiring an Alpha layer coding stream in the video stream, and decoding the Alpha layer coding stream by the Alpha layer decoder to obtain an Alpha value;
and obtaining a display image according to the color value and the Alpha value.
2. The method of claim 1, wherein determining that the video stream includes both color layer data and Alpha layer data comprises:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the extended data identifier corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream.
3. The method of claim 1, wherein when the color layer decoder and Alpha layer decoder are invoked, further comprising:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for caching color layer reference frames obtained in the process of decoding the color layer coding stream;
and creating an Alpha layer reference frame buffer area, wherein the Alpha layer reference frame buffer area is used for caching the Alpha layer reference frames obtained in the Alpha layer coding stream decoding process.
4. The method of claim 1, wherein after invoking the color layer decoder and the Alpha layer decoder, further comprising:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set as a color layer image parameter set or an Alpha layer image parameter set according to a data layer mark parameter in the image parameter set;
initializing the color layer decoder using the color layer sequence parameter set and the color layer image parameter set;
initializing the Alpha layer decoder using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
5. The method of claim 1, wherein obtaining a color layer encoded stream and an Alpha layer encoded stream in the video stream comprises:
acquiring frame data in the video stream, and determining the frame data as color layer frame data or Alpha layer frame data according to a data layer mark parameter of the frame data;
sending the color layer frame data to the color layer decoder;
sending the Alpha layer frame data to the Alpha layer decoder.
6. The method of claim 5, further comprising:
when the data layer mark parameter is analyzed to be null or an illegal value, acquiring the frame serial number of the frame to be decoded of the frame data;
if the frame serial number is the new frame serial number, determining that the frame to be decoded is color layer frame data;
and if the frame serial number is a repeated frame serial number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameters in the frame to be decoded.
7. The method of claim 5, wherein the Alpha layer frame data is in YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
8. A video decoding apparatus, comprising:
the device comprises an initialization unit, a color layer decoder and an Alpha layer decoder, wherein the initialization unit is used for receiving a video stream to be decoded, and calling the color layer decoder and the Alpha layer decoder respectively when determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream;
the color layer decoding unit is used for acquiring a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
an Alpha layer decoding unit, configured to obtain an Alpha layer encoded stream in the video stream, and decode the Alpha layer encoded stream by the Alpha layer decoder to obtain an Alpha value;
and the display unit is used for obtaining a display image according to the color value and the Alpha value.
9. An electronic device, comprising: a processor and a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the video decoding method of any of claims 1-7.
10. A computer storage medium, characterized in that the computer storage medium stores a computer program for implementing the video decoding method according to any one of claims 1 to 7.
CN202110320486.7A 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium Active CN113099232B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110320486.7A CN113099232B (en) 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110320486.7A CN113099232B (en) 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium

Publications (2)

Publication Number Publication Date
CN113099232A true CN113099232A (en) 2021-07-09
CN113099232B CN113099232B (en) 2024-01-19

Family

ID=76669913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110320486.7A Active CN113099232B (en) 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium

Country Status (1)

Country Link
CN (1) CN113099232B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015101173A1 (en) * 2013-12-30 2015-07-09 Mediatek Singapore Pte. Ltd. Method and apparatus for scaling parameter coding for inter-component residual prediction
US20160165262A1 (en) * 2013-07-18 2016-06-09 Lg Electronics Inc. Method and apparatus for processing video signal
CN106105220A (en) * 2014-01-07 2016-11-09 诺基亚技术有限公司 Method and apparatus for video coding and decoding
CN107071514A (en) * 2017-04-08 2017-08-18 腾讯科技(深圳)有限公司 A kind of photograph document handling method and intelligent terminal
WO2018184483A1 (en) * 2017-04-08 2018-10-11 腾讯科技(深圳)有限公司 Picture file processing method and system, and storage medium
CN110113615A (en) * 2018-02-01 2019-08-09 腾讯科技(深圳)有限公司 Image encoding method, device, calculates equipment and storage medium at coding/decoding method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160165262A1 (en) * 2013-07-18 2016-06-09 Lg Electronics Inc. Method and apparatus for processing video signal
WO2015101173A1 (en) * 2013-12-30 2015-07-09 Mediatek Singapore Pte. Ltd. Method and apparatus for scaling parameter coding for inter-component residual prediction
CN106105220A (en) * 2014-01-07 2016-11-09 诺基亚技术有限公司 Method and apparatus for video coding and decoding
CN107071514A (en) * 2017-04-08 2017-08-18 腾讯科技(深圳)有限公司 A kind of photograph document handling method and intelligent terminal
WO2018184483A1 (en) * 2017-04-08 2018-10-11 腾讯科技(深圳)有限公司 Picture file processing method and system, and storage medium
CN110113615A (en) * 2018-02-01 2019-08-09 腾讯科技(深圳)有限公司 Image encoding method, device, calculates equipment and storage medium at coding/decoding method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
APPLE INC.: "HEVC Video with Alpha Interoperability Profile", pages 3 - 6, Retrieved from the Internet <URL:https://developer.apple.com/av-foundation/HEVC-Video-with-Alpha-Interoperability-Profile.pdf> *

Also Published As

Publication number Publication date
CN113099232B (en) 2024-01-19

Similar Documents

Publication Publication Date Title
US11012489B2 (en) Picture file processing method, picture file processing device, and storage medium
US10587897B2 (en) Specifying visual dynamic range coding operations and parameters
US10249019B2 (en) Method and apparatus for mapping omnidirectional image to a layout output format
US20190273929A1 (en) De-Blocking Filtering Method and Terminal
JP6472429B2 (en) Method, apparatus and system for determining LUMA values
JP2022137130A (en) Method and device for encoding/decoding image unit comprising image data represented by luminance channel and at least one chrominance channel
RU2693185C2 (en) Encoding method and colour conversion decoding method and corresponding devices
KR102617258B1 (en) Image processing method and apparatus
CN101340587A (en) Method for encoding input image, method and apparatus for displaying an encoded image
WO2020119449A1 (en) Chroma block prediction method and device
BR112021007865A2 (en) image prediction method and apparatus
BR112021004124A2 (en) video decoding method and video decoder
WO2016197334A1 (en) Image encoding/decoding method and apparatus
BR112021013444A2 (en) IMAGE, DEVICE, AND STORAGE MEDIA METHOD, DEVICE AND PREDICTION SYSTEM
BR112021009911A2 (en) encoder, decoder and corresponding methods using compact mv storage
WO2017063168A1 (en) Image coding method and apparatus, and image processing device
WO2023142127A1 (en) Coding and decoding methods and apparatuses, device, and storage medium
TWI505717B (en) Joint scalar embedded graphics coding for color images
WO2021147463A1 (en) Video processing method and device, and electronic apparatus
CN113099232B (en) Video decoding the method is a device(s) electronic device and computing machine storage medium
US11967345B2 (en) System and method for rendering key and fill video streams for video processing
US9317891B2 (en) Systems and methods for hardware-accelerated key color extraction
WO2022219202A1 (en) System and method for rendering key and fill video streams for video processing
TW202408245A (en) Coding method and apparatus, decoding method and apparatus, and coder, decoder and storage medium
KR20170032605A (en) Method and apparatus for decoding a video signal with transmition of chroma sampling position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant