CN112448962A - Video anti-aliasing display method and device, computer equipment and readable storage medium - Google Patents

Video anti-aliasing display method and device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN112448962A
CN112448962A CN202110128499.4A CN202110128499A CN112448962A CN 112448962 A CN112448962 A CN 112448962A CN 202110128499 A CN202110128499 A CN 202110128499A CN 112448962 A CN112448962 A CN 112448962A
Authority
CN
China
Prior art keywords
pixel
brightness
luminance
video frame
contour edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110128499.4A
Other languages
Chinese (zh)
Other versions
CN112448962B (en
Inventor
肖正东
陈锡华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Happycast Technology Co Ltd
Original Assignee
Shenzhen Happycast Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Happycast Technology Co Ltd filed Critical Shenzhen Happycast Technology Co Ltd
Priority to CN202110128499.4A priority Critical patent/CN112448962B/en
Publication of CN112448962A publication Critical patent/CN112448962A/en
Application granted granted Critical
Publication of CN112448962B publication Critical patent/CN112448962B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

Abstract

The invention relates to the technical field of rendering of video players, and discloses a video anti-aliasing display method, a device, computer equipment and a readable storage medium, which can perform an anti-aliasing filter operation on a video frame once in the process of playing the video and before rendering and displaying the video frame on a screen, namely after the video frame is obtained by decoding, firstly identifying outline edge pixels in the source video frame, then judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each outline edge pixel is positioned in a preset brightness gradient range, if so, obtaining a corresponding pixel brightness compensation value according to the pixel brightness mean value of the corresponding surrounding pixels, then drawing to obtain a new texture for compensating and eliminating the aliasing phenomenon based on the pixel brightness compensation values of all the outline edge pixels, and finally rendering and displaying the source video frame and the new texture through frame buffering, the purpose of eliminating the saw tooth phenomenon can be achieved.

Description

Video anti-aliasing display method and device, computer equipment and readable storage medium
Technical Field
The invention belongs to the technical field of rendering of video players, and particularly relates to a video anti-aliasing display method and device, computer equipment and a readable storage medium.
Background
Streaming Media (Streaming Media) refers to a technology and a process of compressing a series of Media data, sending the data in segments on the network, and transmitting video and audio on the network for viewing, wherein the technology enables data packets to be sent as Streaming; if this technique is not used, the entire media file must be downloaded before use. The streaming transmission can transmit the on-site video or the film pre-stored in the server, and when the viewer watches the video files, the video data is immediately played by the specific playing software after reaching the computer of the viewer.
The Streaming is mainly realized by two modes, namely sequential Streaming (Progressive Streaming) and Real-Time Streaming (Real Time Streaming). The former is sequential downloading, that is, a viewer downloads a file while watching an online media, and in this process, the viewer can only watch the downloaded part but cannot directly watch the un-downloaded part, that is, the viewer can always watch the information transmitted by the server after a delay. The latter means that the media can be watched in real time on the premise of ensuring the matching connection bandwidth, that is, the viewer can watch the content in front of or behind the media arbitrarily during the watching process.
In the current processing process of streaming media, the main processing mode is to transmit video stream data to a multimedia digital signal decoder MediaCodec, then decode the video stream data by the MediaCodec to obtain video frames and texture information, and finally render the texture information to the video pictures of the video frames in real time through a surface shader, so as to achieve the purpose of output and display. However, in the process of video playing (for example, playing videos acquired by different cameras, or playing videos on different video websites, etc.) or displaying desktop images of different devices in real time in a screen projection manner, due to the fact that the quality of the acquired video sources or video pictures on the websites is uneven, playing and displaying of part of the videos can have serious jaggy feeling, and poor video watching experience can be brought to audiences.
Disclosure of Invention
In order to solve the problem that poor video watching experience is brought to audiences due to the fact that a part of video pictures have serious saw teeth in the video playing and displaying process, the invention aims to provide a video anti-saw teeth displaying method, a video anti-saw teeth displaying device, a computer device and a readable storage medium.
In a first aspect, the present invention provides a video anti-aliasing display method, including:
decoding video stream data to obtain a source video frame;
identifying at least one contour edge pixel based on pixel intensity in the source video frame;
judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel in the at least one contour edge pixel is within a preset brightness gradient range;
if so, acquiring a pixel brightness compensation value of the corresponding contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel;
drawing to obtain a new texture according to the pixel brightness compensation values of all contour edge pixels in the source video frame;
and rendering and displaying the source video frame and the new texture through frame buffering.
Based on the content of the invention, an anti-aliasing filter operation can be performed on a video frame once during the process of playing the video and before the video frame is rendered and displayed on a screen, namely after the source video frame is obtained by decoding, contour edge pixels in the source video frame are firstly identified, then whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel is within a preset brightness gradient range is judged, if so, a corresponding pixel brightness compensation value is obtained according to the pixel brightness mean value of the corresponding surrounding pixels, then a new texture for compensating and eliminating the aliasing phenomenon is drawn based on the pixel brightness compensation values of all the contour edge pixels, and finally the source video frame and the new texture are rendered and displayed through frame buffering, so that the purpose of eliminating the aliasing phenomenon can be achieved.
In one possible design, decoding video stream data to obtain source video frames includes:
and decoding the video stream data by using a multimedia digital signal decoder (MediaCodec) and an embedded system OpenGL Es of a development image drawing library to obtain the source video frame.
In one possible design, identifying at least one contour edge pixel based on pixel intensity in the source video frame includes:
calculating to obtain the pixel brightness of each pixel in the source video frame;
respectively determining the minimum value and the maximum value of pixel brightness in the corresponding pixel brightness, the pixel brightness of the left and right adjacent pixels and the pixel brightness of the upper and lower adjacent pixels aiming at each pixel in the source video frame;
calculating a pixel brightness range value according to the corresponding pixel brightness minimum value and the corresponding pixel brightness maximum value aiming at each pixel in the source video frame, wherein the pixel brightness range value is equal to the pixel brightness maximum value minus the corresponding pixel brightness minimum value;
respectively judging whether the corresponding pixel brightness range value is smaller than the maximum value in the product of a preset brightness threshold value and the corresponding pixel brightness maximum value and a preset brightness threshold value coefficient or not aiming at each pixel in the source video frame;
if yes, the corresponding pixel is determined to be a contour edge pixel.
In one possible design, calculating the pixel brightness of each pixel in the source video frame includes:
the pixel brightness of the pixel is calculated according to the following formula
Figure 301689DEST_PATH_IMAGE001
Figure 36427DEST_PATH_IMAGE002
In the formula (I), the compound is shown in the specification,
Figure 97793DEST_PATH_IMAGE003
representing the green channel value of a pixel in RGB color mode,
Figure 599181DEST_PATH_IMAGE004
representing the red channel value of the pixel in RGB color mode.
In one possible design, the determining whether the horizontal luminance gradient or the vertical luminance gradient of each of the at least one contour edge pixel is within a preset luminance gradient range includes:
for each contour edge pixel in the at least one contour edge pixel, calculating an edge horizontal direction sawtooth index according to the following formula
Figure 813125DEST_PATH_IMAGE005
And edge normal direction sawtooth index
Figure 233611DEST_PATH_IMAGE006
Figure 1847DEST_PATH_IMAGE007
In the formula (I), the compound is shown in the specification,
Figure 979030DEST_PATH_IMAGE008
representing the pixel brightness of the corresponding left-adjacent pixel,
Figure 28239DEST_PATH_IMAGE009
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the lower left neighboring pixel,
Figure 839200DEST_PATH_IMAGE010
which represents the brightness of the corresponding pixel or pixels,
Figure 343999DEST_PATH_IMAGE011
indicating the sum of the pixel luminance of the corresponding upper neighboring pixel and the pixel luminance of the corresponding lower neighboring pixel,
Figure 531398DEST_PATH_IMAGE012
representing the pixel brightness of the corresponding right-adjacent pixel,
Figure 703623DEST_PATH_IMAGE013
representing the sum of the pixel luminance corresponding to the upper right adjacent pixel and the pixel luminance corresponding to the lower right adjacent pixel,
Figure 934753DEST_PATH_IMAGE014
representing the pixel brightness of the corresponding neighboring pixel,
Figure 677581DEST_PATH_IMAGE015
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the upper right neighboring pixel,
Figure 655288DEST_PATH_IMAGE016
representing the sum of the pixel luminance of the corresponding left-adjacent pixel and the pixel luminance of the corresponding right-adjacent pixel,
Figure 167172DEST_PATH_IMAGE017
representing the pixel brightness of the corresponding next-adjacent pixel,
Figure 178991DEST_PATH_IMAGE018
representing the sum of the pixel luminance corresponding to the lower left neighboring pixel and the pixel luminance corresponding to the lower right neighboring pixel;
respectively judging the corresponding sawtooth index in the edge horizontal direction aiming at each contour edge pixel in the at least one contour edge pixel
Figure 127224DEST_PATH_IMAGE005
Whether or not it is greater than or equal to the corresponding edge vertical sawtooth index
Figure 781059DEST_PATH_IMAGE006
If so, judging whether the corresponding horizontal brightness gradient is located in the preset brightness gradient range, otherwise, judging whether the corresponding vertical brightness gradient is located in the preset brightness gradient range.
In one possible design, obtaining the pixel brightness compensation value of the corresponding contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel includes:
calculating the pixel brightness mean value of the surrounding pixels according to the following formula
Figure 881870DEST_PATH_IMAGE019
Figure 454803DEST_PATH_IMAGE020
In the formula (I), the compound is shown in the specification,
Figure 31278DEST_PATH_IMAGE021
indicating the sum of the pixel luminance of the corresponding upper neighboring pixel and the pixel luminance of the corresponding lower neighboring pixel,
Figure 98591DEST_PATH_IMAGE022
representing the sum of the pixel luminance of the corresponding left-adjacent pixel and the pixel luminance of the corresponding right-adjacent pixel,
Figure 178543DEST_PATH_IMAGE023
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the lower left neighboring pixel,
Figure 187956DEST_PATH_IMAGE024
representing the sum of the pixel brightness corresponding to the upper right adjacent pixel and the pixel brightness corresponding to the lower right adjacent pixel;
averaging the pixel intensities of corresponding surrounding pixels
Figure 861514DEST_PATH_IMAGE025
And subtracting the pixel brightness of the corresponding contour edge pixel to calculate the pixel brightness compensation value.
In one possible design, after a new texture is rendered based on pixel intensity compensation values of all contour edge pixels in the source video frame, the method further includes:
and caching the new texture by developing a texture buffer area of an image drawing library OpenGL.
In a second aspect, the invention provides a video anti-aliasing display device, which comprises a video decoding module, an edge identification module, a gradient judgment module, a complementary value acquisition module, a texture drawing module and a rendering display module which are sequentially in communication connection;
the video decoding module is used for decoding video stream data and acquiring a source video frame;
the edge identification module is used for identifying at least one contour edge pixel according to the pixel brightness in the source video frame;
the gradient judging module is used for judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel in the at least one contour edge pixel is within a preset brightness gradient range;
the complementary value obtaining module is used for obtaining a pixel brightness compensation value corresponding to the contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel when the horizontal direction brightness gradient or the vertical direction brightness gradient of the contour edge pixel is within the preset brightness gradient range;
the texture drawing module is used for drawing to obtain a new texture according to the pixel brightness compensation values of all contour edge pixels in the source video frame;
and the rendering display module is used for rendering and displaying the source video frame and the new texture through frame buffering.
In a third aspect, the present invention provides a computer device comprising a memory and a processor, wherein the memory is used for storing a computer program, and the processor is used for reading the computer program and executing the method according to the first aspect or any one of the possible designs of the first aspect.
In a fourth aspect, the present invention provides a readable storage medium having stored thereon instructions which, when executed on a computer, perform the method as set forth in the first aspect or any one of the possible designs of the first aspect.
In a fifth aspect, the present invention provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method as described above in the first aspect or any one of the possible designs of the first aspect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart illustrating a video anti-aliasing display method according to the present invention.
Fig. 2 is a diagram illustrating a positional relationship between a central pixel and peripheral pixels according to the present invention.
FIG. 3 is a comparative example diagram of the antialiasing effect before and after the video antialiasing display method is employed according to the present invention.
FIG. 4 is a schematic structural diagram of a video anti-aliasing display device provided by the invention.
Fig. 5 is a schematic structural diagram of a computer device provided by the present invention.
Detailed Description
The invention is further described with reference to the following figures and specific embodiments. It should be noted that the description of the embodiments is provided to help understanding of the present invention, but the present invention is not limited thereto. Specific structural and functional details disclosed herein are merely illustrative of example embodiments of the invention. This invention may, however, be embodied in many alternate forms and should not be construed as limited to the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments of the present invention.
It should be understood that, for the term "and/or" as may appear herein, it is merely an associative relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, B exists alone, and A and B exist at the same time; for the term "/and" as may appear herein, which describes another associative object relationship, it means that two relationships may exist, e.g., a/and B, may mean: a exists independently, and A and B exist independently; in addition, for the character "/" that may appear herein, it generally means that the former and latter associated objects are in an "or" relationship.
It will be understood that when an element is referred to herein as being "connected," "connected," or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Conversely, if a unit is referred to herein as being "directly connected" or "directly coupled" to another unit, it is intended that no intervening units are present. In addition, other words used to describe the relationship between elements should be interpreted in a similar manner (e.g., "between … …" versus "directly between … …", "adjacent" versus "directly adjacent", etc.).
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used herein, specify the presence of stated features, quantities, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, quantities, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative designs, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
It should be understood that specific details are provided in the following description to facilitate a thorough understanding of example embodiments. However, it will be understood by those of ordinary skill in the art that the example embodiments may be practiced without these specific details. For example, systems may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, well-known processes, structures and techniques may be shown without unnecessary detail in order to avoid obscuring example embodiments.
As shown in fig. 1 to 3, the video anti-aliasing display method provided in the first aspect of this embodiment may be executed by, but is not limited to, a terminal device based on an android system and capable of receiving and playing real-time streaming media, for example, on an electronic device such as a smart phone, a tablet computer, or a screen projection television. The video anti-aliasing display method may include, but is not limited to, the following steps S101 to S106.
S101, decoding video stream data to obtain a source video frame.
In the step S101, the video stream data may be, but is not limited to, video data transmitted in real time through the internet and coming from a live video and audio collecting device (e.g. a monitoring camera, etc.) or a video server on the premise of matching the connection bandwidth, so as to be viewable by a user of the terminal device in real time. The specific decoding method is an existing conventional method, and preferably, the video stream data is decoded by using a multimedia digital signal decoder MediaCodec (which is a video coding and decoding tool of the android System, and has a high efficiency compared with that of a multimedia processing tool FFMPEG (hard decoding) because of using hard decoding) and an OpenGL Es (OpenGL for Embedded System) for developing an image drawing library OpenGL three-dimensional graphics API, which is a subset of the OpenGL API for developing an image drawing library and is set for Embedded devices such as a mobile phone, a PDA, a game host, and the like), so as to obtain the source video frame. In addition, when the source video frame is decoded and obtained, corresponding texture information (the texture is a common term in computer graphics and includes the texture of the object surface in the general sense, even if the object surface presents uneven grooves, the color pattern on the smooth surface of the object, and the like) can be decoded and obtained, so that the texture information can be rendered on the video frame of the video frame in real time through the frame buffer and the surface shader, and the purpose of output and display can be realized.
S102, identifying at least one contour edge pixel according to the pixel brightness in the source video frame.
In step S102, the contour edge pixel refers to a pixel located on a contour edge of an object in a current video frame of the source video frame, and as shown in fig. 3, the contour edge of the hammer is formed by piecing together a plurality of contour edge pixels; and as shown in the left half-area hammer of fig. 3, the edge of the hammer profile is highly susceptible to the jagging phenomenon. In order to ensure the accuracy of identifying the contour edge pixels, it is preferable to identify at least one contour edge pixel according to the brightness of the pixels in the source video frame, including but not limited to the following steps S1021 to S1025.
And S1021, calculating the pixel brightness of each pixel in the source video frame.
In step S1021, the pixel brightness of the pixel may be calculated according to the following formula, but is not limited to the following formula
Figure 325993DEST_PATH_IMAGE001
Figure 913314DEST_PATH_IMAGE002
In the formula (I), the compound is shown in the specification,
Figure 719727DEST_PATH_IMAGE003
representing the green channel value of a pixel in RGB color mode,
Figure 923656DEST_PATH_IMAGE004
representing the red channel value of the pixel in RGB color mode.
S1022, aiming at each pixel in the source video frame, the minimum value and the maximum value of the pixel brightness in the corresponding pixel brightness, the pixel brightness of the left and right adjacent pixels and the pixel brightness of the upper and lower adjacent pixels are respectively determined.
In the step S1022, as shown in fig. 2, for the central pixel, the corresponding minimum pixel brightness value is the minimum value among the pixel brightness of the central pixel, the pixel brightness of the left adjacent pixel, the pixel brightness of the right adjacent pixel, the pixel brightness of the upper adjacent pixel and the pixel brightness of the lower adjacent pixel, and the corresponding maximum pixel brightness value is the maximum value among the pixel brightness of the central pixel, the pixel brightness of the left adjacent pixel, the pixel brightness of the right adjacent pixel, the pixel brightness of the upper adjacent pixel and the pixel brightness of the lower adjacent pixel.
And S1023, aiming at each pixel in the source video frame, calculating to obtain a pixel brightness range value according to the corresponding pixel brightness minimum value and the corresponding pixel brightness maximum value, wherein the pixel brightness range value is equal to the pixel brightness maximum value minus the corresponding pixel brightness minimum value.
And S1024, respectively judging whether the corresponding pixel brightness range value is smaller than the maximum value in the product of a preset brightness threshold value and the corresponding pixel brightness maximum value and a preset brightness threshold value coefficient or not for each pixel in the source video frame.
In the step S1024, the preset brightness threshold is 0.0112, for example; the preset luminance threshold coefficient is, for example, 0.125.
S1025, if yes, the corresponding pixel is judged to be a contour edge pixel.
S103, judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel in the at least one contour edge pixel is within a preset brightness gradient range.
In the step S103, the horizontal direction luminance gradient may be determined based on the pixel luminance of the left neighboring pixel, the pixel luminance of the center pixel, and the pixel luminance of the right neighboring pixel by a conventional image luminance gradient calculation method; the vertical-direction luminance gradient may be based on the pixel luminance of the upper-adjacent pixel, the pixel luminance of the center pixel, and the pixel luminance of the lower-adjacent pixel and determined by a conventional image luminance gradient calculation method. The preset brightness gradient range can be specifically determined through limited experiments according to the picture quality requirement of the anti-aliasing display effect. As shown in the left half-area hammer in fig. 3, the aliasing phenomenon is caused by the fact that a plurality of pixels in succession have similar or even the same brightness in the horizontal direction or the vertical direction, and therefore, in order to accurately identify whether the aliasing phenomenon is caused by the fact that a plurality of pixels in succession have similar or even the same brightness in the horizontal direction or the vertical direction, it is preferable to determine whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each of the at least one contour edge pixel is within a preset brightness gradient range, including but not limited to the following steps S1031 to S1033.
S1031, for each contour edge pixel in the at least one contour edge pixel, respectively calculating an edge horizontal direction sawtooth index according to the following formula
Figure 332772DEST_PATH_IMAGE005
And edge normal direction sawtooth index
Figure 387316DEST_PATH_IMAGE006
Figure 207373DEST_PATH_IMAGE026
In the formula (I), the compound is shown in the specification,
Figure 245736DEST_PATH_IMAGE008
representing the pixel brightness of the corresponding left-adjacent pixel,
Figure 192964DEST_PATH_IMAGE009
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the lower left neighboring pixel,
Figure 492227DEST_PATH_IMAGE010
which represents the brightness of the corresponding pixel or pixels,
Figure 624131DEST_PATH_IMAGE011
indicating the sum of the pixel luminance of the corresponding upper neighboring pixel and the pixel luminance of the corresponding lower neighboring pixel,
Figure 493998DEST_PATH_IMAGE012
representing the pixel brightness of the corresponding right-adjacent pixel,
Figure 635129DEST_PATH_IMAGE013
representing the sum of the pixel luminance corresponding to the upper right adjacent pixel and the pixel luminance corresponding to the lower right adjacent pixel,
Figure 788899DEST_PATH_IMAGE014
representing the pixel brightness of the corresponding neighboring pixel,
Figure 967070DEST_PATH_IMAGE015
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the upper right neighboring pixel,
Figure 714447DEST_PATH_IMAGE016
representing the sum of the pixel luminance of the corresponding left-adjacent pixel and the pixel luminance of the corresponding right-adjacent pixel,
Figure 255674DEST_PATH_IMAGE017
representing the pixel brightness of the corresponding next-adjacent pixel,
Figure 404895DEST_PATH_IMAGE018
representing the sum of the pixel luminance corresponding to the lower left neighboring pixel and the pixel luminance corresponding to the lower right neighboring pixel.
S1032, respectively judging the corresponding sawtooth index in the horizontal direction of the edge aiming at each contour edge pixel in the at least one contour edge pixel
Figure 488389DEST_PATH_IMAGE005
Whether or not it is greater than or equal to the corresponding edge vertical direction sawIndex of tooth
Figure 457482DEST_PATH_IMAGE006
S1033, if yes, judging whether the corresponding horizontal direction brightness gradient is located within the preset brightness gradient range, otherwise, judging whether the corresponding vertical direction brightness gradient is located within the preset brightness gradient range.
And S104, if so, acquiring a pixel brightness compensation value of the corresponding contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel.
In the step S104, in order to obtain the pixel brightness compensation value quickly and accurately, it is preferable to obtain the pixel brightness compensation value corresponding to the contour edge pixel according to the pixel brightness mean value corresponding to the surrounding pixels, including but not limited to the following steps S1041 to S1042.
S1041, calculating to obtain the pixel brightness mean value of the surrounding pixels according to the following formula
Figure 799471DEST_PATH_IMAGE019
Figure 678565DEST_PATH_IMAGE020
In the formula (I), the compound is shown in the specification,
Figure 323173DEST_PATH_IMAGE021
indicating the sum of the pixel luminance of the corresponding upper neighboring pixel and the pixel luminance of the corresponding lower neighboring pixel,
Figure 169775DEST_PATH_IMAGE022
representing the sum of the pixel luminance of the corresponding left-adjacent pixel and the pixel luminance of the corresponding right-adjacent pixel,
Figure 925241DEST_PATH_IMAGE023
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the lower left neighboring pixel,
Figure 658842DEST_PATH_IMAGE027
representing the sum of the pixel luminance corresponding to the upper right adjacent pixel and the pixel luminance corresponding to the lower right adjacent pixel.
S1042. corresponding to the pixel brightness mean value of the surrounding pixels
Figure 864565DEST_PATH_IMAGE028
And subtracting the pixel brightness of the corresponding contour edge pixel to calculate the pixel brightness compensation value.
And S105, drawing to obtain a new texture according to the pixel brightness compensation values of all contour edge pixels in the source video frame.
In step S105, since the pixel luminance compensation values of the partial contour edge pixels have been determined through the foregoing steps S103 to S104, a new texture for compensating for the aliasing artifact removal can be rendered in a conventional manner based on the obtained pixel luminance compensation values. Furthermore, to facilitate the use of the new texture for subsequent rendering, the new texture may preferably be cached by developing a texture buffer of the image rendering library OpenGL.
And S106, rendering and displaying the source video frame and the new texture through frame buffering.
Therefore, according to the video anti-aliasing display scheme described in the foregoing steps S101 to S106, in the process of playing a video and before rendering and displaying a video frame on a screen, an anti-aliasing filter operation may be performed on the video frame, that is, after the source video frame is obtained by decoding, contour edge pixels in the source video frame are identified first, then it is determined whether a horizontal luminance gradient or a vertical luminance gradient of each contour edge pixel is within a preset luminance gradient range, if so, a corresponding pixel luminance compensation value is obtained according to a pixel luminance average value of corresponding surrounding pixels, then a new texture for compensating and eliminating aliasing is obtained by drawing based on pixel luminance compensation values of all contour edge pixels, and finally, the source video frame and the new texture are rendered and displayed by frame buffering, so that the purpose of eliminating aliasing is achieved. As shown in fig. 3, for the hammer video picture in the left half area, after the video anti-aliasing display method provided by the embodiment is adopted, a significant anti-aliasing effect can be achieved.
As shown in fig. 4, a second aspect of this embodiment provides a virtual device for implementing the video antialiasing display method in any one of the first aspect or the first aspect, where the virtual device includes a video decoding module, an edge identification module, a gradient determination module, a complementary value acquisition module, a texture drawing module, and a rendering display module, which are sequentially connected in a communication manner;
the video decoding module is used for decoding video stream data and acquiring a source video frame;
the edge identification module is used for identifying at least one contour edge pixel according to the pixel brightness in the source video frame;
the gradient judging module is used for judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel in the at least one contour edge pixel is within a preset brightness gradient range;
the complementary value obtaining module is used for obtaining a pixel brightness compensation value corresponding to the contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel when the horizontal direction brightness gradient or the vertical direction brightness gradient of the contour edge pixel is within the preset brightness gradient range;
the texture drawing module is used for drawing to obtain a new texture according to the pixel brightness compensation values of all contour edge pixels in the source video frame;
and the rendering display module is used for rendering and displaying the source video frame and the new texture through frame buffering.
For the working process, working details and technical effects of the foregoing apparatus provided in the second aspect of this embodiment, reference may be made to the method described in the first aspect or any one of the possible designs of the first aspect, which is not described herein again.
As shown in fig. 5, a third aspect of the present embodiment provides a computer device for executing the video antialiasing display method according to any one of the possible designs of the first aspect or the first aspect, and the computer device includes a memory and a processor, which are communicatively connected, where the memory is used to store a computer program, and the processor is used to read the computer program and execute the video antialiasing display method according to any one of the possible designs of the first aspect or the first aspect. For example, the Memory may include, but is not limited to, a Random-Access Memory (RAM), a Read-Only Memory (ROM), a Flash Memory (Flash Memory), a First-in First-out (FIFO), and/or a First-in Last-out (FILO), and the like; the processor may not be limited to the microprocessor of the model number employing the STM32F105 family. In addition, the computer device may also include, but is not limited to, a power module, a display screen, and other necessary components.
For the working process, working details, and technical effects of the foregoing computer device provided in the third aspect of this embodiment, reference may be made to the method in the first aspect or any one of the possible designs in the first aspect, which is not described herein again.
A fourth aspect of the present invention provides a readable storage medium storing instructions including any one of the first aspect or any one of the possible designs of the video antialiasing display method, i.e. the readable storage medium has instructions stored thereon, which when executed on a computer, perform the video antialiasing display method according to any one of the first aspect or any one of the possible designs of the first aspect. The readable storage medium refers to a carrier for storing data, and may include, but is not limited to, a floppy disk, an optical disk, a hard disk, a flash Memory, a flash disk and/or a Memory Stick (Memory Stick), etc., and the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
For the working process, the working details and the technical effects of the foregoing readable storage medium provided in the fourth aspect of this embodiment, reference may be made to the method in the first aspect or any one of the possible designs in the first aspect, which is not described herein again.
A fifth aspect of the present embodiments provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform a video antialiasing display method as defined in the first aspect or any one of the possible designs of the first aspect. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable devices.
The embodiments described above are merely illustrative, and may or may not be physically separate, if referring to units illustrated as separate components; if reference is made to a component displayed as a unit, it may or may not be a physical unit, and may be located in one place or distributed over a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: modifications may be made to the embodiments described above, or equivalents may be substituted for some of the features described. And such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Finally, it should be noted that the present invention is not limited to the above alternative embodiments, and that various other forms of products can be obtained by anyone in light of the present invention. The above detailed description should not be taken as limiting the scope of the invention, which is defined in the claims, and which the description is intended to be interpreted accordingly.

Claims (10)

1. A video antialiasing display method, comprising:
decoding video stream data to obtain a source video frame;
identifying at least one contour edge pixel based on pixel intensity in the source video frame;
judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel in the at least one contour edge pixel is within a preset brightness gradient range;
if so, acquiring a pixel brightness compensation value of the corresponding contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel;
drawing to obtain a new texture according to the pixel brightness compensation values of all contour edge pixels in the source video frame;
and rendering and displaying the source video frame and the new texture through frame buffering.
2. The method of claim 1, wherein decoding video stream data to obtain source video frames comprises:
and decoding the video stream data by using a multimedia digital signal decoder (MediaCodec) and an embedded system OpenGL Es of a development image drawing library to obtain the source video frame.
3. The method of claim 1, wherein identifying at least one contour edge pixel based on pixel intensity in the source video frame comprises:
calculating to obtain the pixel brightness of each pixel in the source video frame;
respectively determining the minimum value and the maximum value of pixel brightness in the corresponding pixel brightness, the pixel brightness of the left and right adjacent pixels and the pixel brightness of the upper and lower adjacent pixels aiming at each pixel in the source video frame;
calculating a pixel brightness range value according to the corresponding pixel brightness minimum value and the corresponding pixel brightness maximum value aiming at each pixel in the source video frame, wherein the pixel brightness range value is equal to the pixel brightness maximum value minus the corresponding pixel brightness minimum value;
respectively judging whether the corresponding pixel brightness range value is smaller than the maximum value in the product of a preset brightness threshold value and the corresponding pixel brightness maximum value and a preset brightness threshold value coefficient or not aiming at each pixel in the source video frame;
if yes, the corresponding pixel is determined to be a contour edge pixel.
4. The method of claim 3, wherein calculating the pixel intensity for each pixel in the source video frame comprises:
the pixel brightness of the pixel is calculated according to the following formula
Figure 561154DEST_PATH_IMAGE001
Figure 617972DEST_PATH_IMAGE002
In the formula (I), the compound is shown in the specification,
Figure 165628DEST_PATH_IMAGE003
representing the green channel value of a pixel in RGB color mode,
Figure 991501DEST_PATH_IMAGE004
representing the red channel value of the pixel in RGB color mode.
5. The method of claim 1, wherein determining whether the horizontal luminance gradient or the vertical luminance gradient of each of the at least one contour edge pixel is within a predetermined luminance gradient range comprises:
for each contour edge pixel in the at least one contour edge pixel, calculating an edge horizontal direction sawtooth index according to the following formula
Figure 822535DEST_PATH_IMAGE005
And edge normal direction sawtooth index
Figure 507595DEST_PATH_IMAGE006
Figure 717996DEST_PATH_IMAGE007
In the formula (I), the compound is shown in the specification,
Figure 398376DEST_PATH_IMAGE008
representing the pixel brightness of the corresponding left-adjacent pixel,
Figure 809766DEST_PATH_IMAGE009
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the lower left neighboring pixel,
Figure 575597DEST_PATH_IMAGE010
which represents the brightness of the corresponding pixel or pixels,
Figure 465055DEST_PATH_IMAGE011
indicating the sum of the pixel luminance of the corresponding upper neighboring pixel and the pixel luminance of the corresponding lower neighboring pixel,
Figure 265521DEST_PATH_IMAGE012
representing the pixel brightness of the corresponding right-adjacent pixel,
Figure 706867DEST_PATH_IMAGE013
representing the sum of the pixel luminance corresponding to the upper right adjacent pixel and the pixel luminance corresponding to the lower right adjacent pixel,
Figure 835360DEST_PATH_IMAGE014
representing the pixel brightness of the corresponding neighboring pixel,
Figure 653143DEST_PATH_IMAGE015
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the upper right neighboring pixel,
Figure 308115DEST_PATH_IMAGE016
representing corresponding left neighborsThe sum of the pixel luminance of a pixel and the pixel luminance of the corresponding right-adjacent pixel,
Figure 61308DEST_PATH_IMAGE017
representing the pixel brightness of the corresponding next-adjacent pixel,
Figure 804660DEST_PATH_IMAGE018
representing the sum of the pixel luminance corresponding to the lower left neighboring pixel and the pixel luminance corresponding to the lower right neighboring pixel;
respectively judging the corresponding sawtooth index in the edge horizontal direction aiming at each contour edge pixel in the at least one contour edge pixel
Figure 35921DEST_PATH_IMAGE005
Whether or not it is greater than or equal to the corresponding edge vertical sawtooth index
Figure 810979DEST_PATH_IMAGE006
If so, judging whether the corresponding horizontal brightness gradient is located in the preset brightness gradient range, otherwise, judging whether the corresponding vertical brightness gradient is located in the preset brightness gradient range.
6. The method of claim 1, wherein obtaining the pixel intensity compensation value for the corresponding contour edge pixel based on the pixel intensity mean value for the corresponding surrounding pixel comprises:
calculating the pixel brightness mean value of the surrounding pixels according to the following formula
Figure 594128DEST_PATH_IMAGE019
Figure 962792DEST_PATH_IMAGE020
In the formula (I), the compound is shown in the specification,
Figure 856799DEST_PATH_IMAGE021
indicating the sum of the pixel luminance of the corresponding upper neighboring pixel and the pixel luminance of the corresponding lower neighboring pixel,
Figure 361729DEST_PATH_IMAGE022
representing the sum of the pixel luminance of the corresponding left-adjacent pixel and the pixel luminance of the corresponding right-adjacent pixel,
Figure 581358DEST_PATH_IMAGE009
representing the sum of the pixel luminance corresponding to the upper left neighboring pixel and the pixel luminance corresponding to the lower left neighboring pixel,
Figure 296373DEST_PATH_IMAGE023
representing the sum of the pixel brightness corresponding to the upper right adjacent pixel and the pixel brightness corresponding to the lower right adjacent pixel;
averaging the pixel intensities of corresponding surrounding pixels
Figure 869437DEST_PATH_IMAGE019
And subtracting the pixel brightness of the corresponding contour edge pixel to calculate the pixel brightness compensation value.
7. The method of claim 1, wherein after rendering a new texture based on pixel intensity compensation values for all contour edge pixels in the source video frame, the method further comprises:
and caching the new texture by developing a texture buffer area of an image drawing library OpenGL.
8. A video anti-aliasing display device is characterized by comprising a video decoding module, an edge identification module, a gradient judgment module, a complementary value acquisition module, a texture drawing module and a rendering display module which are sequentially in communication connection;
the video decoding module is used for decoding video stream data and acquiring a source video frame;
the edge identification module is used for identifying at least one contour edge pixel according to the pixel brightness in the source video frame;
the gradient judging module is used for judging whether the horizontal direction brightness gradient or the vertical direction brightness gradient of each contour edge pixel in the at least one contour edge pixel is within a preset brightness gradient range;
the complementary value obtaining module is used for obtaining a pixel brightness compensation value corresponding to the contour edge pixel according to the pixel brightness mean value of the corresponding surrounding pixel when the horizontal direction brightness gradient or the vertical direction brightness gradient of the contour edge pixel is within the preset brightness gradient range;
the texture drawing module is used for drawing to obtain a new texture according to the pixel brightness compensation values of all contour edge pixels in the source video frame;
and the rendering display module is used for rendering and displaying the source video frame and the new texture through frame buffering.
9. A computer device comprising a memory and a processor communicatively coupled, wherein the memory is configured to store a computer program and the processor is configured to read the computer program and perform the method of any of claims 1 to 7.
10. A readable storage medium having stored thereon instructions which, when executed on a computer, perform the method of any one of claims 1 to 7.
CN202110128499.4A 2021-01-29 2021-01-29 Video anti-aliasing display method and device, computer equipment and readable storage medium Active CN112448962B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110128499.4A CN112448962B (en) 2021-01-29 2021-01-29 Video anti-aliasing display method and device, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110128499.4A CN112448962B (en) 2021-01-29 2021-01-29 Video anti-aliasing display method and device, computer equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112448962A true CN112448962A (en) 2021-03-05
CN112448962B CN112448962B (en) 2021-04-27

Family

ID=74739977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110128499.4A Active CN112448962B (en) 2021-01-29 2021-01-29 Video anti-aliasing display method and device, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112448962B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463687A (en) * 2022-04-12 2022-05-10 北京云恒科技研究院有限公司 Movement track prediction method based on big data
CN114897744A (en) * 2022-07-14 2022-08-12 深圳乐播科技有限公司 Image-text correction method and device
CN116631319A (en) * 2023-05-29 2023-08-22 上海傲显科技有限公司 Screen display compensation method, intelligent terminal and storage medium
CN116631319B (en) * 2023-05-29 2024-05-14 上海傲显科技有限公司 Screen display compensation method, intelligent terminal and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214513A1 (en) * 2002-05-14 2003-11-20 Microsoft Corporation Type size dependent anti-aliasing in sub-pixel precision rendering systems
CN104335567A (en) * 2012-05-15 2015-02-04 夏普株式会社 Video-processing device, video-processing method, television receiver, program, and recording medium
CN106204453A (en) * 2016-07-14 2016-12-07 京东方科技集团股份有限公司 The interpolation method of a kind of image and device
CN108881875A (en) * 2018-08-16 2018-11-23 Oppo广东移动通信有限公司 Image white balancing treatment method, device, storage medium and terminal
CN110908510A (en) * 2019-11-08 2020-03-24 四川大学 Application method of oblique photography modeling data in immersive display equipment
CN111526362A (en) * 2019-02-01 2020-08-11 华为技术有限公司 Inter-frame prediction method and device
CN111640150A (en) * 2019-09-20 2020-09-08 于贵庆 Video data source analysis system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214513A1 (en) * 2002-05-14 2003-11-20 Microsoft Corporation Type size dependent anti-aliasing in sub-pixel precision rendering systems
CN104335567A (en) * 2012-05-15 2015-02-04 夏普株式会社 Video-processing device, video-processing method, television receiver, program, and recording medium
CN106204453A (en) * 2016-07-14 2016-12-07 京东方科技集团股份有限公司 The interpolation method of a kind of image and device
CN108881875A (en) * 2018-08-16 2018-11-23 Oppo广东移动通信有限公司 Image white balancing treatment method, device, storage medium and terminal
CN111526362A (en) * 2019-02-01 2020-08-11 华为技术有限公司 Inter-frame prediction method and device
CN111640150A (en) * 2019-09-20 2020-09-08 于贵庆 Video data source analysis system and method
CN110908510A (en) * 2019-11-08 2020-03-24 四川大学 Application method of oblique photography modeling data in immersive display equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463687A (en) * 2022-04-12 2022-05-10 北京云恒科技研究院有限公司 Movement track prediction method based on big data
CN114463687B (en) * 2022-04-12 2022-07-08 北京云恒科技研究院有限公司 Movement track prediction method based on big data
CN114897744A (en) * 2022-07-14 2022-08-12 深圳乐播科技有限公司 Image-text correction method and device
CN114897744B (en) * 2022-07-14 2022-12-09 深圳乐播科技有限公司 Image-text correction method and device
CN116631319A (en) * 2023-05-29 2023-08-22 上海傲显科技有限公司 Screen display compensation method, intelligent terminal and storage medium
CN116631319B (en) * 2023-05-29 2024-05-14 上海傲显科技有限公司 Screen display compensation method, intelligent terminal and storage medium

Also Published As

Publication number Publication date
CN112448962B (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN109618179B (en) Rapid play starting method and device for ultra-high definition video live broadcast
CN109379624B (en) Video processing method and device, electronic equipment and storage medium
CN109168014B (en) Live broadcast method, device, equipment and storage medium
CN112448962B (en) Video anti-aliasing display method and device, computer equipment and readable storage medium
US11184646B2 (en) 360-degree panoramic video playing method, apparatus, and system
US20140092439A1 (en) Encoding images using a 3d mesh of polygons and corresponding textures
CN110868625A (en) Video playing method and device, electronic equipment and storage medium
CN109640167B (en) Video processing method and device, electronic equipment and storage medium
EP3745700A1 (en) Virtual image processing method, image processing system, and storage medium
CN112153082B (en) Method and device for smoothly displaying real-time streaming video picture in android system
CN110012336B (en) Picture configuration method, terminal and device of live interface
CN107454434A (en) Virtual reality net cast method and video playing terminal
CN113243112A (en) Streaming volumetric and non-volumetric video
CN112884665A (en) Animation playing method and device, computer equipment and storage medium
CN104717509A (en) Method and device for decoding video
CN115761090A (en) Special effect rendering method, device, equipment, computer readable storage medium and product
CN110049347B (en) Method, system, terminal and device for configuring images on live interface
CN110120039B (en) Screen detection method, screen detection device, electronic equipment and readable storage medium
CN109658488B (en) Method for accelerating decoding of camera video stream through programmable GPU in virtual-real fusion system
CN113852860A (en) Video processing method, device, system and storage medium
US8994789B2 (en) Digital video signal, a method for encoding of a digital video signal and a digital video signal encoder
WO2016161899A1 (en) Multimedia information processing method, device and computer storage medium
CN113366842A (en) System and method for content layer based video compression
CN111406404A (en) Compression method, decompression method, system and storage medium for obtaining video file
CN112423108B (en) Method and device for processing code stream, first terminal, second terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant