CN116647672A - Video GPU decoding image anomaly detection method and system based on shader - Google Patents

Video GPU decoding image anomaly detection method and system based on shader Download PDF

Info

Publication number
CN116647672A
CN116647672A CN202211113547.3A CN202211113547A CN116647672A CN 116647672 A CN116647672 A CN 116647672A CN 202211113547 A CN202211113547 A CN 202211113547A CN 116647672 A CN116647672 A CN 116647672A
Authority
CN
China
Prior art keywords
video frame
detection
gpu
video
abnormal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211113547.3A
Other languages
Chinese (zh)
Inventor
罗国鸿
王刚
王家宾
薛有义
李康炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianyi Shilian Technology Co ltd
Original Assignee
Tianyi Digital Life Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianyi Digital Life Technology Co Ltd filed Critical Tianyi Digital Life Technology Co Ltd
Priority to CN202211113547.3A priority Critical patent/CN116647672A/en
Publication of CN116647672A publication Critical patent/CN116647672A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/04Diagnosis, testing or measuring for television systems or their details for receivers
    • H04N17/045Self-contained testing apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application provides a video GPU decoding image anomaly detection method and system based on a shader. Before video playback, a video frame detection shader program for abnormal video frame detection is configured into the GPU to provide conditions for operation in the GPU rendering pipeline. During video playback, each video frame is detected by a video frame detection shader program in the GPU and a detection result of 1 unit pixel is returned to the CPU, and the CPU determines whether or not to render an abnormal video frame in which an abnormality has occurred based on the detection result. The application realizes the whole detection flow in the GPU, avoids a great deal of time consumption caused by transmitting the image frames decoded by hardware from the GPU video memory to the CPU system memory and CPU consumption caused by detecting the abnormal image frames, thereby effectively ensuring the compatibility of GPU decoding and guaranteeing the experience of video playing of users.

Description

Video GPU decoding image anomaly detection method and system based on shader
Technical Field
The application relates to video monitoring and video playing technologies, in particular to a method and a system for detecting image anomalies.
Background
With the development of GPU technology, video coded by video coding methods such as h.264 and h.265, which are widely used at present, can be decoded in GPU through hardware decoding capability provided by GPU, and compared with traditional soft decoding methods through CPU, the decoding efficiency is greatly improved. However, the decoding capability of the GPU is often due to the complexity of encoding, the problems of GPU driver and the like, and the decoded video frame is often abnormal when the video data source has problems such as network packet loss or compatibility of the adopted encoding parameters, and the like, and the video frame is particularly easy to appear as green, gray, black and other color blocks in part or all of the region. If these conditions need to be detected and processed, the decoded video frames need to be transmitted back from the video memory of the GPU to the computer memory for detection by the CPU, resulting in serious degradation of playing performance.
With the advent and development of GPU shader technology, the processing power of GPUs has become more powerful, while having the computing and processing power of graphics. In the field of computer graphics, a shader is a computer program developed for a GPU, so that the GPU is provided with a programmable drawing pipeline, which can be programmed using a shader language instead of the conventional fixed pipeline, so as to constitute pixels, vertices, textures, etc. of the final image. The coordinates of the graphics, saturation of the display, brightness, contrast can also be dynamically adjusted using algorithms defined in the shader. External programs may modify parameters in shaders by providing them with external variables, textures.
At present, the main graphic software libraries of OpenGL, direct3D and the like support shader technology. The GPU runs different shaders at different stages through a programmable graphics pipeline, and the pipeline at least comprises two shaders, namely a vertex shader and a pixel shader. The vertex shader is responsible for carrying out geometric operations such as coordinate transformation, projection, clipping and the like on vertices which form basic geometric bodies and are input in a pipeline, and the pixel shader is responsible for carrying out operations such as calculation, texture mapping, color processing and the like on pixels in a graph rasterization stage so as to determine relevant attributes such as colors of the pixels after the graph rasterization.
The shader can be applied to the decoding and rendering process of the video GPU, and as the display of the video belongs to two-dimensional graphics, the rasterization operation is mainly involved, and the pixel shader is mainly adopted for processing. Therefore, a method for efficiently detecting the anomalies of the image frames after video decoding based on the processing capability of the GPU shader is needed to solve the display anomalies and efficiency problems of the GPU hardware solutions.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The application provides a method for detecting abnormal pictures after video is decoded by GPU hardware based on a GPU shader technology, which enables the whole detection flow to be realized in the GPU, avoids a great deal of time consumption caused by transferring image frames after hardware decoding from a GPU video memory to a CPU system memory and CPU consumption caused by abnormal image frame detection, thereby effectively ensuring the compatibility of GPU decoding and guaranteeing the experience of video playing of users.
According to one embodiment of the present application, there is provided a method of detecting an abnormal video frame based on a shader, including: (1) Configuring a video frame detection shader program for abnormal video frame detection into the GPU; and (2) for each video frame, detecting the video frame by the video frame detection shader program in the GPU and returning a detection result of 1 unit pixel to the CPU; and (3) the CPU judges whether or not to render the abnormal video frame with the abnormality based on the detection result.
According to one embodiment of the present application, there is provided a system for detecting an abnormal video frame based on a shader, including: a video frame detection load and run module configured to: configuring a video frame detection shader program for abnormal video frame detection into the GPU; a video frame detection shader program module configured to: for each video frame, detecting the video frame, and transmitting a detection result of 1 unit pixel back to the CPU; and a rendering decision module configured in the CPU and configured to: and judging whether to render the abnormal video frame with the abnormality or not based on the detection result.
According to another embodiment of the present application, there is provided a computing device for detecting an abnormal video frame based on a shader, including: one or more processors; a memory; and a system as described above.
These and other features and advantages will become apparent upon reading the following detailed description and upon reference to the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.
Drawings
So that the manner in which the above recited features of the present application can be understood in detail, a more particular description of the application, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only certain typical aspects of this application and are therefore not to be considered limiting of its scope, for the description may admit to other equally effective aspects.
Fig. 1 shows a prior art system architecture diagram 100 for detecting anomalous video frames based on a CPU.
Fig. 2 illustrates a system architecture diagram 200 for detecting anomalous video frames based on a shader in accordance with an embodiment of the application.
FIG. 3 illustrates a block diagram of a system 300 for detecting anomalous video frames based on a shader in accordance with an embodiment of the application.
FIG. 4 illustrates a flowchart of a method 400 for detecting an anomalous video frame based on a shader according to one embodiment of the application.
Fig. 5 shows a flow chart 500 for further description of step 401.
Fig. 6 shows a flow chart 600 for further description of step 402.
FIG. 7 illustrates a block diagram of an exemplary computing device 700, according to one embodiment of the application.
Detailed Description
The features of the present application will become more apparent from the detailed description set forth below when taken in conjunction with the drawings.
The following detailed description refers to the accompanying drawings that illustrate exemplary embodiments of the application. The scope of the application is not limited to the embodiments, however, but is defined by the appended claims. Accordingly, embodiments other than those shown in the figures, such as modified versions of the illustrated embodiments, are still encompassed by the present application.
Reference in the specification to "one embodiment," "an example embodiment," etc., means that the embodiment may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the relevant art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
The application describes various technologies for detecting video GPU decoding image anomalies based on a shader, and in an application scene of video decoding by a GPU, the video frames after decoding are matched according to set reference anomaly textures or color values based on the GPU shader, so that whether the decoded video frames have anomalies or not is detected. In the application fields of video monitoring and video playing, the method and the device can better ensure that the GPU is utilized to perform efficient decoding in the aspects of video monitoring, digital television, video live broadcasting, video on demand and the like, meanwhile, the abnormal display condition caused by the problems of network packet loss, GPU drive compatibility and the like is avoided, meanwhile, the video playing performance is not affected, and the user playing experience is effectively improved.
Specifically, in the present application, before video playback, a video frame detection shader program module for abnormal video frame detection is loaded into the operating environment and compiled and initialized to have the conditions for operation in the GPU rendering pipeline. Setting a video frame decoded by the GPU and 1 unit pixel texture created as a detection result to a GPU rendering pipeline and starting a shader program when video is played; comparing whether the video frame is matched with an abnormal reference texture or a preset color constant by a shader program, accumulating whether the matched percentage is higher than a threshold value, and rendering the video frame into textures of 1 unit pixel in different RGB colors as a result of passing detection; and the detection result is returned to the CPU, whether the video frame is output to the screen or not is determined according to the specific application scene (for example, compared with the rendering decision), and the detection cycle of the next video frame is entered until the playing is finished.
Fig. 1 shows a prior art system architecture diagram 100 for detecting anomalous video frames based on a CPU. As can be seen from the graph 100, the CPU-based detection mainly includes: extracting video data from the coded video source in the CPU, decoding video from the CPU in the GPU, transmitting the decoded video frame to the CPU for detection, comparing the detected result with a rendering decision by the CPU, and informing the GPU of whether to render or not.
Therefore, the existing CPU detection method needs to rely on the CPU to perform anomaly detection on the video decoded by the GPU, and since the CPU cannot directly access the data in the GPU video memory, the precondition in the existing system architecture 100 that the CPU performs detection on the video frame decoded by the GPU is that the video frame data stored in the GPU video memory needs to be transmitted to the system memory through the GPU driver and can be accessed by the CPU, and the data size of one frame of video image is related to the resolution, especially in the case of video high definition, the pixel size of one frame of video image can reach more than one million levels, and the purpose of performing video decoding by the GPU is to improve the decoding speed by utilizing the performance of the GPU while avoiding occupying CPU resources, while the existing system architecture is obviously contrary to this original purpose.
Fig. 2 illustrates a system architecture diagram 200 for detecting anomalous video frames based on a shader in accordance with an embodiment of the application. Specifically, referring to fig. 2, in the video frame detection loading and running stage, a specified video frame detection GPU shader program is loaded, a shader and an image texture resource to be processed are set in a rendering pipeline of the GPU, after the GPU rendering pipeline is started, a corresponding shader program is executed to complete detection, and the result is returned to the CPU. The video frame detection shader program is mainly responsible for texture resources, color constants and/or thresholds which need to be used in the preparation detection process of the loading stage of the specified video frame detection GPU shader program, completing the shader program operation of the GPU side according to the input video frame image in the operation stage and outputting the result of whether the detection is passed or not in a 1 unit pixel texture mode.
Therefore, compared with the technology of detecting abnormal video frames based on the CPU in the prior art of FIG. 1, the application utilizes the shader technology of the GPU, so that video decoding and abnormal detection processing are carried out in the GPU, the duplication of video frame data from the GPU to the CPU and the occupation of the CPU are avoided, and the data volume transferred from the GPU to the CPU as a detection result is only 1 pixel unit size.
FIG. 3 illustrates a block diagram of a system 300 for detecting anomalous video frames based on a shader in accordance with an embodiment of the application. The system 300 combines the GPU decoded video frame output with GPU shader technology in the GPU rendering pipeline to enable detection of video frame pictures. The system 300 may be implemented on a terminal running an application with video playback capabilities. Such as, the system 300 may be implemented on a client (e.g., a cell phone, tablet, desktop computer, smart home device, or various terminal devices that may be used in video monitoring, video playback technology, etc.) and/or a cloud server (e.g., a server located remotely from the client, etc.) running an application having video real-time playback, video playback, etc.
The system 300 generally includes an encoded video data module 301, a video decoding module 302, a video frame detection load and run module 303, a video frame detection shader program module 304, a rendering decision module 305, and a video frame rendering module 306. Wherein the encoded video data module 301 and the rendering decision module 305 are implemented in a CPU, while the video decoding module 302, the video frame detection loading and running module 303, the video frame detection shader program module 304 and the video frame rendering module 306 are implemented in a GPU. It will be fully understood by those skilled in the art that the above respective modules are illustrated herein for illustrative purposes only, and that the functionality of one or more of the above modules may be combined into a single module or split into multiple modules. Also, one or more of the above modules may be implemented in software, hardware, or a combination thereof.
According to one embodiment of the application, the encoded video data module 301 is configured to obtain encoded video data from a video source and pass the encoded video data to the video decoding module 302 running on the GPU. Those skilled in the art will recognize that there are a variety of techniques for encoding video data that are not within the scope of the present application.
According to one embodiment of the application, video decoding module 302 is configured to decode encoded video data into one or more video frames through a decoding program interface of the GPU. Wherein the one or more video frames may be stored in the GPU memory for subsequent abnormal frame detection in the GPU.
According to one embodiment of the application, the video frame detection load and run module 303 is configured to load and initialize a specified video frame detection GPU shader program, execute the GPU shader program after the GPU rendering pipeline is started to complete the detection, and output the detection result.
For example, the video frame detection load and run module 303 is configured to set a specified video frame detection GPU shader program in the rendering pipeline of the GPU along with the image texture resources to be processed during the loading of the GPU shader program. For example, the GPU shader program may be associated with the storage location of the video frame in the GPU video memory by means of a handle or the like.
For another example, the video frame detection load and run module 303 is further configured to transmit a result (e.g., 1 unit pixel value) after detecting a video frame that needs to be detected to the rendering decision module 305 to further determine whether the video frame needs to be rendered.
According to one embodiment of the application, the video frame detection shader program module 304 is configured to compare the reference texture, color constants, and/or thresholds to the anomalies that need to be used in the preparation of the detection process during the load phase, and to complete the shader program operation on the GPU side from the input video frame during the run phase, and to output the result of the detection as a 1-unit pixel texture.
According to one embodiment of the application, the anomaly comparison reference texture is the texture with which the video frame is compared during the detection of the video frame. Specifically, the anomaly-comparison reference texture may include textures of various solid colors, such as RGB for a green screen, RGB for a gray screen, RGB for a white screen, and/or RGB for a blue screen, among others. Those skilled in the art will appreciate that other types of aligned reference texture resources or color constants may also be employed based on the display form common to outlier frames.
According to one embodiment of the application, the threshold may be a percentage of the video frame above which an outlier pixel is determined to be outlier. In particular, different thresholds for different anomaly comparisons to the reference texture may be included, for example, a threshold for RGB for green screen, a threshold for RGB for gray screen, a threshold for RGB for white screen, and/or a threshold for RGB for blue screen. Thus, such a threshold may be a pixel duty cycle of the video frame that matches a pixel in the anomaly-to-reference texture, i.e., a duty cycle of an anomaly pixel in the video frame.
According to one embodiment of the application, the detection result may include an RGB value representing a detection pass (e.g., a white RGB value (1, 1) or an RGB value representing a detection fail (such as a black RGB value (0, 0)), which may be rendered into the texture of 1 unit pixel by the GPU rendering pipeline.
According to one embodiment of the present application, the rendering decision module 305 is configured to determine, when receiving the result that the video frame detection on the GPU side fails, whether the abnormal video frame needs to be rendered to the display screen according to a preset policy.
According to one embodiment of the application, the preset strategy includes, for example: rendering the abnormal video frame and recording an abnormal result; rendering the abnormal video frames and providing alarm processing; recording the anomaly results without rendering the anomaly video frames (e.g., discarding the anomaly video frames); the abnormal video frames are not rendered, but rather alert processing is provided.
According to one embodiment of the present application, the video frame rendering module 306 is configured to render and output the video frames eventually available for display to a display screen according to the decision of the rendering decision module 305, thereby achieving video playback.
FIG. 4 illustrates a flowchart of a method 400 for detecting an anomalous video frame based on a shader according to one embodiment of the application. The method 400 mainly includes a shader program initialization stage before video playback and an abnormal video frame detection stage during video playback.
In step 401, a video frame detection shader program for outlier video frame detection is configured into the GPU prior to video playback to enable it to run in the GPU rendering pipeline.
In step 402, during video playback, each video frame is detected by a video frame detection shader program in the GPU and a detection result of 1 unit pixel is returned to the CPU, and the CPU determines whether or not to render an abnormal video frame in which an abnormality occurs based on the detection result.
Fig. 5 shows a flow chart 500 for further description of step 401.
In step 501, a video frame detection shader program for video frame detection is loaded into the GPU and initialized. Step 501 may be performed by video frame detection load and run module 303, among other things.
In step 502, an anomaly alignment reference texture and/or color constant for detection is created or loaded. According to one embodiment of the application, an outlier pixel duty cycle threshold for matching may also be created or loaded. Wherein step 502 may be performed by video frame detection shader program module 304.
At step 503, the video frame detection shader program code to be run is compiled through a shader program compiler interface provided by the GPU graphics library, thereby enabling the compiled code to run in the GPU rendering pipeline. Step 503 may be performed by video frame detection shader program module 304, among other things.
Fig. 6 shows a flow chart 600 for further description of step 402.
In step 601, video is decoded into video frames that are available for GPU rendering pipeline processing.
According to one embodiment of the application, the video in step 601 is encoded video data separated from a video source such as a local media file, a network video stream, or the like.
According to yet another embodiment of the present application, step 601 may be performed by video decoding module 302. The video decoding module 302 decodes the encoded video data separated by the encoded video data module 301 through a hardware decoding program interface provided by the GPU graphics library, and the decoded data is a video frame that can be used for GPU rendering pipeline processing.
In step 602, the video frame obtained in step 601 and the 1-unit pixel texture as a detection output are set to the GPU rendering pipeline and a video frame detection shader program is started. Step 602 may be performed by video frame detection load and run module 303, among other things.
According to one embodiment of the application, the video frame detect load and run module 303 sets the decoded video frame as input in the GPU rendering pipeline and creates a 1 unit pixel size texture to set it as the rendering target for the GPU rendering pipeline. And, the video frame detect load and run module 303 starts the GPU rendering pipeline, after which the GPU will run the loaded and configured video frame detect shader program.
In step 603, the video frame detection shader program performs anomaly detection on the video frame and outputs a detection result of 1 unit pixel texture. Step 603 may be performed by video frame detection shader program module 304, among other things.
According to one embodiment of the present application, a video frame detection shader program matches a color value of each pixel in an input video frame with an anomaly comparison reference texture or a preset color constant, accumulates the matched percentage, outputs different RGB color values based on whether the matched percentage is higher than a preset threshold, and simultaneously renders the different RGB color values into a 1-unit pixel texture as a result of whether detection is passed. For example, when the percentage of matching is higher than a preset threshold, it is regarded as the detection not passing (abnormal occurrence), the black RGB value (0, 0) is taken as the output result, otherwise it is regarded as the passing, and the white RGB value (1, 1) is taken as the output result. Meanwhile, the output result will be rendered by the GPU rendering pipeline into 1 unit pixel texture as a detection result.
In step 604, the detection result obtained in step 603 is transmitted to the CPU, and it is determined whether to output the video frame to the screen according to a preset policy. Wherein step 604 may be performed by rendering decision module 305.
According to one embodiment of the application, the rendering decision module 305 reads the RGB values in the output 1-unit pixel texture from the GPU rendering pipeline and determines whether an anomaly exists based on the RGB values. Then, whether the video frame needs to be rendered to a display screen or not is determined according to a preset strategy.
According to one embodiment of the application, if the rendering decision module 305 determines that the video frame needs to be rendered to a display screen, the video frame rendering module 306 targets the output of the GPU rendering pipeline to the display screen and inputs and renders the video frame as a rendered texture.
According to one embodiment of the application, if the rendering decision module 305 determines that the video frame does not need to be rendered to the display screen, the video frame detect load and run module 303 ends this run of the GPU rendering pipeline.
The steps of fig. 4-6 are further illustrated below according to a specific application scenario.
The device (such as a client, a platform server and other terminal devices) adopting the method loads the detection shader programs of the green screen and the gray screen before playing, and detects the video frames output after the GPU is decoded during video playing through the shader programs. In the detection process, the RGB values in the video frame are matched with the RGB values of the preset green screen RGB and gray screen pixel by pixel, and the matched pixel duty ratio is counted. And outputting corresponding 1 unit pixel RGB values as detection results according to whether the green screen duty ratio and the gray screen duty ratio are higher than a threshold value. Meanwhile, according to the application scene requirement or according to a preset strategy, the rendering decision of the abnormal situation is to discard the video frame with the abnormality, record and count the number of the frequency of the occurrence problem, and report to the platform side for tracking processing.
Compared with the prior art, the application has the main advantages that:
1. detecting abnormal video frames by utilizing the acceleration capability of the GPU based on the GPU shader technology, and releasing the processing capability of the CPU;
2. the duplication of the video frame texture decoded by the video GPU to the CPU is avoided, and the detection result is only represented by 1 unit pixel RGB value, so that the overall performance requirement of playing is effectively ensured;
3. the method is suitable for various video monitoring, live broadcasting, on-demand and other scenes, provides an efficient abnormal video frame detection means, and can improve compatibility and stability by discarding abnormal video frames or automatically falling back to software decoding and other modes.
FIG. 7 illustrates a block diagram of an exemplary computing device 700, which is one example of a hardware device (e.g., a terminal capable of implementing system 300) that may be used in connection with aspects of the application, according to one embodiment of the application.
With reference to FIG. 7, a computing device 700 will now be described as one example of a hardware device that may be employed with aspects of the present application. Computing device 700 may be any machine that may be configured to implement processes and/or calculations and may be, but is not limited to, a workstation, a server, a smart device, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, a vehicle computer, or any combination thereof. The various methods/modules/servers/smart devices described above may be implemented, in whole or in part, by computing device 700 or a similar device or system.
Computing device 700 may include components that may be connected or in communication with a bus 702 via one or more interfaces. For example, computing device 700 may include a bus 702, one or more processors 704, one or more input devices 706, and one or more output devices 708. The one or more processors 704 may be any type of processor and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (e.g., special processing chips). Input device 706 may be any type of device capable of inputting information to a computing device and may include, but is not limited to, a camera, mouse, keyboard, touch screen, microphone, and/or remote controller. Output device 708 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer. Computing device 700 may also include or be connected to a non-transitory storage device 710, which may be non-transitory and capable of data storage, and which may include, but is not limited to, a disk drive, an optical storage device, a solid state memory, a floppy disk, a flexible disk, a hard disk, a magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a ROM (read only memory), a RAM (random access memory), a cache memory, and/or any memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. The non-transitory storage device 710 may be separated from the interface. The non-transitory storage device 710 may have data/instructions/code for implementing the methods and steps described above. Computing device 700 may also include communication device 712. The communication device 712 may be any type of device or system capable of enabling communication with an internal apparatus and/or with a network and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset, such as a bluetooth device, an IEEE 1302.11 device, a WiFi device, a WiMax device, a cellular communication device, and/or the like.
Bus 702 can include, but is not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Computing device 500 may also include a working memory 714, which working memory 714 may be any type of working memory capable of storing instructions and/or data that facilitate the operation of processor 704 and may include, but is not limited to, random access memory and/or read-only memory devices.
Software components may reside in the working memory 714 and include, but are not limited to, an operating system 716, one or more application programs 718, drivers, and/or other data and code. Instructions for implementing the above-described methods and steps of the present application may be included in the one or more application programs 718, and the above-described methods 400, 500, and 600 of the present application may be implemented by the processor 704 reading and executing the instructions of the one or more application programs 718.
It should also be appreciated that variations may be made according to particular needs. For example, custom hardware may also be used, and/or particular components may be implemented in hardware, software, firmware, middleware, microcode, hardware description voices, or any combination thereof. In addition, connections to other computing devices, such as network input/output devices, etc., may be employed. For example, some or all of the disclosed methods and apparatus may be implemented with programming hardware (e.g., programmable logic circuits including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) having an assembly language or hardware programming language (e.g., VERILOG, VHDL, C ++).
Although aspects of the present application have been described so far with reference to the accompanying drawings, the above-described methods, systems and apparatuses are merely examples, and the scope of the present application is not limited to these aspects but is limited only by the appended claims and equivalents thereof. Various components may be omitted or replaced with equivalent components. In addition, the steps may also be implemented in a different order than described in the present application. Furthermore, the various components may be combined in various ways. It is also important that as technology advances, many of the described components can be replaced by equivalent components that appear later.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application, and are intended to be included within the scope of the appended claims and description.

Claims (10)

1. A method of detecting an anomalous video frame based on a shader, comprising:
(1) Configuring a video frame detection shader program for abnormal video frame detection into the GPU; and
(2) For each video frame, detecting the video frame by the video frame detection shader program in the GPU, and transmitting a detection result of 1 unit pixel back to the CPU; and
(3) The CPU judges whether to render the abnormal video frame with the abnormality based on the detection result.
2. The method of claim 1, wherein step (1) further comprises:
loading the video frame detection shader program into the GPU and initializing the video frame detection shader program;
creating or loading an anomaly alignment reference texture or color constant for detection; and
the video frame detection shader program code is compiled through a shader program compiler interface provided by the GPU graphics library, enabling the compiled code to run in the GPU rendering pipeline.
3. The method of claim 2, wherein the anomaly-comparison reference texture comprises one or more solid-colored textures.
4. The method of claim 3, wherein step (2) further comprises:
matching a color value of each pixel in the video frame with the anomaly comparison reference texture or the color constant;
accumulating the percentage of matching;
outputting different RGB color values based on whether the percentage of matches is above a preset threshold; and
the RGB color values are rendered into a texture of 1 unit pixel as the detection result.
5. The method as recited in claim 4, further comprising:
outputting a black RGB value (0, 0) if the percentage of matching is higher than a preset threshold value;
if the percentage of matching is not higher than a preset threshold, a white RGB value (1, 1) is output.
6. The method of claim 4, wherein step (3) further comprises:
based on the detection result, determining whether the video frame needs to be rendered to a display screen according to a preset strategy;
wherein the preset strategy comprises one of the following: rendering the abnormal video frame and recording an abnormal result; rendering the abnormal video frames and providing alarm processing; recording an abnormal result without rendering the abnormal video frame; the abnormal video frames are not rendered, but rather alert processing is provided.
7. A system for detecting anomalous video frames based on a shader, comprising:
a video frame detection load and run module configured to: configuring a video frame detection shader program for abnormal video frame detection into the GPU;
a video frame detection shader program module configured to: for each video frame, detecting the video frame, and transmitting a detection result of 1 unit pixel back to the CPU; and
a rendering decision module configured in the CPU and configured to: and judging whether to render the abnormal video frame with the abnormality or not based on the detection result.
8. The system of claim 7, wherein the video frame detection shader program module is further configured to:
creating or loading an anomaly alignment reference texture or color constant for detection;
matching a color value of each pixel in the video frame with the anomaly comparison reference texture or the color constant;
accumulating the percentage of matching;
outputting different RGB color values based on whether the percentage of matches is above a preset threshold; and
the RGB color values are rendered into a texture of 1 unit pixel as the detection result.
9. The system of claim 8, wherein the video frame detection shader program module is further configured to:
outputting a black RGB value (0, 0) if the percentage of matching is higher than a preset threshold value;
if the percentage of matching is not higher than a preset threshold, a white RGB value (1, 1) is output.
10. A computing device for detecting anomalous video frames based on a shader, comprising:
one or more processors;
a memory; and
the system of claim 7.
CN202211113547.3A 2022-09-14 2022-09-14 Video GPU decoding image anomaly detection method and system based on shader Pending CN116647672A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211113547.3A CN116647672A (en) 2022-09-14 2022-09-14 Video GPU decoding image anomaly detection method and system based on shader

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211113547.3A CN116647672A (en) 2022-09-14 2022-09-14 Video GPU decoding image anomaly detection method and system based on shader

Publications (1)

Publication Number Publication Date
CN116647672A true CN116647672A (en) 2023-08-25

Family

ID=87621767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211113547.3A Pending CN116647672A (en) 2022-09-14 2022-09-14 Video GPU decoding image anomaly detection method and system based on shader

Country Status (1)

Country Link
CN (1) CN116647672A (en)

Similar Documents

Publication Publication Date Title
CN111681167B (en) Image quality adjusting method and device, storage medium and electronic equipment
US10887614B2 (en) Adaptive thresholding for computer vision on low bitrate compressed video streams
CN109361950B (en) Video processing method and device, electronic equipment and storage medium
US11763010B2 (en) Method and system of computer graphics processing system validation for processing of encrypted image content
US11810535B2 (en) Display driver, circuit sharing frame buffer, mobile device, and operating method thereof
US20110261885A1 (en) Method and system for bandwidth reduction through integration of motion estimation and macroblock encoding
US20230362388A1 (en) Systems and methods for deferred post-processes in video encoding
US20210233572A1 (en) Video processing method, electronic device, and storage medium
US20120218292A1 (en) System and method for multistage optimized jpeg output
CN109587555B (en) Video processing method and device, electronic equipment and storage medium
CN112184538B (en) Image acceleration method, related device, equipment and storage medium
WO2023134625A1 (en) Special effect optimization method and apparatus, and storage medium and program product
CN116647672A (en) Video GPU decoding image anomaly detection method and system based on shader
WO2023142715A1 (en) Video coding method and apparatus, real-time communication method and apparatus, device, and storage medium
US10475164B2 (en) Artifact detection in a contrast enhanced output image
CN113613024A (en) Video preprocessing method and device
US11388432B2 (en) Motion estimation through input perturbation
US10841549B2 (en) Methods and apparatus to facilitate enhancing the quality of video
US11423600B2 (en) Methods and apparatus for configuring a texture filter pipeline for deep learning operation
WO2023102868A1 (en) Enhanced architecture for deep learning-based video processing
CN118429292A (en) Medical image data quality evaluation method, device and medical system
CN115426519A (en) Method and system for playing H265 video in browser
CN116998145A (en) Method and apparatus for saliency-based frame color enhancement
CN118555403A (en) Video processing method, device, equipment and storage medium
CN116847150A (en) Ultrahigh-definition multimedia playing method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240323

Address after: Unit 1, Building 1, China Telecom Zhejiang Innovation Park, No. 8 Xiqin Street, Wuchang Street, Yuhang District, Hangzhou City, Zhejiang Province, 311100

Applicant after: Tianyi Shilian Technology Co.,Ltd.

Country or region after: China

Address before: Room 1423, No. 1256 and 1258, Wanrong Road, Jing'an District, Shanghai 200072

Applicant before: Tianyi Digital Life Technology Co.,Ltd.

Country or region before: China

TA01 Transfer of patent application right