CN115661145A - Cloud application bad frame detection method and device, electronic equipment and storage medium - Google Patents

Cloud application bad frame detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115661145A
CN115661145A CN202211658381.3A CN202211658381A CN115661145A CN 115661145 A CN115661145 A CN 115661145A CN 202211658381 A CN202211658381 A CN 202211658381A CN 115661145 A CN115661145 A CN 115661145A
Authority
CN
China
Prior art keywords
data space
cloud application
image
condition
application image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211658381.3A
Other languages
Chinese (zh)
Other versions
CN115661145B (en
Inventor
邓宝宽
温健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Haima Cloud Technology Co ltd
Original Assignee
Haima Cloud Tianjin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haima Cloud Tianjin Information Technology Co Ltd filed Critical Haima Cloud Tianjin Information Technology Co Ltd
Priority to CN202211658381.3A priority Critical patent/CN115661145B/en
Publication of CN115661145A publication Critical patent/CN115661145A/en
Application granted granted Critical
Publication of CN115661145B publication Critical patent/CN115661145B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application provides a cloud application bad frame detection method and device, electronic equipment and a storage medium, wherein the method comprises the following steps: the method comprises the steps of obtaining a cloud application image, and calculating a pixel value index of a target data space in the cloud application image, wherein the pixel value index comprises: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal; and judging whether the cloud application image is a bad frame or not by adopting a basic principle judgment condition and/or an exclusion principle judgment condition based on the pixel value index of the target data space in the cloud application image, so that the real-time detection of the bad frame of the whole cloud application image transmission link node can be realized, and the occupation of a CPU (Central processing Unit), a memory and network resources is ensured not to influence the operation of a cloud application main process.

Description

Cloud application bad frame detection method and device, electronic equipment and storage medium
Technical Field
The invention relates to the field of cloud application, in particular to a cloud application bad frame detection method and device, electronic equipment and a storage medium.
Background
With the development of cloud application (such as cloud game) technology, the application operation mode with less performance consumption on the user terminal is more and more accepted by application companies, platforms and users. The cloud application image transmission quality is one of very important performance indexes in a cloud application system, and high-quality application image transmission is an important premise that a player can normally play in a cloud mode.
At present, the on-line system has various reasons, which can cause the user end to see the bad frame image, the bad frame image shows the phenomena of full black image, full green image or the image contour but covering a layer of green skin, and the like, and the method can be divided into the following situations: (1) The image is continuously displayed as a monochrome image (black screen, green screen); (2) The image is displayed continuously as a contoured skin (green skin) but covered by a single color; (3) Images with persistent confusion (similar to images decoded with incomplete frame data); (4) the above abnormalities occasionally appear non-continuously. Therefore, the occurrence position of the image bad frame needs to be accurately positioned. However, the cloud application image transmission link has a large number of nodes, and image data is transmitted and processed through a plurality of nodes from the output, collection to the transmission display of the application image, and each node may theoretically have a problem, which causes an image abnormality. In addition, the problem is solved passively by checking after the users such as the seats report the problem, and the problem is not found and solved as soon as possible.
In view of this, how to provide a bad frame detection scheme can implement real-time detection of a bad frame of a whole cloud application image transmission link node.
Disclosure of Invention
Therefore, the embodiment of the application provides a cloud application bad frame detection method and device, an electronic device and a storage medium, which can realize real-time detection of a bad frame of a whole cloud application image transmission link node.
In a first aspect, an embodiment of the present application provides a cloud application bad frame detection method, including:
the method comprises the steps of obtaining a cloud application image, and calculating a pixel value index of a target data space in the cloud application image, wherein the pixel value index comprises: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal;
and judging whether the cloud application image is a bad frame or not by adopting a basic principle judgment condition and/or an exclusion principle judgment condition based on the pixel value index of the target data space in the cloud application image.
In a second aspect, an embodiment of the present application further provides a device for detecting a bad frame of a cloud application, including:
the computing unit is used for acquiring a cloud application image and counting a pixel value index of a target data space in the cloud application image, wherein the pixel value index comprises: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal;
and the judging unit is used for judging whether the cloud application image is a bad frame or not by adopting a basic principle judging condition and/or an exclusion principle judging condition based on the pixel value index of the target data space in the cloud application image.
In a third aspect, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the cloud application bad frame detection method according to the first aspect are performed.
In a fourth aspect, an embodiment of the present application further provides an electronic device, including: the device comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when an electronic device runs, the processor and the storage medium communicate through the bus, and the processor executes the machine-readable instructions to execute the steps of the cloud application bad frame detection method according to the first aspect.
In summary, the cloud application bad frame detection method and apparatus, the electronic device, and the storage medium provided in the embodiments of the present application, based on the pixel value index of the target data space in the cloud application image, the basic principle judgment condition and/or the exclusion principle judgment condition is adopted to judge whether the cloud application image is a bad frame, on one hand, the bad frame in the cloud application image can be accurately detected, on the other hand, by analyzing the transmission link node of the whole cloud application image, the cloud server image output, the client image decoding, and the client image display position are selected as analysis nodes, and the bad frame detection is performed on at least one of the three types of images corresponding to the image sent to the terminal by the cloud server, the image decoded by the terminal, and the image displayed by the terminal, so that the occurrence position of the bad frame can be accurately located, that is, the present scheme can implement the real-time detection of the bad frame of the whole cloud application image transmission link node.
Drawings
Fig. 1 is a schematic flowchart of an embodiment of a cloud application bad frame detection method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an embodiment of a cloud application bad frame detection apparatus according to the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, as generally described and illustrated in the figures herein, could be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Referring to fig. 1, a schematic flow chart of a cloud application bad frame detection method provided in an embodiment of the present application may include:
s10, acquiring a cloud application image, and calculating a pixel value index of a target data space in the cloud application image, wherein the pixel value index comprises: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal;
in this embodiment, it should be noted that, because there are many image transmission nodes in the cloud application system, in order to accurately locate the occurrence position of the image bad frame, a bad frame detection scheme needs to be deployed at each necessary node for detection. For a cloud application platform, bad frame detection schemes can be deployed in: the cloud application instance side, the user side image decoding and the user side image displaying are used for respectively detecting an image sent to the terminal by the cloud server, the image decoded by the terminal and the image displayed by the terminal so as to determine the position of a bad frame problem in application image output, application image transmission or application image display on the user equipment.
In order to detect a bad frame in real time and not to negatively affect the frame rate and the display stability of the cloud application service, when the cloud application image is acquired, a thread low-priority mode (the priority of the corresponding processing process is lower than that of the cloud application service) can be adopted to copy one frame of cloud application image data to be detected from the GPU to the CPU space each time, and the subsequent processing is continuously carried out in a mode of calculating the cloud application image in the CPU with low priority, so that the requirement that real-time detection can be achieved and negative effects on the cloud application service are not caused is ensured.
In order to further reduce resource consumption caused by calculation, the acquisition time of the cloud application image may be set as required, for example, a fixed or unfixed time interval may be set, and the sampling rule is determined based on the time interval. For example, an acquisition rule is assumed to be [0, 10, 30], where the acquisition rule indicates that a first frame of cloud application image appearing 0 seconds after the cloud application is started, a first frame of cloud application image appearing 10 seconds later, and a first frame of cloud application image appearing 30 seconds later are used as original image data to be detected. And finishing bad frame detection after all the images in the original image data to be detected are detected.
Besides the way of performing frame sampling according to time intervals, in order to reduce consumption of system resources as well, sampling processing can be performed on single-frame image data, and the single-frame image sampling rule is as follows: for each frame of original image data, the bad frame detection scheme may select a portion of pixels from the entire frame of image data (i.e., sample the entire frame of image data) for calculation, so as to reduce the amount of calculation. Specifically, at least one data block may be selected from the entire frame of image, and the at least one data block may be calculated instead of the entire frame of image. The image sampling range may be defined as: [ size, pos1, pos 2., posX ], where size denotes the size of the selected image data block, e.g., size =8, and the size of the image block is 8 × 8 (denotes that the width and height of the image block are both 8 pixels); posX denotes the position of the top left vertex pixel point of the selected X (X denotes the number of selected image blocks) image block in the whole frame image, e.g. size =8, and pos1=0.3 denotes the coordinates of the top left vertex pixel point of the 1 st image block in the whole frame image as (2.4).
The whole frame image sampling and the single frame image sampling can be selected according to requirements (including whether to select and one or all of the selections) so as to adapt to the pressure of the large data platform. After sampling the image, the subsequent processing process may use the sampled image data to calculate a pixel value index of a target data space in the sampled image data, and subsequently determine whether the cloud application image before sampling is a bad frame according to the pixel value index of the target data space in the sampled image data, where the pixel value index may include: at least one of magnitude of change, standard deviation, and variance of the value.
S11, judging whether the cloud application image is a bad frame or not by adopting a basic principle judgment condition and/or an exclusion principle judgment condition based on the pixel value index of the target data space in the cloud application image.
In this embodiment, when a bad frame is detected, at least one of the basic principle determination condition and the exclusion principle determination condition may be used to determine whether the cloud application image is a bad frame, specifically, at least one of the basic principle determination condition and the exclusion principle determination condition may be selected, and based on a pixel value index of a target data space in the cloud application image, whether the cloud application image is a bad frame may be determined using the selected at least one condition. After the bad frame is detected, the detection result, together with the intermediate result (including the pixel value index) involved in the whole detection process, can be reported to the big data platform for further statistical analysis.
The cloud application bad frame detection method provided by the embodiment of the application is based on the pixel value index of a target data space in a cloud application image, whether the cloud application image is a bad frame is judged by adopting a basic principle judgment condition and/or an exclusion principle judgment condition, on one hand, the bad frame in the cloud application image can be accurately detected, on the other hand, through analysis of the whole cloud application image transmission link node, a cloud server image output, a user side image decoding position and a user side image display position are selected as analysis nodes, the bad frame detection is carried out on at least one of the three images corresponding to an image sent to a terminal by a cloud server, the image decoded by the terminal and the image displayed by the terminal, the occurrence position of the bad frame can be accurately positioned, and therefore the scheme can realize real-time detection of the bad frame of the whole cloud application image transmission link node.
On the basis of the foregoing method embodiment, the target data space may include: at least one of the data spaces is,
the determining, based on the pixel value index of the target data space in the cloud application image, whether the cloud application image is a bad frame by using a basic principle determination condition and/or an exclusion principle determination condition may include:
judging whether a first data space exists in the target data space and meets a first condition, wherein the first condition is that a first numerical value change amplitude of the first data space is smaller than or equal to a first threshold value; and/or a standard deviation of the first data space is less than or equal to a second threshold; and/or the variance of the first data space is less than or equal to a third threshold;
if the first data space in the target data space meets the first condition, determining that the cloud application image is a bad frame.
In this embodiment, it should be noted that the target data space is in accordance with the format of the acquired cloud application image, for example, the format of the acquired cloud application image is YUV (Y, U, and V respectively represent lumen, chromaticity, and concentration), and then the target data space may be at least one of a Y data space, a U data space, and a V data space; for another example, the acquired cloud application image is in RGB (R, G, and B respectively represent red, green, and blue) format, and the target data space may be at least one of an R data space, a G data space, and a B data space.
The first condition belongs to the basic principle judgment condition. The first data space is any one of the target data spaces. Each data space in the target data space can have a corresponding first condition, if one data space (marked as a first data space) exists in the target data space and the first data space in the cloud application image meets the first condition corresponding to the first data space, the cloud application image is determined to be a bad frame, otherwise, if each data space in the cloud application image does not meet the first condition corresponding to the corresponding data space, the cloud application image is determined not to be a bad frame. It should be noted that the first condition may be at least one of that the first magnitude of change of the first value in the first data space is less than or equal to a first threshold, that the standard deviation of the first data space is less than or equal to a second threshold, and that the variance of the first data space is less than or equal to a third threshold. The first magnitude of change in the first data space may be a first magnitude of change in a pixel value of the first data space in the cloud application image, the standard deviation of the first data space may be a standard deviation of a pixel value of the first data space in the cloud application image, and the variance of the first data space may be a variance of a pixel value of the first data space in the cloud application image.
On the basis of the foregoing method embodiment, the determining whether there is a first data space in the target data space that satisfies a first condition may include:
judging whether a first data space in the target data space meets a first condition or not, and the first data space does not meet a second condition;
if the first data space in the target data space meets the first condition, determining that the cloud application image is a bad frame may include:
if the first data space in the target data space meets the first condition and the first data space does not meet the second condition, determining that the cloud application image is a bad frame.
In this embodiment, it should be noted that the second condition belongs to an exclusion principle determination condition, and the second condition corresponding to each data space in the target data space may be set, and if there is one data space (denoted as a first data space) in the target data space, the first data space in the cloud application image satisfies the corresponding first condition, and the first data space does not satisfy the corresponding second condition, it is determined that the cloud application image is a bad frame, otherwise, it is determined that the cloud application image is not a bad frame. The second condition corresponding to each data space in the target data space may be a non-null value or a null value, and for one data space in the target data space, if the second condition corresponding to the data space is a null value, the data space in the cloud application image does not necessarily satisfy the corresponding second condition.
In this embodiment, by setting the second condition as the rule-of-exclusion determination condition, the bad frame can be detected more accurately than in the foregoing embodiment.
On the basis of the foregoing method embodiment, the second condition may include:
a first exclusion principle judgment condition and/or a second exclusion principle judgment condition;
the first elimination principle determination condition may include: the change amplitude of a second numerical value of a second data space in the cloud application image is smaller than or equal to a fourth threshold, and/or the change amplitude of a third numerical value of the second data space is smaller than or equal to a fifth threshold, wherein the second data space is at least one data space in the target data space;
the second rule-of-exclusion determining condition may include: the first threshold is a negative number and/or the second threshold is a negative number and/or the third threshold is a negative number.
In this embodiment, it should be noted that the second condition may include at least one of a first exclusion criterion and a second exclusion criterion. The second data space may be at least one data space in the target data space.
For each type of condition in the second conditions, the type of condition corresponding to each data space in the target data space may be set, the type of condition corresponding to each data space in the target data space may be a non-null value or a null value, for any data space in the target data space, if the type of condition corresponding to the data space is a null value, the data space in the cloud application image does not satisfy the corresponding type of condition, and if the data space in the cloud application image does not satisfy each type of condition in the corresponding second conditions, it is determined that the data space in the cloud application image does not satisfy the corresponding second conditions. For each frame of cloud application image, if one data space (marked as a first data space) exists in the target data space, the first data space in the cloud application image meets the corresponding first condition and does not meet the corresponding second condition, determining that the cloud application image is a bad frame, and otherwise, determining that the cloud application image is not a bad frame.
By setting the second condition to include at least one type of condition, a bad frame in the cloud application image can be more accurately located.
On the basis of the foregoing method embodiment, the first numerical variation amplitude may be a difference between a maximum pixel value and a minimum pixel value in the cloud application image, the second numerical variation amplitude may be an absolute value of a difference between a maximum pixel value and a minimum pixel value in the cloud application image, the third numerical variation amplitude may be an absolute value of a difference between a maximum pixel value and a minimum pixel value in the cloud application image, and the fourth threshold and the fifth threshold may each be half of the first threshold.
In this embodiment, the first threshold, the second threshold, and the third threshold may be set as needed. For example, assuming that the cloud application image format is YUV, the basic principle determination conditions include: the first condition 1, the first condition 2 and the first condition 3, and the exclusion principle determination conditions include: a first exclusion principle judgment condition and a second exclusion principle judgment condition, the first exclusion principle judgment condition including: the first removal principle judgment condition 1 and the first removal principle judgment condition 2, and the second removal principle judgment condition includes: a second exclusion rule judgment condition 1, a second exclusion rule judgment condition 2, and a second exclusion rule judgment condition 3, wherein,
the first condition 1 may be:
Ymax - Ymin <= Ybias_thredshold;
Ystandard_deviation <= Ysd_thredshold;
Yvariance <= Yvariance_thredshold;
in the above formula, ymax and Ymin respectively represent the maximum lumen and the minimum lumen of the Y data space in the sampled image data, ybias _ threshold represents a first threshold corresponding to the Y data space, and specific values can be set as required; the ystard _ definition represents the standard deviation of lumens of a Y data space in the sampled image data, the Ysd _ threshold represents a second threshold corresponding to the Y data space, and specific values can be set according to requirements; the variance of lumens of a Y data space in the sampled image data is represented by Yvariance, the third threshold corresponding to the Y data space is represented by Yvariance _ threshold, and specific values can be set according to needs.
The first condition 2 may be:
Umax - Umin <= Ubias_thredshold;
Ustandard_deviation <= Usd_thredshold;
Uvariance <= Uvariance_thredshold;
in the above formula, umax and Umin respectively represent the maximum lumen and the minimum lumen of the U data space in the sampled image data, ubias _ threshold represents the first threshold corresponding to the U data space, and the specific value can be set as required; the usable _ depth represents a standard deviation of lumens of a U data space in the sampled image data, the usable _ threshold represents a second threshold corresponding to the U data space, and specific values can be set according to needs; uvariance represents the variance of the lumens of the U data space in the sampled image data, uvariance _ threshold represents a third threshold corresponding to the U data space, and specific values can be set according to needs.
The first condition 3 may be:
Vmax - Vmin <= Vbias_thredshold;
Vstandard_deviation <= Vsd_thredshold;
Vvariance <= Vvariance_thredshold;
in the above formula, vmax and Vmin respectively represent the maximum lumen and the minimum lumen of a V data space in the sampled image data, vbias _ thredshold represents a first threshold corresponding to the V data space, and a specific value can be set as required; vstandard _ depth represents a standard deviation of lumens of a V data space in the sampled image data, vsd _ threshold represents a second threshold corresponding to the V data space, and a specific value can be set as required; vvariance represents a variance of lumens of a V data space in the sampled image data, vvariance _ threshold represents a third threshold corresponding to the V data space, and a specific value may be set as needed.
The first elimination principle determination condition 1 may be:
|Umax - 128| <= Ubias_thredshold/2;
|128 - Umin| <= Ubias_thredshold/2;
in the above formula, ubias _ threshold/2 represents the fourth threshold corresponding to the U data space, and is 1/2 of Ubias _ threshold.
The first elimination principle determination condition 2 may be:
|Vmax - 128| <= Vbias_thredshold/2;
|128 - Vmin| <= Vbias_thredshold/2;
in the above equation, vbias _ threshold/2 represents the fourth threshold corresponding to the V data space, which is 1/2 of Vbias _ threshold.
The second exclusion criterion judgment condition 1 may be: ybias _ threshold <0, ysd _ threshold < -0, yvariance _ threshold < -0.
The second exclusion principle determination condition 2 may be: ubias _ thredshold <0, usd _thredshold > 0, uvariance _thredshold > 0.
The second exclusion criterion determination condition 3 may be: vbias _ thredshold <0, vsd _thredshold < -0, vbias _thredshold < -0.
When bad frame detection is carried out, if a data space exists in a target data space, the data space in the sampled image data meets a first condition corresponding to the data space and does not meet a first exclusion principle judgment condition and a second exclusion principle judgment condition corresponding to the data space, the cloud application image corresponding to the sampled image data is determined to be a bad frame, and otherwise, the cloud application image corresponding to the sampled image data is determined not to be a bad frame. Still by way of example above: if the Y data space in the sampled image data meets the first condition 1 and does not meet the second exclusion principle judgment condition 1, determining that the cloud application image corresponding to the sampled image data is a bad frame, or if the U data space in the sampled image data meets the first condition 2 and does not meet the first exclusion principle judgment condition 1 and the second exclusion principle judgment condition 2, determining that the cloud application image corresponding to the sampled image data is a bad frame, or if the V data space in the sampled image data meets the first condition 3 and does not meet the first exclusion principle judgment condition 2 and the second exclusion principle judgment condition 3, determining that the cloud application image corresponding to the sampled image data is a bad frame. That is, the rule-of-exclusion determination condition is set so as to exclude the data space satisfying the rule-of-exclusion determination condition, and the basic rule determination condition is used to determine, for example, if the V data space in the sampled image data satisfies the second rule-of-exclusion determination condition 3, it is not determined whether the V data space in the sampled image data satisfies the first condition 3, that is, the data of the V data space in the sampled image data is not used to perform the bad frame detection.
Referring to fig. 2, a schematic structural diagram of a cloud application bad frame detection apparatus provided in an embodiment of the present application is shown, including:
a calculating unit 20, configured to obtain a cloud application image, and perform statistical calculation on a pixel value indicator of a target data space in the cloud application image, where the pixel value indicator includes: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal;
a determining unit 21, configured to determine whether the cloud application image is a bad frame by using a basic principle determining condition and/or an exclusion principle determining condition based on a pixel value index of a target data space in the cloud application image.
The cloud application bad frame detection device provided by the embodiment of the application is used for judging whether a cloud application image is a bad frame or not by adopting a basic principle judgment condition and/or an exclusion principle judgment condition based on a pixel value index of a target data space in the cloud application image, on one hand, the bad frame in the cloud application image can be accurately detected, on the other hand, through analysis of a whole cloud application image transmission link node, a cloud server image output, a user side image decoding position and a user side image display position are selected as analysis nodes, the bad frame detection is carried out on at least one image in the three images corresponding to an image sent to a terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal, the occurrence position of the bad frame can be accurately positioned, and therefore the scheme can realize real-time detection of the bad frame of the whole cloud application image transmission link node.
On the basis of the foregoing apparatus embodiment, the target data space may include: at least one of the data spaces is,
wherein, the judging unit may include:
the judging subunit is configured to judge whether a first data space exists in the target data space and meets a first condition, where the first condition is that a first numerical variation amplitude of the first data space is less than or equal to a first threshold; and/or a standard deviation of the first data space is less than or equal to a second threshold; and/or the variance of the first data space is less than or equal to a third threshold;
a determining subunit, configured to determine that the cloud application image is a bad frame if the determining subunit determines that the first data space in the target data space meets the first condition.
On the basis of the foregoing device embodiment, the determining subunit may be specifically configured to: judging whether a first data space in the target data space meets a first condition or not, and the first data space does not meet a second condition;
the determining subunit may specifically be configured to: if the judging subunit judges that the first data space in the target data space meets the first condition and the first data space does not meet the second condition, determining that the cloud application image is a bad frame.
The cloud application bad frame detection device provided by the embodiment of the application has the implementation process consistent with that of the cloud application bad frame detection method provided by the embodiment of the application, and the achieved effect is also the same as that of the cloud application bad frame detection method provided by the embodiment of the application, and is not described again here.
The bad frame detection scheme of this application provides a bad frame detection method of light weight level, detects the bad frame problem of image from the angle of whole cloud application transmission link node, can realize that the bad frame of link node is real-time, accurate detection and data report, and the big data platform of being convenient for discovers the problem early and provides the powerful information support of solving the problem as early as possible for the research and development team, and the beneficial effect that specifically can realize is as follows:
1. by detecting and analyzing the cloud application image data before and after encoding, whether the image is a bad frame or not can be accurately determined, the resource consumption of a detection algorithm is extremely light, no influence is caused on the cloud application main service, and performance indexes such as delay and frame rate of the cloud application image are not influenced;
2. the method supports the detection and the report of the bad frames of a plurality of nodes, and can position the problem node from the angle of the whole system.
Through verification, continuous single-color and skin bad frames can be detected by the scheme, the detection accuracy rate reaches 90%, and the omission ratio is lower than 5%.
As shown in fig. 3, an electronic device provided in an embodiment of the present application includes: a processor 30, a memory 31 and a bus 32, wherein the memory 31 stores machine-readable instructions executable by the processor 30, when the electronic device is operated, the processor 30 communicates with the memory 31 through the bus 32, and the processor 30 executes the machine-readable instructions to perform the steps of the cloud application bad frame detection method.
Specifically, the memory 31 and the processor 30 can be general-purpose memory and processor, which are not limited to specific embodiments, and the cloud application bad frame detection method can be executed when the processor 30 runs a computer program stored in the memory 31.
Corresponding to the cloud application bad frame detection method, the embodiment of the application further provides a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the steps of the cloud application bad frame detection method are executed.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some communication interfaces, indirect coupling or communication connection between devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A cloud application bad frame detection method is characterized by comprising the following steps:
the method comprises the steps of obtaining a cloud application image, and calculating a pixel value index of a target data space in the cloud application image, wherein the pixel value index comprises: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal;
and judging whether the cloud application image is a bad frame or not by adopting a basic principle judgment condition and/or an exclusion principle judgment condition based on the pixel value index of the target data space in the cloud application image.
2. The method of claim 1, wherein the target data space comprises: at least one of the data spaces is,
wherein the determining whether the cloud application image is a bad frame by using a basic principle determination condition and/or an exclusion principle determination condition based on the pixel value index of the target data space in the cloud application image includes:
judging whether a first data space exists in the target data space and meets a first condition, wherein the first condition is that a first numerical value change amplitude of the first data space is smaller than or equal to a first threshold value; and/or a standard deviation of the first data space is less than or equal to a second threshold; and/or the variance of the first data space is less than or equal to a third threshold;
if the first data space in the target data space meets the first condition, determining that the cloud application image is a bad frame.
3. The method of claim 2, wherein said determining whether the first data space exists in the target data space satisfies a first condition comprises:
judging whether a first data space in the target data space meets a first condition or not, and the first data space does not meet a second condition;
wherein, if the first data space exists in the target data space and meets the first condition, determining that the cloud application image is a bad frame includes:
if the first data space in the target data space meets the first condition and the first data space does not meet the second condition, determining that the cloud application image is a bad frame.
4. The method of claim 3, wherein the second condition comprises:
a first exclusion rule judgment condition and/or a second exclusion rule judgment condition;
wherein, the first elimination principle judgment condition includes: the change amplitude of a second numerical value of a second data space in the cloud application image is smaller than or equal to a fourth threshold, and/or the change amplitude of a third numerical value of the second data space is smaller than or equal to a fifth threshold, wherein the second data space is at least one data space in the target data space;
the second rule of exclusion determining condition includes: the first threshold is a negative number and/or the second threshold is a negative number and/or the third threshold is a negative number.
5. The method of claim 4, wherein the first magnitude of change in the numerical value is a difference between a maximum pixel value and a minimum pixel value in the cloud application image, the second magnitude of change in the numerical value is an absolute value of a difference between a maximum pixel value and 128 in the cloud application image, the third magnitude of change in the numerical value is an absolute value of a difference between 128 and a minimum pixel value in the cloud application image, and the fourth threshold and the fifth threshold are both half of the first threshold.
6. A cloud application bad frame detection device is characterized by comprising:
the computing unit is used for acquiring a cloud application image and performing statistical computation on a pixel value index of a target data space in the cloud application image, wherein the pixel value index comprises: at least one of a numerical value change amplitude, a standard deviation and a variance, wherein the cloud application image is at least one of an image sent to the terminal by a cloud server, an image decoded by the terminal and an image displayed by the terminal;
and the judging unit is used for judging whether the cloud application image is a bad frame or not by adopting a basic principle judging condition and/or an exclusion principle judging condition based on the pixel value index of the target data space in the cloud application image.
7. The apparatus of claim 6, wherein the target data space comprises: at least one of the data spaces is,
wherein, the judging unit comprises:
the judging subunit is configured to judge whether a first data space exists in the target data space and meets a first condition, where the first condition is that a first numerical variation amplitude of the first data space is less than or equal to a first threshold; and/or a standard deviation of the first data space is less than or equal to a second threshold; and/or the variance of the first data space is less than or equal to a third threshold;
a determining subunit, configured to determine that the cloud application image is a bad frame if the determining subunit determines that the first data space in the target data space meets the first condition.
8. The apparatus of claim 7,
the judging subunit is specifically configured to: judging whether a first data space in the target data space meets a first condition or not, and the first data space does not meet a second condition;
the determining subunit is specifically configured to: if the judging subunit judges that the first data space in the target data space meets the first condition and the first data space does not meet the second condition, determining that the cloud application image is a bad frame.
9. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the steps of the cloud application bad frame detection method of any of claims 1 to 5.
10. An electronic device, comprising: a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when an electronic device runs, the processor communicates with the storage medium through the bus, and the processor executes the machine-readable instructions to execute the steps of the cloud application bad frame detection method according to any one of claims 1 to 5.
CN202211658381.3A 2022-12-23 2022-12-23 Cloud application bad frame detection method and device, electronic equipment and storage medium Active CN115661145B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211658381.3A CN115661145B (en) 2022-12-23 2022-12-23 Cloud application bad frame detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211658381.3A CN115661145B (en) 2022-12-23 2022-12-23 Cloud application bad frame detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115661145A true CN115661145A (en) 2023-01-31
CN115661145B CN115661145B (en) 2023-03-21

Family

ID=85022181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211658381.3A Active CN115661145B (en) 2022-12-23 2022-12-23 Cloud application bad frame detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115661145B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104469345A (en) * 2014-12-10 2015-03-25 北京理工大学 Video fault diagnosis method based on image processing
US20160196477A1 (en) * 2014-12-30 2016-07-07 Spreadtrum Communications (Tianjin) Co., Ltd. Method and system for filtering image noise out
CN106447656A (en) * 2016-09-22 2017-02-22 江苏赞奇科技股份有限公司 Rendering flawed image detection method based on image recognition
CN112533059A (en) * 2020-11-20 2021-03-19 腾讯科技(深圳)有限公司 Image rendering method and device, electronic equipment and storage medium
CN113256502A (en) * 2020-02-10 2021-08-13 深圳市理邦精密仪器股份有限公司 Ultrasonic image adjusting method, terminal device and readable storage medium
CN113297420A (en) * 2021-04-30 2021-08-24 百果园技术(新加坡)有限公司 Video image processing method and device, storage medium and electronic equipment
CN113660427A (en) * 2021-09-22 2021-11-16 广州网路通电子有限公司 Image analysis system and method applied to video monitoring tester

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104469345A (en) * 2014-12-10 2015-03-25 北京理工大学 Video fault diagnosis method based on image processing
US20160196477A1 (en) * 2014-12-30 2016-07-07 Spreadtrum Communications (Tianjin) Co., Ltd. Method and system for filtering image noise out
CN106447656A (en) * 2016-09-22 2017-02-22 江苏赞奇科技股份有限公司 Rendering flawed image detection method based on image recognition
CN113256502A (en) * 2020-02-10 2021-08-13 深圳市理邦精密仪器股份有限公司 Ultrasonic image adjusting method, terminal device and readable storage medium
CN112533059A (en) * 2020-11-20 2021-03-19 腾讯科技(深圳)有限公司 Image rendering method and device, electronic equipment and storage medium
CN113297420A (en) * 2021-04-30 2021-08-24 百果园技术(新加坡)有限公司 Video image processing method and device, storage medium and electronic equipment
CN113660427A (en) * 2021-09-22 2021-11-16 广州网路通电子有限公司 Image analysis system and method applied to video monitoring tester

Also Published As

Publication number Publication date
CN115661145B (en) 2023-03-21

Similar Documents

Publication Publication Date Title
US8396302B2 (en) Method of detecting logos, titles, or sub-titles in video frames
CN105678700B (en) Image interpolation method and system based on prediction gradient
CN107347151B (en) Binocular camera occlusion detection method and device
CN106303158B (en) A kind of striped method for detecting abnormality in video image
JP4848001B2 (en) Image processing apparatus and image processing method
CN106412573A (en) Method and device for detecting lens stain
CN106504230B (en) It is complete with reference to color/graphics image quality measure method based on phase equalization
CN103164852A (en) Image processing device and image processing method
CN111797022A (en) Test case generation method and device for order splitting service, electronic equipment and medium
CN108594089B (en) Method and device for detecting polymerization degree of insulating paper in transformer
CN115272290A (en) Defect detection method and device, electronic equipment and storage medium
CN101389045B (en) Image quality evaluation method and device
CN115661145B (en) Cloud application bad frame detection method and device, electronic equipment and storage medium
CN104240228A (en) Detecting method and device for specific pictures applied to website
CN103200349B (en) Scanned image color cast automatic detection method
CN104282013B (en) A kind of image processing method and device for foreground target detection
CN112132774A (en) Quality evaluation method of tone mapping image
JP5232619B2 (en) Inspection apparatus and inspection method
CN109978859A (en) A kind of image display adaptation method for evaluating quality based on visible distortion pond
CN109115120A (en) A kind of snow depth measurement method, apparatus and system
CN112801997A (en) Image enhancement quality evaluation method and device, electronic equipment and storage medium
CN103640331B (en) A kind of printed matter definition detection method of optimization
CN111083468A (en) Short video quality evaluation method and system based on image gradient
CN109982069A (en) The measurement method and system of cell breath
CN115953723A (en) Static frame detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240102

Address after: 230031 Room 672, 6/F, Building A3A4, Zhong'an Chuanggu Science Park, No. 900, Wangjiang West Road, High-tech Zone, Hefei, Anhui

Patentee after: Anhui Haima Cloud Technology Co.,Ltd.

Address before: 301700 room 2d25, Building 29, No.89 Heyuan Road, Jingjin science and Technology Valley Industrial Park, Wuqing District, Tianjin

Patentee before: HAIMAYUN (TIANJIN) INFORMATION TECHNOLOGY CO.,LTD.