CN114185784A - Barrage rendering test method and device - Google Patents
Barrage rendering test method and device Download PDFInfo
- Publication number
- CN114185784A CN114185784A CN202111504390.2A CN202111504390A CN114185784A CN 114185784 A CN114185784 A CN 114185784A CN 202111504390 A CN202111504390 A CN 202111504390A CN 114185784 A CN114185784 A CN 114185784A
- Authority
- CN
- China
- Prior art keywords
- image
- test
- bullet screen
- video
- tested
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The application discloses a barrage rendering test method. The method comprises the following steps: playing a video to be tested with a bullet screen; when the video to be tested is played to a preset progress, intercepting a frame of video frame as a test image, wherein the video frame comprises at least one bullet screen; calculating the similarity value of the test image and a reference image, wherein the reference image is a video frame which is captured when the video to be tested is played to the preset progress and does not contain a bullet screen; and if the similarity value is smaller than a first preset value, judging that the bullet screen is rendered. The application can improve the test accuracy.
Description
Technical Field
The application relates to the field of software testing, in particular to a barrage rendering testing method and device.
Background
The rendering capability of the barrage is related to various performances of the terminal equipment, the requirement on compatibility testing is high, and the barrage rendering quality is generally guaranteed through a smoking test under the condition of high-speed iteration of an application program (APP). Traditional bullet screen smoking test accessible comparison picture's difference accomplishes bullet screen and renders quality inspection, however, because the bullet screen possesses the different characteristics of real-time publication and different time point bullet screen distribution density, traditional picture difference mode of comparing can have the erroneous judgement of certain degree.
Disclosure of Invention
In view of the above, a bullet screen rendering test method, a bullet screen rendering test device, a computer device and a computer readable storage medium are provided to solve the problem that certain misjudgment exists in the existing bullet screen rendering test method.
The application provides a barrage rendering test method, which comprises the following steps:
playing a video to be tested with a bullet screen;
when the video to be tested is played to a preset progress, intercepting a frame of video frame as a test image, wherein the video frame comprises at least one bullet screen;
calculating the similarity value of the test image and a reference image, wherein the reference image is a video frame which is captured when the video to be tested is played to the preset progress and does not contain a bullet screen;
and if the similarity value is smaller than a first preset value, judging that the bullet screen is rendered.
Optionally, the calculating the similarity value between the test image and the reference image comprises:
carrying out gray level processing on the test image and the reference image respectively to obtain a corresponding first gray level image and a corresponding second gray level image;
respectively carrying out discrete cosine transform on the first gray level image and the second gray level image to obtain a corresponding first discrete cosine transform image and a corresponding second discrete cosine transform image;
respectively carrying out binarization processing on pixels in the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first binarized image and a corresponding second binarized image;
and calculating the similarity value of the first binarized image and the second binarized image.
Optionally, the performing gray scale processing on the test image and the reference image respectively to obtain a corresponding first gray scale image and a corresponding second gray scale image includes:
respectively carrying out reduction processing on the test image and the reference image to obtain a corresponding first reduced image and a corresponding second reduced image;
and carrying out gray level processing on the first reduced image and the second reduced image to obtain a corresponding first gray level image and a corresponding second gray level image.
Optionally, the binarizing the pixel images in the first discrete cosine transform image and the second discrete cosine transform image respectively to obtain corresponding first binarized image and second binarized image includes:
respectively cutting preset areas in the upper left corners of the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first matrix image and a corresponding second matrix image;
and carrying out binarization processing on the first matrix image and the second matrix image to obtain a corresponding first binarized image and a corresponding second binarized image.
Optionally, the calculating the similarity value of the first binarized image and the second binarized image comprises:
and calculating the Hamming distance between the first binarized image and the second binarized image, and determining the similarity value between the test image and the reference image according to the calculated Hamming distance.
Optionally, the method further comprises:
if the similarity value is larger than or equal to the first preset value, calculating the ratio of reserved pixels in the test image to obtain a first ratio, wherein the reserved pixels are pixel points of which the pixel values are within a preset interval range;
and judging whether the bullet screen is rendered according to the first proportion and a preset proportion.
Optionally, the determining whether the bullet screen is rendered according to the first percentage and a preset percentage includes:
calculating the ratio of reserved pixels in the reference image to obtain a second ratio, and taking the second ratio as the preset ratio;
if the difference value of the first proportion and the second proportion is larger than or equal to a second preset value, judging that the bullet screen is rendered;
and if the difference value of the first occupation ratio and the second occupation ratio is smaller than the second preset value, judging that the bullet screen is not rendered.
Optionally, the method further comprises:
recording the playing modes of the video to be tested, wherein the playing modes comprise a full screen mode, a half screen mode, a horizontal screen mode and a vertical screen mode;
after the step of capturing a frame of video frame as a test image when the video to be tested is played to the preset progress, the method further comprises the following steps:
when the playing mode of the video to be tested is a half-screen mode, judging whether the playing mode of the video to be tested is a horizontal-screen mode;
if the playing mode of the video to be tested is the horizontal screen mode, cutting the test picture by adopting a first cutting mode to obtain a cut test picture;
and if the playing mode of the video to be tested is the vertical screen mode, cutting the test picture by adopting a second cutting mode to obtain the cut test picture.
The application also provides a bullet screen rendering test device, the bullet screen rendering test device includes:
the playing module is used for playing the video to be tested with the bullet screen;
the intercepting module is used for intercepting a frame of video frame as a test image when the video to be tested is played to the preset progress, and the video frame comprises at least one bullet screen;
the calculation module is used for calculating the similarity value of the test image and a reference image, wherein the reference image is a video frame which is captured when the video to be tested is played to the preset progress and does not contain a bullet screen;
and the judging module is used for judging that the bullet screen is rendered if the similarity value is smaller than a first preset value.
The present application further provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above method when executing the computer program.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method.
The embodiment is realized by playing a video to be tested with a bullet screen; when the video to be tested is played to a preset progress, intercepting a frame of video frame as a test image, wherein the video frame comprises at least one bullet screen; calculating the similarity value of the test image and a reference image, wherein the reference image is a video frame which is captured when the video to be tested is played to the preset progress and does not contain a bullet screen; and if the similarity value is smaller than a first preset value, judging that the bullet screen is rendered. In the embodiment, the pictures of the same playing progress in the video playing process under the scenes of closing and opening the bullet screen are respectively taken as the reference picture and the test picture to carry out similarity comparison, and then whether the bullet screen is normally rendered or not is judged according to the similarity value, so that the test accuracy is improved.
Drawings
Fig. 1 is an environment schematic diagram of a bullet screen rendering test method according to an embodiment of the present application;
FIG. 2 is a flowchart of an embodiment of a bullet screen rendering test method according to the present application;
FIG. 3 is a flowchart illustrating a detailed process of calculating similarity between the test image and the reference image according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a step thinning process of performing gray processing on the test image and the reference image respectively to obtain a corresponding first gray image and a corresponding second gray image in an embodiment of the present application;
fig. 5 is a schematic view of a step refinement flow of performing binarization processing on pixel images in the first discrete cosine transform image and the second discrete cosine transform image respectively to obtain a corresponding first binarized image and a corresponding second binarized image in an embodiment of the present application;
FIG. 6 is a flowchart of another embodiment of a bullet screen rendering test method according to the present application;
fig. 7 is a flowchart illustrating a detailed process of determining whether a bullet screen has been rendered according to the first percentage and a preset percentage in an embodiment of the present application;
FIG. 8 is a flowchart of another embodiment of a bullet screen rendering test method according to the present application;
FIG. 9 is a block diagram of a bullet screen rendering test apparatus according to an embodiment of the present disclosure;
fig. 10 is a schematic hardware structure diagram of a computer device for executing a barrage rendering test method according to an embodiment of the present application.
Detailed Description
The advantages of the present application are further illustrated below with reference to the accompanying drawings and specific embodiments.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the description of the present application, it should be understood that the numerical references before the steps do not identify the order of performing the steps, but merely serve to facilitate the description of the present application and to distinguish each step, and therefore should not be construed as limiting the present application.
Fig. 1 schematically shows an application environment diagram of a bullet screen rendering test method according to an embodiment of the present application. In an exemplary embodiment, the system of the application environment may include a device under test 10 and a testing device 20 installed with a testing tool. Wherein the device under test 10 forms a wireless or wired connection with the testing device 20. In the present embodiment, the test device 20 executes an automated test script through the installed test tool to implement a test on an application program to be tested (APP) installed in the device to be tested. The device to be tested 10 and the testing device 20 can be mobile phones, iPADs, tablet computers and the like.
Fig. 2 is a schematic flowchart of a bullet screen rendering test method according to an embodiment of the present application. It is to be understood that the flow charts in the embodiments of the present method are not intended to limit the order in which the steps are performed. As can be seen from the figure, the barrage rendering test method provided in this embodiment includes:
and step S20, playing the video to be tested with the bullet screen.
Specifically, when a video to be tested with a bullet screen is played, a video playing application (testing application program) for playing the video to be tested needs to be started first, then a video playing page is entered, and then the video to be tested can be played. It can be understood that when the video to be tested is played, the bullet screen button needs to be opened, so that when the video to be tested is played, the bullet screen can be played at the same time.
In an embodiment, in order to facilitate playing and recording the video to be tested, before playing the video to be tested, a login operation of the application to be tested may be performed, and then the video to be tested is played.
And step S21, when the video to be tested is played to the preset progress, capturing a frame of video frame as a test image, wherein the video frame comprises at least one bullet screen.
Specifically, the preset schedule may be flexibly set and adjusted according to actual needs, for example, the preset schedule is 20 seconds, which means that when the video to be tested is played for the 20 th second, the current video frame image to be played needs to be captured, and the captured video frame is used as a test image. In this embodiment, since the barrage button is turned on when the test video is played, the captured video frame includes at least one barrage.
It should be noted that the video is composed of still pictures that are continuously played, and these still pictures are referred to as video frames.
Step S22, calculating a similarity value between the test image and a reference image, where the reference image is a video frame that is captured when the video to be tested is played to the preset schedule and does not include a bullet screen.
Specifically, the reference image is an image used for comparing with the test image, and the reference image can be obtained by capturing a currently played video frame when the video to be tested is played to the preset progress when the bullet screen button is not opened.
It will be appreciated that the reference image may be pre-cut or may be cut during the test.
As an example, when the reference image is captured during the test, after the pop-up screen playing mode is started, when the video to be tested is played to the preset progress, the capture instruction is synchronously executed to capture the current video playing picture to obtain the test image. And then, closing the bullet screen playing mode, and then playing the video to be tested again until the preset progress is played again, and synchronously executing a screenshot instruction to intercept the current video playing picture to obtain a reference image.
Wherein the similarity value is used for representing the similarity of the test image and the reference image.
In an exemplary embodiment, referring to fig. 3, the calculating the similarity value between the test image and the reference image may include:
step S30, performing gray scale processing on the test image and the reference image respectively to obtain a corresponding first gray scale image and a corresponding second gray scale image.
Specifically, the grayscale processing is to convert a color (RGB) first video picture to a Grayscale (GRAY) first video picture. In one embodiment, the grayscale processing of the image may be performed by the following grayscale algorithm:
GRAY (Red + Green + Blue)/3, wherein Red, Green and Blue are respectively expressed as Red, Green and Blue pixel points, and GRAY is a GRAY pixel point.
It is understood that the above grayscale algorithm is exemplary, and in other embodiments, other grayscale algorithms may be used, and are not limited in this embodiment.
In this embodiment, a first grayscale image can be obtained by performing grayscale processing on the test image, and a second grayscale image can be obtained by performing grayscale processing on the reference image.
It should be noted that, the second grayscale image may also be obtained by performing grayscale processing on the reference image in advance, and need not be performed in the test process.
In an exemplary embodiment, referring to fig. 3, the performing gray-scale processing on the test image and the reference image respectively to obtain corresponding first gray-scale image and second gray-scale image may include:
step S40, respectively performing reduction processing on the test image and the reference image to obtain a corresponding first reduced image and a corresponding second reduced image.
Specifically, for convenience of subsequent processing, an image reduction algorithm may be adopted to firstly perform reduction processing on the test image and the reference image to obtain a corresponding first reduced image and a corresponding second reduced image. The image reduction algorithm may be a nearest neighbor algorithm, a Bilinear algorithm, etc., and is not limited in this embodiment.
As an example, the test image and the reference image may be reduced to an image of 32x32 pixels.
Step S41, performing grayscale processing on the first and second reduced images to obtain corresponding first and second grayscale images.
Specifically, after the image reduction processing is completed, the obtained first reduced image and the second reduced image are subjected to gray scale processing, so that a first gray scale image and a second gray scale image are obtained.
Step S31, performing discrete cosine transform on the first grayscale image and the second grayscale image respectively to obtain a corresponding first discrete cosine transform image and a corresponding second discrete cosine transform image.
Specifically, the Discrete Cosine Transform (DCT) is a Transform related to Fourier Transform, which is similar to the Discrete Fourier Transform (DFT for Discrete Fourier Transform), but uses only real numbers. The discrete cosine transform corresponds to a discrete fourier transform approximately twice as long as it, which is performed on a real even function (since the fourier transform of a real even function is still a real even function), and requires a half unit shift in the input or output position within some variations (DCT is of 8 standard types, 4 of which are common). DCT is mainly used for compressing data or images, can convert signals in a space domain into a frequency domain, and has good decorrelation performance. The DCT is lossless, but creates good conditions for next quantization, Huffman coding and the like in the fields of image coding and the like, and meanwhile, because the DCT is symmetrical, the DCT can be used for restoring the original image information at a receiving end by using the inverse DCT after quantization coding. The DCT coefficient energy is mainly concentrated at the upper left corner after the discrete cosine transform is carried out on the original image, most of the rest coefficients are close to zero, and therefore, the DCT has the characteristic of being suitable for image compression.
In this embodiment, a first discrete cosine transform image can be obtained by performing DTC transform on the first grayscale image, and a second discrete cosine transform image can be obtained by performing DTC transform on the second grayscale image.
It should be noted that DTC transformation may also be performed on the second gray scale image in advance to obtain the second discrete cosine transform image, and is not required to be performed in the test process.
In an exemplary embodiment, referring to fig. 5, the performing binarization processing on the pixel images in the first and second discrete cosine transform images to obtain corresponding first and second binarized images may include:
step S50, respectively cutting the preset areas in the upper left corners of the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first matrix image and a corresponding second matrix image;
specifically, since the energy of the image coefficient obtained by discrete cosine transform is mainly concentrated in the upper left corner, most of the rest coefficients are close to zero. Therefore, in this embodiment, in order to subsequently save the calculation amount, the preset region in the upper left corner of the first discrete cosine transform image and the second discrete cosine transform image may be clipped, so as to obtain the corresponding first matrix image and second matrix image.
The preset area may be flexibly set and adjusted according to an actual situation, for example, the preset area is an 8x8 pixel area, that is, an 8x8 pixel area at the upper left corner of the first discrete cosine transform image and the second discrete cosine transform image is cropped, so as to obtain the corresponding first matrix image and the second matrix image.
Step S51, performing binarization processing on the first matrix image and the second matrix image to obtain a corresponding first binarized image and a corresponding second binarized image.
Specifically, after the image clipping processing is completed, the first matrix image and the second matrix image obtained by clipping are subjected to binarization processing, so that a corresponding first binarized image and a corresponding second binarized image are obtained.
Step S32, performing binarization processing on the pixels in the first discrete cosine transform image and the second discrete cosine transform image respectively to obtain a corresponding first binarized image and a corresponding second binarized image.
Specifically, the binarization processing refers to a process of setting the gray value of a pixel point in the difference picture to be 0 or 255, that is, the whole difference picture exhibits an obvious black-and-white effect, that is, the 256 gray difference pictures with brightness levels are selected by a proper threshold value to obtain a binarized image which can still reflect the overall and local characteristics of the image.
As an example, the threshold may be selected to be 5, that is, when the gray value of a pixel point in the difference image is greater than or equal to 5, the gray value of the pixel point may be assigned to 1; when the gray value of the pixel point in the difference picture is less than 5, the gray value of the pixel point can be assigned to be 0. And after the gray values of all pixel points in the difference picture are reassigned, a binary image reflecting the overall and local characteristics of the image can be obtained.
In one embodiment, the threshold may be determined by calculating the mean of all gray pixels in the image. That is, the threshold value is (pixel value of pixel 1 + pixel value of pixel 2 + … + pixel value of pixel N)/N, where N is the number of pixels in the image. Therefore, when binarization processing is performed, for a pixel point in the first discrete cosine transform image, when the pixel value of the pixel point in the image is greater than the mean value of all the pixel points in the image, the gray value of the pixel point can be given as 1; when the pixel value of the pixel point in the image is greater than or equal to the mean value of all the pixel points in the image, the gray value of the pixel point can be given as 0. Similarly, for a pixel in the second discrete cosine transform image, when the pixel value of the pixel in the image is greater than the mean value of all pixels in the image, the gray value of the pixel can be given as 1; when the pixel value of the pixel point in the image is greater than or equal to the mean value of all the pixel points in the image, the gray value of the pixel point can be given as 0.
In this embodiment, a first binarized image may be obtained by performing binarization processing on the first discrete cosine transform image, and a second binarized image may be obtained by performing binarization processing on the second discrete cosine transform image.
It should be noted that, the binarization processing may also be performed on the second discrete cosine transform image in advance to obtain the second binarization image, and is not required to be performed in the test process.
Step S33, calculating a similarity value of the first binarized image and the second binarized image.
Specifically, after the first binarized image and the second binarized image are obtained, the similarity between the reference image and the test image may be obtained by calculating the similarity between the two binarized images.
In an exemplary embodiment, the calculating the similarity value of the first binarized image and the second binarized image includes:
and calculating the Hamming distance between the first binarized image and the second binarized image, and determining the similarity value between the test image and the reference image according to the calculated Hamming distance.
Specifically, the hamming distance is named under the name richardsley hamming. In the information theory, the hamming distance between two character strings with equal length is the number of different characters at the corresponding positions of the two character strings. In other words, it is the number of characters that need to be replaced to convert one string into another.
In this embodiment, after obtaining the hamming distance, the similarity value can be further calculated.
As an example, assuming that the first binarized image and the second binarized image include 1000 binarized pixel points and the calculated hamming distance is 500, the similarity value is 500/1000-0.5-50%.
Step S23, if the similarity value is smaller than the first preset value, it is determined that the bullet screen has been rendered.
Specifically, the first preset value may be flexibly set and adjusted according to an actual situation, for example, if the first preset value is 85%, it indicates that when the similarity value between the test image and the reference image is less than 85%, it may be determined that the bullet screen is normally rendered; when the similarity value of the test image and the reference image is greater than or equal to 85%, it may be determined that the bullet screen is not normally rendered.
In an exemplary embodiment, to improve the test accuracy, referring to fig. 6, the method further includes:
step S60, if the similarity value is greater than or equal to the first preset value, calculating a ratio of retained pixels in the test image to obtain a first ratio, where the retained pixels are pixels whose pixel values are within a preset interval.
Specifically, the preset interval range can be flexibly set and adjusted according to actual conditions. In practical cases, the preset interval range may be [200, 200, 200] to [255, 255, 255] in consideration of a characteristic that the pop-up screen rendering color is mostly white.
In an embodiment, when the reserved pixel proportion is calculated, an inRange algorithm may be used to screen out reserved pixel points from the image, and then, the proportion of the reserved pixel points in all the pixel points included in the image is calculated, for example, the number of the screened reserved pixel points is 4000, the image includes 5000 pixel points, and the proportion is 4000/5000-0.8-80%.
And step S61, judging whether the bullet screen is rendered according to the first proportion and a preset proportion.
Specifically, the preset ratio can be flexibly set and adjusted according to actual conditions. In one embodiment, the preset ratio may be set according to a ratio of the reserved pixels in the reference image, that is, the preset ratio is set to the ratio of the reserved pixels in the reference image. In another embodiment, the preset duty ratio may be directly set to a fixed value, for example, to 80%.
In one embodiment, when the first occupation ratio is greater than the preset occupation ratio, it may be determined that the bullet screen is rendered; when the first occupation ratio is smaller than or equal to the preset occupation ratio, the bullet screen can be judged not to be rendered.
In an exemplary embodiment, referring to fig. 7, determining whether the bullet screen has been rendered according to the first percentage and the preset percentage may include:
step S70, calculating a ratio of the remaining pixels in the reference image to obtain a second ratio, and using the second ratio as the preset ratio.
Specifically, when the reserved pixel proportion is calculated, an inRange algorithm may be used to screen out the reserved pixel points from the image, and then, the proportion of the reserved pixel points in all the pixel points included in the image is calculated, for example, the number of the screened reserved pixel points is 4500, and the proportion of the reserved pixel points included in the image is 4500/5000-0.9-90%.
Step S71, if the difference between the first ratio and the second ratio is greater than or equal to a second preset value, it is determined that the bullet screen has been rendered.
Step S72, if the difference between the first ratio and the second ratio is smaller than the second preset value, it is determined that the bullet screen is not rendered.
Specifically, the second preset value may also be set and adjusted according to an actual situation, for example, the second preset value is 5%.
In this embodiment, considering that there may also be influences of the number of bullet curtains and the possibility of having pixels with pixel values within the preset interval range in the image picture, after obtaining the first and second ratios of the retained pixels in the reference image and the test image, it may be determined whether the test image is subjected to bullet curtain rendering on the basis of the reference image by determining whether the difference value is greater than the second preset value. If the difference value is greater than or equal to a second preset value, it can be judged that the bullet screen is normally rendered; if the difference value is smaller than the second preset value, it can be judged that the bullet screen is not normally rendered.
In an exemplary embodiment, in order to eliminate the influence of factors other than the player on the test result, referring to fig. 8, the method further includes:
and step S80, recording the playing modes of the video to be tested, wherein the playing modes comprise a full screen mode, a half screen mode, a horizontal screen mode and a vertical screen mode.
Specifically, when a video to be tested is played through a player, the current playing mode of the video to be tested can be recorded at the same time.
After the step of capturing a frame of video frame as a test image when the video to be tested is played to the preset progress, the method further comprises the following steps:
step S81, when the playing mode of the video to be tested is a half-screen mode, judging whether the playing mode of the video to be tested is a horizontal screen mode.
Specifically, when the play mode of the video to be tested is the half-screen mode, the half-screen mode generally carries out video playing, that is, the bullet screen only displays on the half-screen, and therefore, for subsequent detection of similarity, the picture needs to be cut to cut the image containing the bullet screen. However, the half-screen mode of the landscape mode and the portrait mode generally has a different picture ratio of the bullet screen. Therefore, for better subsequent picture cropping, whether the current playing mode is the landscape mode or the portrait mode can be further judged.
It is understood that when the playing mode is the full screen mode, the picture does not need to be cropped.
And step S82, if the playing mode of the video to be tested is the horizontal screen mode, cutting the test picture by adopting a first cutting mode to obtain the cut test picture.
Specifically, the first cutting mode is set according to actual conditions, for example, the first cutting mode is cutting according to 1/3 of the screen height. And after finishing the picture cutting, taking the cut picture as a test picture finally used for comparison.
It can be understood that, if the test picture is cropped, after the reference picture is acquired, the reference picture also needs to be cropped by using the first cropping method, and the cropped reference picture is used as the reference picture for comparison finally.
And step S83, if the playing mode of the video to be tested is the vertical screen mode, cutting the test picture by adopting a second cutting mode to obtain the cut test picture.
Specifically, the second cutting mode is set according to actual conditions, for example, the second cutting mode is cutting according to 1/2 of the screen height.
It can be understood that, if the test picture is cropped, after the reference picture is acquired, the reference picture also needs to be cropped by using the second cropping method, and the cropped reference picture is used as the reference picture for comparison finally.
It should be noted that, when all the steps of the present application are implemented, an automated test script may be written in advance by a user, and the automated test script is executed.
In particular, an automation script for testing bullet screen rendering may be written by an automation testing tool. In this embodiment, the automated test tool is preferably the Appium. The Apium is an open source automatic testing tool and is used for automatically testing native applications, mobile Web applications and mixed applications on iOS mobile phones, Android mobile phones and Windows desktop platforms. In addition, the Appium is a cross-platform testing tool that allows users to write test scripts to multiple platforms (iOS, Android, Windows) using the same API, multiplexing code among iOS, Android, and Windows test suites. In the embodiment, when the automation script is written by using the appum, the automation script can be written by adopting a pytest framework. The pytest is a very mature full-function Python test framework, and has the advantages of simplicity, flexibility, easy operation, rich documents, parameterization support, fine-grained control of test cases to be tested, and support of simple unit tests and complex functional tests.
It should be noted that the barrage refers to a comment directly appearing on the video, and may appear on the video in a special manner of scrolling, staying or even more actions, and is a brief comment sent by a person watching the video.
The bullet screen rendering test refers to testing whether the bullet screen can be normally rendered in a video playing application program (APP).
The Test Case (Test Case) refers to the description of a Test task performed on a specific software product, and embodies Test schemes, methods, techniques and strategies. The contents of the test object, the test environment, the input data, the test steps, the expected results, the test scripts and the like are included, and finally, a document is formed. Simply considered, a test case is a set of test inputs, execution conditions, and expected results tailored for a particular purpose to verify whether a particular software requirement is met.
In the embodiment, the pictures of the same playing progress in the video playing process under the scenes of closing and opening the bullet screen are respectively taken as the reference picture and the test picture to carry out similarity comparison, and then whether the bullet screen is normally rendered or not is judged according to the similarity value, so that the test accuracy is improved. Meanwhile, in order to further improve the test accuracy, the judgment accuracy is improved by combining two modes of similarity calculation and pixel proportion. The experimental result shows that the identification accuracy of the test mode of the similarity value in the first stage can reach 90%, on the basis of the identification in the first stage, pixel proportion identification in the second stage is carried out on 10% of unidentified images, and 70% of images can be accurately identified. Therefore, in the scene of bullet screen identification, the total image identification accuracy can reach 97% by combining the first-stage and second-stage identification schemes.
Fig. 9 is a block diagram of an embodiment of a bullet screen rendering test apparatus 90 according to the present application.
In this embodiment, the bullet screen rendering test apparatus 90 includes a series of computer program instructions stored in a memory, and when the computer program instructions are executed by a processor, the bullet screen rendering test function of the embodiments of the present application can be implemented. In some embodiments, based on the specific operations implemented by the computer program instructions, the bullet screen rendering test device 90 may be divided into one or more modules, which may be specifically divided as follows:
the playing module 91 is used for playing the video to be tested with the bullet screen;
the intercepting module 92 is configured to intercept a frame of video frame as a test image when the video to be tested is played to a preset progress, where the video frame includes at least one bullet screen;
a calculating module 93, configured to calculate a similarity value between the test image and a reference image, where the reference image is a video frame that is captured when the video to be tested is played to the preset schedule and does not include a bullet screen;
and a determining module 94, configured to determine that the bullet screen is rendered if the similarity value is smaller than the first preset value.
In an exemplary embodiment, the calculating module 93 is further configured to perform gray processing on the test image and the reference image respectively to obtain a corresponding first gray image and a corresponding second gray image; respectively carrying out discrete cosine transform on the first gray level image and the second gray level image to obtain a corresponding first discrete cosine transform image and a corresponding second discrete cosine transform image; respectively carrying out binarization processing on pixels in the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first binarized image and a corresponding second binarized image; and calculating the similarity value of the first binarized image and the second binarized image.
In an exemplary embodiment, the calculating module 93 is further configured to perform a reduction process on the test image and the reference image respectively to obtain a corresponding first reduced image and a corresponding second reduced image; and carrying out gray level processing on the first reduced image and the second reduced image to obtain a corresponding first gray level image and a corresponding second gray level image.
In an exemplary embodiment, the calculating module 93 is further configured to respectively cut preset regions in upper left corners of the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first matrix image and a corresponding second matrix image; and carrying out binarization processing on the first matrix image and the second matrix image to obtain a corresponding first binarized image and a corresponding second binarized image.
In an exemplary embodiment, the calculating module 93 is further configured to calculate hamming distances of the first binarized image and the second binarized image, and determine a similarity value between the test image and the reference image according to the calculated hamming distances.
In an exemplary embodiment, the calculating module 93 is further configured to calculate a ratio of reserved pixels in the test image to obtain a first ratio if the similarity value is greater than or equal to the first preset value, where the reserved pixels are pixel points whose pixel values are within a preset interval.
The determining module 94 is further configured to determine whether the bullet screen has been rendered according to the first percentage and a preset percentage.
In an exemplary embodiment, the determining module 94 is further configured to calculate a ratio of remaining pixels in the reference image, obtain a second ratio, and use the second ratio as the preset ratio; if the difference value of the first proportion and the second proportion is larger than or equal to a second preset value, judging that the bullet screen is rendered; and if the difference value of the first occupation ratio and the second occupation ratio is smaller than the second preset value, judging that the bullet screen is not rendered.
In an exemplary embodiment, the bullet screen rendering test device 90 further includes a recording module.
The recording module is used for recording the playing modes of the video to be tested, and the playing modes comprise a full screen mode, a half screen mode, a horizontal screen mode and a vertical screen mode;
the judging module 94 is further configured to, when the play mode of the video to be tested is the half-screen mode, judge whether the play mode of the video to be tested is the horizontal-screen mode; if the playing mode of the video to be tested is the horizontal screen mode, cutting the test picture by adopting a first cutting mode to obtain a cut test picture; and if the playing mode of the video to be tested is the vertical screen mode, cutting the test picture by adopting a second cutting mode to obtain the cut test picture.
In the embodiment, the pictures of the same playing progress in the video playing process under the scenes of closing and opening the bullet screen are respectively taken as the reference picture and the test picture to carry out similarity comparison, and then whether the bullet screen is normally rendered or not is judged according to the similarity value, so that the test accuracy is improved. Meanwhile, in order to further improve the test accuracy, the judgment accuracy is improved by combining two modes of similarity calculation and pixel proportion. The experimental result shows that the identification accuracy of the test mode of the similarity value in the first stage can reach 90%, on the basis of the identification in the first stage, pixel proportion identification in the second stage is carried out on 10% of unidentified images, and 70% of images can be accurately identified. Therefore, in the scene of bullet screen identification, the total image identification accuracy can reach 97% by combining the first-stage and second-stage identification schemes.
Fig. 10 schematically shows a hardware architecture diagram of a computer device 10 suitable for implementing the bullet screen rendering test method according to an embodiment of the present application. In the present embodiment, the computer device 10 is a device capable of automatically performing numerical calculation and/or information processing in accordance with a command set or stored in advance. For example, the server may be a tablet computer, a notebook computer, a desktop computer, a rack server, a blade server, a tower server, or a rack server (including an independent server or a server cluster composed of a plurality of servers). As shown in fig. 10, computer device 10 includes at least, but is not limited to: the memory 120, processor 121, and network interface 122 may be communicatively linked to each other by a system bus. Wherein:
the memory 120 includes at least one type of computer-readable storage medium, which may be volatile or non-volatile, and particularly, includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the memory 120 may be an internal storage module of the computer device 10, such as a hard disk or a memory of the computer device 10. In other embodiments, the memory 120 may also be an external storage device of the computer device 10, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the computer device 10. Of course, memory 120 may also include both internal and external memory modules of computer device 10. In this embodiment, the memory 120 is generally used for storing an operating system installed in the computer device 10 and various types of application software, such as program codes of the bullet screen rendering test method. In addition, the memory 120 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 121 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other bullet screen rendering test chip in some embodiments. The processor 121 is generally configured to control the overall operation of the computer device 10, such as performing control and processing related to data interaction or communication with the computer device 10. In this embodiment, the processor 121 is configured to execute the program code stored in the memory 120 or process data.
Network interface 122 may comprise a wireless network interface or a wired network interface, with network interface 122 typically being used to establish communication links between computer device 10 and other computer devices. For example, the network interface 122 is used to connect the computer device 10 to an external terminal via a network, establish a data transmission channel and a communication link between the computer device 10 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), or Wi-Fi.
It is noted that FIG. 10 only shows a computer device having components 120-122, but it is understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead.
In this embodiment, the bullet screen rendering test method stored in the memory 120 may be divided into one or more program modules and executed by one or more processors (in this embodiment, the processor 121) to complete the present application.
The embodiment of the application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the barrage rendering test method in the embodiment.
In this embodiment, the computer-readable storage medium includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the computer readable storage medium may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. In other embodiments, the computer readable storage medium may be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device. Of course, the computer-readable storage medium may also include both internal and external storage devices of the computer device. In this embodiment, the computer-readable storage medium is generally used for storing an operating system and various types of application software installed in the computer device, for example, the program code of the bullet screen rendering test method in the embodiment, and the like. Further, the computer-readable storage medium may also be used to temporarily store various types of data that have been output or are to be output.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on at least two network units. Some or all of the modules can be screened out according to actual needs to achieve the purpose of the scheme of the embodiment of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a Random Access Memory (RAM), or the like.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the embodiments can still be tampered with, or part or all of the technical features can be equivalently replaced; and the tampering or the replacement does not cause the essence of the corresponding technical scheme to depart from the scope of the technical scheme of the embodiments of the application.
Claims (11)
1. A barrage rendering test method, characterized in that the method comprises:
playing a video to be tested with a bullet screen;
when the video to be tested is played to a preset progress, intercepting a frame of video frame as a test image, wherein the video frame comprises at least one bullet screen;
calculating the similarity value of the test image and a reference image, wherein the reference image is a video frame which is captured when the video to be tested is played to the preset progress and does not contain a bullet screen;
and if the similarity value is smaller than a first preset value, judging that the bullet screen is rendered.
2. The bullet screen rendering test method of claim 1, wherein said calculating a similarity value between said test image and a reference image comprises:
carrying out gray level processing on the test image and the reference image respectively to obtain a corresponding first gray level image and a corresponding second gray level image;
respectively carrying out discrete cosine transform on the first gray level image and the second gray level image to obtain a corresponding first discrete cosine transform image and a corresponding second discrete cosine transform image;
respectively carrying out binarization processing on pixels in the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first binarized image and a corresponding second binarized image;
and calculating the similarity value of the first binarized image and the second binarized image.
3. The bullet screen rendering test method according to claim 2, wherein the performing gray scale processing on the test image and the reference image respectively to obtain a corresponding first gray scale image and a corresponding second gray scale image comprises:
respectively carrying out reduction processing on the test image and the reference image to obtain a corresponding first reduced image and a corresponding second reduced image;
and carrying out gray level processing on the first reduced image and the second reduced image to obtain a corresponding first gray level image and a corresponding second gray level image.
4. The bullet screen rendering test method according to claim 2, wherein the binarizing the pixel images in the first discrete cosine transform image and the second discrete cosine transform image respectively to obtain the corresponding first binarized image and second binarized image comprises:
respectively cutting preset areas in the upper left corners of the first discrete cosine transform image and the second discrete cosine transform image to obtain a corresponding first matrix image and a corresponding second matrix image;
and carrying out binarization processing on the first matrix image and the second matrix image to obtain a corresponding first binarized image and a corresponding second binarized image.
5. The bullet screen rendering test method of claim 2, wherein said calculating a similarity value of said first binarized image and said second binarized image comprises:
and calculating the Hamming distance between the first binarized image and the second binarized image, and determining the similarity value between the test image and the reference image according to the calculated Hamming distance.
6. The bullet screen rendering test method according to any one of claims 1 to 5, wherein the method further comprises:
if the similarity value is larger than or equal to the first preset value, calculating the ratio of reserved pixels in the test image to obtain a first ratio, wherein the reserved pixels are pixel points of which the pixel values are within a preset interval range;
and judging whether the bullet screen is rendered according to the first proportion and a preset proportion.
7. The bullet screen rendering test method of claim 6, wherein the determining whether the bullet screen has been rendered according to the first percentage and a preset percentage comprises:
calculating the ratio of reserved pixels in the reference image to obtain a second ratio, and taking the second ratio as the preset ratio;
if the difference value of the first proportion and the second proportion is larger than or equal to a second preset value, judging that the bullet screen is rendered;
and if the difference value of the first occupation ratio and the second occupation ratio is smaller than the second preset value, judging that the bullet screen is not rendered.
8. The bullet screen rendering test method according to any one of claims 1 to 5, wherein the method further comprises:
recording the playing modes of the video to be tested, wherein the playing modes comprise a full screen mode, a half screen mode, a horizontal screen mode and a vertical screen mode;
after the step of capturing a frame of video frame as a test image when the video to be tested is played to the preset progress, the method further comprises the following steps:
when the playing mode of the video to be tested is a half-screen mode, judging whether the playing mode of the video to be tested is a horizontal-screen mode;
if the playing mode of the video to be tested is the horizontal screen mode, cutting the test picture by adopting a first cutting mode to obtain a cut test picture;
and if the playing mode of the video to be tested is the vertical screen mode, cutting the test picture by adopting a second cutting mode to obtain the cut test picture.
9. The bullet screen rendering testing device is characterized by comprising:
the playing module is used for playing the video to be tested with the bullet screen;
the intercepting module is used for intercepting a frame of video frame as a test image when the video to be tested is played to the preset progress, and the video frame comprises at least one bullet screen;
the calculation module is used for calculating the similarity value of the test image and a reference image, wherein the reference image is a video frame which is captured when the video to be tested is played to the preset progress and does not contain a bullet screen;
and the judging module is used for judging that the bullet screen is rendered if the similarity value is smaller than a first preset value.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of any one of claims 1 to 8 when executing the computer program.
11. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when executed by a processor, implementing the steps of the method of any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111504390.2A CN114185784A (en) | 2021-12-10 | 2021-12-10 | Barrage rendering test method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111504390.2A CN114185784A (en) | 2021-12-10 | 2021-12-10 | Barrage rendering test method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114185784A true CN114185784A (en) | 2022-03-15 |
Family
ID=80604264
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111504390.2A Pending CN114185784A (en) | 2021-12-10 | 2021-12-10 | Barrage rendering test method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114185784A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114745596A (en) * | 2022-05-12 | 2022-07-12 | 上海幻电信息科技有限公司 | Group-based barrage processing method and device |
-
2021
- 2021-12-10 CN CN202111504390.2A patent/CN114185784A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114745596A (en) * | 2022-05-12 | 2022-07-12 | 上海幻电信息科技有限公司 | Group-based barrage processing method and device |
CN114745596B (en) * | 2022-05-12 | 2023-12-29 | 上海幻电信息科技有限公司 | Group-based barrage processing method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210168441A1 (en) | Video-Processing Method, Electronic Device, and Computer-Readable Storage Medium | |
US20140286542A1 (en) | Methods and systems for determining image processing operations relevant to particular imagery | |
CN110139104B (en) | Video decoding method, video decoding device, computer equipment and storage medium | |
CN112102204A (en) | Image enhancement method and device and electronic equipment | |
CN113781356B (en) | Training method of image denoising model, image denoising method, device and equipment | |
CN109389659B (en) | Rendering method and device of mathematical formula in PPT, storage medium and terminal equipment | |
EP3200459A1 (en) | Encoding method, decoding method, apparatus and electronic device | |
CN113436222A (en) | Image processing method, image processing apparatus, electronic device, and storage medium | |
CN114185784A (en) | Barrage rendering test method and device | |
CN110782392A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN111353965A (en) | Image restoration method, device, terminal and storage medium | |
CN112233196B (en) | Live broadcasting room green screen detection method, device, equipment and storage medium | |
CN114222181A (en) | Image processing method, device, equipment and medium | |
CN117376549A (en) | Method, device, equipment and medium for testing time delay of image acquisition system | |
CN114363697B (en) | Video file generation and playing method and device | |
US10764578B2 (en) | Bit rate optimization system and method | |
CN112149745A (en) | Method, device, equipment and storage medium for determining difficult example sample | |
CN113012031A (en) | Image processing method and image processing apparatus | |
CN116631003A (en) | Equipment identification method and device based on P & ID drawing, storage medium and electronic equipment | |
US10719959B2 (en) | Mobile device and a method for texture memory optimization thereof | |
CN112927324B (en) | Data processing method and device of boundary compensation mode of sample point self-adaptive compensation | |
CN115004245A (en) | Target detection method, target detection device, electronic equipment and computer storage medium | |
CN113628192A (en) | Image blur detection method, device, apparatus, storage medium, and program product | |
CN112801932A (en) | Image display method, image display device, electronic equipment and storage medium | |
CN112784246A (en) | Method, device and equipment for verifying slider verification code and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |