CN115834952A - Video frame rate detection method and device based on visual perception - Google Patents

Video frame rate detection method and device based on visual perception Download PDF

Info

Publication number
CN115834952A
CN115834952A CN202111088246.5A CN202111088246A CN115834952A CN 115834952 A CN115834952 A CN 115834952A CN 202111088246 A CN202111088246 A CN 202111088246A CN 115834952 A CN115834952 A CN 115834952A
Authority
CN
China
Prior art keywords
video
frame
adjacent
change rate
frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111088246.5A
Other languages
Chinese (zh)
Inventor
万明阳
顾泽豪
关梦龙
马国俊
邢斌斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202111088246.5A priority Critical patent/CN115834952A/en
Publication of CN115834952A publication Critical patent/CN115834952A/en
Pending legal-status Critical Current

Links

Images

Abstract

The present disclosure provides a video frame rate detection method and apparatus based on visual perception, the method includes: extracting a time stamp played by each frame of the video, and determining adjacent frames of the video according to the time stamp; converting RGB three channels of each frame of the video into a gray single channel, and calculating a gray pixel point value of the gray single channel; calculating the change rate of adjacent frames of the video according to different number proportions of gray pixel point values; calculating the content change rate of adjacent frames of the video according to the sum of the differences of the gray pixel values; detecting a relationship between adjacent frame change rates of the video and a first stuck threshold; detecting the relation between the content change rate of adjacent frames of the video and a second pause threshold value; and when the change rate of the adjacent frames is greater than or equal to the first pause threshold value and the change rate of the content of the adjacent frames is greater than or equal to the second pause threshold value, judging that the adjacent video frames are not paused. Whether the video frame is a video pause frame or not is judged by calculating the visual difference between adjacent frames of the video, and the video pause can be accurately monitored.

Description

Video frame rate detection method and device based on visual perception
Technical Field
The present disclosure relates to the field of video detection, and in particular, to a video frame rate detection method and apparatus based on visual perception, an electronic device, and a storage medium.
Background
Video media is a way of life or entertainment, and when people watch videos, people often encounter the situation that the videos are jammed, buffer marks or pictures are played one by one frequently, and the situation particularly affects the watching experience. Most people may think that the video is jammed just because of the poor network speed, which is not true. Common causes of video jams: the video playing method can be used for coding and decoding the video in the video playing process due to the fact that the software and hardware configuration of the device is too low or the playing software version is too low, the high-definition video often brings decoding pressure to hardware, and due to the fact that the video is blocked due to the fact that blocking caused by decoding is obvious, the coding and decoding speed is reduced, and the situation that the video is blocked during playing can be caused.
The video frame rate is the number of display frames per second of the video, which directly determines the fluency of video playing, when the video frames are slowly and unevenly output, human eyes can perceive the pause, the video is higher than a certain threshold (generally 24 fps), and the video cannot generally perceive the pause after the frames are evenly output, and the higher the frame rate is, the smoother the video is. However, in the current process of adding a filter, a watermark and the like to a video by video editing software, a frame rate transcoding mode and a frame copying mode are designated, so that the video frame rate is high, although the frame rate is high, human eyes still feel that the video is stuck, and the current stuck detection technology cannot monitor the stuck situation and cannot accurately measure whether the video is stuck.
Disclosure of Invention
In order to solve at least the above technical problems and improve the accuracy and efficiency of detecting video stuck, embodiments of the present invention provide a video frame rate detection method and apparatus based on visual perception, an electronic device, and a storage medium.
According to a first aspect of the present invention, an embodiment of the present invention provides a video frame rate detection method based on visual perception, including:
extracting a time stamp played by each frame of the video, and determining adjacent frames of the video according to the time stamp;
converting RGB three channels of each frame of the video into a gray single channel, and calculating gray pixel point values of the gray single channel;
calculating the change rate of adjacent frames of the video according to different number proportions of the gray pixel point values;
calculating the content change rate of adjacent frames of the video according to the ratio of the sum of the change differences of the gray pixel values;
detecting a relationship between adjacent frame change rates of the video and a first stuck threshold;
detecting the relation between the content change rate of adjacent frames of the video and a second pause threshold value; and
and when the change rate of the adjacent frames is greater than or equal to the first pause threshold value and the change rate of the content of the adjacent frames is greater than or equal to a second pause threshold value, judging that the adjacent video frames are not paused.
Further, the timestamp of each frame of the video is the time point of each frame display after the video frame is decoded, the user calculates the playing time of the video frame, and the adjacent frame of the video is determined according to the time point and the playing time of each frame.
Further, the RGB three channels of each frame of the video are converted into a single grayscale channel, and the grayscale pixel point value of the single grayscale channel is calculated, where the conversion mode formula is:
GRAY=R*0.299+G*0.587+B*0.114;
wherein, GRAY is the grey pixel point value of grey level single channel, R is the red channel component, G is the green channel component, B is the blue channel component.
Further, the adjacent frame change rate is a ratio of unequal number of gray pixel point values of corresponding bits of adjacent video frames to the total number of pixels, and a specific calculation formula is as follows:
Figure BDA0003266570540000031
where H and W denote the height and width of the video frame, p ij And p' ij Gray pixel point values representing adjacent frame image coordinates (i, j), respectively, with adjacent frame rates of change ranging between 0-1.
Further, the adjacent frame content change rate is a ratio of a sum of differences in gray pixel values of corresponding bits of adjacent frames of the video to unequal numbers of gray pixel values of corresponding bits, and a specific calculation formula is as follows:
Figure BDA0003266570540000032
where H and W denote the height and width of the video frame, p ij And p' ij The gray pixel point values representing adjacent frame image coordinates (i, j), respectively, where 256 is multiplied in the denominator because there are 256 gray pixel values, the adjacent frame content rate of change ranges between 0-1.
Further, the first stuck threshold is set to 0.01 to 0.1, and the second stuck threshold is set to 0.1 to 0.5.
Further, the method further comprises:
when the change rate of the adjacent frames of the video is smaller than a first pause threshold value and/or the change rate of the content of the adjacent frames of the video is smaller than a second pause threshold value, judging that the adjacent video frames are pause;
and when the adjacent video frame is judged to be Kapause, discarding the video frame.
Further, the method further comprises:
calculating a video perception frame rate of the video;
the video perception frame rate is the number of video non-clamping frames divided by the playing time of the last non-clamping frame.
Further, the method further comprises:
and recording the video by adopting a recording timing mode, comparing the sensing frame rate and the actual frame rate of the recorded video, and verifying the sensing frame rate of the recorded video.
According to a second aspect of the present invention, an embodiment of the present disclosure provides a video frame rate detection apparatus based on visual perception, including:
the extraction module is used for extracting a timestamp played by each frame of the video and determining adjacent frames of the video according to the timestamp;
the conversion module is used for converting the RGB three channels of each frame of the video into a gray single channel and calculating a gray pixel point value of the gray single channel;
the first calculation module is used for calculating the adjacent frame change rate of the video according to different number proportion of the gray pixel point values;
the second calculation module is used for calculating the content change rate of adjacent frames of the video according to the ratio of the sum of the change differences of the gray pixel values;
the first detection module is used for detecting the relation between the adjacent frame change rate of the video and a first pause threshold value;
the second detection module is used for detecting the relation between the content change rate of adjacent frames of the video and a second pause threshold value; and
and the judging module is used for judging that the adjacent video frames are not blocked when the change rate of the adjacent frames is greater than or equal to the first blocking threshold value and the change rate of the content of the adjacent frames is greater than or equal to a second blocking threshold value.
According to a third aspect of the present invention, an embodiment of the present disclosure provides an electronic device, including:
a memory for storing computer readable instructions; and
a processor configured to execute the computer readable instructions to enable the electronic device to implement the method of any of the first aspect.
According to a fourth aspect of the present invention, there is provided a computer-readable storage medium storing a program which, when executed, is capable of implementing the method of any one of the first aspects.
The embodiment of the disclosure discloses a video frame rate detection method, a video frame rate detection device, an electronic device and a computer readable storage medium based on visual perception, wherein the method comprises the following steps: extracting a time stamp played by each frame of the video, and determining adjacent frames of the video according to the time stamp; converting RGB three channels of each frame of the video into a gray single channel, and calculating gray pixel point values of the gray single channel; calculating the change rate of adjacent frames of the video according to different number proportions of the gray pixel point values; calculating the content change rate of adjacent frames of the video according to the ratio of the sum of the change differences of the gray pixel values; detecting a relationship between adjacent frame change rates of the video and a first stuck threshold; detecting the relation between the content change rate of adjacent frames of the video and a second pause threshold value; and when the change rate of the adjacent frames is greater than or equal to the first pause threshold value and the change rate of the content of the adjacent frames is greater than or equal to the second pause threshold value, judging that the adjacent video frames are not paused. According to the video frame rate detection method, whether a video frame is a stuck frame or not is judged by calculating visual difference between adjacent frames of a video, and finally the number of frames which are not stuck is divided by the total duration of the video to obtain the video sensing frame rate, so that video sticking can be accurately monitored, and meanwhile, a sensing frame rate algorithm verification scheme is provided.
The foregoing description is only an overview of the technical solutions of the present disclosure, and in order to make the technical means of the present disclosure more clearly understood, the present disclosure may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present disclosure more clearly understood, the following preferred embodiments are specifically illustrated below, and the detailed description is given in conjunction with the accompanying drawings.
Drawings
Fig. 1 is a schematic flowchart of a video frame rate detection method based on visual perception according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of an original image of a certain frame in a video according to an embodiment of the disclosure;
fig. 3 is a diagram illustrating an R channel component of a frame in a video according to an embodiment of the present disclosure;
fig. 4 is a diagram illustrating G channel components of a frame in a video according to an embodiment of the disclosure;
fig. 5 is a schematic diagram of a B channel component of a frame in a video according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a video frame rate detection apparatus based on visual perception according to another embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to another embodiment of the disclosure.
Detailed Description
In order that the technical contents of the present disclosure can be more clearly described, the following further description is made in conjunction with specific embodiments.
The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The disclosed embodiments are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a video frame rate detection method based on visual perception according to an embodiment of the present disclosure, where the video frame rate detection method based on visual perception according to this embodiment may be executed by a video frame rate detection apparatus based on visual perception, and the apparatus may be implemented as software, or implemented as a combination of software and hardware, and the apparatus may be integrally disposed in a certain device, such as a terminal device, in a video frame rate detection system based on visual perception. As shown in fig. 1, the method comprises the steps of:
step S101: and extracting a time stamp played by each frame of the video, and determining adjacent frames of the video according to the time stamp.
In step S101, when analyzing a video file, a frame of data is labeled, and a Presentation Time Stamp (PTS) of each frame of the video is extracted. The video playing time stamp is a time point of each frame display after the video frame is decoded, the user calculates the playing time of the video frame, the display time of each frame of the video is not constant but dynamically changed, therefore, the time of the video frame calculated through the playing time stamp is more accurate, and the adjacent frames of the video are determined according to the time point and the playing time of each frame. The embodiment of the disclosure adopts Opencv to read frame data in a video, where the video has a timestamp of video frame data, and the timestamp obtained by Opencv lets a user know that the frame data is at the position of a video file, and can determine adjacent frames of the video according to the timestamp, determine the sequence of the adjacent frames and the playing time of each frame, and through Opencv, the frame rate of each video can be obtained:
import cv2
video=cv2.VideoCapture(videoFile)
fps=video.get(cv2.CAP_PROP_FPS)
OpenCV may add 1000/fps (due to milliseconds) to each frame in the video, but this assumes that the frame rate remains stable throughout the recording.
Step S102: and converting the RGB three channels of each frame of the video into a gray single channel, and calculating the gray pixel point value of the gray single channel.
In step S102, the 24-bit RGB image is also called a full-color image. It has three channels, which are respectively: r (red), G (green), B (blue). Fig. 2-5, in conjunction with fig. 2-5, illustrate an example of converting the RGB three channels of video to a GRAY single channel.
As shown in fig. 2, a video artwork for a frame of a video is shown. First, an image (orange is in a blue background, the publication is displayed as a gray background) which is a color image including R, B, and G (the publication is displayed as a gray image), and a dot is drawn (red x in the figure, the publication is displayed as gray or black):
get _ gray (Image, row, column, RGBGrayval) obtains the gray pixel value of the drawn point, with the result:
RGBGrayval[19,22,37]。
secondly, as shown in fig. 3-5, three channels of rgb are separated, and gray pixel point values are obtained:
three-channel separation is realized through decomplexe, and gray pixel point values of three channels are obtained respectively:
get_grayval(R,Row,Column,RGrayval)
RGBGRbGrayval 19 (refer to FIG. 3)
get_grayval(G,Row,Column,GGrayval)
RGBGRbGrayval 22 (refer to FIG. 4)
Referring to FIG. 5, get _grayval (B, row, column, BGrayval)
RGBGRbGrayval 37 (refer to FIG. 5)
It follows that the gray scale pixel values of the three channel image are a combination of the gray scale pixel values of the three single channels. The gray pixel point values are 0-255, each channel is 0-255, the image looks brighter the larger the value, and the image is darker the smaller the value. The darker which color is seen in which part on the three-channel image, the greater which color component is evident in this part, the brighter it reflects on this single channel. For example, if the background is mostly blue as seen by the original, the brighter the background portion seen on the blue channel, the darker the orange portion.
In addition, the single-channel gray image is a single-channel image obtained by calculating three components of R, G and B according to a certain proportion, and the calculation method is not single.
In this embodiment, the component of each RGB single channel has a corresponding weighted value, the RGB three channels of each frame of the video are converted into a grayscale single channel, and a grayscale pixel point value of the grayscale single channel is calculated, where the conversion mode formula is:
GRAY=R*0.299+G*0.587+B*0.114;
wherein, GRAY is the grey scale pixel point value of grey scale single channel, R is red channel component, G is green channel component, B is blue channel component.
Step S103: and calculating the change rate of the adjacent frames of the video according to different number proportions of the gray pixel point values.
In step S103, if the adjacent frames in the video are consecutive, the video will not feel pause, and if the adjacent frames have frame skipping or too large difference, the video will feel pause, and here, a conceptual adjacent frame change rate is introduced, which has a certain relationship with the pause of the video.
The adjacent frame change rate is the proportion of unequal number of pixel point values of corresponding bits of adjacent frames of the video to the total number of pixel points, and the specific calculation formula is as follows:
Figure BDA0003266570540000091
where H and W denote the height and width of the video frame, p ij And p' ij Pixel point values representing adjacent frame image coordinates (i, j), respectively, the adjacent frame rate of change ranging between 0-1.
The larger the change rate of the adjacent frame is, the more the number of pixels indicating the change of the image of the adjacent frame is, and human eyes can feel the change of the video frame more easily.
Step S104: and calculating the content change rate of the adjacent frames of the video according to the ratio of the sum of the change differences of the gray pixel values.
In step S104, the adjacent frame content change rate is different from the adjacent frame change rate in step S103, where the adjacent frame content change rate is a ratio of a sum of differences between pixel values of corresponding bits of adjacent frames of the video to a number of unequal pixel values of the corresponding bits, and it is seen that a denominator of the adjacent frame content change rate is a certain multiple of the denominator of the adjacent frame content change rate.
Specifically, the content change rate of the adjacent frames refers to a ratio of a sum of differences in changes of pixel points at corresponding bits of adjacent frames of the video to an unequal number of pixel points at corresponding bits, and a specific calculation formula is as follows:
Figure BDA0003266570540000101
where H and W denote the height and width of the video frame, p ij And p' ij Pixel point values representing adjacent frame image coordinates (i, j), respectively, where 256 is multiplied in the denominator because there are 256 grayscale image pixel point values, the adjacent frame content rate of change ranges between 0-1.
The reason for multiplying 256 in the denominator is that 256 gray scale image pixel values are total, so the change rate of the content of the adjacent frames ranges from 0 to 1, and the larger the change rate is, the larger the change of the content of the image of the adjacent frames is, the more easily the human eyes feel the change of the video frames.
Step S105: a relationship of adjacent frame rate of change of the video to a first stuck threshold is detected.
In step S105, the condition that the adjacent video frame is not a pause frame is that the adjacent frame change rate and the adjacent frame content change rate must satisfy a certain threshold value at the same time, otherwise, the frame is discarded. Here, first, a relationship between a change rate of adjacent frames of the video and a first stuck threshold is detected, the first stuck threshold is set to be between 0.01 and 0.1, for example, 0.01, 0.02 or 0.05, and taking 0.02 as an example, it is required to satisfy that the change rate of adjacent frames exceeds 0.02, and based on this, it is further simultaneously satisfied that the change rate of content of adjacent frames exceeds a second stuck threshold, it is determined that the adjacent video frames are not stuck; otherwise, as long as at least one of the adjacent frame change rate or the adjacent frame content change rate is smaller than the corresponding first stuck threshold or the second stuck threshold, the adjacent video frame is determined to be stuck, and the stuck frame is discarded.
Step S106: detecting a relationship between adjacent frame content change rates of the video and a second stuck threshold.
In step S106, after detecting the relationship between the change rate of the adjacent frame of the video and the first pause threshold in step S105, the relationship between the change rate of the content of the adjacent frame of the video and the second pause threshold is further detected. The condition that the adjacent video frames are not stuck frames is that the adjacent frame change rate and the adjacent frame content change rate must simultaneously meet a certain threshold value, so that the stuck between the adjacent frames can be judged. When detecting the relationship between the content change rate of the adjacent frames of the video and the second stuck threshold, setting the second stuck threshold to be between 0.1 and 0.5, such as 0.1, 0.15 or 0.3, taking 0.15 as an example, it is required to satisfy that the content change rate of the adjacent frames exceeds 0.15, but only satisfying the condition can not determine that the adjacent frames are not stuck, and it is required to satisfy that the content change rate of the adjacent frames exceeds the first stuck threshold at the same time to determine that the adjacent frames are not stuck; otherwise, as long as at least one of the adjacent frame change rate or the adjacent frame content change rate is smaller than the corresponding first stuck threshold or the second stuck threshold, the adjacent video frame is determined to be stuck, and the stuck frame is discarded.
Step S107: and when the change rate of the adjacent frames is greater than or equal to the first pause threshold value and the change rate of the content of the adjacent frames is greater than or equal to a second pause threshold value, judging that the adjacent video frames are not paused.
In step S107, the condition that the adjacent video frames are not stuck is that the adjacent frame change rate and the adjacent frame content change rate must satisfy a certain threshold value at the same time, and it can be determined that stuck is formed between the adjacent frames. That is, when the adjacent frame change rate is greater than or equal to the first pause threshold and the adjacent frame content change rate is greater than or equal to the second pause threshold, it is determined that the adjacent video frame is not paused. When detecting the relationship between the adjacent frame change rate of the video and the first stuck threshold, the first stuck threshold is set to be between 0.01 and 0.1, for example, 0.01, 0.02 or 0.05, and for example, 0.02, it is required to satisfy that the adjacent frame change rate exceeds 0.02. Meanwhile, when detecting the relationship between the content change rate of the adjacent frames of the video and the second stuck threshold, setting the second stuck threshold to be between 0.1 and 0.5, such as 0.1, 0.15 or 0.3, in the embodiment of the present disclosure, taking 0.15 as an example, if the content change rate of the adjacent frames exceeds 0.02 at the same time, and if the content change rate of the adjacent frames exceeds 0.15, it can be determined that the adjacent frames are not stuck; otherwise, as long as at least one of the adjacent frame change rate or the adjacent frame content change rate is smaller than the corresponding first stuck threshold or the second stuck threshold, the adjacent video frame is determined to be stuck, and the stuck frame is discarded. Specifically, when the change rate of the adjacent frames of the video is less than 0.02, and/or when the change rate of the content of the adjacent frames of the video is less than 0.15, the adjacent video frames are determined to be stuck, and when the adjacent video frames are determined to be stuck, the video frames are discarded.
In addition, the embodiments of the present disclosure further judge the stuck effect of the video frame by the video perceived frame rate, and the time sensitivity and resolution of human vision vary according to the type and characteristics of the visual stimulus and are different among individuals. The video frame rate is the number of video frames per second, the fluency of video playing is directly determined, the video sensing frame rate refers to the video frame rate sensed by human eyes, video blocking referred to in the embodiment of the disclosure is sensed and verified by human eyes, the larger the sensing frame rate is, the smoother the video is, and conversely, the smaller the sensing frame rate is, the more obvious the video blocking is. The human eye can perceive the video to be stuck when the video is slow and uneven in frame output, the video is higher than a certain threshold (generally 24 fps) and generally cannot be stuck after the video is even in frame output, and the video is smoother when the frame rate is higher than 24 fps. The human visual system can process 10 to 12 images per second and perceive them separately, while higher rates are considered motion. Most study participants considered modulated light (e.g., computer displays) stable when the rate was above 50Hz to 90 Hz. This steady sense of modulated light is referred to as the flicker fusion threshold. However, when the modulated light is not uniform and contains an image, the flicker fusion threshold can be much higher, hundreds of hertz. With respect to image recognition, it has been found that one recognizes a particular image in an uninterrupted series of different images, each lasting as little as 13 milliseconds. The persistence of vision sometimes results in a very short single millisecond visual stimulus with a perceived duration between 100 milliseconds and 400 milliseconds. Multiple stimuli that are very short are sometimes considered to be a single stimulus, for example a 10 millisecond green flash followed by a 10 millisecond red flash, perceived as a single yellow flash.
Because the video end may have a black screen and other still pictures, in order to accurately calculate the video perception frame rate, the final perception frame rate algorithm result is the number of video non-clamping frames divided by the playing time of the last video non-clamping frame.
The embodiment of the disclosure provides two sensing frame rate verification schemes:
1. and calculating the video perception frame rate of the video, wherein the video perception frame rate is the number of the video non-blocking frames divided by the playing time of the last non-blocking frame. And recording the video by adopting a recording timing mode, comparing the sensing frame rate and the actual frame rate of the recorded video, and verifying the sensing frame rate of the recorded video. Specifically, in order to verify that the perceived frame rate algorithm is not affected when the video content changes continuously, a stopwatch for recording and timing is adopted, and the video recording is as follows: 60fps. For example, the actual frame rate of MP4 video is 60fps, and the perceived frame rate is 60fps.
2. In order to verify that the perceptual frame rate algorithm can accurately calculate the perceptual frame rate when the video content changes slightly, a mode of recording 10fps by 30fps is adopted, the recorded video is 1080p_10 fps_h 264_0.mp4, the actual frame rate of the video is 30fps, and the perceptual frame rate is 10.1fps.
Fig. 6 is a schematic diagram of a video frame rate detection apparatus based on visual perception according to another embodiment of the disclosure. The video frame rate detection device based on visual perception comprises:
an extracting module 601, configured to extract a timestamp for playing each frame of a video, and determine adjacent frames of the video according to the timestamp;
the conversion module 602 is configured to convert the RGB three channels of each frame of the video into a single grayscale channel, and calculate a grayscale pixel point value of the single grayscale channel;
a first calculating module 603, configured to calculate a rate of change of adjacent frames of the video according to different number fractions of the grayscale pixel point values;
a second calculating module 604, configured to calculate a content change rate of an adjacent frame of the video according to a ratio of a sum of differences in the gray-level pixel values;
a first detecting module 605, configured to detect a relationship between a rate of change of adjacent frames of the video and a first stuck threshold;
a second detecting module 606, configured to detect a relationship between a content change rate of adjacent frames of the video and a second stuck threshold; and
the determining module 607 is configured to determine that the adjacent video frame is not stuck when the adjacent frame change rate is greater than or equal to the first stuck threshold and the adjacent frame content change rate is greater than or equal to the second stuck threshold.
Wherein:
the extracting module 601 is configured to extract a timestamp for playing each frame of the video, and determine adjacent frames of the video according to the timestamp.
The video playing time stamp is a time point of each frame display after the video frame is decoded, the user calculates the playing time of the video frame, the display time of each frame of the video is not constant but dynamically changed, therefore, the time of calculating the video frame through the playing time stamp is more accurate, and the adjacent frame of the video is determined according to the time point and the playing time of each frame. The method and the device for obtaining the frame rate of the video have the advantages that the Opencv is adopted to read the frame data in the video, the video is provided with the time stamp of the video frame data, the time stamp obtained by the Opencv enables a user to know the position of the frame data in a video file, and the frame rate of each video can be obtained through the Opencv.
The conversion module 602 is configured to convert the RGB three channels of each frame of the video into a single grayscale channel, and calculate a grayscale pixel point value of the single grayscale channel.
Color video has three channels, respectively: r (red), G (green), B (blue). Fig. 2-5, in conjunction with fig. 2-5, illustrate an example of converting the RGB three channels of video to a GRAY single channel. First, an image (orange is in a blue background, the publication is displayed as a gray background) which is a color image including R, B, and G (the publication is displayed as a gray image), and a dot is drawn (red x in the drawing, the publication is displayed as gray or black). Next, as shown in fig. 2-5, the rgb three channels are separated and gray pixel point values are obtained. The gray pixel point values from which the three channel image is derived are a combination of the gray pixel point values of the three single channels. The gray pixel point values are 0-255, each channel is 0-255, the image looks brighter the larger the value, and the image is darker the smaller the value. The darker which color is seen in which part on the three-channel image, the greater which color component is evident in this part, the brighter it reflects on this single channel. In this embodiment, the component of each RGB single channel has a corresponding weighted value, the RGB three channels of each frame of the video are converted into a grayscale single channel, and a grayscale pixel point value of the grayscale single channel is calculated, where the conversion mode formula is:
GRAY=R*0.299+G*0.587+B*0.114;
wherein, GRAY is the grey pixel point value of grey level single channel, R is the red channel component, G is the green channel component, B is the blue channel component.
The first calculating module 603 is configured to calculate a change rate of adjacent frames of the video according to a different number fraction of the grayscale pixel point values.
If adjacent frames in the video are continuous, the video does not feel pause, and if the adjacent frames have frame skipping or too large difference, the video feels pause. The adjacent frame change rate is the proportion of unequal number of pixel point values of corresponding bits of adjacent frames of the video to the total number of pixel points, and the adjacent frame change rate range is between 0 and 1. The larger the change rate of the adjacent frame is, the more the number of pixels indicating the change of the image of the adjacent frame is, and human eyes can feel the change of the video frame more easily.
The second calculating module 604 is configured to calculate a content change rate of an adjacent frame of the video according to a ratio of a sum of differences between the gray-level pixel values.
The change rate of the content of the adjacent frames is the proportion of the number of unequal sum of the change difference of the pixel values of the corresponding bits of the adjacent frames of the video to the pixel values of the corresponding bits, and the range of the change rate of the content of the adjacent frames is between 0 and 1. The larger the change rate is, the larger the change of the image content of the adjacent frame is, and the human eyes can more easily feel the change of the video frame.
A first detecting module 605, configured to detect a relationship between a rate of change of adjacent frames of the video and a first stuck threshold.
Here, first, a relationship between a change rate of adjacent frames of the video and a first stuck threshold is detected, the first stuck threshold is set to be between 0.01 and 0.1, for example, 0.01, 0.02 or 0.05, and taking 0.02 as an example, it is required to satisfy that the change rate of adjacent frames exceeds 0.02, and based on this, it is further simultaneously satisfied that the change rate of content of adjacent frames exceeds a second stuck threshold, it is determined that the adjacent video frames are not stuck; otherwise, as long as at least one of the adjacent frame change rate or the adjacent frame content change rate is smaller than the corresponding first stuck threshold or the second stuck threshold, the adjacent video frame is determined to be stuck, and the stuck frame is discarded.
A second detecting module 606, configured to detect a relationship between a content change rate of adjacent frames of the video and a second stuck threshold.
And after the relation between the adjacent frame change rate and the first stuck threshold value is detected, further detecting the relation between the adjacent frame content change rate of the video and the second stuck threshold value. The condition that the adjacent video frames are not stuck frames is that the adjacent frame change rate and the adjacent frame content change rate must simultaneously meet a certain threshold value, so that the stuck between the adjacent frames can be judged. When detecting the relationship between the content change rate of the adjacent frames of the video and the second stuck threshold, setting the second stuck threshold to be between 0.1 and 0.5, such as 0.1, 0.15 or 0.3, taking 0.15 as an example, it is required to satisfy that the content change rate of the adjacent frames exceeds 0.15, but only satisfying the condition can not determine that the adjacent frames are not stuck, and it is required to satisfy that the content change rate of the adjacent frames exceeds the first stuck threshold at the same time to determine that the adjacent frames are not stuck; otherwise, as long as at least one of the change rate of the adjacent frames or the change rate of the content of the adjacent frames is smaller than the corresponding first pause threshold or the second pause threshold, the adjacent video frames are judged to be paused, and the pause frames are discarded.
The determining module 607 is configured to determine that the adjacent video frame is not stuck when the adjacent frame change rate is greater than or equal to the first stuck threshold and the adjacent frame content change rate is greater than or equal to the second stuck threshold.
The condition that the adjacent video frames are not blocked frames is that the change rate of the adjacent frames and the change rate of the content of the adjacent frames must meet a certain threshold value at the same time, and then the blocking between the adjacent frames can be judged. That is, when the adjacent frame change rate is equal to or greater than the first stuck threshold and the adjacent frame content change rate is equal to or greater than a second stuck threshold, it is determined that the adjacent video frame is not stuck. When detecting the relationship between the adjacent frame change rate of the video and the first stuck threshold, the first stuck threshold is set to be between 0.01 and 0.1, for example, 0.01, 0.02 or 0.05, and for example, 0.02, it is required to satisfy that the adjacent frame change rate exceeds 0.02. Meanwhile, when detecting the relation between the content change rate of the adjacent frames of the video and the second pause threshold, setting the second pause threshold to be between 0.1 and 0.5, such as 0.1, 0.15 or 0.3, in the embodiment of the present disclosure, taking 0.15 as an example, if the change rate of the adjacent frames exceeds 0.02 at the same time, and if the content change rate of the adjacent frames exceeds 0.15, it can be determined that the adjacent frames are not paused; otherwise, as long as at least one of the adjacent frame change rate or the adjacent frame content change rate is smaller than the corresponding first stuck threshold or the second stuck threshold, the adjacent video frame is determined to be stuck, and the stuck frame is discarded. Specifically, when the change rate of the adjacent frames of the video is less than 0.02, and/or when the change rate of the content of the adjacent frames of the video is less than 0.15, the adjacent video frames are determined to be stuck, and when the adjacent video frames are determined to be stuck, the video frames are discarded.
The determining module 607 is further configured to determine that the adjacent video frame is stuck when the change rate of the adjacent frame of the video is smaller than a first stuck threshold and/or the change rate of the content of the adjacent frame of the video is smaller than a second stuck threshold;
and when the adjacent video frame is judged to be Kapause, discarding the video frame.
The video frame rate detection device based on visual perception further comprises:
and the visual perception frame rate calculation module is used for calculating the video perception frame rate of the video, wherein the video perception frame rate is the number of frames which are not blocked by the video divided by the playing time of the last frame which is not blocked by the video.
The video frame rate detection device based on visual perception further comprises:
and the verification module is used for recording the video in a recording timing mode, comparing the sensing frame rate and the actual frame rate of the recorded video and verifying the sensing frame rate of the recorded video.
The apparatus shown in fig. 6 can perform the method of the embodiment shown in fig. 1, and reference may be made to the related description of the embodiment shown in fig. 1 for a part of this embodiment that is not described in detail. The implementation process and technical effect of the technical solution refer to the description in the embodiment shown in fig. 1, and are not described herein again.
Referring now to FIG. 7, shown is a block diagram of an electronic device 700 suitable for use in implementing another embodiment of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 may include a processing means (e.g., central processing unit, graphics processor, etc.) 701 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 702 or a program loaded from storage 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the electronic apparatus 700 are also stored. The processing apparatus 701, the ROM 702, and the RAM 703 are connected to each other via a communication line 704. An input/output (I/O) interface 705 is also connected to the communication line 704.
Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication means 709 may allow the electronic device 700 to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device 700 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be alternatively implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communication means 709, or may be installed from the storage means 708, or may be installed from the ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: the interaction method in the above embodiment is performed.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the preceding first aspects.
According to one or more embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium stores computer instructions for causing a computer to perform the method of any of the foregoing first aspects.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (12)

1. A video frame rate detection method based on visual perception is characterized by comprising the following steps:
extracting a time stamp played by each frame of the video, and determining adjacent frames of the video according to the time stamp;
converting RGB three channels of each frame of the video into a gray single channel, and calculating gray pixel point values of the gray single channel;
calculating the change rate of adjacent frames of the video according to different number proportions of the gray pixel point values;
calculating the content change rate of adjacent frames of the video according to the ratio of the sum of the change differences of the gray pixel values;
detecting a relationship between adjacent frame change rates of the video and a first stuck threshold;
detecting the relation between the content change rate of adjacent frames of the video and a second pause threshold value; and
and when the change rate of the adjacent frames is greater than or equal to the first pause threshold value and the change rate of the content of the adjacent frames is greater than or equal to a second pause threshold value, judging that the adjacent video frames are not paused.
2. The method of claim 1, wherein the timestamp of each frame playing of the video is a time point of each frame display after the video frame is decoded, the user calculates a playing time of the video frame, and determines an adjacent frame of the video according to the time point and the playing time of each frame.
3. The method according to claim 1, wherein the RGB three channels of each frame of the video are converted into a gray single channel, and the gray pixel point value of the gray single channel is calculated by the following formula:
GRAY=R*0.299+G*0.587+B*0.114;
wherein, GRAY is the grey pixel point value of grey level single channel, R is the red channel component, G is the green channel component, B is the blue channel component.
4. The method according to claim 1, wherein the adjacent frame change rate is a ratio of unequal number of gray pixel point values of corresponding bits of adjacent frames of the video to total number of pixels, and a specific calculation formula is as follows:
Figure FDA0003266570530000021
where H and W denote the height and width of the video frame, p ij And p' ij Gray pixel point values representing adjacent frame image coordinates (i, j), respectively, with adjacent frame rates of change ranging between 0-1.
5. The method according to claim 1, wherein the adjacent frame content change rate is a ratio of a sum of differences in gray pixel values of corresponding bits of adjacent frames of the video to a number of unequal gray pixel values of corresponding bits, and a specific calculation formula is as follows:
Figure FDA0003266570530000022
where H and W denote the height and width of the video frame, p ij And p' ij The gray pixel point values representing adjacent frame image coordinates (i, j), respectively, where 256 is multiplied in the denominator because there are 256 gray pixel values, the adjacent frame content rate of change ranges between 0-1.
6. The method of claim 1, wherein the first stuck threshold is set to 0.01 to 0.1 and the second stuck threshold is set to 0.1 to 0.5.
7. The method of claim 1, further comprising:
when the change rate of the adjacent frames of the video is smaller than a first pause threshold value and/or the change rate of the content of the adjacent frames of the video is smaller than a second pause threshold value, judging that the adjacent video frames are pause;
and when the adjacent video frame is judged to be Kapause, discarding the video frame.
8. The method of claim 1, further comprising:
calculating a video perception frame rate of the video;
the video perception frame rate is the number of video non-clamping frames divided by the playing time of the last non-clamping frame.
9. The method of claim 8, further comprising:
and recording the video by adopting a recording timing mode, comparing the sensing frame rate and the actual frame rate of the recorded video, and verifying the sensing frame rate of the recorded video.
10. A video frame rate detection device based on visual perception, comprising:
the extraction module is used for extracting a time stamp played by each frame of the video and determining adjacent frames of the video according to the time stamp;
the conversion module is used for converting the RGB three channels of each frame of the video into a gray single channel and calculating a gray pixel point value of the gray single channel;
the first calculation module is used for calculating the adjacent frame change rate of the video according to different number proportion of the gray pixel point values;
the second calculation module is used for calculating the content change rate of adjacent frames of the video according to the ratio of the sum of the change differences of the gray pixel values;
the first detection module is used for detecting the relation between the adjacent frame change rate of the video and a first pause threshold value;
the second detection module is used for detecting the relation between the content change rate of adjacent frames of the video and a second pause threshold value; and
and the judging module is used for judging that the adjacent video frames are not blocked when the change rate of the adjacent frames is greater than or equal to the first blocking threshold value and the change rate of the content of the adjacent frames is greater than or equal to a second blocking threshold value.
11. An electronic device, comprising:
a memory for storing computer readable instructions; and
a processor configured to execute the computer-readable instructions to cause the electronic device to implement the method according to any one of claims 1-9.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a program which, when executed, is capable of implementing the method according to any one of claims 1-9.
CN202111088246.5A 2021-09-16 2021-09-16 Video frame rate detection method and device based on visual perception Pending CN115834952A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111088246.5A CN115834952A (en) 2021-09-16 2021-09-16 Video frame rate detection method and device based on visual perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111088246.5A CN115834952A (en) 2021-09-16 2021-09-16 Video frame rate detection method and device based on visual perception

Publications (1)

Publication Number Publication Date
CN115834952A true CN115834952A (en) 2023-03-21

Family

ID=85515135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111088246.5A Pending CN115834952A (en) 2021-09-16 2021-09-16 Video frame rate detection method and device based on visual perception

Country Status (1)

Country Link
CN (1) CN115834952A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117255222A (en) * 2023-11-20 2023-12-19 上海科江电子信息技术有限公司 Digital television monitoring method, system and application

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117255222A (en) * 2023-11-20 2023-12-19 上海科江电子信息技术有限公司 Digital television monitoring method, system and application

Similar Documents

Publication Publication Date Title
CN114584849B (en) Video quality evaluation method, device, electronic equipment and computer storage medium
US10957024B2 (en) Real time tone mapping of high dynamic range image data at time of playback on a lower dynamic range display
US20150213586A1 (en) Image processing apparatus, image processing method, display apparatus, and control method for display apparatus
US9204086B2 (en) Method and apparatus for transmitting and using picture descriptive information in a frame rate conversion processor
US20170127011A1 (en) Semiconductor integrated circuit, display device provided with same, and control method
EP2599310B1 (en) Method and apparatus for measuring video quality
CN102572502B (en) Selecting method of keyframe for video quality evaluation
US20120044359A1 (en) Frame rate measurement
WO2018012729A1 (en) Display device and text recognition method for display device
KR100719841B1 (en) Method for creation and indication of thumbnail view
CN112788329A (en) Video static frame detection method and device, television and storage medium
CN115834952A (en) Video frame rate detection method and device based on visual perception
US8797308B2 (en) Method of driving display apparatus and driving circuit for display apparatus using the same
US11221477B2 (en) Communication apparatus, communication system, and data communication method
CN115439660A (en) Detection method, detection device, electronic equipment and medium
CN111541940B (en) Motion compensation method and device for display equipment, television and storage medium
CN114979652A (en) Video processing method and device, electronic equipment and storage medium
EP3998770A1 (en) Image processing device and image processing method
US20130120549A1 (en) Display processing apparatus and display processing method
WO2019196573A1 (en) Streaming media transcoding method and apparatus, and computer device and readable medium
Lu et al. Perceptual quality evaluation on periodic frame-dropping video
US8432976B2 (en) System and method for detecting field order of video sequences
CN111179317A (en) Interactive teaching system and method
US20150269904A1 (en) Image processing device and method thereof
US11908340B2 (en) Magnification enhancement of video for visually impaired viewers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination