CN107360474A - Video sluggishness frame detection method based on Local textural feature and global brightness - Google Patents
Video sluggishness frame detection method based on Local textural feature and global brightness Download PDFInfo
- Publication number
- CN107360474A CN107360474A CN201710710941.8A CN201710710941A CN107360474A CN 107360474 A CN107360474 A CN 107360474A CN 201710710941 A CN201710710941 A CN 201710710941A CN 107360474 A CN107360474 A CN 107360474A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- frame
- mtd
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/647—Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
- H04N21/64723—Monitoring of network processes or resources, e.g. monitoring of network load
- H04N21/6473—Monitoring network processes errors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/647—Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
- H04N21/64723—Monitoring of network processes or resources, e.g. monitoring of network load
- H04N21/64738—Monitoring network characteristics, e.g. bandwidth, congestion level
Abstract
The present invention discloses a kind of video sluggishness frame detection method based on Local textural feature and global brightness, is related to image and technical field of video processing, comprises the following steps:S1:Extract the Local textural feature of each two field picture of Internet video and global brightness;S2:Sluggish frame detection is carried out according to the textural characteristics of video sequence and brightness changing features.The present invention solves the problems, such as:In the case where network parameter is accessed and is limited, user terminal can not directly judge sluggish frame position and time from the network flow received.
Description
Technical field
The present invention relates to image and technical field of video processing, more particularly to one kind are bright based on Local textural feature and the overall situation
Spend the video sluggishness frame detection method of feature.
Background technology
In recent years, with the continuous progress of network technology and various intelligent video collecting devices, streaming media video scale is just
It is presented rapidly growth trend, and as our daily exchanges and obtains the important medium of information.The result of broadcast of clear and smooth is
The comfortable important leverage for obtaining information from video exactly of user.However, after limited in bandwidth, noise jamming and compression etc.
The influence of operation is handled, different degrees of degeneration will be presented in streaming media video quality, so as to substantially reduce transmission of video service
Consumer's Experience, or even cause loss and the misjudgement of key message.Therefore, effective streaming media video sluggishness frame detection side is developed
Method, catches in time and feedback network postpones and transmitted the change of error, it has also become Web Video Service industry pass urgently to be resolved hurrily
Key problem.
Traditional video sluggishness frame detection method is dependent on network parameter (such as packet loss, network throughput, caching change
Deng) parsing.However, for the limited application environment of encrypted network data and access rights, user terminal can not be from receiving
Network flow directly judges the sluggish location and time of network, so as to which the monitoring to network transmission situation and feedback bring and be greatly stranded
It is difficult.
The content of the invention
It is an object of the invention to:The problem of traditional method for evaluating video quality is present be:User terminal can not be from reception
To network flow directly judge the sluggish location and time of network;In order to solve the above problems, the present invention provides one kind and is based on office
The streaming media video sluggishness frame detection method of portion's texture and global brightness.
Technical scheme is as follows:
Video sluggishness frame detection method based on inter texture feature and global brightness, comprises the following steps:
S1:Extract the Local textural feature of each two field picture of Internet video and global brightness;
S2:Sluggish frame detection is carried out according to the textural characteristics of video sequence and brightness change;
S21:Calculate the sequence number of each two field picture obtained based on Local textural feature;
S22:Calculate the sequence number of each two field picture based on global brightness;
S23:For Local textural feature, when continuously there is more than certain number in sequence number identical frame of video, and
When the Local textural feature standard deviation of these frames is less than certain threshold value, the frame may be sluggishness;
For global brightness, when more than certain number, and these continuously occurs in sequence number identical frame of video
When the global brightness standard deviation of frame is less than certain threshold value, the frame may be sluggish frame;
S24:Result in comprehensive S23, it is determined that sluggish frame result of determination.
Specifically, S1 is concretely comprised the following steps:
S11:Calculate the number G that boundary pixel brightness value is higher than center pixeli:
Wherein, T (x) represents unit-step function, IiAnd Ij3 × 3 in image centered on ith pixel are represented respectively
The brightness value of the brightness value of center pixel in localized mass and j-th of boundary pixel, boundary pixel number are 4;
S12:Calculate Local textural feature αm:
Wherein, αmFor the Local textural feature of m two field pictures, δ (x) represents unit impulse function, and N represents the picture of frame of video
Prime number mesh;
S13:Calculate global brightness βm:
S14:By αmAnd βmNormalize to as the following formula [0,1]:
Wherein, αmin=min { α1,α2,...,αM},αmax=max { α1,α2,...,αM, M represents the number of frame.
Further, S2 specific steps include:
S21:Calculate the sequence number l for the m two field pictures being calculated based on textural characteristicsv(m):
Wherein, αbaseOn the basis of textural characteristics, tv=1 is textural characteristics thresholding;For initial frame, lv(1)=1, αbase=
α1。
In subsequent frames, the benchmark textural characteristics update as the following formula with the change of texture feature amplitude
S22:Calculate the m (m based on global brightness>1) sequence number of two field picture:
Wherein, βbase1And βbase2For two Benchmark brightness features for m frame brightness, for initial frame, βbase1=
β1+tb2+1,βbase2=β1, tb1=5 and tb2=1 represents two global brightness thresholdings.
In subsequent frames, βbase1And βbase2Updated as the following formula with the change of brightness amplitude:
S23:According to the function s of the sluggish frame of judgement based on textural characteristicsv(m) the judgement sluggishness frame and based on brightness
Function sb(m) sluggish frame result is judged, wherein, sv(m)=1 represent to be judged according to textural characteristics, m frames are identified as sluggishness
Frame;Conversely, sluggish frame it is not identified as then;Same definition is applied to sb(m):
Wherein, u represents lvAnd l (m)b(m) any video frame number in, Ω represent lvOr l (m)b(m) all serial number u in
Frame of video set, std () represent standard deviation operator, TcThe threshold value that same sequence number frame of video continuously occurs is represented, by
Frame per second f is determined, for example video frame rate is 30 frames of broadcasting per second, then TcIt is set to 30;For lv(m), when multiple frame of video occur continuously
Same sequence number number is more than thresholding tcMay be then sluggish frame during=f, wherein f represents the video frame rate;S (m)=sv(m)∧sb
(m) final sluggish frame result of determination is represented, wherein symbol ∧ represents logical AND operation.
After such scheme, the beneficial effects of the present invention are:
(1) present invention proposes the method for carrying out sluggish frame detection using image texture and brightness, has broken away to net
The dependence of network flow data.
(2) when sluggish frame is calculated, when the time interval of the sluggish frame of two sections of neighbours is less than given thresholding, to this
Two sections of sluggishnesses are merged, and this is avoided the sluggish frame that the delay from successive positions frame mistake of long section is divided into multiple segments so that sluggish frame
Result of calculation is more accurate.
(3) in University of Texas, LIVE Mobile Video Stall Database-I databases pass through survey to the present invention
Examination, the results showed that, the sluggish frame detection method that the present invention is built can accurate and capture network delay in real time location and time.
(4) present invention is to Local textural feature αmWith global brightness βmNormalized is used, because in video
Hold the change that change can cause inter texture gradient and brightness step.Such as fine day shooting video occur it is sluggish when brightness ladder
Spend bigger, the luminance video gradient of opposite cloudy day shooting is smaller.Therefore, it is difficult to sluggishness is judged using unified threshold value
Frame.Normalization can avoid the brightness and texture gradient fluctuation that above-mentioned video content change is brought.
Brief description of the drawings
Fig. 1 is the algorithm flow chart in the specific embodiment of the invention;
Fig. 2 is the neighbor pixel graph of a relation for being used to describe local grain in the specific embodiment of the invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Described into ground, it is clear that described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.It is based on
Embodiment in the present invention, the every other embodiment that those of ordinary skill in the art are obtained, belong to what the present invention protected
Scope.
The video sluggishness frame detection method based on inter texture feature and global brightness in the present embodiment, including such as
Lower step:
S1:Extract the Local textural feature of each two field picture of Internet video and global brightness.S1 specific steps
For:
S11:Calculate the number G that boundary pixel brightness value is higher than center pixeli:
Wherein, T (x) represents unit-step function, as shown in Fig. 2 IiAnd IjRespectively represent image in using ith pixel as
The brightness value of center pixel in 3 × 3 localized masses at center and the brightness value of j-th of boundary pixel, boundary pixel number are 4
It is individual.
S12:Calculate Local textural feature αm:
Wherein, αmFor the Local textural feature of m two field pictures, δ (x) represents unit impulse function, and N represents the picture of frame of video
Prime number mesh.
S13:Calculate global brightness βm:
S14:By αmAnd βmNormalize to as the following formula [0,1]:
Wherein, αmin=min { α1,α2,...,αM},αmax=max { α1,α2,...,αM, M represents the number of frame.
S2:Sluggish frame detection is carried out according to the textural characteristics of each two field picture and brightness.S2 specific steps include:
S21:Calculate the sequence number l for the m two field pictures being calculated based on Local textural featurev(m):
Wherein, αbaseOn the basis of textural characteristics, tv=1 is Local textural feature thresholding;For initial frame, lv(1)=1,
αbase=α1。
The implication of formula (7) is:Only exceed thresholding t in the gradient of present frame textural characteristics and benchmark textural characteristicsvWhen
Just increase picture numbers.
In subsequent frames, the benchmark textural characteristics update as the following formula with the change of texture feature amplitude
The implication of formula (8) is:Only exceed thresholding t in the gradient of present frame textural characteristics and benchmark textural characteristicsvWhen
Just update benchmark textural characteristics.
S22:Calculate the m (m based on global brightness>1) sequence number of two field picture:
Wherein, βbase1And βbase2For two Benchmark brightness features for m frame brightness, for initial frame, βbase1=
β1+tb2+1,βbase2=β1, tb1=5 and tb2=1 represents two global brightness thresholdings.V represents vein textures in t footnote
Characteristic threshold, b1 and b2 are brightness preceding paragraphes and consequent two-stage brightness thresholding.
The implication of formula (9) is:If βbase1More than present frame brightness tb1More than, and βbase2With present frame brightness
Character amplitude is less than tb2When, above-mentioned two Benchmark brightness feature invariant.Otherwise, present image sequence number is increased.
In subsequent frames, βbase1And βbase2Updated as the following formula with the change of brightness amplitude:
The implication of formula (10) and formula (11) is:If βbase1More than present frame brightness tb1More than, and βbase2With
Present frame brightness feature is less than tb2When, above-mentioned two Benchmark brightness feature invariant.Otherwise, βbase1And βbase2By (10)-
(11) sequencing of formula is updated.
S23:According to the function s of the sluggish frame of judgement based on textural characteristicsv(m) the judgement sluggishness frame and based on brightness
Function sb(m) judge sluggish frame result, obtain lvAnd l (m)b(m) after, it is assumed that when consecutive identical sequence occur in multiple frame of video
Number number is more than thresholding tcMay be then sluggish frame during=f, wherein f represents the video frame rate.Wherein, sv(m) basis=1 is represented
Textural characteristics judge that m frames are identified as sluggish frame;Conversely, sluggish frame it is not identified as then;Same definition is applied to sb
(m):
Wherein, u represents lvAnd l (m)b(m) any video frame number in, Ω represent lvOr l (m)b(m) all serial number u in
Frame of video set, std () represent standard deviation operator, TcThe threshold value that same sequence number frame of video continuously occurs is represented, by
Frame per second f is determined, for example video frame rate is 30 frames of broadcasting per second, then TcIt is set to 30.
The implication of formula (12) and formula (13) is:When t continuously occurs in sequence number identical frame of videocMore than secondary, and these
The texture of frame/brightness standard deviation is less than thresholding tsThe frame can determine that as sluggish frame when=0.35.
S (m)=sv(m)∧sb(m) final sluggish frame result of determination is represented, wherein symbol ∧ represents logical AND operation, this
The implication of formula is:Only textured and brightness judge simultaneously m frames for sluggish frame when, S (m) is just set to 1.
It is described above, only it is presently preferred embodiments of the present invention, any formal limitation not is made to the present invention.It is any ripe
Those skilled in the art is known, in the case where not departing from the technical scheme scope of present aspect, all using the side of the disclosure above
Method and technology contents make reasonability changes and modifications that may be present to present aspect technical scheme.Therefore, it is every without departing from this
The content of inventive method and technical scheme, according to the present invention technical spirit to any simple modification made for any of the above embodiments,
Equivalent variations and modification, still fall within the scope of technical solution of the present invention protection.
Claims (3)
1. the video sluggishness frame detection method based on inter texture feature and global brightness, it is characterised in that including as follows
Step:
S1:Extract the Local textural feature of each two field picture of Internet video and global brightness;
S2:Sluggish frame detection is carried out according to the Local textural feature of video sequence and the change of global brightness;
S21:Calculate the sequence number of each two field picture obtained based on Local textural feature;
S22:Calculate the sequence number of each two field picture based on global brightness;
S23:For Local textural feature, when more than certain number, and these continuously occurs in sequence number identical frame of video
When the Local textural feature standard deviation of frame is less than certain threshold value, the frame may be sluggishness;
For global brightness, when continuously there is more than certain number in sequence number identical frame of video, and these frames
When global brightness standard deviation is less than certain threshold value, the frame may be sluggish frame;
S24:Result in comprehensive S23, it is determined that sluggish frame result of determination.
2. the video sluggishness frame detection method according to claim 1 based on inter texture feature and global brightness,
It is characterized in that S1's concretely comprises the following steps:
S11:Calculate the number G that boundary pixel brightness value is higher than center pixeli:
<mrow>
<mi>T</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mn>1</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>x</mi>
<mo>&GreaterEqual;</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mn>0</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>x</mi>
<mo><</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<msub>
<mi>G</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mn>4</mn>
</munderover>
<mi>T</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>I</mi>
<mi>j</mi>
</msub>
<mo>-</mo>
<msub>
<mi>I</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein, T (x) represents unit-step function, IiAnd Ij3 × 3 parts centered on ith pixel in image are represented respectively
The brightness value of the brightness value of center pixel in block and j-th of boundary pixel, boundary pixel number are 4;
S12:Calculate Local textural feature αm:
<mrow>
<mi>&delta;</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mn>1</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mn>0</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>x</mi>
<mo>&NotEqual;</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<msub>
<mi>&alpha;</mi>
<mi>m</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mi>&delta;</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>G</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mi>N</mi>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein, αmFor the Local textural feature of m two field pictures, δ (x) represents unit impulse function, and N represents the pixel count of frame of video
Mesh;
S13:Calculate global brightness βm:
<mrow>
<msub>
<mi>&beta;</mi>
<mi>m</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mi>i</mi>
<mi>N</mi>
</munderover>
<msub>
<mi>I</mi>
<mi>i</mi>
</msub>
</mrow>
<mi>N</mi>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>)</mo>
</mrow>
</mrow>
S14:By αmAnd βmNormalize to as the following formula [0,1]:
<mrow>
<msub>
<mi>&alpha;</mi>
<mi>m</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mi>&alpha;</mi>
<mi>m</mi>
</msub>
<mo>-</mo>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
</mrow>
<mrow>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
</mrow>
</mfrac>
<mo>,</mo>
<msub>
<mi>&beta;</mi>
<mi>m</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mi>&beta;</mi>
<mi>m</mi>
</msub>
<mo>-</mo>
<msub>
<mi>&beta;</mi>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
</mrow>
<mrow>
<msub>
<mi>&beta;</mi>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>&beta;</mi>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
</mrow>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein, αmin=min { α1,α2,...,αM},αmax=max { α1,α2,...,αM, M represents the number of frame.
3. the video sluggishness frame detection method according to claim 2 based on inter texture feature and global brightness,
It is characterized in that S2 specific steps include:
S21:Calculate the sequence number l for the m two field pictures being calculated based on textural characteristicsv(m):
<mrow>
<msub>
<mi>l</mi>
<mi>v</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>m</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>l</mi>
<mi>v</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>m</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mn>1</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>|</mo>
<msub>
<mi>&alpha;</mi>
<mi>m</mi>
</msub>
<mo>-</mo>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>b</mi>
<mi>a</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</msub>
<mo>|</mo>
<mo>></mo>
<msub>
<mi>t</mi>
<mi>v</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>l</mi>
<mi>v</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>m</mi>
<mo>-</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>o</mi>
<mi>t</mi>
<mi>h</mi>
<mi>e</mi>
<mi>r</mi>
<mi>w</mi>
<mi>i</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>7</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein, αbaseOn the basis of textural characteristics, tv=1 is textural characteristics thresholding;For initial frame, lv(1)=1, αbase=α1;
In subsequent frames, the benchmark textural characteristics update as the following formula with the change of texture feature amplitude:
<mrow>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>b</mi>
<mi>a</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</msub>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>&alpha;</mi>
<mi>m</mi>
</msub>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>|</mo>
<msub>
<mi>&alpha;</mi>
<mi>m</mi>
</msub>
<mo>-</mo>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>b</mi>
<mi>a</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</msub>
<mo>|</mo>
<mo>></mo>
<msub>
<mi>t</mi>
<mi>v</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>&alpha;</mi>
<mrow>
<mi>b</mi>
<mi>a</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</msub>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>o</mi>
<mi>t</mi>
<mi>h</mi>
<mi>e</mi>
<mi>r</mi>
<mi>w</mi>
<mi>i</mi>
<mi>s</mi>
<mi>e</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>8</mn>
<mo>)</mo>
</mrow>
</mrow>
S22:Calculate the m (m based on global brightness>1) the sequence number l of two field pictureb(m):
Wherein, βbase1And βbase2For two benchmark overall situation brightness of the global brightness for m two field pictures, for initial
Frame, βbase1=β1+tb2+1,βbase2=β1, tb1=5 and tb2=1 represents two global brightness thresholdings;
In subsequent frames, βbase1And βbase2Updated as the following formula with the change of global brightness amplitude:
S23:According to the function s of the sluggish frame of judgement based on textural characteristicsv(m) the judgement sluggishness frame and based on global brightness
Function sb(m) sluggish frame result is judged, wherein, sv(m)=1 represent to be judged according to textural characteristics, m frames are identified as sluggishness
Frame;Conversely, sluggish frame it is not identified as then;sb(m)=1 represent to be judged according to global brightness, m frames are identified as sluggishness
Frame;Conversely, sluggish frame it is not identified as then:
Wherein, u represents lvAnd l (m)b(m) any video frame number in, Ω represent lvOr l (m)b(m) all serial number u's regards in
The set of frequency frame, std () represent standard deviation operator, TcThe threshold value that same sequence number frame of video continuously occurs is represented, by frame per second
F is determined;For lv(m), it is more than thresholding t when consecutive identical sequence number number occur in multiple frame of videocMay be then sluggish frame during=f,
Wherein f represents the video frame rate;For lb(m), it is more than thresholding t when consecutive identical sequence number number occur in multiple frame of videosDuring=f
May be then sluggish frame, wherein f represents the video frame rate;
S (m)=sv(m)∧sb(m) final sluggish frame result of determination is represented, wherein symbol ∧ represents logical AND operation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710710941.8A CN107360474A (en) | 2017-08-18 | 2017-08-18 | Video sluggishness frame detection method based on Local textural feature and global brightness |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710710941.8A CN107360474A (en) | 2017-08-18 | 2017-08-18 | Video sluggishness frame detection method based on Local textural feature and global brightness |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107360474A true CN107360474A (en) | 2017-11-17 |
Family
ID=60287453
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710710941.8A Pending CN107360474A (en) | 2017-08-18 | 2017-08-18 | Video sluggishness frame detection method based on Local textural feature and global brightness |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107360474A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108509931A (en) * | 2018-04-11 | 2018-09-07 | 河南工学院 | Football featured videos method for catching and system |
CN110610145A (en) * | 2019-08-28 | 2019-12-24 | 电子科技大学 | Behavior identification method combined with global motion parameters |
-
2017
- 2017-08-18 CN CN201710710941.8A patent/CN107360474A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108509931A (en) * | 2018-04-11 | 2018-09-07 | 河南工学院 | Football featured videos method for catching and system |
CN108509931B (en) * | 2018-04-11 | 2021-06-01 | 河南工学院 | Football wonderful video capturing method and system |
CN110610145A (en) * | 2019-08-28 | 2019-12-24 | 电子科技大学 | Behavior identification method combined with global motion parameters |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dubin et al. | I know what you saw last minute—encrypted http adaptive video streaming title classification | |
CN110519177B (en) | Network traffic identification method and related equipment | |
CN110445653B (en) | Network state prediction method, device, equipment and medium | |
US8355342B2 (en) | Video quality estimation apparatus, method, and program | |
CN108647245B (en) | Multimedia resource matching method and device, storage medium and electronic device | |
Khan et al. | Content-Based Video Quality Prediction for MPEG4 Video Streaming over Wireless Networks. | |
EP2347599A1 (en) | Method and system for determining a quality value of a video stream | |
CN106656629A (en) | Prediction method for stream media playing quality | |
JP2014509120A (en) | System and method for enhanced remote transcoding using content profiles | |
CN107360474A (en) | Video sluggishness frame detection method based on Local textural feature and global brightness | |
CN109167734B (en) | Method and device for identifying transmission control protocol state | |
WO2016101663A1 (en) | Image compression method and device | |
CN106817582B (en) | The system and method for dynamic optimization video quality in video transmitting procedure | |
CN107465914A (en) | Method for evaluating video quality based on Local textural feature and global brightness | |
Dubin et al. | I know what you saw last minute-the chrome browser case | |
CN111083125A (en) | Neural network optimized non-reference self-adaptive streaming media quality evaluation method and system | |
CN114827617B (en) | Video coding and decoding method and system based on perception model | |
Dubin et al. | Video quality representation classification of encrypted http adaptive video streaming | |
Rodríguez-Lois et al. | A Critical Look into Quantization Table Generalization Capabilities of CNN-based Double JPEG Compression Detection | |
CN112188212B (en) | Intelligent transcoding method and device for high-definition monitoring video | |
CN108668166A (en) | A kind of coding method, device and terminal device | |
Kim et al. | Statistical traffic modeling of MPEG frame size: experiments and analysis | |
Huang et al. | Prediction model for User's QoE in imbalanced dataset | |
CN112104666A (en) | Video coding-based abnormal network flow detection system and method | |
Hayashi et al. | P2PTV traffic classification and its characteristic analysis using machine learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171117 |
|
RJ01 | Rejection of invention patent application after publication |