CN103699886B - Video real-time comparison method - Google Patents

Video real-time comparison method Download PDF

Info

Publication number
CN103699886B
CN103699886B CN201310705643.1A CN201310705643A CN103699886B CN 103699886 B CN103699886 B CN 103699886B CN 201310705643 A CN201310705643 A CN 201310705643A CN 103699886 B CN103699886 B CN 103699886B
Authority
CN
China
Prior art keywords
frame
video
image
vector
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310705643.1A
Other languages
Chinese (zh)
Other versions
CN103699886A (en
Inventor
王宗超
贾凡
兰波
党静雅
张丽君
伊然
张化良
熊永革
宗丽娜
李新生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Aerospace Measurement and Control Technology Co Ltd
Original Assignee
Beijing Aerospace Measurement and Control Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Aerospace Measurement and Control Technology Co Ltd filed Critical Beijing Aerospace Measurement and Control Technology Co Ltd
Priority to CN201310705643.1A priority Critical patent/CN103699886B/en
Publication of CN103699886A publication Critical patent/CN103699886A/en
Application granted granted Critical
Publication of CN103699886B publication Critical patent/CN103699886B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a video real-time comparison method, which comprises the following steps in the mode that firstly, the video image time sequence characteristics are utilized for carrying out frame synchronization, then, the image texture characteristics are utilized for carrying out video real-time comparison. The video real-time comparison method has the characteristics that the efficiency of the algorithm is high, the calculation speed is high, the hardware consumption is low, the real-time performance is high, and the large-scale multi-path video comparison processing can be favorably carried out. The method provided by the invention aims at the IPTV (Internet protocol television) supervision, and can also be applied to other services in aspects of video characteristic extraction, comparison, supervision and the like. In addition, the method provided by the invention can be realized through DSP (digital signal processor) hardware, and corresponding algorithms and solving methods can also be realized through adopting other modes such as servers.

Description

A kind of video real-time comparison method
Technical field
The present invention relates to technical field of video image processing, more particularly, to a kind of video real-time comparison method.
Background technology
With the development of information technology, broadband transmission speed and video processing capabilities quickly improve, simultaneously broadband telecommunication net, The gradually integration of three networks of digital broadcast television net, internet, large-scale video acquisition, transmission, process and monitoring demand and should With more and more extensive.IPTV(Internet Protocol Television, network convention TV)It is to advise greatly after the integration of three networks One typical case's application of mould internet video transmission, such application characteristic:Transmission of video scale is big, transmission link is many, real-time Height, but supervision difficulty is big.
For in IPTV supervision, needing to carry out real-time monitoring to the video of multi-channel video, different node, develop a kind of video Feature extraction and the real-time method comparing, this kind of method can be analyzed the video content of different nodes in real time, extract video image special Levy, video content is compared in real time and supervises, determine that transmission of video content is consistent, prevent video content to be replaced, distort Deng.Existing video comparison method method comparison is complicated, and operand is big, and hardware consumption is big, poor real, and parallel way is few, figure As coupling comparison time longer it is impossible to carry out the comparison of extensive real-time video.
Content of the invention
The technical problem to be solved in the present invention is to provide a kind of video real-time comparison method, fast operation, hardware consumption Low, real-time is high, process video beneficial to extensive multichannel compares.
The technical solution used in the present invention is, described video real-time comparison method, including:
Step 1, carries out frame synchronization based on image temporal aspect to two-path video image stream, and described two-path video image stream is come Transmit collection points and there is same program source from different;
Step 2, is compared based on the image texture characteristic in synchronous two-path video image stream, in real time to determine two-way The difference of video image stream.
Further, described step 1, also includes before conducting frame synchronization:
For two-path video image stream, sequentially in time sequence number is worked out to picture frame respectively.
Further, described step 1 specifically includes:
S1:Two-path video image stream is synchronously chosen with the successive image frame of sample length, in first via video image stream The successive image frame comparing length is chosen in the centre position of described successive image frame, and the referred to as first via compares successive image frame;
S2:In the sample length successive image frame of the second road video image stream, the first frame is chosen ratio as start frame Successive image frame to length, the referred to as second road compares successive image frame;
S3:First via comparison successive image frame compares successive image frame with the second road and is compared, and the content comparing is:Right Answer whether the difference summation between the temporal aspect vector of frame is less than first threshold, if so, then the first via compares successive image frame Comparing successive image frame with the second road is synchronization frame, otherwise in the sample length successive image frame of the second road video image stream, Second frame is chosen, as start frame, the successive image frame comparing length, the referred to as second road compares successive image frame, repeats Step S3, the rest may be inferred, till finding the synchronization frame of first via video image stream and the second road video image stream.
Further, for YUV(Luminance, Chrominance, lightness, colourity and concentration)The video figure of form As stream, the determination process of the described temporal aspect vector of a certain two field picture includes:
A1:This two field picture is divided into m block, calculates Y in every block of image, U, V component statistical averageM represents that piecemeal is numbered, and n represents frame number;
A2:In relatively every block of imageNumerical value, determines the maximum component of numerical value, when maximum is Note characteristic value is 1;When maximum isNote characteristic value is 0;When maximum isNote characteristic value is -1;And by this characteristic value As m block color space characteristic vector value Amn
A3:Relatively n-th frame, the Y-component data mean value of m block imageAverage with the (n+1)th frame m block Y-component data ValueNote characteristic value is 1;Note characteristic value is 0;Note characteristic value is -1; And using this characteristic value as n-th frame, m block neighbour's frame brightness vector value Bmn;Step A3 or be directed to U component or V component Data determines n-th frame, m block neighbour's frame brightness vector value Bmn
A4:Calculate color space characteristic vector value A of all pieces of n-th frame image respectivelymnWith adjacent frame brightness vector value Bmn, obtain n-th frame image temporal aspect vector.
Further, if the transfer rate of picture frame is v, time delay maximum between two-path video is t, described takes Sample length is more than or equal to 2vt, and described comparison length sets in the range of 10~50 frames.
Further, described sample length is equal to 4vt, and described comparison length is 25 frames.
Further, described step 2 specifically includes:
B1:Calculate the difference of the texture feature vector of corresponding frame in synchronous two-path video image stream in real time;
B2:Judge whether the difference of described texture feature vector is less than Second Threshold, if so, then judge two-path video image Uniformity is good, and otherwise two-path video image consistency is poor.
Further, for the video image stream of yuv format, it is special that described texture feature vector at least includes Y-component texture Levy vector.
Further, in step A1, for the Y-component texture feature vector comprising in described texture feature vector, often In the video image stream of road, the determination process of the Y-component texture feature vector of a certain frame includes:
C1:A certain two field picture is divided into M × N block, wherein M is the piecemeal number of vertical direction, N is dividing of horizontal direction Block number, M≤2N;
C2:Calculate the Y-component data mean value in every block of image, i represents piecemeal numbering variable, 1≤i≤M × N, profit Averagely it is worth to the i-th -1 piece of adjacent area interband with the Y-component data that i-th piece of Y-component data mean value deducts the i-th -1 piece Reason characteristic vector, is averagely worth to last using the Y-component data that the Y-component data mean value of last block deducts the 1st piece Texture feature vector between the adjacent area of block, between all pieces of adjacent area, texture feature vector forms the Y-component line of described frame Reason characteristic vector.
Using technique scheme, the present invention at least has following advantages:
Video real-time comparison method of the present invention, carries out frame synchronization, Ran Houli first with video image temporal aspect Carry out the mode that video image compares in real time with image texture characteristic, have algorithm efficiently, fast operation, hardware consumption low, Real-time is high, process video beneficial to extensive multichannel the features such as compare.The method of the invention, in addition to for IPTV supervision, also may be used To be applied to the aspects such as the video feature extraction of other business, comparison, supervision, and the method for the invention both can pass through DSP Hardware is realized, and respective algorithms and solution can also be passed through the other modes such as server and realize.
Brief description
Fig. 1 is the video real-time comparison method flow chart of first embodiment of the invention;
Fig. 2 is the video real-time comparison method flow chart of second embodiment of the invention;
Fig. 3(a)It is directed in video temporal aspect extraction process in the calculating of n-th frame image for second embodiment of the invention Hold schematic diagram;
Fig. 3(b)It is directed to what the (n+1)th two field picture calculated in video temporal aspect extraction process for second embodiment of the invention Content schematic diagram;
Fig. 3(c)Calculate for n-th frame image in video temporal aspect extraction process for second embodiment of the invention Temporal aspect vector Tn
Fig. 3(d)Calculate for n-th frame image in video temporal aspect extraction process for second embodiment of the invention Temporal aspect curve;
Fig. 4 is the video image frame synchronizing process schematic diagram of second embodiment of the invention;
Fig. 5(a)For second embodiment of the invention during video image texture feature extraction section technique content;
Fig. 5(b)For second embodiment of the invention during video image texture feature extraction n-th frame texture feature vector Schematic diagram;
Fig. 6 compares schematic diagram in real time for second embodiment of the invention video image.
Specific embodiment
For further illustrating that the present invention is to reach technological means and effect that predetermined purpose is taken, below in conjunction with accompanying drawing And preferred embodiment, after the present invention is described in detail such as.
First embodiment of the invention, a kind of video real-time comparison method, as shown in figure 1, include step in detail below:
Step S101, for two-path video image stream, works out sequence number to picture frame respectively sequentially in time;Described two-way Video image stream is derived from different transmission collection points and has same program source.
Step S102, carries out frame synchronization based on image temporal aspect to two-path video image stream.
Specifically, step S102 includes:
S1:Two-path video image stream is synchronously chosen with the successive image frame of sample length, in first via video image stream The successive image frame comparing length is chosen in the centre position of described successive image frame, and the referred to as first via compares successive image frame;
S2:In the sample length successive image frame of the second road video image stream, the first frame is chosen ratio as start frame Successive image frame to length, the referred to as second road compares successive image frame;
S3:First via comparison successive image frame compares successive image frame with the second road and is compared, and the content comparing is:Right Answer whether the difference summation between the temporal aspect vector of frame is less than first threshold, if so, then the first via compares successive image frame Comparing successive image frame with the second road is synchronization frame, and that is, the first via of first via video image stream compares successive image frame and second Second tunnel of road video image stream compares consecutive image frame synchronization;Otherwise in the sample length sequential chart of the second road video image stream As, in frame, the second frame being chosen, as start frame, the successive image frame comparing length, the referred to as second road compares successive image frame, weight Multiple execution step S3, the rest may be inferred, till finding the synchronization frame of first via video image stream and the second road video image stream.
Specifically, if the transfer rate of picture frame is v, unit is:Frame/second, the time delay between two-path video is maximum It is worth for t, unit is:Second, described sample length is more than or equal to 2vt, and described comparison length sets in the range of 10~50 frames.Excellent Choosing, described sample length is equal to 4vt, and described comparison length is 25 frames.
Further, for the video image stream of yuv format, the determination of the described temporal aspect vector of a certain two field picture Journey includes:
A1:This two field picture is divided into m block, calculates Y in every block of image, U, V component statistical averageM represents that piecemeal is numbered, and n represents frame number;
A2:In relatively every block of imageNumerical value, determines the maximum component of numerical value, when maximum is Note characteristic value is 1;When maximum isNote characteristic value is 0;When maximum isNote characteristic value is -1;And by this characteristic value As m block color space characteristic vector value Amn
A3:Relatively n-th frame, the Y-component data mean value of m block imageAverage with the (n+1)th frame m block Y-component data ValueNote characteristic value is 1;Note characteristic value is 0;Note characteristic value is -1; And using this characteristic value as n-th frame, m block neighbour's frame brightness vector value Bmn;Step A3 or be directed to U component or V component Data determines n-th frame, m block neighbour's frame brightness vector value Bmn
A4:Calculate color space characteristic vector value A of all pieces of n-th frame image respectivelymnWith adjacent frame brightness vector value Bmn, obtain n-th frame image temporal aspect vector.
Step S103, is compared in real time based on the image texture characteristic in synchronous two-path video image stream, to determine The difference of two-path video image stream.
Specifically, step S103 includes:
B1:Calculate the difference of the texture feature vector of corresponding frame in synchronous two-path video image stream in real time;
B2:Judge whether the difference of described texture feature vector is less than Second Threshold, if so, then judge two-path video image Uniformity is good, and otherwise two-path video image consistency is poor.
Specifically, for the video image stream of yuv format, the texture feature vector in step A1 at least includes Y-component line Reason characteristic vector is that is to say, that Y-component texture feature vector can only be comprised it is also possible to comprise Y-component and the texture of U component Characteristic vector or the texture feature vector comprising Y-component and V component or comprise simultaneously the textural characteristics of three components to Amount.
In step A1, below, describe in detail its determination process, for institute taking Y-component texture feature vector as a example State the Y-component texture feature vector comprising in texture feature vector, the Y-component textural characteristics of a certain frame in the video image stream of every road The determination process of vector includes:
C1:A certain two field picture is divided into M × N block, wherein M is the piecemeal number of vertical direction, N is dividing of horizontal direction Block number, M≤2N;
C2:Calculate the Y-component data mean value in every block of image, i represents piecemeal numbering variable, 1≤i≤M × N, profit Averagely it is worth to the i-th -1 piece of adjacent area interband with the Y-component data that i-th piece of Y-component data mean value deducts the i-th -1 piece Reason characteristic vector, is averagely worth to last using the Y-component data that the Y-component data mean value of last block deducts the 1st piece Texture feature vector between the adjacent area of block, between all pieces of adjacent area, texture feature vector forms the Y-component line of described frame Reason characteristic vector.
Second embodiment of the invention, a kind of video real-time comparison method, the present embodiment can be regarded as based on the first enforcement Example methods described is applied to process an application example of yuv format video image.
Fig. 2 is the flow chart of the video real-time comparison method of the present embodiment.As shown in Fig. 2 after video image input, first Carry out video image stream pretreatment, obtain the video frame image of yuv format, and sequentially in time sequence number is worked out to every frame, just In subsequently carrying out video information feature extraction;Then it is special that video information identical content source difference node being collected carries out sequential Levy extraction, obtain corresponding temporal aspect vector curve;Take the sequential of video that two-way same program source difference node collects Characteristic information is compared computing, obtains the frame position sequence number difference between two-way input video, two-path video is carried out frame with Step;After completing frame synchronization, the textural characteristics of the every frame of real-time video are extracted, the textural characteristics of the every frame of get Mei road video; According to textural characteristics, the identical content source video after frame synchronization is carried out with the other real-time comparison of frame level, and exports comparison result.
Fig. 3(a)、(b)、(c)、(d)It is video temporal aspect extraction process schematic diagram in the present embodiment.As Fig. 3(a)Institute Show, every two field picture is divided into be numbered 1,2,3,4 four pieces, calculates Y, U, V statistical average in every block of imageM represents that piecemeal is numbered, and n represents picture frame sequence number.In relatively every block of imageNumber Value, determines the maximum component of numerical value, when maximum isNote characteristic value is 1;When maximum isNote characteristic value is 0;When Maximum isNote characteristic value is -1;And using this characteristic value as m block color space characteristic vector value Amn.Comparison n-th frame, M block Y-component mean valueWith the (n+1)th frame m block Y-component mean valueIfThen note characteristic value is 1; IfThen note characteristic value is 0;IfThen note characteristic value is -1;And using this characteristic value as n-th frame, M block neighbour's frame brightness vector value Bmn.As Fig. 3(c)Shown, calculate the 1st, 2,3,4 pieces of color space features of n-th frame image respectively Vector value AmnWith adjacent frame brightness vector value Bmn, obtain n-th frame image temporal aspect vector Tn, according to frame ordered pair temporal aspect Vector carries out arrangement and obtains this road video image temporal aspect curve, such as Fig. 3(d)Shown.Carry out higher discrimination if necessary Comparison, image block number m can be increased, to obtain more high-dimensional characteristic vector.
Fig. 4 is the video image frame synchronizing process schematic diagram of the present embodiment.Frame synchronization is to determine the two-way collecting With picture delay time or frame number between content source video.Picture frame first to the two-path video of the same content source that needs compare It is numbered generation frame number in chronological order;Two-path video temporal aspect curve is sampled, sample length is 200 frames; And the comparison curve of one section of 25 frame length is taken out from first via video temporal aspect curve centre position(Bent including eight points of vectors Line);Again comparison curve frame by frame with the second road video temporal aspect sampling curve of this length is compared, calculates continuous 25 Squared difference between frame timing characteristic vector and, when difference quadratic sum is less than threshold value 1 it may be determined that this part of two-path video Curve is identical, calculates now two-path video and compares the difference that curve initiates frame number, obtains two-path video frame sequence difference, complete Framing synchronization, to be compared in real time to video further.Above-mentioned sample length 200 frame, when being to postpone according to two-path video Between less than 2s determine.Can extend or shorten temporal aspect curve sampling length according to time delay length between two-path video Degree and comparison length of curve.
Fig. 5(a)、(b)It is embodiment of the present invention section technique content during video image texture feature extraction respectively And n-th frame texture feature vector schematic diagram.As Fig. 5(a)Shown, every two field picture is divided into be numbered 1,2 ... 16 pieces of 16, Calculate the Y-component data mean value in every block of image(Integer numerical value is taken, beneficial to raising efficiency during DSP hardware condition), m Represent piecemeal numbering, n represents image frame sequence row number.As Fig. 5(b)Shown, using m+1 block Y-component data mean valueSubtract Remove m block Y-component data mean valueObtain texture feature vector C between m block adjacent areamn, using the 16th piece of Y-component number According to mean valueDeduct the 1st piece of Y-component data mean valueObtain the 16th block eigenvector C16n, the 1st to 16 block eigenvector CmnComposition n-th frame texture feature vector Qn.Every two field picture piecemeal can be adjusted, every two field picture piecemeal is more, can obtain The higher textural characteristics of resolution degree, it is proper point-score in terms of calculating speed and comparative effectiveness that every frame is divided into 16 pieces.
Fig. 6 is that the present embodiment video image compares schematic diagram in real time.After the completion of frame synchronization, extract real-time needs the same of comparison The two-path video textural characteristics Q of content sourcen、Qn', the texture feature vector difference calculating corresponding frame obtains vectorial Δ Qn, work as vector ΔQnEvery element is squared and is compared with given threshold, illustrates that two-way image consistency is good less than given threshold, greatly Illustrate that two-path video is variant in threshold value, now can obtain the result that two-path video compares in real time.Work as Fig. 5(a)InRound type During numerical value, if being less than threshold value 1, illustrating that this frame two-path video image is completely the same, equal to or more than threshold value 1, this frame two-way being described Video image is variant.
The embodiment of the present invention be mainly based upon yuv format video frame image carry out process compare, due to yuv format with There is certain transformational relation in rgb format, the methods described of the embodiment of the present invention can be used for the video frame image of rgb format Process contrast, principle is similar to.
The present invention has the characteristics that simple, efficient, real-time is good, it is high to compare the degree of accuracy, is suitable for extensive multi-channel video Real-time online compares.
By the explanation of specific embodiment it should to the present invention can be reach the technological means that predetermined purpose taken and Effect is able to more deeply and specifically understand, but appended diagram is only to provide reference and purposes of discussion, is not used for this Invention is any limitation as.

Claims (7)

1. a kind of video real-time comparison method is it is characterised in that include:
Step 1, carries out frame synchronization based on image temporal aspect to two-path video image stream, and described two-path video image stream is derived from not Transmit together collection point and there is same program source;
Step 2, is compared based on the image texture characteristic in synchronous two-path video image stream, in real time to determine two-path video The difference of image stream;
Described step 1 specifically includes:
S1:Two-path video image stream is synchronously chosen with the successive image frame of sample length, described in first via video image stream The successive image frame comparing length is chosen in the centre position of successive image frame, and the referred to as first via compares successive image frame;
S2:In the sample length successive image frame of the second road video image stream, the first frame is chosen as start frame and compares length The successive image frame of degree, the referred to as second road compares successive image frame;
S3:First via comparison successive image frame compares successive image frame with the second road and is compared, and the content comparing is:Corresponding frame Temporal aspect vector between difference summation whether be less than first threshold, if so, then the first via compares successive image frame and the It is synchronization frame that two roads compare successive image frames, otherwise in the sample length successive image frame of the second road video image stream, by the Two frames choose the successive image frame comparing length as start frame, and the referred to as second road compares successive image frame, repeated execution of steps S3, the rest may be inferred, till finding the synchronization frame of first via video image stream and the second road video image stream;
For the video image stream of lightness, colourity and concentration yuv format, the described temporal aspect vector of a certain two field picture is really Determine process to include:
A1:This two field picture is divided into m block, calculates Y in every block of image, U, V component statistical averageM represents that piecemeal is numbered, and n represents frame number;
A2:In relatively every block of imageNumerical value, determines the maximum component of numerical value, when maximum isNote is special Value indicative is 1;When maximum isNote characteristic value is 0;When maximum isNote characteristic value is -1;And using this characteristic value as M block color space characteristic vector value Amn
A3:Relatively n-th frame, the Y-component data mean value of m block imageWith the (n+1)th frame m block Y-component data mean value Note characteristic value is 1;Note characteristic value is 0;Note characteristic value is -1;And will This characteristic value is as n-th frame, m block neighbour's frame brightness vector value Bmn;Step A3 or be directed to U component or V component data Determine n-th frame, m block neighbour's frame brightness vector value Bmn
A4:Calculate color space characteristic vector value A of all pieces of n-th frame image respectivelymnWith adjacent frame brightness vector value Bmn, obtain To n-th frame image temporal aspect vector.
2. video real-time comparison method according to claim 1 is it is characterised in that described step 1, conducting frame synchronization Front also include:
For two-path video image stream, sequentially in time sequence number is worked out to picture frame respectively.
3. video real-time comparison method according to claim 1 is it is characterised in that the transfer rate setting picture frame is v, two Time delay maximum between the video of road is t, and described sample length is more than or equal to 2vt, and described comparison length is in 10~50 frames In the range of set.
4. video real-time comparison method according to claim 3 is it is characterised in that described sample length is equal to 4vt, described Comparison length is 25 frames.
5. video real-time comparison method according to claim 1 is it is characterised in that described step 2 specifically includes:
B1:Calculate the difference of the texture feature vector of corresponding frame in synchronous two-path video image stream in real time;
B2:Judge whether the difference of described texture feature vector is less than Second Threshold, if so, then judge that two-path video image is consistent Property good, otherwise two-path video image consistency is poor.
6. video real-time comparison method according to claim 5 it is characterised in that for yuv format video image stream, Described texture feature vector at least includes Y-component texture feature vector.
7. video real-time comparison method according to claim 6 is it is characterised in that in step A1, for described texture The Y-component texture feature vector comprising in characteristic vector, the Y-component texture feature vector of a certain frame in the video image stream of every road Determination process includes:
C1:A certain two field picture is divided into M × N block, wherein M is the piecemeal number of vertical direction, N is the piecemeal of horizontal direction Number, M≤2N;
C2:Calculate the Y-component data mean value in every block of image, i represents piecemeal numbering variable, 1≤i≤M × N, utilizes i-th The Y-component data mean value of block deducts the i-th -1 piece of Y-component data, and to be averagely worth to texture between the i-th -1 piece of adjacent area special Levy vector, be averagely worth to last block using the Y-component data that the Y-component data mean value of last block deducts the 1st piece Texture feature vector between adjacent area, between all pieces of adjacent area, texture feature vector forms the Y-component texture spy of described frame Levy vector.
CN201310705643.1A 2013-12-19 2013-12-19 Video real-time comparison method Active CN103699886B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310705643.1A CN103699886B (en) 2013-12-19 2013-12-19 Video real-time comparison method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310705643.1A CN103699886B (en) 2013-12-19 2013-12-19 Video real-time comparison method

Publications (2)

Publication Number Publication Date
CN103699886A CN103699886A (en) 2014-04-02
CN103699886B true CN103699886B (en) 2017-02-08

Family

ID=50361409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310705643.1A Active CN103699886B (en) 2013-12-19 2013-12-19 Video real-time comparison method

Country Status (1)

Country Link
CN (1) CN103699886B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105187883B (en) * 2015-09-11 2018-05-29 广东威创视讯科技股份有限公司 A kind of data processing method and client device
CN106028147B (en) * 2016-06-23 2019-05-28 北京华兴宏视技术发展有限公司 Vision signal monitoring method and vision signal monitor system
CN106803991A (en) * 2017-02-14 2017-06-06 北京时间股份有限公司 Method for processing video frequency and device
CN110740251B (en) * 2018-07-20 2021-09-24 杭州海康机器人技术有限公司 Multi-camera synchronous stream taking method, device and system
CN112291593B (en) * 2020-12-24 2021-03-23 湖北芯擎科技有限公司 Data synchronization method and data synchronization device
CN112580577B (en) * 2020-12-28 2023-06-30 出门问问(苏州)信息科技有限公司 Training method and device for generating speaker image based on facial key points

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5294981A (en) * 1993-07-13 1994-03-15 Pacific Pay Video Limited Television video synchronization signal monitoring system and method for cable television system
JP2906332B2 (en) * 1995-12-27 1999-06-21 日本テレビ放送網株式会社 Telecine signal conversion method and up-converter
US8537892B2 (en) * 2008-04-14 2013-09-17 New Jersey Institute Of Technology Detection of double video compression using first digit based statistics
CN102905054B (en) * 2012-10-23 2017-11-21 上海佰贝科技发展有限公司 A kind of video synchronization method compared based on image multi-dimensional characteristic value
CN103067778B (en) * 2013-01-06 2015-12-23 北京华兴宏视技术发展有限公司 Data monitoring system and data monitoring method

Also Published As

Publication number Publication date
CN103699886A (en) 2014-04-02

Similar Documents

Publication Publication Date Title
CN103699886B (en) Video real-time comparison method
CN106412626B (en) A kind of processing method and processing device of live video
WO2018161775A1 (en) Neural network model training method, device and storage medium for image processing
CN103379351B (en) A kind of method for processing video frequency and device
US20070280129A1 (en) System and method for calculating packet loss metric for no-reference video quality assessment
CN103955930B (en) Motion parameter estimation method based on gray integral projection cross-correlation function characteristics
CN103258332A (en) Moving object detection method resisting illumination variation
CN107959848A (en) Universal no-reference video quality evaluation algorithms based on Three dimensional convolution neutral net
CN103248906A (en) Method and system for acquiring depth map of binocular stereo video sequence
CN110765880A (en) Light-weight video pedestrian heavy identification method
CN102509311B (en) Motion detection method and device
CN106572387A (en) Video sequence alignment method and video sequence alignment system
CN104715470B (en) A kind of klt Corner Detections device and method
WO2022179251A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN108090145A (en) A kind of dynamic network side sampling and its method for visualizing
CN111178503A (en) Mobile terminal-oriented decentralized target detection model training method and system
CN104580978B (en) A kind of video detection and processing method, device
CN104378575B (en) The method of image signal transmission and device
US9264688B2 (en) Video processing method and apparatus for use with a sequence of stereoscopic images
CN107770595B (en) A method of it being embedded in real scene in virtual scene
CN105187688B (en) The method and system that a kind of real-time video and audio to mobile phone collection synchronizes
CN104618680B (en) A kind of compression method of video resolution
CN102968773A (en) Method of image noise reduction
CN104182931A (en) Super resolution method and device
US9124869B2 (en) Systems and methods for video denoising

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant