CN102999759A - Light stream based vehicle motion state estimating method - Google Patents

Light stream based vehicle motion state estimating method Download PDF

Info

Publication number
CN102999759A
CN102999759A CN2012104420826A CN201210442082A CN102999759A CN 102999759 A CN102999759 A CN 102999759A CN 2012104420826 A CN2012104420826 A CN 2012104420826A CN 201210442082 A CN201210442082 A CN 201210442082A CN 102999759 A CN102999759 A CN 102999759A
Authority
CN
China
Prior art keywords
point
vehicle
image
delta
angle point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012104420826A
Other languages
Chinese (zh)
Other versions
CN102999759B (en
Inventor
李旭
吴明明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201210442082.6A priority Critical patent/CN102999759B/en
Publication of CN102999759A publication Critical patent/CN102999759A/en
Application granted granted Critical
Publication of CN102999759B publication Critical patent/CN102999759B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a light stream based vehicle motion state estimating method which is applicable to estimating motion of vehicles running of flat bituminous pavement at low speed in the road traffic environment. The light stream based vehicle motion state estimating method includes mounting a high-precision overlook monocular video camera at the center of a rear axle of a vehicle, and acquiring video camera parameters by means of calibration algorithm; preprocessing acquired image sequence by histogram equalization so as to highlight angular point characteristics of the bituminous pavement, and reducing adverse affection caused by pavement conditions and light variation; detecting the angular point characteristics of the pavement in real time by adopting efficient Harris angular point detection algorithm; performing angular point matching tracking of a front frame and a rear frame according to the Lucas-Kanade light stream algorithm, further optimizing matched angular points by RANSAC (random sample consensus) algorithm and acquiring more accurate light stream information; and finally, restructuring real-time motion parameters of the vehicle such as longitudinal velocity, transverse velocity and side slip angle under a vehicle carrier coordinate system, and accordingly, realizing high-precision vehicle ground motion state estimation.

Description

A kind of state of motion of vehicle method of estimation of optical flow-based
Technical field
The present invention relates to a kind of method of estimation state of motion of vehicle of optical flow-based, belong to autonomous driving and auxiliary driving field.
Background technology
State estimation in the Vehicle Driving Cycle process, to realize one of autonomous driving and auxiliary gordian technique of driving, be intended to determine the important state variables such as longitudinal velocity, transverse velocity and side slip angle of vehicle under the transport condition, to vehicle anti-lock brake system (ABS), vehicle electric stability program (ESP), the active safety control systems such as four-wheel steering stabilizing control system (4WS) have great actual application value.These active safety control system can work depend on real-time, the Obtaining Accurate of vehicle-state parameter to a great extent.
At present, in the automobile active safety field, the method for estimation of the motion state of vehicle mainly can be divided three classes.One class is indirect measurement, namely utilizes the output information of existing onboard sensor such as inertial navigation system INS, wheel speed sensors etc., calculates to obtain the correlation behavior of vehicle by simple mathematics.But As time goes on the deviation of sensor can progressively accumulate, and causes the estimated accuracy of vehicle-state to reduce.Equations of The Second Kind is direct measurement, namely utilize high-precision equipment such as high precision global position system GPS etc. directly to obtain the motion state parameters of vehicle, this method measuring accuracy is high, but expensive, and GPS in tunnel that signal is blocked, the effective output information of stereo region, bearing accuracy also can reduce.The 3rd class is based on the method for auto model, namely carry out kinematics or Dynamic Modeling by the motion process to vehicle, simultaneously with corresponding onboard sensor (such as wheel speed sensors, gyroscope, accelerometer etc.) information as observation information, and then utilize suitable filtering algorithm for estimating to realize estimation to motoring condition.This method need to be to vehicle whole even each tire carry out respectively modeling, unpunctual when these models or model parameter, evaluated error is larger.
Summary of the invention
In order to overcome the limitation of said method, for with 0 ~ 15(km/hour) low speed driving is at the vehicle of smooth bituminous pavement, the present invention proposes the state of motion of vehicle measuring method of the optical flow-based that a kind of cost is low, real-time good, adaptive capacity to environment is strong, precision is higher.
To achieve these goals, adopt following technical scheme:
Step 1: in the vehicle rear axle center high precision monocular-camera is installed, its optical axis is downward perpendicular to vehicle operating plane and directed towards ground, and 20 ~ 40 centimetres of vertical heights are used for obtaining the pavement image sequence of variation.Definition vehicle carrier coordinate system, the initial point o of vehicle carrier coordinate system is positioned at vehicle barycenter place, and the x axle is along the longitudinal axis of vehicle and consistent with vehicle forward direction, and the z axle is downward perpendicular to vehicle operating plane and directed towards ground, and it is definite that the y axle press the right-handed helix rule.Definition pavement image coordinate system, the initial point o of pavement image coordinate system pBe the summit in the image lower left corner, longitudinal axis u is parallel to vehicle carrier x axle, and transverse axis v is parallel to vehicle carrier y axle.
Step 2: adopt classical camera calibration method to carry out camera calibration, to obtain the inner parameter of video camera: optical axis is at the vertical projection point of the plane of delineation coordinate (o at image coordinate system u, o v), u axle normal vector α on the image, v axle normal vector β on the image, verticality γ and the external parameter of u, v coordinate axis: video camera is with respect to 3 * 3 rotation matrix r of world coordinate system Cam, video camera is with respect to 3 * 1 translation vector t of world coordinate system Cam
Step 3: initialisation image sequence pointer i makes i=1.
Step 4: get i frame pavement image and i+1 frame pavement image, respectively i frame pavement image and i+1 frame pavement image are carried out the image pre-service, described image pre-service comprises: grey level histogram function and the equalization processing of computed image.
The grey level histogram function of described image adopts following methods to calculate:
At first with image gray processing, and with the gray-scale value normalization of all pixels of image, calculate the grey level histogram function of this image:
pro ( r k ) = n k n , k = 0,1,2 . . . 255,0 ≤ r k ≤ 1 - - - ( 1 )
Wherein, n is total number of image pixels, r kBe k gray level, n kGray level is r in the image for this reason kNumber of pixels, pro (r k) expression gray level r kThe probability that occurs.In rectangular coordinate system, r kWith pro (r k) graph of a relation be grey level histogram.
Described equalization processing adopts the gray accumulation distribution function of image:
s k = Σ l = 0 k pro ( r l ) , 0 ≤ r l ≤ 1 , l = 0,1,2 . . . k , k = 0,1,2 , . . . , 255 - - - ( 2 )
Wherein, s kBe k gray level cumulative distribution function, r lBe l gray level.Gray level after the conversion
r′ k=INT[(L-1)s k+0.5],k=0,1,2,…,255 (3)
Wherein, L=256, INT are bracket function, r ' kR kGray level after conversion.
Step 5: the pretreated i two field picture of image is carried out Harris Harris Corner Detection: Harris Haris Corner Detection comprises: calculate grey scale change amount and the angle point response function of every bit in the i two field picture, and carry out angle point and judge.
Following methods is adopted in the calculating of described grey scale change amount:
E(u 0,v 0)=∑[I(u 0+Δu,v 0+Δv)-I(u 0,v 0)] 2,(u 0+Δu,v 0+Δv)∈w(u 0,v 0,r ad) (4)
Wherein, E (u 0, v 0) be any point (u in the i two field picture 0, v 0) the grey scale change amount, Δ u, Δ v are respectively point (u 0, v 0) along u axle, the axial displacement of v, I (u 0, v 0) be point (u 0, v 0) gray-scale value, I (u 0+ Δ u, v 0+ Δ v) is point (u 0+ Δ u, v 0+ Δ v) gray-scale value, window function w (u 0, v 0, r Ad)={ (u, v) | (u-u 0) 2+ (v-v 0) 2≤ r Ad 2With any point (u 0, v 0) centered by, r AdA circular window for radius.To I (u 0+ Δ u, v 0+ Δ v) carry out Taylor Taylor series expansion, ignore high-order term and get:
I(u 0+Δu,v 0+Δv)≈I(u 0,v 0)+Δu·I u(u 0,v 0)+Δv·I v(u 0,v 0)(5)
Wherein, I u ( u 0 , v 0 ) = ∂ I ( u 0 , v 0 ) ∂ u , I v ( u 0 , v 0 ) = ∂ I ( u 0 , v 0 ) ∂ v Respectively I (u 0, v 0) in u axle, the axial gradient of v.Can be got by above two formulas:
E ( u 0 , v 0 ) = [ Δu , Δv ] M Δu Δv , M = Σ I 2 u ( u 0 , v 0 ) I u ( u 0 , v 0 ) I v ( u 0 , v 0 ) I u ( u 0 , v 0 ) I v ( u 0 , v 0 ) I 2 v ( u 0 , v 0 ) - - - ( 6 )
Wherein, matrix M is any point (u 0, v 0) autocorrelation matrix of video in window on every side, symmetrical matrix M is carried out the similarity diagonalization offspring enters relational expression (6):
E ( u 0 , v 0 ) = R - 1 λ 1 0 0 λ 2 - - - ( 7 )
Wherein, R is twiddle factor, R -1Be the inverse matrix of R, λ 1, λ 2Eigenwert for the M battle array.
Described angle point response function CRF adopts following method to obtain:
CRF(u 0,v 0)=det(M)-η·[trace(M)] 2 (8)
Wherein, CRF (u 0, v 0) be any point (u 0, v 0) the angle point response function, det (M)=λ 1λ 2, the determinant of expression Metzler matrix, trace (M)=λ 1+ λ 2, the mark of representing matrix M, scale factor η are empirical value, η=0.04.
Described angle point is judged the employing following methods:
When angle point response function CRF surpasses threshold value, think that this point is exactly angle point, and obtain the pixel coordinate of angle point that threshold value gets 18.
Step 6: utilizing Lucas-Ke is that moral Lucas-Kanade optical flow method coupling is followed the tracks of Harris Harris algorithm at the coordinate of detected all angle points of i two field picture in the i+1 two field picture: the arbitrary angle point G in the i two field picture is (u at the coordinate of i two field picture i, v i), Lucas-Ke of described angle point G is that the tracking of moral Lucas-Kanade optical flow method coupling may further comprise the steps:
At first, set up equation of constraint between i frame and the i+1 two field picture:
I(u i+1,v i+1,t i+1)=I(u i+Δu i,v i+Δv i,t i+Δt)=I(u i,v i,t i) (9)
Wherein, t i, t I+1Dividing is the moment of taking i frame and i+1 two field picture, time interval Δ t=t I+1-t i, Δ t is fixed value, the representative value of Δ t has 1/30(second), 1/25(second), 1/15(second), Δ u i, Δ v iRespectively the displacement of angle point G u axle, v axle within the Δ t time, (u i, v i) be that angle point G is at the coordinate of i two field picture, (u I+1, v I+1) be angle point G at the coordinate of i+1 two field picture, I (u i, v i, t i) be t iThe gray-scale value of moment angle point G, I (u I+1, v I+1, t I+1) be t I+1The gray-scale value of moment angle point G.
Then, I (u i+ Δ u i, v i+ Δ v i, t i+ Δ t) carry out Taylor Taylor series expansion, ignore high-order term and get:
I(u i+Δu i,v i+Δv i,t i+Δt)≈I(u i,v i,t i)+I u·Δu i+I v·Δv i+I t·Δt (10)
Wherein,
Figure BDA00002368243700031
Gradation of image u direction of principal axis partial derivative,
Figure BDA00002368243700032
Gradation of image v direction of principal axis partial derivative,
Figure BDA00002368243700033
That gradation of image is to the partial derivative of time t.
Can be got by relational expression (9), (10):
I u · Δ u i Δt + I v · Δ v i Δt + I t = 0 - - - ( 11 )
Then, set up t iNear the system equation of 5 * 5 neighborhood territory pixels the moment angle point G, set up following system of equations:
Hc=g (12)
Wherein, H = I u ( u i , 1 , v i , 1 , t i ) , I v ( u i , 1 , v i , 1 , t i ) I u ( u i , 2 , v i , 2 , t i ) , I v ( u i , 2 , v i , 2 , t i ) . . . . . . I u ( u i , 25 , v i , 25 , t i ) , I v ( u i , 25 , v i , 25 , t i ) , c = Δ u i Δt Δ v i Δt , g = - I t ( u i , 1 , v i , 1 , t i ) - I t ( u i , 2 , v i , 2 , t i ) . . . - I t ( u i , 25 , v i , 25 , t i ) ,
25 * 2 the matrix that H is comprised of the shade of gray of 5 * 5 neighborhood territory pixels, c is 2 * 1 the column vector that light stream forms by the displacement between adjacent two two field pictures, 25 * 1 the column vector that g is comprised of gray scale partial derivative in time.
At last, adopt least square method to ask the least square solution of system equation, the i.e. motion of central point G.
Transposition on equation (12) both sides with premultiplication H battle array:
H THc=H Tg (13)
Wherein, H TIt is the transposition of H battle array.The light stream of the i.e. front and back frame formation of the solution of equation is as follows:
Δ u i Δt Δ v i Δt = ( H T H ) - 1 H T g - - - ( 14 )
Wherein, (H TH) -1It is matrix H TThe inverse matrix of H.Angle point G is at the coordinate (u of i+1 two field picture I+1, v I+1) adopt following formula to obtain:
u i + 1 v i + 1 = Δ u i Δt Δ v i Δt Δt + u i v i - - - ( 15 )
Step 7: all coupling angle points that utilize the consistent RANSAC algorithm optimization of stochastic sampling step 6 to obtain:
At first, set up the model Q of the projective transformation matrix between i frame and i+1 two field picture:
ξU i+1=QU i (16)
Wherein, U i + 1 = u i + 1 v i + 1 1 , U i = u i v i 1 , Q = q 0 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 , U i, U I+1Be respectively the homogeneous vectors that arbitrary coupling angle point G in all coupling angle points that step 6 obtains consists of at the coordinate of i frame and i+1 two field picture, Q is the projective transformation matrix between i frame and i+1 two field picture, and ξ is scale factor, gets ξ=1; Q wherein 0, q 1, q 3, q 4The matrix that consists of q 0 q 1 q 3 q 4 Expression is carried out convergent-divergent, rotation, symmetry, wrong contact transformation, q to image 6, q 7Vector [the q that consists of 6q 7] represent image is carried out translation transformation, q 2, q 5The vector that consists of q 2 q 5 Expression is carried out projective transformation, q to image 8Expression is carried out stretching to image.Select 4 pairs of coupling angle points to draw 8 linear equations, by self constraint condition of projective transformation battle array Q, form 9 equations and namely estimate Q, the step of estimation again:
(I) initialization iterations variable m=1, and establish the iterations maximal value
Figure BDA00002368243700048
(II) chooses 4 pairs of coupling angle points at random in all coupling angle points that step 6 obtains, as a sample set S m, the sample complementary set after the extraction is
Figure BDA00002368243700049
With sample set S mIn some substitution relational expression (16) try to achieve initial Q m
(III) utilizes complementary set
Figure BDA000023682437000410
In the U that consists of at the coordinate of i frame of arbitrary coupling angle point iWith the Q that is obtained by (II) mObtain error function:
F = d 2 Euc ( U i + 1 , Q m U i ) + d 2 Euc ( U i , Q m - 1 U i + 1 ) - - - ( 17 )
Wherein,
Figure BDA000023682437000412
Q mInverse matrix, d Euc(U I+1, Q mU i) expression point U I+1With a Q mU iIn three-dimensional Euclidean distance,
Figure BDA000023682437000413
Expression point U iWith the point
Figure BDA000023682437000414
In three-dimensional Euclidean distance; Error threshold gets 3, and error function F is interior point less than the match point of this threshold value, and point set in point adds in this, otherwise be exterior point;
(IV) interior point set and sample set S mForm consistent collection If
Figure BDA00002368243700052
In in count more than or equal to 60 percent of coupling angle point sum, then think to have obtained correct projective transformation matrix, and will
Figure BDA00002368243700053
As sample set substitution relational expression (16), adopt least square method to recomputate New model Q ' mAs the correction model of projective transformation matrix, if
Figure BDA00002368243700054
In in count less than 60 percent of coupling angle point sum, then algorithm failure makes Q m=0;
(V) if
Figure BDA00002368243700055
Make m=m+1, continue to carry out successively (II) to (V); If
Figure BDA00002368243700056
The iterative process of then sampling finishes; After sampling, obtain the new model set
Figure BDA00002368243700057
In choose the consistent collection of sample number maximum
Figure BDA00002368243700058
As the matching angle point set after optimizing, the corresponding model Q ' that calculates MaxAs final transformation matrix Q, algorithm finishes.
Step 8: the matching angle point set after the optimization that employing frame difference method treatment step 7 obtains, calculate the motion state of vehicle: the coupling angle point behind the consistent RANSAC algorithm optimization of extraction stochastic sampling is done the poor pixel displacement that obtains at the coordinate of i frame, i+1 two field picture with the matching angle point coordinate in i frame, the i+1 two field picture.In conjunction with the camera interior and exterior parameter that is obtained by step 2, calculate the quantity of state of vehicle under the vehicle carrier coordinate system.
Described frame difference method is calculated state of motion of vehicle, and the concrete grammar of employing is as follows:
1. the matching angle point set sample size of establishing behind the consistent RANSAC algorithm optimization of stochastic sampling is
Figure BDA00002368243700059
Initialization sample number variable j makes j=1;
2. it is poor that after will optimizing j coupling angle point done at the coordinate of i+1 frame and i two field picture, gets j coupling angle point after the optimization at shooting i frame and the pixel displacement Δ p (u in the i+1 two field picture time interval I, j, v I, j), wherein, u I, j, v I, jRespectively u axle, the v axial coordinate of the coupling angle point of j behind the consistent RANSAC algorithm optimization of stochastic sampling in the i two field picture in the i two field picture;
3. the camera parameter demarcated of integrating step 2 is with pixel displacement Δ p (u I, j, v I, j) carrying out the coordinate inverse, j match point after being optimized taken i frame and the physical displacement in the i+1 two field picture time interval;
4. video camera is taken the i two field picture within the time interval of taking the i+1 two field picture, each parameter of the one group of travel condition of vehicle that is calculated by j match point after optimizing:
v i , j = ΔP ( x i , j , y i , j ) Δt = P ( x i + 1 , j , y i + 1 , j ) - P ( x i , j , y i , j ) t i + 1 - t i
v x → i , j = Δ x i , j Δt = x i + 1 , j - x i , j t i + 1 - t i - - - ( 18 )
v y → i , j = Δ y i , j Δt = y i + 1 , j - y i , j t i + 1 - t i
Wherein, arctan () represents arctan function, t i, t I+1Dividing is the moment of taking i frame, i+1 two field picture, P (x I, j, y I, j), P (x I+1, j, y I+1, j) be respectively by j the t that the coupling angle point is obtained after optimizing iThe moment and t I+1The physical coordinates of moment vehicle, Δ P (x I, j, y I, j) be by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The physical displacement of moment vehicle, x I, j, x I+1, jRespectively by j the t that the coupling angle point is obtained after optimizing iThe moment and t I+1The ordinate of orthogonal axes value of moment vehicle, y I, j, y I+1, jRespectively by j the t that the coupling angle point is obtained after optimizing iThe moment and t I+1The transverse axis coordinate figure of moment vehicle, Δ x I, j, Δ y I, jBe respectively by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The constantly longitudinal axis, the transverse axis displacement of vehicle, v I, jBe by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The instantaneous velocity of moment vehicle,
Figure BDA000023682437000513
Be respectively by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The longitudinal axis, the X direction speed of moment vehicle;
If 5.
Figure BDA00002368243700061
Then make j=j+1, continue to carry out successively above-mentioned 2. arriving 5.; If
Figure BDA00002368243700062
Then iterative process finishes.
Above-mentioned frame difference method is obtained
Figure BDA00002368243700063
The group vehicle movement parameter do on average, and with average result as vehicle from t iThe time be carved into t I+1Kinematic parameter constantly:
v i = Σ j = 1 j ^ v i , j j ^ v x → i = Σ j = 1 j ^ v x → i , j j ^ (19)
v y → i = Σ j = 1 j ^ v y → i , j j ^ β i = arctan ( v y → i v x → i )
Wherein, v iFrom t iThe time be carved into t I+1The instantaneous velocity of moment vehicle,
Figure BDA00002368243700068
Respectively from t iThe time be carved into t I+1The longitudinal axis, the X direction speed of moment vehicle, β iFrom t iThe time be carved into t I+1The side slip angle of moment vehicle.
Step 9: the pavement image that will obtain in chronological order successively repeating step 3 to step 9 recursion calculates, even image sequence pointer i=i+1, repeating step 3 to step 9 is calculated, and obtains vehicle at each longitudinal velocity, transverse velocity and side slip angle constantly.
Beneficial effect
1. the state of motion of vehicle method of estimation of the optical flow-based that proposes of the present invention has adopted efficient angular-point detection method, matching tracking method and the consistent optimized algorithm of stochastic sampling etc., has that real-time is good, measuring accuracy is than advantages of higher.
2. the image pre-processing method that adopts of the state of motion of vehicle method of estimation of the optical flow-based that proposes of the present invention has reduced the adverse effect that road conditions and illumination variation are brought, and has strengthened the ambient adaptability of algorithm.
3. the state of motion of vehicle method of estimation of the optical flow-based of the present invention's proposition can more accurately record the important state variables such as vertical speed of a motor vehicle, the horizontal speed of a motor vehicle and side slip angle, and these state parameters can be used as the input signal of the systems such as vehicle active safety, integrated positioning and detection of obstacles.
Description of drawings
Fig. 1 is process flow diagram of the present invention;
Fig. 2 is the present frame original image that video camera obtains in the test;
Fig. 3 is the pretreated current frame image of image in the test;
Fig. 4 is the angle point that Harris Harris angular-point detection method detects at current frame image;
Fig. 5 is Lucas-Ke corners Matching figure that to be moral Lucas-Kanade corners Matching method trace at current frame image and next frame image;
Fig. 6 is current frame image behind the consistent RANSAC algorithm optimization of stochastic sampling and the corners Matching figure of next frame image.
Embodiment
State estimation in the Vehicle Driving Cycle process, to realize one of autonomous driving and auxiliary gordian technique of driving, be intended to determine the important state variables such as longitudinal velocity, transverse velocity and side slip angle of vehicle under the transport condition, to vehicle anti-lock brake system (ABS), vehicle electric stability program (ESP), the active safety control systems such as four-wheel steering stabilizing control system (4WS) have great actual application value.These active safety control system can work depend on real-time, the Obtaining Accurate of vehicle-state parameter to a great extent.
At present, in the automobile active safety field, the method for estimation of the motion state of vehicle mainly can be divided three classes.One class is indirect measurement, namely utilizes the output information of existing onboard sensor such as inertial navigation system INS, wheel speed sensors etc., calculates to obtain the correlation behavior of vehicle by simple mathematics.But As time goes on the deviation of sensor can progressively accumulate, and causes the estimated accuracy of vehicle-state to reduce.Equations of The Second Kind is direct measurement, namely utilize high-precision equipment such as high precision global position system GPS etc. directly to obtain the motion state parameters of vehicle, this method measuring accuracy is high, but expensive, and GPS in tunnel that signal is blocked, the effective output information of stereo region, bearing accuracy also can reduce.The 3rd class is based on the method for auto model, namely carry out kinematics or Dynamic Modeling by the motion process to vehicle, simultaneously with corresponding onboard sensor (such as wheel speed sensors, gyroscope, accelerometer etc.) information as observation information, and then utilize suitable filtering algorithm for estimating to realize estimation to motoring condition.This method need to be to vehicle whole even each tire carry out respectively modeling, unpunctual when these models or model parameter, evaluated error is larger.
The present invention is directed under the road traffic environment with 0 ~ 15(km/hour) travel at the vehicle of smooth bituminous pavement, propose a kind of state of motion of vehicle measuring method of optical flow-based, the method has that cost is low, real-time good, adaptive capacity to environment is strong, precision is than advantages of higher.Do following reasonable assumption:
The bituminous pavement level of approximation of (1) travelling;
(2) think that vehicle is straight-line within the time interval of taking the consecutive frame image;
(3) think that light intensity is constant within the time interval of taking the consecutive frame image.
In conjunction with process flow diagram shown in Figure 1, thinking of the present invention is described further:
Step 1: in the vehicle rear axle center high precision monocular-camera is installed, its optical axis is downward perpendicular to vehicle operating plane and directed towards ground, and 20 ~ 40 centimetres of vertical heights are used for obtaining the pavement image sequence of variation.Definition vehicle carrier coordinate system, the initial point o of vehicle carrier coordinate system is positioned at vehicle barycenter place, and the x axle is along the longitudinal axis of vehicle and consistent with vehicle forward direction, and the z axle is downward perpendicular to vehicle operating plane and directed towards ground, and it is definite that the y axle press the right-handed helix rule.Definition pavement image coordinate system, the initial point o of pavement image coordinate system pBe the summit in the image lower left corner, longitudinal axis u is parallel to vehicle carrier x axle, and transverse axis v is parallel to vehicle carrier y axle.
Step 2: video camera is demarcated, to obtain the inside and outside parameter of video camera.For the real-time that improves whole algorithm flow and guarantee stated accuracy, that the present invention quotes is classical camera calibration method [but list of references: Z Zhang.Aflexible new technique for camera calibration[J] the .IEEE Transactions on Pattern Analysis andMachine Intelligence of Zhang Zhengyou, 2000,22 (11): 1330-1334].The classical calibration algorithm of Zhang Zhengyou by to scaling board different directions (more than three times) complete taking pictures repeatedly, need not know the mode of motion of scaling board, just can realize the demarcation to video camera.
According to actual needs, the used scaling board of the present invention is to utilize plotting apparatus to draw 4 * 4 at an A4 blank sheet of paper evenly to distribute and equirotal black squares, and it is sticked to makes on the duroplasts thin plate.The black squares length of side is 30mm, and spacing 20mm, view picture scaling board have 64 square vertices (being angle point).For simplicity, at timing signal, world coordinate system OXYZ directly is based upon on the scaling board, initial point O is positioned on the square vertices in the lower left corner of scaling board, and X, Y-axis are parallel to respectively foursquare two limits, determine Z axis by right hand rule.The Z axis coordinate of such 64 angle points is 0(for easy, and the Z axis coordinate of point is ignored), X, Y coordinate also can be determined easily.Any point of the resulting X-Y scheme picture point of video camera coordinate in image coordinate system is p (u, v), and the relation table of putting on the scaling board between P (x, y) and its subpoint p (u, v) is shown:
a u v 1 = aA [ r cam 1 , r cam 2 , r cam 3 , t cam ] x y 0 1 = aA [ r cam 1 , r cam 2 , t cam ] x y 1 - - - ( 1 )
Wherein, a is any dimension scale, and the present invention is without convergent-divergent, thus a=1, [u, v, 1] TBeing the augmented matrix that p (u, v) consists of, is the spot projection homogeneous vectors that the coordinate of corresponding point consists of to the plane of delineation on the scaling board plane, [x, y, 0,1] TThe augmented matrix of P (x, y), [x, y, 1] TThe homogeneous vectors that scaling board Plane-point P (x, y) consists of, That video camera is with respect to 3 * 3 rotation matrixs of world coordinate system, t CamBe video camera with respect to 3 * 1 translation vectors of world coordinate system, A = α γ o u 0 β o v 0 0 1 Matrix is the intrinsic parameters of the camera matrix, because the optical axis of video camera is perpendicular to ground, so (o u, o v) be optical axis at the vertical projection point of the plane of delineation coordinate at image coordinate system, α, β are the normal vectors of image u axle, v axle, γ is the verticality of two coordinate axis.
According to the character of rotation matrix, know that the mould value of the rotation matrix different lines orthogonal quadrature of vector and column vector all is 1, namely
Figure BDA00002368243700082
Among the present invention, subscript -1All inverse operations of representing matrix, the transposition of the equal representing matrix of subscript T; Every width of cloth image can obtain following two basic constraint conditions to Intrinsic Matrix:
b 1 T(A -1) TA -1b 2=0 (2)
b 1 T(A -1) TA -1b 1=b 2 T(A -1) TA -1b 2 (3)
Order B = b 11 b 12 b 13 b 21 b 22 b 23 b 31 b 32 b 33 = [ b 1 , b 2 , b 3 ] = aA [ r cam 1 , r cam 2 , t cam ] , B is 3 * 3 projection matrix, makes 3 * 3 matrixes:
D = ( A - 1 ) T A - 1 = d 11 d 12 d 13 d 12 d 22 d 23 d 13 d 23 d 33 = 1 α 2 - γ α 2 β v 0 γ - u 0 β α 2 β - γ α 2 β γ 2 α 2 β 2 + 1 β 2 - γ ( v 0 γ - u 0 β ) α 2 β 2 - v 0 β 2 v 0 γ - u 0 β α 2 β - γ ( v 0 γ - u 0 β ) α 2 β 2 - v 0 β 2 ( v 0 γ - u 0 β ) 2 α 2 β 2 + v 0 2 β 2 + 1
Because of D T=D is symmetrical matrix, then can be transformed into 6 dimensional vectors that an element rearranges
Figure BDA00002368243700085
Form is as follows:
D ~ = [ d 11 , d 12 , d 22 , d 13 , d 23 , d 33 ] T - - - ( 4 )
Then the product representation of rotation matrix the 1st row, the 2nd column vector is as follows:
b 1 T ( A - 1 ) T A - 1 b 1 = b 1 T D b 1 = v 11 T D ~
b 1 T ( A - 1 ) T A - 1 b 2 = b 1 t D b 2 = v 12 T D ~ - - - ( 5 )
b 2 T ( A - 1 ) T A - 1 b 2 = b 2 T D b 2 = v 22 T D ~
v 11=[b 11b 11, b 11b 12+ b 12b 11, b 12b 12, b 13b 11+ b 11b 13, b 13b 12+ b 12b 13, b 13b 13] TWherein, v 12=[b 11b 21, b 11b 22+ b 12b 21, b 12b 22, b 13b 21+ b 11b 23, b 13b 22+ b 12b 23, b 13b 23] T, last intrinsic parameter
v 22=[b 21b 21, b 21b 22+ b 22b 21, b 22b 22, b 23b 21+ b 21b 23, b 23b 22+ b 22b 23, b 23b 23] TEquation of constraint (2), (3) are equivalent to:
v 12 T ( v 11 - v 22 ) T D ~ = 0 - - - ( 6 )
Wherein, v 12 T ( v 11 - v 22 ) T Matrix is 2 * 6 matrixes, and namely every pictures can be set up 2 system of equations, and 6 unknown numbers then need 3 pictures to find the solution the D matrix at least, can obtain camera intrinsic parameter battle array A.After Intrinsic Matrix A determines, the corresponding external parameter of every width of cloth scaling board image:
r cam 1 = A - 1 b 1 / | | A - 1 b 1 | | r cam 2 = A - 1 b 2 / | | A - 1 b 1 | | (7)
r cam 3 = r cam 1 × r cam 2 t cam=A -1b 3/||A -1b 1||
Wherein, || A -1b 1|| refer to matrix A -1b 1The mould value, In * the vectorial multiplication cross computing of expression;
Because the intrinsic parameter of video camera is determined by inner structure, irrelevant with external factor, therefore first under good indoor environment, take several (〉=3) preferably scaling board image, with inside and outside parameter obtained above as initial value and adopt the maximal possibility estimation criterion that these parameters are optimized processing.Result after the optimization is as final inside and outside parameter.
Step 3: in order to process respectively in chronological order image sequence, define image sequence pointer i, and make i=1.
Step 4: get i frame pavement image and i+1 frame pavement image, respectively i frame pavement image and i+1 frame pavement image are carried out the image pre-service.Fig. 2 knows that by Fig. 2 the original image angle point is not obvious, characteristic information is single for the present frame on the Vehicle Driving Cycle road surface that video camera in the test obtains is i frame original image.In order to highlight the bituminous pavement Corner Feature, reduce the adverse effect that road conditions and illumination variation are brought, the sequence image that obtains is carried out image enhancement processing.Histogram equalization is the important algorithm of image enhancement technique, have a wide range of applications in fields such as compression of images, image segmentation and image recognitions, can change targetedly histogrammic intensity profile situation, be distributed in whole tonal range with making uniform gray level, and the histogram equalization techniques processing speed is fast, satisfies the requirement of processing in real time.
Described histogram equalization comprises: grey level histogram function and the equalization processing of computed image.
The grey level histogram function of described image adopts following methods to calculate:
At first with image gray processing, and with the gray-scale value normalization of all pixels of image, calculate the grey level histogram function of this image:
pro ( r k ) = n k n , k = 0,1,2 . . . 255,0 ≤ r k ≤ 1 - - - ( 8 )
Wherein, n is total number of image pixels, r kBe k gray level, n kGray level is r in the image for this reason kNumber of pixels, pro (r k) expression gray level r kThe probability that occurs; In rectangular coordinate system, r kWith pro (r k) graph of a relation be grey level histogram.This grey level histogram has reflected the statistical relationship between each gray level and its frequency of occurrences in the image, has described average light and shade and the contrast situation of entire image.
The gray accumulation distribution function of described equalization processing employing image is mapped to wider, a better unified distribution with the intensity profile of image:
s k = Σ l = 0 k pro ( r l ) , 0 ≤ r l ≤ 1 , l = 0,1,2 . . . k , k = 0,1,2 , . . . , 255 - - - ( 9 )
Wherein, s kBe k gray level cumulative distribution function, r lBe l gray level; Gray level after the conversion:
r′ k=INT[(L-1)s k+0.5],k=0,1,2,…,255 (10)
Wherein, L=256, INT are bracket function, r ' kR kGray level after conversion.The pretreated effect of the original pavement image of present frame in the test is compared with Fig. 2 as shown in Figure 3, and the contrast of image strengthens, and Corner Feature is more obvious, more is conducive to follow-up Corner Detection.
Step 5: the pretreated i two field picture of image is carried out Harris Harris Corner Detection, and to utilize Lucas-Ke be that moral Lucas-Kanade optical flow method coupling is followed the tracks of Harris Harris algorithm at the coordinate of detected all angle points of i two field picture in the i+1 two field picture.
Angle point claims that again unique point is a kind of important local feature of image, can reflect many information of image outline, also is the key factor that impact detects effect.The most frequently used angular-point detection method has Harris Harris Corner Detection Algorithm, Susan SUSAN algorithm, Tuscany Canny operator Corner Detection etc. at present.Wherein Susan SUSAN algorithm anti-noise ability is strong, blurs but work as image target edge, and angle point extracts out of true; Tuscany Canny has the advantage that signal to noise ratio (S/N ratio) is large, accuracy of detection is high, but easily loses edge details; Harris Harris algorithm has unchangeability to translation and the rotation of gradation of image, and processing speed is fast.
It is exactly " light stream " that the displacement association that the angle point that detects forms between the two continuous frames image converts velocity correlation to, it is the instantaneous velocity of the pixel motion of space motion object on the observation imaging surface, can the time domain by the pixel intensity data in the image sequence change and correlativity be determined separately " motion " of location of pixels the sports ground that can not directly obtain with approximate treatment from sequence image.
It is moral Lucas-Kanade matching algorithm etc. that optical flow method tracking and matching method commonly used has the dense light stream matching method of Huo En-Shu Sike Horn-Schunck, eigentransformation method SIFT, Lucas-Ke that yardstick is constant.In all multi-methods, Huo En-dense optical flow method of Shu Sike Horn-Schunck belongs to the computing method of global optical flow, can obtain accurate light stream and estimate, but real-time is relatively poor; Though the eigentransformation method SIFT algorithm that yardstick is constant has plurality of advantages such as rotation, yardstick convergent-divergent, brightness variation remain unchanged, algorithm is complicated, and processing speed far can not satisfy the requirement of real-time estimation vehicle movement parameter; Lucas-Ke is that moral Lucas-Kanade algorithm is one of important method of asking sparse optical flow, and this algorithm only needs the angle point local message of wicket on every side, so speed is applicable to brightness and changes not quite the occasion that interframe movement is little and coherent.
In sum, Harris Harris Corner Detection Algorithm and Lucas-Ke be moral Lucas-Kanade matching algorithm can satisfy under the road traffic environment with 0 ~ 15(km/hour) low speed driving real-time and the accuracy requirement of measuring at the vehicle movement parameter of smooth bituminous pavement, the present invention adopts Harris Haris algorithm to seek the angle point of i two field picture then, utilizing Lucas-Ke is the coordinate that moral Lucas-Kanade light stream matching algorithm is followed the tracks of angle point Corresponding matching point in the i+1 two field picture of i frame, and then obtains over the ground motion state parameters of vehicle.
If certain a bit towards either direction the acute variation that less skew can both cause gray scale occurs, Harris Harris algorithm just thinks that this point is angle point, and the gray scale function of image at any point (u 0, v 0) locate along one party to the grey scale change amount be defined as:
E(u 0,v 0)=∑[I(u 0+Δu,v 0+Δv)-I(u 0,v 0)] 2,(u 0+Δu,v 0+Δv)∈w(u 0,v 0,r ad) (11)
Wherein, E (u 0, v 0) be any point (u in the i two field picture 0, v 0) the grey scale change amount, Δ u, Δ v are respectively that pixel is along u axle, the axial displacement of v, I (u 0, v 0) be point (u 0, v 0) gray-scale value, I (u 0+ Δ u, v 0+ Δ v) is point (u 0+ Δ u, v 0+ Δ v) gray-scale value, window function w (u 0, v 0, r Ad)={ (u, v) | (u-u 0) 2+ (v-v 0) 2≤ r Ad 2With any point (u 0, v 0) centered by, r AdA circular window for radius; To I (u 0+ Δ u, v 0+ Δ v) carry out Taylor Taylor series expansion, ignore high-order term and get:
I(u 0+Δu,v 0+Δv)≈I(u 0,v 0)+Δu·I u(u 0,v 0)+Δv·I v(u 0,v 0) (12)
Wherein, I u ( u 0 , v 0 ) = ∂ I ( u 0 , v 0 ) ∂ u , I v ( u 0 , v 0 ) = ∂ I ( u 0 , v 0 ) ∂ v Respectively I (u 0, v 0) in u axle, the axial gradient of v; Can be got by above two formulas:
E ( u 0 , v 0 ) = [ Δu , Δv ] M Δu Δv , M = Σ I 2 u ( u 0 , v 0 ) I u ( u 0 , v 0 ) I v ( u 0 , v 0 ) I u ( u 0 , v 0 ) I v ( u 0 , v 0 ) I 2 v ( u 0 , v 0 ) - - - ( 13 )
Wherein, matrix M is any point (u 0, v 0) autocorrelation matrix of video in window on every side, for a width of cloth picture, autocorrelation function has characterized the intensity of variation of topography's gray scale, processes the offspring and enters relational expression (13) so symmetrical matrix M is carried out similarity diagonalization:
E ( u 0 , v 0 ) = R - 1 λ 1 0 0 λ 2 - - - ( 14 )
Wherein, R is twiddle factor, R -1Be the inverse matrix of R, λ 1, λ 2Be the eigenwert of M battle array, reflected the imaging surface curvature of 2 major axes orientations of angle point.When 2 eigenwerts that and if only if are all large, all will cause violent grey scale change along the skew of any direction, the angle point response function CRF that satisfies Harris Harris definition is expressed as:
Described angle point response function CRF adopts following method to obtain:
CRF(u 0,v 0)=det(M)-η·[trace(M)] 2 (15)
Wherein, CRF (u 0, v 0) be any point (u 0, v 0) the angle point response function, det (M)=λ 1λ 2, the determinant of expression Metzler matrix, trace (M)=λ 1+ λ 2, the mark of representing matrix M, scale factor η are empirical value, η=0.04.When the CRF of point surpasses given threshold value (usually getting 18), think that this point is exactly angle point, and obtain the pixel coordinate of angle point.Harris Haris algorithm is red mark point in angle point that current frame image detects such as Fig. 4.
Then, utilizing Lucas-Ke is that moral Lucas-Kanade optical flow method coupling is followed the tracks of Harris Harris at the coordinate of detected all angle points of i two field picture in the i+1 two field picture.Lucas-Ke is that moral Lucas-Kanade is a kind of light stream algorithm for estimating of two frame differences, and this algorithm is based on following three hypothesis:
(1) brightness constancy.The pixel of target remains unchanged when adjacent interframe movement in appearance in the image scene.Its brightness does not change when being followed the tracks of frame by frame for gray level image hypothesis pixel;
(2) Time Continuous or motion are " little motions ".The ratio that the time that refers in the practical application changes relative image motion is enough little, and target is just smaller in the motion of interframe like this;
(3) space is consistent.In scene on the same surface contiguous point have similar motion, the projection on the plane of delineation is also in adjacent domain.
The arbitrary angle point G of Harris Harris algorithm in detected all angle points of i two field picture is (u at the coordinate of i two field picture i, v i), Lucas-Ke of described angle point G is that the tracking of moral Lucas-Kanade optical flow method coupling may further comprise the steps:
At first, set up equation of constraint between i frame and the i+1 two field picture:
I(u i+1,v i+1,t i+1)=I(u i+Δu i,v i+Δv i,t i+Δt)=I(u i,v i,t i) (16)
Wherein, t i, t I+1Dividing is the moment of taking i frame and i+1 two field picture, time interval Δ t=t I+1-t i, the time interval is relevant with the image update frequency of obtaining, the representative value of Δ t has 1/30(second), 1/25(second), 1/15(second) etc., Δ u i, Δ v iRespectively that angle point G is from t iThe time be carved into t I+1U axle constantly, the displacement of v axle, (u i, v i) be that angle point G is at the coordinate of i two field picture, (u I+1, v I+1) be angle point G at the coordinate of i+1 two field picture, I (u i, v i, t i) be t iThe gray-scale value of moment angle point G, I (u I+1, v I+1, t I+1) be t I+1The gray-scale value of moment angle point G.
Then, I (u i+ Δ u i, v i+ Δ v i, t i+ Δ t) carry out Taylor Taylor series expansion, ignore high-order term and get:
I(u i+Δu i,v i+Δv i,t i+Δt)≈I(u i,v i,t i)+I u·Δu i+I v·Δv i+I t·Δt (17)
Wherein,
Figure BDA00002368243700113
Gradation of image u direction of principal axis partial derivative,
Figure BDA00002368243700114
Gradation of image v direction of principal axis partial derivative,
Figure BDA00002368243700121
That gradation of image is to the partial derivative of time t;
Can be got by relational expression (16), (17):
I u · Δ u i Δt + I v · Δ v i Δt + I t = 0 - - - ( 18 )
Then, set up t iNear the system equation of 5 * 5 neighborhood territory pixels the moment angle point G, set up following system equation group:
Hc=g (19)
Wherein, H = I u ( u i , 1 , v i , 1 , t i ) , I v ( u i , 1 , v i , 1 , t i ) I u ( u i , 2 , v i , 2 , t i ) , I v ( u i , 2 , v i , 2 , t i ) . . . . . . I u ( u i , 25 , v i , 25 , t i ) , I v ( u i , 25 , v i , 25 , t i ) , c = Δ u i Δt Δ v i Δt , g = - I t ( u i , 1 , v i , 1 , t i ) - I t ( u i , 2 , v i , 2 , t i ) . . . - I t ( u i , 25 , v i , 25 , t i ) ,
25 * 2 the matrix that H is comprised of the shade of gray of 5 * 5 neighborhood territory pixels of angle point G, c is 2 * 1 the column vector that light stream forms by the displacement between adjacent two two field pictures, 25 * 1 the column vector that g is comprised of gray scale partial derivative in time;
At last, adopt least square method to ask the least square solution of system equation, the i.e. motion of central point G;
Transposition on equation (19) both sides with premultiplication H battle array:
H THc=H Tg (20)
The light stream of the i.e. front and back frame formation of the solution of equation is as follows:
Δ u i Δt Δ v i Δt = ( H T H ) - 1 H T g - - - ( 21 )
Angle point G is at the coordinate (u of i+1 two field picture I+1, v I+1) adopt following formula to obtain:
u i + 1 v i + 1 = Δ u i Δt Δ v i Δt Δt + u i v i - - - ( 22 )
Lucas-Ke be moral Lucas-Kanade algorithm tracking effect as shown in Figure 5, the first half of Fig. 5 is the present frame pavement image, the latter half is the next frame pavement image, the point that red straight line links to each other be exactly Lucas-Ke be the coupling angle point of moral Lucas-Kanade algorithm keeps track.
Step 6: utilize the angle point that mates in the consistent RANSAC algorithm optimization of the stochastic sampling step 5.Corners Matching is finished the end that can not indicate algorithm, because process Lucas-Ke is after moral Lucas-Kanade algorithm carries out the angle point initial matching, mainly exist two impacts to mate the factors of correctness and precision: the initial matching that is on the one hand angle point can not guarantee fully that resulting angle point is correct, is from the corner location of image extraction and not exclusively accurate on the other hand.So the present invention has introduced the larger angle point of matching error in the consistent RANSAC algorithm of the stochastic sampling of the robust filtering step 5, this algorithm is the sample data collection that comprises abnormal data according to a group, calculates the mathematical model parameter of data, obtains the effective sample data.By practical situation of the present invention, the effective sample data are " the interior point " after the optimization, can utilize the inherent restriction relation of matching angle point set to reject wrong and the larger coupling angle point of error, eliminate " exterior point " thus impact guarantee precision, improve the quality of mating angle point and the robustness of algorithm.The consistent RANSAC algorithm of stochastic sampling specifically adopts following steps:
At first, set up the model Q of the projective transformation matrix between i frame and i+1 two field picture:
ξU i+1=QU i (23)
Wherein, U i + 1 = u i + 1 v i + 1 1 , U i = u i v i 1 , Q = q 0 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 , U i, U I+1Be respectively arbitrary coupling angle point G in the homogeneous vectors of the coordinate formation of i frame and i+1 two field picture, Q is the projective transformation matrix between i frame and i+1 two field picture, and ξ is scale factor, gets ξ=1; Q wherein 0, q 1, q 3, q 4The matrix that consists of q 0 q 1 q 3 q 4 Expression is carried out convergent-divergent, rotation, symmetry, wrong contact transformation, q to image 6, q 7Vector [the q that consists of 6q 7] represent image is carried out translation transformation, q 2, q 5The vector that consists of q 2 q 5 Expression is carried out projective transformation, q to image 8Expression is carried out stretching to image; Q has 8 degree of freedom, as long as select 4 pairs of coupling angle points just can draw 8 linear equations, by self constraint condition of projective transformation matrix Q, forms 9 equations and namely estimates Q, estimating step again:
(I) initialization iterations variable m=1, and establish the iterations maximal value
Figure BDA00002368243700136
(II) chooses 4 pairs of coupling angle points at random coupling angle point centering, as a sample set S m, the sample complementary set after the extraction is
Figure BDA00002368243700137
With sample set S mIn some substitution relational expression (23) try to achieve initial Q m
(III) utilizes complementary set In the U that consists of at the coordinate of i frame of arbitrary coupling angle point iWith the Q that is obtained by (II) mObtain error function: F = d 2 Euc ( U i + 1 , Q m U i ) + d 2 Euc ( U i , Q m - 1 U i + 1 ) - - - ( 24 )
Wherein,
Figure BDA000023682437001310
Q mInverse matrix, d Euc(U I+1, Q mU i) expression point U I+1With a Q mU iIn three-dimensional Euclidean distance,
Figure BDA000023682437001311
Expression point U iWith the point In three-dimensional Euclidean distance; Error threshold gets 3, and the error function value is regarded as interior point less than the match point of this threshold value, and point set in point adds in this, otherwise be exterior point;
(IV) interior point set and sample set S mForm consistent collection
Figure BDA000023682437001313
If
Figure BDA000023682437001314
In in count more than or equal to 60 percent of coupling angle point sum, then think to have obtained correct projective transformation matrix, and will
Figure BDA000023682437001315
As sample set substitution relational expression (23), adopt least square method to recomputate New model Q ' mAs the correction model of projective transformation matrix, if In in count less than 60 percent of coupling angle point sum, then algorithm failure makes Q m=0;
(V) if
Figure BDA000023682437001317
Make m=m+1, continue to carry out successively (II) to (V); If
Figure BDA000023682437001318
The iterative process of then sampling finishes; After sampling, obtain the new model set In choose the consistent collection of sample number maximum As the matching angle point set after optimizing, the corresponding model Q ' that calculates MaxAs final transformation matrix Q, algorithm finishes.
Effect behind the consistent RANSAC algorithm optimization of stochastic sampling as shown in Figure 6, red straight line links to each other among Fig. 62 be match point after optimizing, rejected apparent error match point and the larger point of matching error among Fig. 5.So the high reliability of the matching angle point set after optimizing and pin-point accuracy have further improved the precision that vehicle movement parameter is estimated in real time.
Step 7: the matching angle point set after the optimization that employing frame difference method treatment step six obtains, the motion state of calculating vehicle.Coupling angle point behind the consistent RANSAC algorithm optimization of extraction stochastic sampling is done the poor pixel displacement that obtains at the coordinate of i frame, i+1 two field picture with the matching angle point coordinate in i+1 frame, the i two field picture.In conjunction with the camera interior and exterior parameter that is obtained by step 2, calculate the quantity of state of vehicle under the vehicle carrier coordinate system.
Described frame difference method is calculated state of motion of vehicle, and the concrete grammar of employing is as follows:
1. the matching angle point set sample size of establishing behind the consistent RANSAC algorithm optimization of stochastic sampling is
Figure BDA00002368243700141
Initialization sample number variable j makes j=1;
2. j the coupling angle point that extracts behind the consistent RANSAC algorithm optimization of stochastic sampling is respectively p (u at the coordinate of i+1 frame and i two field picture I+1, j, v I+1, j) and p (u I, j, v I, j), wherein, u I, j, v I, jRespectively u axle, the v axial coordinate of the coupling angle point of j behind the consistent RANSAC algorithm optimization of stochastic sampling in the i two field picture in the i two field picture, with p (u I+1, j, v I+1, j) and p (u I, j, v I, j) homogeneous vectors [u that consists of respectively I+1, j, v I+1, j, 1] T, [u I, j, v I, j, 1] TSubstitution relational expression (1) is carried out the coordinate inverse:
x i + 1 , j y i + 1 , j 1 = ( A [ r cam 1 , r cam 1 , t cam ] ) - 1 u i + 1 , j v i + 1 , j 1 , x i , j y i , j 1 = ( [ r cam 1 , r cam 1 , t cam ] ) - 1 u i , j v i , j 1 - - - ( 25 )
Wherein, [x I+1, j, y I+1, j, 1] T, [x I, j,y I, j, 1] TRespectively that j coupling angle point after optimizing is in the homogeneous vectors of the physical coordinates formation of i+1 frame and i two field picture;
3. video camera is taken the i two field picture within the time interval of taking the i+1 two field picture, by j after optimizing each parameter of mating one group of travel condition of vehicle that angle point calculates in real time:
v i , j = ΔP ( x i , j , y i , j ) Δt = P ( x i + 1 , j , y i + 1 , j ) - P ( x i , j , y i , j ) t i + 1 - t i
v x → i , j = Δ x i , j Δt = x i + 1 , j - x i , j t i + 1 - t i - - - ( 26 )
v y → i , j = Δ y i , j Δt = y i + 1 , j - y i , j t i + 1 - t i
Wherein, arctan () represents arctan function, t i, t I+1Dividing is the moment of taking i frame, i+1 two field picture, P (x I, j, y I, j), P (x I+1, j, y I+1, j) be respectively by j after optimizing mate that angle point obtains at t iThe moment and t I+1The coordinate of moment vehicle, Δ P (x I, j, y I, j) be by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The constantly displacement of vehicle, x I, j, x I+1, jBe respectively by j after optimizing mate that angle point obtains at t iThe moment and t I+1The ordinate of orthogonal axes value of moment vehicle, y I, j, y I+1, jBe respectively by j after optimizing mate that angle point obtains at t iThe moment and t I+1The transverse axis coordinate figure Δ x of moment vehicle I, j, Δ y I, jBe respectively by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The constantly longitudinal axis, the transverse axis displacement of vehicle, v I, jBe by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The instantaneous velocity of moment vehicle, Be respectively by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The longitudinal axis, the X direction instantaneous velocity of moment vehicle;
If 4.
Figure BDA00002368243700147
Then make j=j+1, continue to carry out successively above-mentioned 2. arriving 4.; If
Figure BDA00002368243700148
Then iterative process finishes.
Above-mentioned frame difference method is obtained
Figure BDA00002368243700149
The group vehicle movement parameter is done on average, and with average result as from t iThe time be carved into t I+1The kinematic parameter of moment vehicle:
v i = Σ j = 1 j ^ v i , j j ^ v x → i = Σ j = 1 j ^ v x → i , j j ^ - - - ( 27 )
v y → i = Σ j = 1 j ^ v y → i , j j ^ β i = arctan ( v y → i v x → i )
Wherein, v iFrom t iThe time be carved into t I+1The instantaneous velocity of moment vehicle,
Figure BDA00002368243700155
Respectively from t iThe time be carved into t I+1Vertical, the transverse velocity of moment vehicle, β iFrom t iThe time be carved into t I+1The side slip angle of moment vehicle;
Step 8: the pavement image that will obtain in chronological order successively repeating step three to step 8 recursion calculates, even image sequence pointer i=i+1, repeating step three to step 8 is calculated, and obtains vehicle at each longitudinal velocity, transverse velocity and side slip angle constantly.

Claims (1)

1. the state of motion of vehicle method of estimation of an optical flow-based is characterized in that, comprises following steps:
Step 1: in the vehicle rear axle center high precision monocular-camera is installed, its optical axis is downward perpendicular to vehicle operating plane and directed towards ground, and 20 ~ 40 centimetres of vertical heights are used for obtaining the pavement image sequence of variation; Definition vehicle carrier coordinate system, the initial point o of vehicle carrier coordinate system is positioned at vehicle barycenter place, and the x axle is along the longitudinal axis of vehicle and consistent with vehicle forward direction, and the z axle is downward perpendicular to vehicle operating plane and directed towards ground, and it is definite that the y axle press the right-handed helix rule; Definition pavement image coordinate system, the initial point o of pavement image coordinate system pBe the summit in the image lower left corner, longitudinal axis u is parallel to vehicle carrier x axle, and transverse axis v is parallel to vehicle carrier y axle;
Step 2: adopt classical camera calibration method to carry out camera calibration, to obtain the inner parameter of video camera: optical axis is at the vertical projection point of the plane of delineation coordinate (o at image coordinate system u, o v), u axle normal vector α on the image, v axle normal vector β on the image, verticality γ and the external parameter of u, v coordinate axis: video camera is with respect to 3 * 3 rotation matrix r of world coordinate system Cam, video camera is with respect to 3 * 1 translation vector t of world coordinate system Cam
Step 3: initialisation image sequence pointer i makes i=1;
Step 4: get i frame pavement image and i+1 frame pavement image, respectively i frame pavement image and i+1 frame pavement image are carried out the image pre-service, described image pre-service comprises: grey level histogram function and the equalization processing of computed image;
The grey level histogram function of described image adopts following methods to calculate:
At first with image gray processing, and with the gray-scale value normalization of all pixels of image, calculate the grey level histogram function of this image:
pro ( r k ) = n k n , k = 0,1,2 . . . 255,0 ≤ r k ≤ 1 - - - ( 1 )
Wherein, n is total number of image pixels, r kBe k gray level, n kGray level is r in the image for this reason kNumber of pixels, pro (r k) expression gray level r kThe probability that occurs; In rectangular coordinate system, r kWith pro (r k) graph of a relation be grey level histogram;
Described equalization processing adopts the gray accumulation distribution function of image:
s k = Σ l = 0 k pro ( r l ) , 0 ≤ r l ≤ 1 , l = 0,1,2 . . . k , k = 0,1,2 , . . . , 255 - - - ( 2 )
Wherein, s kBe k gray level cumulative distribution function, r lBe l gray level; Gray level after the conversion
r′ k=INT[(L-1)s k+0.5],k=0,1,2,…,255 (3)
Wherein, L=256, INT are bracket function, r ' kR kGray level after conversion;
Step 5: the pretreated i two field picture of image is carried out Harris Harris Corner Detection: Harris Haris Corner Detection comprises: calculate grey scale change amount and the angle point response function of every bit in the i two field picture, and carry out angle point and judge;
Following methods is adopted in the calculating of described grey scale change amount:
E(u 0,v 0)=∑[I(u 0+Δu,v 0+Δv)-I(u 0,v 0)] 2,(u 0+Δu,v 0+Δv)∈w(u 0,v 0,r ad) (4)
Wherein, E (u 0, v 0) be any point (u in the i two field picture 0, v 0) the grey scale change amount, Δ u, Δ v are respectively point (u 0, v 0) along u axle, the axial displacement of v, I (u 0, v 0) be point (u 0, v 0) gray-scale value, I (u 0+ Δ u, v 0+ Δ v) is point (u 0+ Δ u, v 0+ Δ v) gray-scale value, window function w (u 0, v 0, r Ad)={ (u, v) | (u-u 0) 2+ (v-v 0) 2≤ r Ad 2With any point (u 0, v 0) centered by, r AdA circular window for radius; To I (u 0+ Δ u, v 0+ Δ v) carry out Taylor Taylor series expansion, ignore high-order term and get:
I(u 0+Δu,v 0+Δv)≈I(u 0,v 0)+Δu·I u(u 0,v 0)+Δv·I v(u 0,v 0) (5)
Wherein, I u ( u 0 , v 0 ) = ∂ I ( u 0 , v 0 ) ∂ u , I v ( u 0 , v 0 ) = ∂ I ( u 0 , v 0 ) ∂ v Respectively I (u 0, v 0) in u axle, the axial gradient of v; Can be got by above two formulas:
E ( u 0 , v 0 ) = [ Δu , Δv ] M Δu Δv , M = Σ I 2 u ( u 0 , v 0 ) I u ( u 0 , v 0 ) I v ( u 0 , v 0 ) I u ( u 0 , v 0 ) I v ( u 0 , v 0 ) I 2 v ( u 0 , v 0 ) - - - ( 6 )
Wherein, matrix M is any point (u 0, v 0) autocorrelation matrix of video in window on every side, symmetrical matrix M is carried out the similarity diagonalization offspring enters relational expression (6):
E ( u 0 , v 0 ) = R - 1 λ 1 0 0 λ 2 - - - ( 7 )
Wherein, R is twiddle factor, R -1Be the inverse matrix of R, λ 1, λ 2Eigenwert for the M battle array;
Described angle point response function CRF adopts following method to obtain:
CRF(u 0,v 0)=det(M)-η·[trace(M)] 2 (8)
Wherein, CRF (u 0, v 0) be any point (u 0, v 0) the angle point response function, det (M)=λ 1λ 2, the determinant of expression Metzler matrix, trace (M)=λ 1+ λ 2, the mark of representing matrix M, scale factor η are empirical value, η=0.04;
Described angle point is judged the employing following methods:
When angle point response function CRF surpasses threshold value, think that this point is exactly angle point, and obtain the pixel coordinate of angle point that threshold value gets 18;
Step 6: utilizing Lucas-Ke is that moral Lucas-Kanade optical flow method coupling is followed the tracks of Harris Harris algorithm at the coordinate of detected all angle points of i two field picture in the i+1 two field picture: the arbitrary angle point G in the i two field picture is (u at the coordinate of i two field picture i, v i), Lucas-Ke of described angle point G is that the tracking of moral Lucas-Kanade optical flow method coupling may further comprise the steps:
At first, set up equation of constraint between i frame and the i+1 two field picture:
I(u i+1,v i+1,t i+1)=I(u i+Δu i,v i+Δv i,t i+Δt)=I(u i,v i,t i) (9)
Wherein, t i, t I+1Dividing is the moment of taking i frame and i+1 two field picture, time interval Δ t=t I+1-t i, Δ t is fixed value, the representative value of Δ t has 1/30(second), 1/25(second), 1/15(second), Δ u i, Δ v iRespectively the displacement of angle point G u axle, v axle within the Δ t time, (u i, v i) be that angle point G is at the coordinate of i two field picture, (u I+1, v I+1) be angle point G at the coordinate of i+1 two field picture, I (u i, v i, t i) be t iThe gray-scale value of moment angle point G, I (u I+1, v I+1, t I+1) be t I+1The gray-scale value of moment angle point G;
Then, I (u i+ Δ u i, v i+ Δ v i, t i+ Δ t) carry out Taylor Taylor series expansion, ignore high-order term and get:
I(u i+Δu i,v i+Δv i,t i+Δt)≈I(u i,v i,t i)+I u·Δu i+I v·Δv i+I t·Δt (10)
Wherein,
Figure FDA00002368243600025
Gradation of image u direction of principal axis partial derivative,
Figure FDA00002368243600026
Gradation of image v direction of principal axis partial derivative, That gradation of image is to the partial derivative of time t;
Can be got by relational expression (9), (10):
I u · Δ u i Δt + I v · Δ v i Δt + I t = 0 - - - ( 11 )
Then, set up t iNear the system equation of 5 * 5 neighborhood territory pixels the moment angle point G, set up following system of equations:
Hc=g (12)
Wherein, H = I u ( u i , 1 , v i , 1 , t i ) , I v ( u i , 1 , v i , 1 , t i ) I u ( u i , 2 , v i , 2 , t i ) , I v ( u i , 2 , v i , 2 , t i ) . . . . . . I u ( u i , 25 , v i , 25 , t i ) , I v ( u i , 25 , v i , 25 , t i ) , c = Δ u i Δt Δ v i Δt , g = - I t ( u i , 1 , v i , 1 , t i ) - I t ( u i , 2 , v i , 2 , t i ) . . . - I t ( u i , 25 , v i , 25 , t i ) ,
25 * 2 the matrix that H is comprised of the shade of gray of 5 * 5 neighborhood territory pixels, c is 2 * 1 the column vector that light stream forms by the displacement between adjacent two two field pictures, 25 * 1 the column vector that g is comprised of gray scale partial derivative in time;
At last, adopt least square method to ask the least square solution of system equation, the i.e. motion of central point G;
Transposition on equation (12) both sides with premultiplication H battle array:
H THc=H Tg (13)
Wherein, H TIt is the transposition of H battle array; The light stream of the i.e. front and back frame formation of the solution of equation is as follows:
Δ u i Δt Δ v i Δt = ( H T H ) - 1 H T g - - - ( 14 )
Wherein, (H TH) -1It is matrix H TThe inverse matrix of H; Angle point G is at the coordinate (u of i+1 two field picture I+1, v I+1) adopt following formula to obtain:
u i + 1 v i + 1 = Δ u i Δt Δ v i Δt Δt + u i v i - - - ( 15 )
Step 7: all coupling angle points that utilize the consistent RANSAC algorithm optimization of stochastic sampling step 6 to obtain:
At first, set up the model Q of the projective transformation matrix between i frame and i+1 two field picture:
ξU i+1=QU i (16)
Wherein, U i + 1 = u i + 1 v i + 1 1 , U i = u i v i 1 , Q = q 0 q 1 q 2 q 3 q 4 q 5 q 6 q 7 q 8 , U i, U I+1Be respectively the homogeneous vectors that arbitrary coupling angle point G in all coupling angle points that step 6 obtains consists of at the coordinate of i frame and i+1 two field picture, Q is the projective transformation matrix between i frame and i+1 two field picture, and ξ is scale factor, gets ξ=1; Q wherein 0, q 1, q 3, q 4The matrix that consists of q 0 q 1 q 3 q 4 Expression is carried out convergent-divergent, rotation, symmetry, wrong contact transformation, q to image 6, q 7Vector [the q that consists of 6q 7] represent image is carried out translation transformation, q 2, q 5The vector that consists of q 2 q 5 Expression is carried out projective transformation, q to image 8Expression is carried out stretching to image; Select 4 pairs of coupling angle points to draw 8 linear equations, by self constraint condition of projective transformation battle array Q, form 9 equations and namely estimate Q, the step of estimation again:
(I) initialization iterations variable m=1, and establish the iterations maximal value
Figure FDA00002368243600042
(II) chooses 4 pairs of coupling angle points at random in all coupling angle points that step 6 obtains, as a sample set S m, the sample complementary set after the extraction is
Figure FDA00002368243600043
With sample set S mIn some substitution relational expression (16) try to achieve initial Q m
(III) utilizes complementary set In the U that consists of at the coordinate of i frame of arbitrary coupling angle point iWith the Q that is obtained by (II) mObtain error function:
F = d 2 Euc ( U i + 1 , Q m U i ) + d 2 Euc ( U i , Q m - 1 U i + 1 ) - - - ( 17 )
Wherein,
Figure FDA00002368243600046
Q mInverse matrix, d Euc(U I+1, Q mU i) expression point U I+1With a Q mU iIn three-dimensional Euclidean distance,
Figure FDA00002368243600047
Expression point U iWith the point
Figure FDA00002368243600048
In three-dimensional Euclidean distance; Error threshold gets 3, and error function F is interior point less than the match point of this threshold value, and point set in point adds in this, otherwise be exterior point;
(IV) interior point set and sample set S mForm consistent collection
Figure FDA00002368243600049
If
Figure FDA000023682436000410
In in count more than or equal to 60 percent of coupling angle point sum, then think to have obtained correct projective transformation matrix, and will
Figure FDA000023682436000411
As sample set substitution relational expression (16), adopt least square method to recomputate New model Q ' mAs the correction model of projective transformation matrix, if
Figure FDA000023682436000412
In in count less than 60 percent of coupling angle point sum, then algorithm failure makes Q m=0;
(V) if
Figure FDA000023682436000413
Make m=m+1, continue to carry out successively (II) to (V); If
Figure FDA000023682436000414
The iterative process of then sampling finishes; After sampling, obtain the new model set
Figure FDA000023682436000415
In choose the consistent collection of sample number maximum
Figure FDA000023682436000416
As the matching angle point set after optimizing, the corresponding model Q ' that calculates MaxAs final transformation matrix Q, algorithm finishes;
Step 8: the matching angle point set after the optimization that employing frame difference method treatment step 7 obtains, calculate the motion state of vehicle: the coupling angle point behind the consistent RANSAC algorithm optimization of extraction stochastic sampling is done the poor pixel displacement that obtains at the coordinate of i frame, i+1 two field picture with the matching angle point coordinate in i frame, the i+1 two field picture; In conjunction with the camera interior and exterior parameter that is obtained by step 2, calculate the quantity of state of vehicle under the vehicle carrier coordinate system;
Described frame difference method is calculated state of motion of vehicle, and the concrete grammar of employing is as follows:
1. the matching angle point set sample size of establishing behind the consistent RANSAC algorithm optimization of stochastic sampling is
Figure FDA000023682436000417
Initialization sample number variable j makes j=1;
2. it is poor that after will optimizing j coupling angle point done at the coordinate of i+1 frame and i two field picture, gets j coupling angle point after the optimization at shooting i frame and the pixel displacement Δ p (u in the i+1 two field picture time interval I, j, v I, j), wherein, u I, j, v I, jRespectively u axle, the v axial coordinate of the coupling angle point of j behind the consistent RANSAC algorithm optimization of stochastic sampling in the i two field picture in the i two field picture;
3. the camera parameter demarcated of integrating step 2 is with pixel displacement Δ p (u I, j, v I, j) carrying out the coordinate inverse, j match point after being optimized taken i frame and the physical displacement in the i+1 two field picture time interval;
4. video camera is taken the i two field picture within the time interval of taking the i+1 two field picture, each parameter of the one group of travel condition of vehicle that is calculated by j match point after optimizing:
v i , j = ΔP ( x i , j , y i , j ) Δt = P ( x i + 1 , j , y i + 1 , j ) - P ( x i , j , y i , j ) t i + 1 - t i
v x → i , j = Δ x i , j Δt = x i + 1 , j - x i , j t i + 1 - t i - - - ( 18 )
v y → i , j = Δ y i , j Δt = y i + 1 , j - y i , j t i + 1 - t i
Wherein, arctan () represents arctan function, t i, t I+1Dividing is the moment of taking i frame, i+1 two field picture, P (x I, j, y I, j), P (x I+1, j, y I+1, j) be respectively by j the t that the coupling angle point is obtained after optimizing iThe moment and t I+The physical coordinates of 1 moment vehicle, Δ P (x I, j, y I, j) be by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The physical displacement of moment vehicle, x I, j, x I+1, jRespectively by j the t that the coupling angle point is obtained after optimizing iThe moment and t I+1The ordinate of orthogonal axes value of moment vehicle, y I, j, y I+1, jRespectively by j the t that the coupling angle point is obtained after optimizing iThe moment and t I+1The transverse axis coordinate figure of moment vehicle, Δ x I, j, Δ y I, jBe respectively by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The constantly longitudinal axis, the transverse axis displacement of vehicle, v I, jBe by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The instantaneous velocity of moment vehicle,
Figure FDA00002368243600054
Be respectively by j after optimizing mate that angle point obtains from t iThe time be carved into t I+1The longitudinal axis, the X direction speed of moment vehicle;
If 5.
Figure FDA00002368243600055
Then make j=j+1, continue to carry out successively above-mentioned 2. arriving 5.; If
Figure FDA00002368243600056
Then iterative process finishes;
Above-mentioned frame difference method is obtained
Figure FDA00002368243600057
The group vehicle movement parameter do on average, and with average result as vehicle from t iThe time be carved into t I+1Kinematic parameter constantly:
v i = Σ j = 1 j ^ v i , j j ^ v x → i = Σ j = 1 j ^ v x → i , j j ^ - - - ( 19 )
v y → i = Σ j = 1 j ^ v y → i , j j ^ β i = arctan ( v y → i v x → i )
Wherein, v iFrom t iThe time be carved into t I+1The instantaneous velocity of moment vehicle,
Figure FDA000023682436000512
Respectively from t iThe time be carved into t I+1The longitudinal axis, the X direction speed of moment vehicle, β iFrom t iThe time be carved into t I+The side slip angle of 1 moment vehicle;
Step 9: the pavement image that will obtain in chronological order successively repeating step 3 to step 9 recursion calculates, even image sequence pointer i=i+1, repeating step 3 to step 9 is calculated, and obtains vehicle at each longitudinal velocity, transverse velocity and side slip angle constantly.
CN201210442082.6A 2012-11-07 2012-11-07 A kind of state of motion of vehicle method of estimation based on light stream Expired - Fee Related CN102999759B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210442082.6A CN102999759B (en) 2012-11-07 2012-11-07 A kind of state of motion of vehicle method of estimation based on light stream

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210442082.6A CN102999759B (en) 2012-11-07 2012-11-07 A kind of state of motion of vehicle method of estimation based on light stream

Publications (2)

Publication Number Publication Date
CN102999759A true CN102999759A (en) 2013-03-27
CN102999759B CN102999759B (en) 2015-10-07

Family

ID=47928308

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210442082.6A Expired - Fee Related CN102999759B (en) 2012-11-07 2012-11-07 A kind of state of motion of vehicle method of estimation based on light stream

Country Status (1)

Country Link
CN (1) CN102999759B (en)

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606146A (en) * 2013-11-05 2014-02-26 南京理工大学 Corner point detection method based on circular target
CN103745474A (en) * 2014-01-21 2014-04-23 南京理工大学 Image registration method based on inertial sensor and camera
CN103759670A (en) * 2014-01-06 2014-04-30 四川虹微技术有限公司 Object three-dimensional information acquisition method based on digital close range photography
CN104463867A (en) * 2014-12-08 2015-03-25 南华大学 Rapid SIFT extraction method based on information quantity
CN104537691A (en) * 2014-12-30 2015-04-22 中国人民解放军国防科学技术大学 Moving target detecting method for optical flow field segmentation based on partitioned homodromous speed accumulation
CN104568458A (en) * 2014-12-12 2015-04-29 中山大学 Method and device for measuring motion slip angle of mass center of vehicle
CN104853064A (en) * 2015-04-10 2015-08-19 海视英科光电(苏州)有限公司 Electronic image-stabilizing method based on infrared thermal imager
CN105096337A (en) * 2014-05-23 2015-11-25 南京理工大学 Image global motion compensation method based on hardware platform of gyroscope
CN105137468A (en) * 2015-09-24 2015-12-09 北京理工大学 Photoelectric type automobile continuous navigation data acquiring device and method in GPS blind area environment
CN105550981A (en) * 2015-11-27 2016-05-04 中山大学 Image registration and splicing method on the basis of Lucas-Kanade algorithm
CN105579289A (en) * 2013-08-19 2016-05-11 金泰克斯公司 Imaging system and method with ego motion detection
CN106022263A (en) * 2016-05-19 2016-10-12 西安石油大学 Vehicle tracking method in fusion with feature matching and optical flow method
CN106210448A (en) * 2016-07-22 2016-12-07 恒业智能信息技术(深圳)有限公司 A kind of video image dithering Processing for removing method
CN106980829A (en) * 2017-03-17 2017-07-25 苏州大学 Abnormal behaviour automatic testing method of fighting based on video analysis
CN106981074A (en) * 2016-01-15 2017-07-25 戴尔菲技术公司 For estimating method and apparatus of the camera relative to the orientation on road surface
CN107044855A (en) * 2017-05-05 2017-08-15 南京信息工程大学 A kind of inertial measuring unit and method based on camera array
CN107111879A (en) * 2015-02-16 2017-08-29 应用解决方案(电子及视频)有限公司 Pass through the method and apparatus of panoramic looking-around Image estimation vehicle displacement
CN107123128A (en) * 2017-04-24 2017-09-01 南京邮电大学 A kind of state of motion of vehicle method of estimation for ensureing accuracy
CN107133986A (en) * 2017-04-26 2017-09-05 武汉科技大学 A kind of camera calibration method based on two-dimensional calibrations thing
CN107273801A (en) * 2017-05-15 2017-10-20 南京邮电大学 A kind of method of video multi-target tracing detection abnormity point
CN107886530A (en) * 2017-12-11 2018-04-06 哈尔滨理工大学 A kind of improved image registration algorithm based on SIFT feature
CN108022429A (en) * 2016-11-04 2018-05-11 浙江大华技术股份有限公司 A kind of method and device of vehicle detection
CN108162858A (en) * 2016-12-07 2018-06-15 杭州海康威视数字技术股份有限公司 Vehicle-mounted monitoring apparatus and its method
CN108229247A (en) * 2016-12-14 2018-06-29 贵港市瑞成科技有限公司 A kind of mobile vehicle detection method
CN108638053A (en) * 2018-04-03 2018-10-12 珠海市微半导体有限公司 A kind of detection method and its antidote of robot skidding
CN109189058A (en) * 2018-07-18 2019-01-11 深圳市海梁科技有限公司 A kind of multi-wavelength lacquer painting, dynamic light stream line walking navigation system and automatic driving vehicle
CN109309785A (en) * 2017-07-28 2019-02-05 松下电器(美国)知识产权公司 Imaging control device and filming control method
CN109684996A (en) * 2018-12-22 2019-04-26 北京工业大学 Real-time vehicle based on video passes in and out recognition methods
CN109727273A (en) * 2018-12-29 2019-05-07 北京茵沃汽车科技有限公司 A kind of Detection of Moving Objects based on vehicle-mounted fisheye camera
CN109903309A (en) * 2019-01-07 2019-06-18 山东笛卡尔智能科技有限公司 A kind of robot motion's information estimating method based on angle optical flow method
CN109952600A (en) * 2016-06-30 2019-06-28 奥克托信息技术股份公司 A method of for estimating the running time of vehicle based on the determination of the state of vehicle
CN109961476A (en) * 2017-12-25 2019-07-02 大连楼兰科技股份有限公司 The localization method of the underground parking of view-based access control model
CN109958311A (en) * 2017-12-25 2019-07-02 大连楼兰科技股份有限公司 Apply the vehicle heading angle detection system in parking lot
CN109978801A (en) * 2019-03-25 2019-07-05 联想(北京)有限公司 A kind of image processing method and image processing apparatus
CN110288702A (en) * 2019-06-28 2019-09-27 联想(北京)有限公司 A kind of data processing method and electronic equipment
CN110349415A (en) * 2019-06-26 2019-10-18 江西理工大学 A kind of running speed measurement method based on multi-scale transform
CN111462166A (en) * 2020-03-31 2020-07-28 武汉卓目科技有限公司 Video image stabilization method and system based on histogram equalization optical flow method
CN111915532A (en) * 2020-08-07 2020-11-10 北京字节跳动网络技术有限公司 Image tracking method and device, electronic equipment and computer readable medium
CN112683281A (en) * 2021-03-11 2021-04-20 之江实验室 Automatic driving vehicle joint positioning method based on vehicle kinematics
CN112699854A (en) * 2021-03-22 2021-04-23 亮风台(上海)信息科技有限公司 Method and device for identifying stopped vehicle
CN114782447A (en) * 2022-06-22 2022-07-22 小米汽车科技有限公司 Road surface detection method, device, vehicle, storage medium and chip
CN117173158A (en) * 2023-10-25 2023-12-05 深圳市德海威实业有限公司 Intelligent detection method and system for quality of precise connector

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KICHUN JO 等: "《Distributed Vehicle State Estimation System Using Information Fusion of GPS and In-vehicle Sensors for Vehicle Locallization》", 《2011 14TH INTERNATIONAL IEEE CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS》 *
余卓平,高晓杰: "《车辆行驶过程中的状态估计问题综述》", 《机械工程学报》 *
宗长富 等: "《基于扩展卡尔曼滤波的信息融合技术在车辆状态估计中的应用》", 《机械工程学报》 *

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105579289B (en) * 2013-08-19 2018-01-19 金泰克斯公司 Imaging system and method with displacement detection
CN105579289A (en) * 2013-08-19 2016-05-11 金泰克斯公司 Imaging system and method with ego motion detection
US9721172B2 (en) 2013-08-19 2017-08-01 Gentex Corporation Imaging system and method with ego motion detection
CN103606146A (en) * 2013-11-05 2014-02-26 南京理工大学 Corner point detection method based on circular target
CN103606146B (en) * 2013-11-05 2016-08-17 南京理工大学 A kind of angular-point detection method based on circular target
CN103759670A (en) * 2014-01-06 2014-04-30 四川虹微技术有限公司 Object three-dimensional information acquisition method based on digital close range photography
CN103759670B (en) * 2014-01-06 2016-09-28 四川虹微技术有限公司 A kind of object dimensional information getting method based on numeral up short
CN103745474A (en) * 2014-01-21 2014-04-23 南京理工大学 Image registration method based on inertial sensor and camera
CN103745474B (en) * 2014-01-21 2017-01-18 南京理工大学 Image registration method based on inertial sensor and camera
CN105096337A (en) * 2014-05-23 2015-11-25 南京理工大学 Image global motion compensation method based on hardware platform of gyroscope
CN105096337B (en) * 2014-05-23 2018-05-01 南京理工大学 A kind of image global motion compensation method based on gyroscope hardware platform
CN104463867A (en) * 2014-12-08 2015-03-25 南华大学 Rapid SIFT extraction method based on information quantity
CN104568458A (en) * 2014-12-12 2015-04-29 中山大学 Method and device for measuring motion slip angle of mass center of vehicle
CN104537691B (en) * 2014-12-30 2015-09-23 中国人民解放军国防科学技术大学 Based on piecemeal in the same way speed add up optical flow field segmentation moving target detecting method
CN104537691A (en) * 2014-12-30 2015-04-22 中国人民解放军国防科学技术大学 Moving target detecting method for optical flow field segmentation based on partitioned homodromous speed accumulation
US10867401B2 (en) 2015-02-16 2020-12-15 Application Solutions (Electronics and Vision) Ltd. Method and device for the estimation of car ego-motion from surround view images
CN107111879B (en) * 2015-02-16 2021-01-22 应用解决方案(电子及视频)有限公司 Method and apparatus for estimating vehicle's own motion by panoramic looking-around image
CN107111879A (en) * 2015-02-16 2017-08-29 应用解决方案(电子及视频)有限公司 Pass through the method and apparatus of panoramic looking-around Image estimation vehicle displacement
CN104853064B (en) * 2015-04-10 2018-04-17 海视英科光电(苏州)有限公司 Electronic image stabilization method based on thermal infrared imager
CN104853064A (en) * 2015-04-10 2015-08-19 海视英科光电(苏州)有限公司 Electronic image-stabilizing method based on infrared thermal imager
CN105137468A (en) * 2015-09-24 2015-12-09 北京理工大学 Photoelectric type automobile continuous navigation data acquiring device and method in GPS blind area environment
CN105550981A (en) * 2015-11-27 2016-05-04 中山大学 Image registration and splicing method on the basis of Lucas-Kanade algorithm
CN106981074B (en) * 2016-01-15 2021-05-25 Aptiv技术有限公司 Method and apparatus for estimating an orientation of a camera relative to a surface
CN106981074A (en) * 2016-01-15 2017-07-25 戴尔菲技术公司 For estimating method and apparatus of the camera relative to the orientation on road surface
CN106022263B (en) * 2016-05-19 2019-07-09 西安石油大学 A kind of wireless vehicle tracking of fusion feature matching and optical flow method
CN106022263A (en) * 2016-05-19 2016-10-12 西安石油大学 Vehicle tracking method in fusion with feature matching and optical flow method
CN109952600A (en) * 2016-06-30 2019-06-28 奥克托信息技术股份公司 A method of for estimating the running time of vehicle based on the determination of the state of vehicle
CN109952600B (en) * 2016-06-30 2021-09-24 奥克托信息技术股份公司 Method for estimating a travel time of a vehicle based on a determination of a state of the vehicle
CN106210448B (en) * 2016-07-22 2020-08-18 深圳市智慧城市建设运行管理有限公司 Video image jitter elimination processing method
CN106210448A (en) * 2016-07-22 2016-12-07 恒业智能信息技术(深圳)有限公司 A kind of video image dithering Processing for removing method
CN108022429A (en) * 2016-11-04 2018-05-11 浙江大华技术股份有限公司 A kind of method and device of vehicle detection
CN108022429B (en) * 2016-11-04 2021-08-27 浙江大华技术股份有限公司 Vehicle detection method and device
CN108162858A (en) * 2016-12-07 2018-06-15 杭州海康威视数字技术股份有限公司 Vehicle-mounted monitoring apparatus and its method
CN108162858B (en) * 2016-12-07 2020-05-29 杭州海康威视数字技术股份有限公司 Vehicle-mounted monitoring device and method thereof
CN108229247A (en) * 2016-12-14 2018-06-29 贵港市瑞成科技有限公司 A kind of mobile vehicle detection method
CN106980829B (en) * 2017-03-17 2019-09-20 苏州大学 Abnormal behaviour automatic testing method of fighting based on video analysis
CN106980829A (en) * 2017-03-17 2017-07-25 苏州大学 Abnormal behaviour automatic testing method of fighting based on video analysis
CN107123128B (en) * 2017-04-24 2019-07-16 南京邮电大学 A kind of state of motion of vehicle estimation method guaranteeing accuracy
CN107123128A (en) * 2017-04-24 2017-09-01 南京邮电大学 A kind of state of motion of vehicle method of estimation for ensureing accuracy
CN107133986A (en) * 2017-04-26 2017-09-05 武汉科技大学 A kind of camera calibration method based on two-dimensional calibrations thing
CN107133986B (en) * 2017-04-26 2019-08-20 武汉科技大学 A kind of camera calibration method based on two-dimensional calibrations object
CN107044855A (en) * 2017-05-05 2017-08-15 南京信息工程大学 A kind of inertial measuring unit and method based on camera array
CN107273801B (en) * 2017-05-15 2021-11-30 南京邮电大学 Method for detecting abnormal points by video multi-target tracking
CN107273801A (en) * 2017-05-15 2017-10-20 南京邮电大学 A kind of method of video multi-target tracing detection abnormity point
CN109309785B (en) * 2017-07-28 2021-06-22 松下电器(美国)知识产权公司 Imaging control device and imaging control method
CN109309785A (en) * 2017-07-28 2019-02-05 松下电器(美国)知识产权公司 Imaging control device and filming control method
CN107886530A (en) * 2017-12-11 2018-04-06 哈尔滨理工大学 A kind of improved image registration algorithm based on SIFT feature
CN109958311A (en) * 2017-12-25 2019-07-02 大连楼兰科技股份有限公司 Apply the vehicle heading angle detection system in parking lot
CN109961476A (en) * 2017-12-25 2019-07-02 大连楼兰科技股份有限公司 The localization method of the underground parking of view-based access control model
CN108638053A (en) * 2018-04-03 2018-10-12 珠海市微半导体有限公司 A kind of detection method and its antidote of robot skidding
CN109189058A (en) * 2018-07-18 2019-01-11 深圳市海梁科技有限公司 A kind of multi-wavelength lacquer painting, dynamic light stream line walking navigation system and automatic driving vehicle
CN109189058B (en) * 2018-07-18 2021-10-15 深圳市海梁科技有限公司 Multi-wavelength paint surface and dynamic optical flow line patrol navigation system and unmanned vehicle
CN109684996A (en) * 2018-12-22 2019-04-26 北京工业大学 Real-time vehicle based on video passes in and out recognition methods
CN109727273A (en) * 2018-12-29 2019-05-07 北京茵沃汽车科技有限公司 A kind of Detection of Moving Objects based on vehicle-mounted fisheye camera
CN109903309B (en) * 2019-01-07 2023-05-12 南京华科广发通信科技有限公司 Robot motion information estimation method based on angular optical flow method
CN109903309A (en) * 2019-01-07 2019-06-18 山东笛卡尔智能科技有限公司 A kind of robot motion's information estimating method based on angle optical flow method
CN109978801A (en) * 2019-03-25 2019-07-05 联想(北京)有限公司 A kind of image processing method and image processing apparatus
CN109978801B (en) * 2019-03-25 2021-11-16 联想(北京)有限公司 Image processing method and image processing device
CN110349415B (en) * 2019-06-26 2021-08-20 江西理工大学 Driving speed measuring method based on multi-scale transformation
CN110349415A (en) * 2019-06-26 2019-10-18 江西理工大学 A kind of running speed measurement method based on multi-scale transform
CN110288702B (en) * 2019-06-28 2021-05-18 联想(北京)有限公司 Data processing method and electronic equipment
CN110288702A (en) * 2019-06-28 2019-09-27 联想(北京)有限公司 A kind of data processing method and electronic equipment
CN111462166A (en) * 2020-03-31 2020-07-28 武汉卓目科技有限公司 Video image stabilization method and system based on histogram equalization optical flow method
CN111915532B (en) * 2020-08-07 2022-02-11 北京字节跳动网络技术有限公司 Image tracking method and device, electronic equipment and computer readable medium
CN111915532A (en) * 2020-08-07 2020-11-10 北京字节跳动网络技术有限公司 Image tracking method and device, electronic equipment and computer readable medium
CN112683281B (en) * 2021-03-11 2021-06-04 之江实验室 Automatic driving vehicle joint positioning method based on vehicle kinematics
CN112683281A (en) * 2021-03-11 2021-04-20 之江实验室 Automatic driving vehicle joint positioning method based on vehicle kinematics
CN112699854A (en) * 2021-03-22 2021-04-23 亮风台(上海)信息科技有限公司 Method and device for identifying stopped vehicle
CN114782447A (en) * 2022-06-22 2022-07-22 小米汽车科技有限公司 Road surface detection method, device, vehicle, storage medium and chip
CN114782447B (en) * 2022-06-22 2022-09-09 小米汽车科技有限公司 Road surface detection method, device, vehicle, storage medium and chip
CN117173158A (en) * 2023-10-25 2023-12-05 深圳市德海威实业有限公司 Intelligent detection method and system for quality of precise connector
CN117173158B (en) * 2023-10-25 2024-01-30 深圳市德海威实业有限公司 Intelligent detection method and system for quality of precise connector

Also Published As

Publication number Publication date
CN102999759B (en) 2015-10-07

Similar Documents

Publication Publication Date Title
CN102999759B (en) A kind of state of motion of vehicle method of estimation based on light stream
EP3735675B1 (en) Image annotation
CN106256606B (en) A kind of lane departure warning method based on vehicle-mounted binocular camera
CN105300403B (en) A kind of vehicle mileage calculating method based on binocular vision
CN109359409A (en) A kind of vehicle passability detection system of view-based access control model and laser radar sensor
CN104700414A (en) Rapid distance-measuring method for pedestrian on road ahead on the basis of on-board binocular camera
CN109727273B (en) Moving target detection method based on vehicle-mounted fisheye camera
CN103325108A (en) Method for designing monocular vision odometer with light stream method and feature point matching method integrated
CN104282020A (en) Vehicle speed detection method based on target motion track
CN104183127A (en) Traffic surveillance video detection method and device
CN103686083B (en) Real-time speed measurement method based on vehicle-mounted sensor video streaming matching
CN104180818A (en) Monocular vision mileage calculating device
Kellner et al. Road curb detection based on different elevation mapping techniques
US20190180121A1 (en) Detection of Objects from Images of a Camera
CN112991369A (en) Method for detecting overall dimension of running vehicle based on binocular vision
CN114018248A (en) Odometer method and map building method integrating coded disc and laser radar
CN105512641A (en) Method for using laser radar scanning method to calibrate dynamic pedestrians and vehicles in video in snowing or raining state
CN104537649A (en) Vehicle steering judgment method and system based on image ambiguity comparison
JP7380824B2 (en) Vehicle state estimation method, vehicle state estimation device, and vehicle state estimation program
CN114719873B (en) Low-cost fine map automatic generation method and device and readable medium
CN110176022B (en) Tunnel panoramic monitoring system and method based on video detection
CN105300390A (en) Method and device for determining moving trace of obstacle
CN109145805B (en) Moving target detection method and system under vehicle-mounted environment
CN113221739B (en) Monocular vision-based vehicle distance measuring method
Chun-Zhao et al. Drivable road boundary detection for intelligent vehicles based on stereovision with plane-induced homography

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151007

Termination date: 20191107