CN108280847A - A kind of vehicle movement track method of estimation - Google Patents

A kind of vehicle movement track method of estimation Download PDF

Info

Publication number
CN108280847A
CN108280847A CN201810049413.7A CN201810049413A CN108280847A CN 108280847 A CN108280847 A CN 108280847A CN 201810049413 A CN201810049413 A CN 201810049413A CN 108280847 A CN108280847 A CN 108280847A
Authority
CN
China
Prior art keywords
vehicle
point
angle
image
angle point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201810049413.7A
Other languages
Chinese (zh)
Inventor
王波
王艳明
胡振程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Software Technology (shanghai) Co Ltd
Original Assignee
New Software Technology (shanghai) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Software Technology (shanghai) Co Ltd filed Critical New Software Technology (shanghai) Co Ltd
Priority to CN201810049413.7A priority Critical patent/CN108280847A/en
Publication of CN108280847A publication Critical patent/CN108280847A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The present invention relates to a kind of vehicle movement track methods of estimation, include the following steps:S1, the vehicle body for obtaining vehicle look around image;S2 detects the angle point in vehicle body image;S3 tracks optical flow method angle steel joint into line trace by LK, obtains trace point of the angle point in next frame image;S4 obtains the first movable information of vehicle by onboard sensor and vehicle movement model;S5, is based on the first movable information, and angle steel joint is screened;S6 carries out postsearch screening to the angle point and obtains best matrix model, the second movable information of vehicle is calculated by the matrix model;First movable information and the second movable information are carried out Kalman filtering fusion, obtain the movement locus of vehicle by S7.Vehicle movement track using the present invention method of estimation can guarantee that the vehicle movement track estimated has higher precision in vehicle high-speed and low speed.

Description

A kind of vehicle movement track method of estimation
Technical field
The present invention relates to vehicle movement track estimation field more particularly to a kind of methods of estimation of vehicle movement track.
Background technology
Currently, the estimation of vehicle movement track realizes two kinds of sides including the use of image optical flow method and using onboard sensor Formula.Mode is estimated in vehicle movement track based on image optical flow method, by image motion estimation, and passes through image and practical generation The mapping relations of boundary's coordinate can accurately estimate the movement locus of vehicle in certain precision and stability claimed range Meter, but when Velicle motion velocity is higher, vehicle movement track is estimated using image optical flow method, precision can reduce, and make The movement locus error that must be estimated is larger.
Such as automobile wheel speed, steering wheel angle information are obtained based on onboard sensor, and then is estimated according to vehicle movement model Counting out the method for vehicle movement track, there is also following defects:
(1) the vehicle movement path accuracy estimated can not ensure, if onboard sensor precision is relatively low or sensor is old It is relatively low to change the vehicle movement path accuracy that will cause to estimate;
(2) there are a degree of noises in track, due to shadows such as the characteristic of onboard sensor itself and environmental factors It rings, the collected data of sensor are not reliable and stable data, usually there is random noise disturbance, to influence estimation knot Fruit;
(3) when the speed of vehicle is less than 5km/h, the applicable elements of vehicle movement model are will no longer satisfy, therefore, thus Movement locus of the vehicle estimated in low speed is inaccurate;
(4) when vehicle is turned in the case that big since the cumulative errors of sensor will lead to the fortune that auto model estimates It is larger that dynamic rail mark deviates physical location.
Invention content
It is an object of the invention to solve above-mentioned technical problem, provide that a kind of precision is high, vehicle movement applied widely Track method of estimation.
For achieving the above object, the present invention provides a kind of vehicle movement track method of estimation, the vehicle movement rail Mark method of estimation includes the following steps:S1, the vehicle body for obtaining vehicle look around image;S2 detects the vehicle body and looks around in image Angle point;S3 tracks optical flow method to the angle point into line trace by LK, obtains tracking of the angle point in next frame image Point;S4 obtains the first movable information of the vehicle by onboard sensor and vehicle movement model;S5 is based on described first Movable information screens the angle point;S6 carries out postsearch screening to the angle point and obtains best matrix model, passes through institute State the second movable information that matrix model calculates the vehicle;S7, by first movable information and second movable information Kalman filtering fusion is carried out, the movement locus of the vehicle is obtained.
Preferably, the step S2 includes:S21 calculates separately tested pixel and multiple pixels on predetermined radii Between multiple pixel absolute value of the difference;S22, if having the pixel difference of predetermined quantity in the multiple pixel absolute value of the difference Absolute value is more than threshold value, then using the tested pixel as characteristic point;S23 judges in the neighborhood centered on the characteristic point Whether this characteristic point of the characteristic point is only had, if only there are one characteristic points, using the characteristic point as the angle point.
Preferably, the step S2 further includes:If there are multiple characteristic points in the neighborhood centered on the characteristic point, Calculate the score value of each characteristic point, multiple pixel differences of the score value between the characteristic point and the multiple pixel Absolute value summation;If the score value of the characteristic point is maximum, using the characteristic point as the angle point.
Preferably, the step S4 includes:The steering wheel angle and vehicle of the vehicle are obtained by the onboard sensor Fast information;The turning radius of the vehicle is calculated based on the vehicle movement model and the steering wheel angle;Based on described Turning radius, the steering wheel angle, the speed information calculate displacement distance and the drift angle of the vehicle.
Preferably, after the displacement distance and drift angle for calculating the vehicle, according to world coordinate system and image coordinate The displacement distance of the vehicle and drift angle are converted to the amount of movement and corner of image by the relationship of system.
Preferably, step S5 includes:Predetermined value is arranged in amount of movement and corner based on described image;It is transported by the vehicle Movable model estimates location point of the angle point in next frame image;Determine the trace point whether in being with the location point The heart is using the predetermined value as in the region of radius;If the trace point in the region, retains the angle point, otherwise deletes Except the angle point.
Preferably, after the step S5 and before step S6, vehicle movement track method of estimation may be used also With include using LK optical flow trackings method to after screening angle point carry out postsearch screening, including:Using LK light stream forward trace algorithms, Determine forward trace angle point of the angle point in previous frame in current frame image;Using after LK light streams to track algorithm, really The fixed forward trace angle point is rear to tracking angle point in the previous frame;Calculate the angle point in the previous frame and institute To the distance between tracking angle point after stating, if the distance is less than predetermined threshold, retain the angle point.
Preferably, in step s 6, postsearch screening is carried out to the angle point after screening using RANSAC algorithms, including:From institute It states in current frame image and the previous frame image and randomly selects 3 pairs of matched angle points, this 3 angle steel joint is not conllinear, is converted Matrix model;The projection error of other all angle points and the transformation matrix model is calculated, if projection error is less than setting threshold Value, then by corresponding angle point to the interior point set corresponding to the transformation matrix model is added;3 pairs of matched angle points are reselected, are obtained New transformation matrix model, and the projection error of other all angle points and the transformation matrix model is calculated, if projection error is small In the given threshold, then by corresponding angle point to the interior point set corresponding to the transformation matrix model is added;Repeat above-mentioned selection The step of with angle point and calculating projection error, obtains corresponding multiple interior point sets;It selects multiple interior points to concentrate and contains angle point number The most interior point set of amount is as optimal interior point set;And using the corresponding transformation matrix model of the optimal interior point set as best Matrix model.
Preferably, it is by the best matrix model that RANSAC algorithms obtain:
Coordinate (the x of the rear axle midpoint of vehicle in image is looked around by the best matrix model H and the vehicle bodyc, yc), vehicle corner δ is calculated and the vehicle body looks around the move distance d of vehicle in the horizontal direction in imagexWith along vertical The move distance d in directiony
In conjunction between two field pictures time difference Δ t and the vehicle body look around in image per the actual range representated by pixel The move distance D and movement velocity V of the vehicle is calculated in pixel_d:
Preferably, step S7 includes:Respectively according to described in first movable information and second movable information foundation The state parameter of vehicle;The matrix parameter of Kalman filtering fusion equation is set, the state parameter of the vehicle is substituted into institute State the movement locus that Kalman filtering fusion equation calculates the vehicle.
According to vehicle movement track of the present invention method of estimation, the vehicle movement information obtained using onboard sensor obtains vehicle Displacement distance and drift angle and the amount of movement and corner that are converted to image, then by angle steel joint detection, screening, after screening To best matrix model, the second movable information that vehicle is calculated by best matrix model obtains the movable information of image, and most The amount of movement of image and corner are merged with the movable information of image at last, obtain the movement locus of vehicle.Compared to existing It is used alone in technology for the method for onboard sensor or image optical flow method to estimate vehicle movement track so that two methods Formed complementary well, evaded the deficiency of respective method, make vehicle no matter in the case of high speed or low speed, can with compared with High precision estimates the track of vehicle.
Description of the drawings
It in order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, below will be to institute in embodiment Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the present invention Example, for those of ordinary skill in the art, without creative efforts, can also obtain according to these attached drawings Obtain other attached drawings.
Fig. 1 is the flow chart for schematically showing vehicle according to the invention movement locus method of estimation;
Fig. 2 is the setting figure for the photographic device for schematically showing vehicle according to the invention;
Fig. 3 is to schematically show that vehicle body looks around the diagram of image;
Fig. 4 is the flow chart for schematically showing angular-point detection method according to the present invention;
Fig. 5 is the diagram for schematically showing angular-point detection method;
Fig. 6 (a) and Fig. 6 (b) is the diagram of the double track motion model and single track motion model that schematically show vehicle;
Fig. 7 is to schematically show that the present invention calculates the diagram of the movable information of vehicle using vehicle single track motion model;
Fig. 8 is the flow chart for schematically showing angle point screening technique;
Fig. 9 is the diagram for schematically showing angle point screening technique;
Figure 10 is to schematically show the diagram screened using LK optical flow tracking method angle steel joints.
Specific implementation mode
The description of this specification embodiment should be combined with corresponding attached drawing, and attached drawing should be used as the one of complete specification Part.In the accompanying drawings, the shape of embodiment or thickness can expand, and to simplify or facilitate mark.Furthermore it is respectively tied in attached drawing The part of structure will be to describe to illustrate respectively, it is notable that the member for being not shown in figure or not illustrated by word Part is the form known to a person of ordinary skill in the art in technical field.
The description of embodiments herein, any reference in relation to direction and orientation are merely for convenience of describing, and cannot manage Solution is any restrictions to the scope of the present invention.It can be related to the combination of feature below for the explanation of preferred embodiment, These features may be individually present or combine presence, and the present invention is not defined in preferred embodiment particularly.The present invention Range be defined by the claims.
Fig. 1 is the flow chart for schematically showing vehicle according to the invention movement locus method of estimation.As shown in Figure 1, root Vehicle movement track method of estimation according to the present invention includes the following steps:S1, the vehicle body for obtaining vehicle look around image;S2 detects vehicle Body looks around the angle point in image;S3 tracks optical flow method angle steel joint into line trace by LK, obtain angle point in one frame of image with Track point;S4 obtains the first movable information of vehicle by onboard sensor and vehicle movement model;S5 is believed based on the first movement Breath, angle steel joint are screened;S6, angle steel joint carry out postsearch screening and obtain best matrix model, and vehicle is calculated by matrix model The second movable information;First movable information and the second movable information are carried out Kalman filtering fusion, obtain the fortune of vehicle by S7 Dynamic rail mark.
In the method for the invention, the vehicle body for obtaining vehicle in step sl first looks around image.Specifically, vehicle is obtained Vehicle body look around image and need through multiple cameras on vehicle body come the image of collection vehicle surrounding, so and to acquisition To image carry out calibration splicing fusion and obtain vehicle body and look around image.
Fig. 2 is the setting figure for the photographic device for schematically showing vehicle according to the invention.Fig. 3 is to schematically show vehicle body Look around the diagram of image.
As shown in Fig. 2, image around vehicle bodies can be acquired in four cameras of installation around vehicle body, wherein L is left camera shooting Head, F are preceding cameras, and R is right camera, and B is rear camera.By abnormal to four camera the image collected uncalibrated images Multiple images are spliced fusion by variable element, and correcting image distortion parameter, the feature then extracted in image, are generated vehicle body and are looked around Image.For example, as shown in figure 3, what is generated looks around the overhead view image that image is image around vehicle body.Note that installing multiple camera shootings Head and how to obtain vehicle body and look around the various ways that the prior art may be used in image, will not be described in great detail herein.
After getting vehicle body and looking around image, detection vehicle body looks around the angle point in image.A kind of implementation according to the present invention The method of FAST Corner Detections may be used to detect the angle point that vehicle body is looked around in image in mode.
Fig. 4 is the flow chart for schematically showing FAST angular-point detection methods detection angle point according to the present invention.Fig. 5 is signal Property indicate FAST angular-point detection methods diagram.
As shown in figure 4, may include using the angle point that FAST angular-point detection methods detection vehicle body is looked around in image:S21, point The multiple pixel absolute value of the difference that Ji Suan be tested between multiple pixels on pixel and predetermined radii;S22, if multiple There is the pixel absolute value of the difference of predetermined quantity to be more than threshold value in pixel absolute value of the difference, then using tested pixel as characteristic point; S23 judges whether there was only this characteristic point in the neighborhood centered on characteristic point, should if only there are one characteristic points Characteristic point is as angle point.FAST angular-point detection methods are illustrated by taking Fig. 5 as an example below.
It is 3 (radius can be configured as needed) in radius specifically as shown in figure 5, centered on tested pixel p Circle shaped neighborhood region in, share 16 pixels (p1-p16).A threshold value is set, pixel p1-p16 and tested picture are calculated separately Pixel absolute value of the difference between vegetarian refreshments P, if in the pixel absolute value of the difference of 16 pixels and tested pixel p at least 9 pixel absolute value of the difference are more than the threshold value of setting, then using tested pixel p as characteristic point.Otherwise, it is tested pixel p not It is characteristic point.Then, then to next pixel it is detected.
In specific Corner Detection, the pixel absolute value of the difference of p and p1, p9 can also be calculated first, if two values Both less than threshold value, then p is not angle point.If at least one in two values be more than threshold value, the picture of p and p1, p9, p5, p13 are calculated Plain absolute value of the difference if being more than threshold value there are three absolute value, then calculates the pixel absolute value of the difference of p and p1-p16.If There are 9 absolute values to be more than threshold value, then p is determined as characteristic point.
After determining characteristic point, it is also necessary to which determining in the neighborhood (for example, 3 × 3,5 × 5) centered on pixel p is It is no that there are multiple characteristic points.If it is present the score value of each characteristic point is calculated, if the score value of tested pixel p is most Greatly, then using tested pixel p as angle point.Specifically, the method for calculating the score value of each characteristic point is:Calculate characteristic point with The pixel difference of multiple pixels in neighborhood thoroughly deserves summation, for example, the pixel absolute value of the difference of p and p1-p16 is total With.If there was only mono- characteristic point of pixel p in the neighborhood centered on pixel p, using this feature point p as angle point.
Then, proceed to step S4, the first movable information of vehicle is obtained by onboard sensor and vehicle movement model. According to the first obtained movable information, displacement distance and the drift angle of vehicle are obtained.Vehicle sensors may include steering wheel angle Sensor and velocity sensor.Specifically, step S4 may include:It is obtained by steering wheel angle sensor and velocity sensor The steering wheel angle and speed information of vehicle;The turning of vehicle is calculated based on vehicle movement model, steering wheel angle, speed information Radius;Displacement distance and the drift angle of vehicle are calculated based on obtained vehicle turn radius, steering wheel angle, speed information.
It is specifically described referring to Fig. 6 and Fig. 7.Fig. 6 is to schematically show vehicle double track motion model and vehicle The diagram of single track motion model.Fig. 7 is to schematically show that the present invention calculates the movement of vehicle using vehicle single track motion model Diagram.
In the present embodiment, vehicle sport mode is the vehicle movement model based on single track.It is double shown in Fig. 6 (a) In rail motion model, the two front-wheel approximate processings that can move double track are for one in two front-wheel centre positions (diagram W/2 Place, W indicate the spacing of left and right wheels) centre position wheel, and using this position as vehicle front-wheel, similarly by two trailing wheel approximate processings It is taken turns for a centre position in the centre position of two trailing wheels, and using this position as vehicle rear wheel, to obtain Fig. 6 (b) Shown in single track model, wherein L indicate front and back wheel distance.
Shown in Fig. 7 vehicle k moment and k+1 moment single track model representation.R1 and R2 in figure are trailing wheel respectively With the turning radius of front-wheel, the dotted line frame in figure is single track modal position of the vehicle at the k+1 moment, and solid box is vehicle in k The single track modal position at quarter, what δ was indicated is the corner of steering wheel, and what γ was indicated is the drift angle of vehicle.
It is now to obtain displacement distance and the drift angle of vehicle by calculating, practical is exactly to calculate from vehicle location (x, y)kIt arrives Vehicle location (x, y)k+1Distance and drift angle γ value.First have to calculate the turning half of vehicle front wheels and rear wheels when calculating Diameter R2 and R1:
Then, displacement distance dx, dy and vehicle of vehicle are calculated based on turning radius, steering wheel angle and the speed obtained Body drift angle γ, calculation formula are as follows:
γ=v*dt/R2
Wherein v indicates that car speed, dt indicate that vehicle movement time, d indicate that vehicle displacement distance, dx indicate vehicle in x Displacement distance on direction, dy indicate the displacement distance of vehicle in y-direction.
After the displacement distance and drift angle for calculating vehicle, according to the correspondence of world coordinate system and image coordinate system, The displacement distance of vehicle and drift angle are converted to the amount of movement and corner of image.Specifically, world coordinate system clear first and figure As the correspondence of coordinate system, i.e., then actual range a of the clear image after calibration representated by every pixel is calculated opposite Amount of movement Dx, the Dy and rotational angle theta for the image information answered:
Dx=dx/a
Dy=dy/a
θ=γ
After getting the first movable information of vehicle by onboard sensor and vehicle movement model, to detecting before Angle point can be screened.It is described in detail referring to Fig. 8 to Fig. 9.
Fig. 8 is the flow chart for schematically showing angle point screening technique.Fig. 9 is to schematically show showing for angle point screening technique Figure.Fig. 8 and screening process shown in Fig. 9 are the first time screenings of angle point.
As shown in figure 8, the first time screening of angle point may include:S51, amount of movement and corner setting based on image are predetermined Value;S52 estimates location point of the angle point in next frame image by vehicle movement model;S53 is determined and is obtained in step S3 Trace point whether centered on by location point using predetermined value as in the region of radius;S54, if trace point is in the area, Retain angle point, otherwise deletes angle point.Angle point screening technique is illustrated by taking Fig. 9 as an example below.Wherein, in step s 51, ability Field technique personnel are based on factors such as noises (fluctuation) and incorporate experience into the predetermined value for screening angle point is arranged.
Specifically as shown in figure 9, P0 indicates that the angle point of previous frame image, r indicate the amount of movement based on image and corner setting Predetermined value, P1 indicates to estimate that location points of the angle point P0 in next frame image, P2 are indicated according to step by vehicle movement model The trace point that rapid S3 is obtained, judge trace point P2 whether centered on by P1 using r as in the region of radius, as shown in figure 9, tracking Point P2 then deletes angle point P0 not in above-mentioned zone, if trace point P0 in region, retains angle point P0.
Hereafter, in vehicle according to the invention movement locus method of estimation, it is also necessary to which angle steel joint carries out postsearch screening, obtains To best matrix model, the precision of vehicle running orbit estimation is thus improved.Certainly, it further increases and is obtained by angle point The precision of best matrix model can first be screened before postsearch screening using other methods angle steel joint, i.e., in step S5 It can repeatedly be screened with angle steel joint later and before step S6.
Referring to Fig.1 0, such as can diagonally be clicked through using LK optical flow trackings method after step s 5 and before step S6 Row screening, detailed process can be:Pyramid LK light stream forward trace algorithms are used first, determine the angle point (example in previous frame Such as, the T0 in Figure 10) forward trace angle point (for example, T01 in Figure 10) in current frame image;Then pyramid LK is used To track algorithm after light stream, determine rear in previous frame of forward trace angle point (T01) to tracking angle point (for example, in Figure 10 T10);The distance between angle point (T0) and backward tracking angle point (T10) are finally calculated, if distance between the two is less than predetermined Angle point is then retained and carries out next step by threshold value d.If distance between the two is more than predetermined threshold d, this angle point is gone It removes.Note that how to obtain forward trace angle point using LK optical flow tracking algorithms and track angle point backward for people in the art It is well known for member, therefore repeats no more herein.
Light stream angle steel joint is tracked in the first movable information based on vehicle or the first movable information based on vehicle and LK After being screened, postsearch screening is carried out using RANSAC algorithm angle steel joints, may comprise steps of:From current frame image and upper 3 pairs of matched angle points are randomly selected in one frame image, this 3 angle steel joint is not conllinear, obtains transformation matrix model;It is all to calculate other The projection error of angle point and the transformation matrix model, if projection error is less than the threshold value of setting, by corresponding angle point to being added Corresponding to the interior point set of the transformation matrix model;Again 3 pairs of matched angle points are chosen, obtain new transformation matrix model, and count The projection error of other all angle points and the transformation matrix model is calculated, it, will be corresponding if projection error is less than the threshold value of setting Angle point is to being added the interior point set corresponding to the transformation matrix model;It repeats above-mentioned selection matching angle point and calculates the step of projection error Suddenly, corresponding multiple interior point sets are obtained;It selects multiple interior points to concentrate and contains the most interior point set of angle point quantity as optimal interior Point set, and using the corresponding transformation matrix model of optimal interior point set as best matrix model.
Generally speaking, the postsearch screening of angle point is to search out an optimal mapping matrix model by RANSAC algorithms, is made The angle point that the transformation matrix model must be met is most.Specifically, transformation matrix H is defined as follows:
Assuming that present frame forward trace angular coordinate is (x ', y '), the matched angular coordinate of previous frame is (x, y), then has:
A pair of of matching angle point can construct 2 equations it can be seen from above-mentioned matrix, but matrix has 6 unknown parameters, because This at least wants 3 pairs of matching angle points, and transformation matrix H can be found out by 3 pairs of matching angle points.Then according to following relationship by before It is brought into matrix by other matching angle points that LK optical flow methods are screened and calculates projection error.
Wherein t indicates given threshold, by angle point to point set in addition if angle point is to meeting above-mentioned relation formula.Then it weighs The step of multiple above-mentioned selection angle point and calculating projection error, obtain multiple interior point sets.The angle point concentrated by comparing multiple interior points Quantity will contain the most interior point set of angle point quantity as optimal interior point set.For example, by certain 3 angle steel joint to obtaining matrix norm Type H1, the interior point for meeting matrix model H1 concentrate the quantity for the angle point having most, i.e., H1 matrix models are best matrix model. Note that how to be well known to the skilled artisan using RANSAC algorithms to obtain matrix model H, therefore here It repeats no more.
Hereafter, vehicle second is calculated based on the best matrix model obtained after screening by RANSAC algorithm angle steel joints to move Information.Coordinate (the x of the rear axle midpoint of vehicle in image is looked around by above-mentioned best matrix model H and vehicle bodyc, yc), Ke Yiji Calculation show that vehicle corner δ and vehicle body look around the move distance d of vehicle in the horizontal direction in imagex, movement vertically Distance dy
Specifically, it is known that vehicle during the motion, when turning at the midpoint (rear axle midpoint) of two trailing wheels It rotates, since vehicle body looks around in image the actual size of auto model size and vehicle there are certain correspondence, I.e. vehicle body look around two trailing wheel of vehicle in image wheel spacing and the wheel spacing of actual two trailing wheel of vehicle there are certain ratios Relationship, it is hereby achieved that vehicle body looks around the coordinate (x at vehicle rear axle center in imagec, yc)。
It can also be to down conversion matrix model H1 in addition, looking around the position relationship of previous frame and current frame image in image To indicate:
Assuming that being looked around in image in vehicle body, the corner of vehicle is δ, and the distance that moves horizontally of vehicle is dx, vehicle it is vertical Displacement distance is dy(pay attention to:Here distance is all pixel distance), then have:
X1=scale*cos (δ)
X2=-scale*sin (δ)
X3=(dx-xc)*r1+(dy-yc)*x2+xc
X4=scale*cos (δ)
X5=scale*sin (δ)
X6=(dx-xc)*r4+(dy-yc)*x5+yc
Scale is a change of scale factor in above-mentioned 6 formula.Compare H and H1, it can be seen that vehicle body panoramic view in fact Relationship between the previous frame and present frame of picture can be come out by RANSAC algorithm direct solutions.That is, x1 ... x6 with R1 ... r6 are equal.Therefore, r1-r6 can be substituted into above-mentioned formula and calculates vehicle corner δ, vehicle body looks around vehicle in image Move horizontally distance dx, the vertical travel distance d of vehicley
Hereafter, in conjunction between front and back two field pictures time interval Δ t and vehicle body look around in image per representated by pixel Actual range pixel_d can calculate move distance D, the movement velocity V of vehicle:
In addition it should be pointed out that except through best matrix model calculate vehicle corner information except, can be with root The corner information of vehicle is obtained according to the angle point of optimal interior point concentration, calculation is as follows:Two distances are chosen in previous frame Angle point apart from each other, such as A (x0, y0), B (x1, y1) calculate AB if the distance AB of two angle points is more than predetermined value d The angle [alpha] of straight line.Calculate the angle beta of line correspondence A ' B ' straight lines in the current frame simultaneously, wherein A ' be with A it is matched it is positive with Track angle point, B ' are that forward trace angle point is matched with B.The corner of vehicle be δ=| β-α |.When there are multiple angle points apart from each other When AB, multiple vehicle corner δ can be found out at this time, and main processing method is that multiple corner δ are weighted averagely, by average value As last vehicle corner.
It should be strongly noted that in the inventive solutions, with the traveling of vehicle, the image that detects before The quantity of middle angle point can be reduced, because the angle point detected may be in next frame image.Therefore, one can be pre-set The threshold value of a angle point quantity, when the angle point quantity in a certain frame image is less than the threshold value, on the basis of retaining existing angle point, The operations such as Corner Detection, angle point screening are carried out again, to increase new angle point, and then ensure the accurate of action reference variable Property.
Finally, step S4 and step S6 the first movable information obtained and the second movable information Kalman filtering is carried out to melt Conjunction obtains the movement locus of vehicle.Kalman filtering fusion mainly consists of two parts, i.e. prior part and posteriority part.At this In the embodiment of invention, the data of prior part are obtained by vehicle movement model, i.e., the first movement letter recited above Breath, the data of posteriority part are obtained by angle point, i.e., the second movable information recited above.
It can specifically include:The state parameter of vehicle is established according to the first movable information and the second movable information respectively;If The matrix parameter of Kalman filtering fusion equation is set (for example, the covariance of state-transition matrix, observing matrix, predictive estimation Matrix, covariance matrix, the measurement noise covariance matrix etc. for encouraging noise), the state parameter of vehicle is substituted into Kalman's filter Wave merges the movement locus that equation calculates vehicle.
Note that the matrix parameter, equation and the specific fusion calculation that are used in Kalman filtering fusion are for ability It is well known for the technical staff in domain, therefore repeats no more.
According to vehicle movement track of the present invention method of estimation, the vehicle movement information obtained using onboard sensor obtains vehicle Displacement distance and drift angle and the amount of movement and corner that are converted to image, then pass through angle steel joint detection, screening, secondary sieve Choosing, obtains best matrix model after screening, the second movable information that vehicle is calculated by best matrix model obtains the fortune of image Dynamic information, and finally merge the amount of movement of image and corner with the movable information of image, obtain the movement locus of vehicle. It is used alone for the method for onboard sensor or image optical flow method to estimate vehicle movement track, makes in compared with the prior art Two methods form good complementation, evaded the deficiency of respective method, made vehicle no matter in high speed or the case where low speed Under, the track of vehicle can be estimated with higher precision.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention With within principle, any modification, equivalent replacement, improvement and so on should all be included in the protection scope of the present invention god.

Claims (10)

1. a kind of vehicle movement track method of estimation, which is characterized in that vehicle movement track method of estimation includes following step Suddenly:
S1, the vehicle body for obtaining vehicle look around image;
S2 detects the angle point that the vehicle body is looked around in image;
S3 tracks optical flow method to the angle point into line trace by LK, obtains trace point of the angle point in next frame image;
S4 obtains the first movable information of the vehicle by onboard sensor and vehicle movement model;
S5 is based on first movable information, is screened to the angle point;
S6 carries out postsearch screening to the angle point and obtains best matrix model, the vehicle is calculated by the matrix model Second movable information;
First movable information and second movable information are carried out Kalman filtering fusion, obtain the vehicle by S7 Movement locus.
2. vehicle movement track according to claim 1 method of estimation, which is characterized in that the step S2 includes:
S21 calculates separately multiple pixel absolute value of the difference between multiple pixels on tested pixel and predetermined radii;
S22, if thering is the pixel absolute value of the difference of predetermined quantity to be more than threshold value in the multiple pixel absolute value of the difference, by institute Tested pixel is stated as characteristic point;
S23 judges whether there was only this characteristic point of the characteristic point in the neighborhood centered on the characteristic point, if only one A characteristic point, then using the characteristic point as the angle point.
3. vehicle movement track according to claim 2 method of estimation, which is characterized in that the step S2 further includes:
If there are multiple characteristic points in the neighborhood centered on the characteristic point, the score value of each characteristic point is calculated, it is described The summation of multiple pixel absolute value of the difference of the score value between the characteristic point and the multiple pixel;
If the score value of the characteristic point is maximum, using the characteristic point as the angle point.
4. vehicle movement track according to claim 1 method of estimation, which is characterized in that the step S4 includes:
The steering wheel angle and speed information of the vehicle are obtained by the onboard sensor;
The turning radius of the vehicle is calculated based on the vehicle movement model and the steering wheel angle;
Displacement distance and the drift angle of the vehicle are calculated based on the turning radius, the steering wheel angle, the speed information.
5. vehicle movement track according to claim 4 method of estimation, which is characterized in that in the shifting for calculating the vehicle After dynamic distance and drift angle, according to the relationship of world coordinate system and image coordinate system, by the displacement distance of the vehicle and drift angle Be converted to the amount of movement and corner of image.
6. vehicle movement track according to claim 5 method of estimation, which is characterized in that step S5 includes:
Predetermined value is arranged in amount of movement and corner based on described image;
Estimate location point of the angle point in next frame image by the vehicle movement model;
Determine the trace point whether centered on by the location point using the predetermined value as in the region of radius;
If the trace point in the region, retains the angle point, the angle point is otherwise deleted.
7. vehicle movement track according to claim 1 method of estimation, which is characterized in that after the step S5 and Before step S6, vehicle movement track method of estimation can also include using LK optical flow trackings method to the angle point after screening It is screened, including:
Using LK light stream forward trace algorithms, forward trace angle of the angle point in previous frame in current frame image is determined Point;
Using to track algorithm, determining that the forward trace angle point is rear to tracking angle point in the previous frame after LK light streams;
The distance between the angle point and the backward tracking angle point in the previous frame are calculated, if the distance is less than in advance Determine threshold value, then retains the angle point.
8. vehicle movement track according to claim 1 or claim 7 method of estimation, which is characterized in that in step s 6, use RANSAC algorithms carry out postsearch screening to the angle point after screening, including:
3 pairs of matched angle points are randomly selected from the current frame image and the previous frame image, this 3 angle steel joint is not conllinear, Obtain transformation matrix model;
The projection error of other all angle points and the transformation matrix model is calculated, if projection error is less than given threshold, By corresponding angle point to the interior point set corresponding to the transformation matrix model is added;
3 pairs of matched angle points are reselected, new transformation matrix model is obtained, and calculate other all angle points and the transformation matrix Corresponding angle point is corresponded to the transformation square by the projection error of model if projection error is less than the given threshold to being added The interior point set of battle array model;
The step of repeating above-mentioned selection matching angle point and calculating projection error, obtains corresponding multiple interior point sets;
It selects multiple interior points to concentrate and contains the most interior point set of angle point quantity as optimal interior point set, and by the optimal interior point set The corresponding transformation matrix model is as best matrix model.
9. vehicle movement track according to claim 8 method of estimation, which is characterized in that obtained by RANSAC algorithms The best matrix model is:
Coordinate (the x of the rear axle midpoint of vehicle in image is looked around by the best matrix model H and the vehicle bodyc, yc), meter Calculation obtains vehicle corner δ and the vehicle body looks around the move distance d of vehicle in the horizontal direction in imagexVertically Move distance dy
In conjunction between two field pictures time difference Δ t and the vehicle body look around in image per the actual range representated by pixel The move distance D and movement velocity V of the vehicle is calculated in pixel_d:
10. vehicle movement track according to claim 1 method of estimation, which is characterized in that step S7 includes:
The state parameter of the vehicle is established according to first movable information and second movable information respectively;
The matrix parameter of Kalman filtering fusion equation is set, the state parameter of the vehicle is substituted into the Kalman filtering Fusion equation calculates the movement locus of the vehicle.
CN201810049413.7A 2018-01-18 2018-01-18 A kind of vehicle movement track method of estimation Withdrawn CN108280847A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810049413.7A CN108280847A (en) 2018-01-18 2018-01-18 A kind of vehicle movement track method of estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810049413.7A CN108280847A (en) 2018-01-18 2018-01-18 A kind of vehicle movement track method of estimation

Publications (1)

Publication Number Publication Date
CN108280847A true CN108280847A (en) 2018-07-13

Family

ID=62803911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810049413.7A Withdrawn CN108280847A (en) 2018-01-18 2018-01-18 A kind of vehicle movement track method of estimation

Country Status (1)

Country Link
CN (1) CN108280847A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109239752A (en) * 2018-09-29 2019-01-18 重庆长安汽车股份有限公司 Vehicle positioning system
CN109345576A (en) * 2018-09-30 2019-02-15 西南政法大学 Vehicle Speed identification method and system
CN109655823A (en) * 2018-12-30 2019-04-19 北京经纬恒润科技有限公司 The tracking and device of target
CN110070565A (en) * 2019-03-12 2019-07-30 杭州电子科技大学 A kind of ship trajectory predictions method based on image superposition
CN110969574A (en) * 2018-09-29 2020-04-07 广州汽车集团股份有限公司 Vehicle-mounted panoramic map creation method and device
CN111914627A (en) * 2020-06-18 2020-11-10 广州杰赛科技股份有限公司 Vehicle identification and tracking method and device
CN112506195A (en) * 2020-12-02 2021-03-16 吉林大学 Vehicle autonomous positioning system and positioning method based on vision and chassis information
CN112862856A (en) * 2019-11-27 2021-05-28 深圳市丰驰顺行信息技术有限公司 Method, device and equipment for identifying illegal vehicle and computer readable storage medium
CN113096407A (en) * 2021-02-27 2021-07-09 惠州华阳通用电子有限公司 Height-limiting channel vehicle anti-collision method and device
CN113188848A (en) * 2021-04-12 2021-07-30 攸太科技(台州)有限公司 Urine tracking method

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109239752A (en) * 2018-09-29 2019-01-18 重庆长安汽车股份有限公司 Vehicle positioning system
CN110969574A (en) * 2018-09-29 2020-04-07 广州汽车集团股份有限公司 Vehicle-mounted panoramic map creation method and device
CN109345576A (en) * 2018-09-30 2019-02-15 西南政法大学 Vehicle Speed identification method and system
CN109345576B (en) * 2018-09-30 2022-09-06 西南政法大学 Vehicle running speed identification method and system
CN109655823A (en) * 2018-12-30 2019-04-19 北京经纬恒润科技有限公司 The tracking and device of target
CN110070565A (en) * 2019-03-12 2019-07-30 杭州电子科技大学 A kind of ship trajectory predictions method based on image superposition
CN112862856A (en) * 2019-11-27 2021-05-28 深圳市丰驰顺行信息技术有限公司 Method, device and equipment for identifying illegal vehicle and computer readable storage medium
CN111914627A (en) * 2020-06-18 2020-11-10 广州杰赛科技股份有限公司 Vehicle identification and tracking method and device
CN112506195A (en) * 2020-12-02 2021-03-16 吉林大学 Vehicle autonomous positioning system and positioning method based on vision and chassis information
CN113096407A (en) * 2021-02-27 2021-07-09 惠州华阳通用电子有限公司 Height-limiting channel vehicle anti-collision method and device
CN113096407B (en) * 2021-02-27 2022-10-11 惠州华阳通用电子有限公司 Height-limiting channel vehicle anti-collision method and device
CN113188848A (en) * 2021-04-12 2021-07-30 攸太科技(台州)有限公司 Urine tracking method

Similar Documents

Publication Publication Date Title
CN108280847A (en) A kind of vehicle movement track method of estimation
CN108257092A (en) A kind of vehicle body looks around image base display methods
US11554717B2 (en) Vehicular vision system that dynamically calibrates a vehicular camera
US10452999B2 (en) Method and a device for generating a confidence measure for an estimation derived from images captured by a camera mounted on a vehicle
CN104854637B (en) Moving object position attitude angle estimating device and moving object position attitude angle estimating method
US9171225B2 (en) Device, method, and recording medium for detecting and removing mistracked points in visual odometry systems
CN104183127B (en) Traffic surveillance video detection method and device
CN108196285B (en) Accurate positioning system based on multi-sensor fusion
CN109813335B (en) Calibration method, device and system of data acquisition system and storage medium
CN107229908A (en) A kind of method for detecting lane lines
CN107750364A (en) Detected using the road vertically profiling of stable coordinate system
CN107402012A (en) A kind of Combinated navigation method of vehicle
WO1997025700A1 (en) Traffic congestion measuring method and apparatus and image processing method and apparatus
CN108198248A (en) A kind of vehicle bottom image 3D display method
JP2002197469A (en) Device for detecting traffic lane
DE10394295T5 (en) Distance calculation device and calculation program
Ruotsalainen et al. Visual-aided two-dimensional pedestrian indoor navigation with a smartphone
JP5834933B2 (en) Vehicle position calculation device
CN107792070A (en) The drive assistance device of vehicle
CN108256484A (en) A kind of vehicle movement parameter evaluation method
JP5891802B2 (en) Vehicle position calculation device
JP2003085685A (en) Vehicle traveling track observing device and method using a plurality of video cameras
JP6611334B2 (en) Lane marking recognition system
CN107918763A (en) Method for detecting lane lines and system
JP7127607B2 (en) Rail curvature estimator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 201203 Shanghai Pudong New Area free trade trial area, 1 spring 3, 400 Fang Chun road.

Applicant after: Shanghai Sen Sen vehicle sensor technology Co., Ltd.

Address before: 201210 301B room 560, midsummer Road, Pudong New Area Free Trade Zone, Shanghai

Applicant before: New software technology (Shanghai) Co., Ltd.

CB02 Change of applicant information
WW01 Invention patent application withdrawn after publication

Application publication date: 20180713

WW01 Invention patent application withdrawn after publication