CN102184535A - Method for detecting boundary of lane where vehicle is - Google Patents

Method for detecting boundary of lane where vehicle is Download PDF

Info

Publication number
CN102184535A
CN102184535A CN 201110094319 CN201110094319A CN102184535A CN 102184535 A CN102184535 A CN 102184535A CN 201110094319 CN201110094319 CN 201110094319 CN 201110094319 A CN201110094319 A CN 201110094319A CN 102184535 A CN102184535 A CN 102184535A
Authority
CN
China
Prior art keywords
lane boundary
vehicle
lane
parameter vector
alpha
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201110094319
Other languages
Chinese (zh)
Other versions
CN102184535B (en
Inventor
陈勇
何明一
张易凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN 201110094319 priority Critical patent/CN102184535B/en
Publication of CN102184535A publication Critical patent/CN102184535A/en
Application granted granted Critical
Publication of CN102184535B publication Critical patent/CN102184535B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for detecting a boundary of a lane where a vehicle is, which comprises the steps of: first adaptively extracting edge points for the part below a horizon of a road image according to pixel gradient amplitudes and calculating an edge direction, then searching and positioning a lane boundary by utilizing the particle swarm optimization in a parameter vector space of a lane boundary projection model according to the extracted edge points, the edge direction and the lane boundary projection model; finally, calculating a plane geometric structure of the lane wherein the vehicle is, a position of the vehicle in the lane and an angle of deviation according to the searched lane boundary projection model parameter values and a parameter calculation method. The method can reflect an actual lane boundary curve more accurately, has better environment self-adaption capability, enhances the robustness of the lane detection, and has high reliability and strong antijamming capability.

Description

A kind of vehicle place lane boundary detection method
Technical field
The invention belongs to intelligent vehicle environment sensing technical field, relate to a kind of lane boundary detection method.
Background technology
In the intelligent transportation system, the lane detection technology is the important component part of intelligent vehicle environment sensing technology, is mainly used in Vehicular intelligent Ruiss Controll, lateral direction of car control, deviation warning, vehicle autonomous driving etc.Vision is that human steering vehicle uses maximum environment sensing modes, most vehicles are all right to be sailed on structured road, and structured road take the form of manward's vision, visually-perceptible is one of the most effective and important techniques means of lane detection.
In existing lane detection technology, can be summarized as three important component parts at the lane detection method of structured road based on vision: definition track model, extract the track feature, serve as that constraint is according to the feature location track of extracting with the track model.Ding Yi track model mainly contains straight line in the prior art, quadratic polynomial, SPL, circular arc etc., but these models all can't be fully be mapped with actual track curve, can only partly or reflect the track structure approx, therefore utilize them can't accurately detect the lane boundary of different shape, even detected the also accurate position of estimating vehicle in the track, track, the geometry on direction and plane, track, as track curvature, the rate of change of track curvature etc., and these data are to the autonomous driving of vehicle, the control of intelligent cruise and decision-making are very important.For the track feature, prior art has mainly been utilized the texture, color on edge, gradient, the road surface of traffic lane line etc.Wherein gradient, texture, color characteristic are vulnerable to the influence of shade, illumination, Changes in weather, and the traffic lane line edge feature is relatively stable, has better anti-interference.But in the existing lane detection technology, the edge threshold that is used for edge extracting is fixed, and road illumination changes, and the traffic lane line edge may can't extract because of the influence that track light changes, thereby can't detect the track.For the edge that extracts, utilize methods such as classification, cluster, neural network that marginal point is carried out match usually in the existing lane detection technology and locate lane boundary.Owing in the process of extracting the traffic lane line edge, have other edge also can be extracted out, therefore before the marginal point match, need filtering interfering, otherwise can influence the accurate location of lane boundary, but filtering interfering can increase the complexity of lane detection method to a certain extent again, and does not also have effective method to address this problem in the prior art.
On the other hand, the vehicle situation of road in the process of moving is complicated and changeable, so lane detection needs to adapt to various traffic lane lines, road illumination and complex environment.These complicacy and variation mainly show three aspects.The one, relevant with the traffic lane line form of expression, as the track is linear straight line, circular arc and clothoid etc. are arranged; The color adularescent and the yellow of traffic lane line; The type of traffic lane line has single solid line, double solid line, dotted line etc.The 2nd, relevant with road illumination, as the shade of roadside buildings thing, branch leaf and other vehicle, the brightness of traffic lane line under different weather and the light conditions, the dazzle that the sunburst reflection causes, traffic lane line is smudgy etc. because of wearing and tearing, and these can have influence on the observability of traffic lane line.The 3rd, relevant with condition of road surface, as the breakage and the crackle on road surface, the difference of road surface material, the blocking etc. of other driving vehicle on the road, these can disturb the detection of traffic lane line.Lane detection under these complexity and mal-condition is very difficult, and prior art also can't solve fully.
Find that by prior art documents Y.Zhou etc. are published in " Measurement Science﹠amp; Technology " papers " A robust lane detection and tracking method based on computer vision " of 2006 years 17 volumes, adopt the projection model in straight line and circular arc track, utilize the gradient feature and the tabu search algorithm search location lane boundary of pixel.Though the described technical method of the document can be to various traffic lane lines, at road the lane detection of carrying out under the situations such as shade, road surface breakage and crackle, vehicle block is arranged, also exist weak point.The one, this technical method only can detection of straight lines and the lane boundary of circular arc, can't accurately detect the linear lane boundary of rondo; The 2nd, this technical method can't detect the track planar geometry fully; The 3rd, this technical method can't effectively detect traffic lane line because of the ambiguous lane boundary of wearing and tearing; The 4th, the lane boundary that this technical method can't adapt under the dusk light conditions detects.
Summary of the invention
Can't detect lane boundary and track planar geometry information can't be provided under complicated and mal-condition in order to overcome prior art, the position of vehicle in the track and the deficiency of direction, the invention provides a kind of vehicle place lane boundary detection method, this method not only can detect the border in track, vehicle place under the Ordinary Rd environment, and can under complicated and abominable road environment, detect vehicle place lane boundary, as road shade is arranged, crackle, vehicle, sunlight reflected, rather dark, traffic lane line wearing and tearing etc., and can calculate the geometrical structure parameter value in track, position and the direction of vehicle in the track.
The technical solution adopted for the present invention to solve the technical problems may further comprise the steps:
Step 1: according to the pixel gradient amplitude adaptively to the extracting section marginal point below the road image local horizon, and edge calculation direction;
Step 2: according to the marginal point, edge direction and the lane boundary projection model that extract, in the parameter vector space utilization particle group optimizing search location of lane boundary projection model lane boundary;
Step 3: calculate track, vehicle place planar geometry, position and the angle of deviation of vehicle in the track according to lane boundary projection model parameter value that searches and calculation of parameter formula.
Described step 1 may further comprise the steps:
Step 1.1: determine horizontal position in the image, calculate
Figure BSA00000472324100031
R wherein OThe ordinate of presentation video central point, d yPhysical size on the remarked pixel point vertical direction, f cThe focal length of expression vehicle-mounted vidicon, α represents the pitch angle of vehicle-mounted vidicon, with local horizon (j, r H) to be the boundary be two parts up and down with image division, j=0 wherein, 1 ..., N, the pixel value of N presentation video width;
Step 1.2: the local horizon is with the gradient magnitude of each pixel of lower part in the computed image
Figure BSA00000472324100032
Wherein (c, the r) coordinate of the following partial pixel point in local horizon in the presentation video, G x(c, r), G y(its computing method are respectively for c, the r) gradient magnitude on following partial pixel point level in local horizon and the vertical direction in the difference presentation video
G x ( c , r ) = f ( c + 1 , r - 1 ) + 2 f ( c + 1 , r ) + f ( c + 1 , r + 1 )
- f ( c - 1 , r - 1 ) - 2 f ( c - 1 , r ) - f ( c - 1 , r + 1 )
G y ( c , r ) = f ( c - 1 , r + 1 ) + 2 f ( c , r + 1 ) + f ( c + 1 , r + 1 )
- f ( c - 1 , r - 1 ) - 2 f ( c , r - 1 ) - f ( c + 1 , r - 1 )
F (c, r) local horizon following partial pixel point (c, pixel value r) in the presentation video wherein;
Step 1.3: calculate the threshold value of extracting the edge
Figure BSA00000472324100037
Wherein M, N distinguish the pixel value of presentation video height and width, w GBe coefficient, its span is 0.1≤w G≤ 1.5;
Step 1.4: with local horizon in the image relatively with the gradient magnitude of each pixel of lower part and edge threshold, if gradient magnitude is greater than edge threshold, then this pixel be marginal point and edge calculation point edge direction θ (c, r)=arctan[G y(c, r)/G x(c, r)].
Described step 2 may further comprise the steps:
Step 2.1: drop shadow curve's equation of left and right boundary line, track, promptly the lane boundary curvilinear equation in the image is also referred to as left and right border, track projection model and is respectively
b 1L(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c=0
b 1R(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c=0
C remarked pixel horizontal ordinate wherein, b 1L, b 1R, b 0, b -1And b -2Be the parameter of lane boundary drop shadow curve, and definition B=(b 1L, b 1R, b 0, b -1, b -2) TBe lane boundary projection model parameter vector, the feasible region (5,0,0 ,-2000 ,-3000) of lane boundary projection model parameter vector B is set T<B<(0,5,300,2000,3000) T, beta particle group's big or small m, the maximum flying speed V of particle Max=(v 1Lmax, v 1Rmax, v 0max, v -1max, v -2max) T, and maximum search iterations Iter Max, V wherein MaxSpan be (1,1,100,1500,2000) T≤ V Max≤ (5,5,500,2500,3000) T, the span of m is 10≤m≤60, Iter MaxSpan be 30≤Iter Max≤ 100;
Step 2.2: each parameter vector particle position B of random initializtion in the feasible region of lane boundary projection model parameter vector B i, speed V i, wherein i represents the sequence number of parameter vector particle, and the historical desired positions P of each parameter vector particle is set i=B i, calculate the lane boundary curve degree of confidence F (B) of each parameter vector particle then according to marginal point and edge direction, and maximum confidence corresponding historical desired positions G in the degree of confidence size of all particles and then the vectorial population that gets parms relatively;
Step 2.3: be calculated as follows and upgrade each parameter vector particle's velocity Vi and position Bi,
V i(k+1)=w(k)V i(k)+c 1r 1[P i(k)-B i(k)]+c 2r 2[G(k)-B i(k)]
B i(k+1)=B i(k)+V i(k+1)
Wherein k is the search iteration number of times, V i(k), B i(k), P i(k) and the historical desired positions of G (k) i parameter vector particle's velocity, position, historical desired positions and population when representing the k time iteration respectively, c 1, c 2The expression constant, its span be (0,4], r 1, r 2Be the random number on (0,1) interval, I=1,2 ..., m;
Step 2.4: check parameter vector particle's velocity and position, the parameter vector particle that surpasses maximal rate is carried out speed limit, it is maximal rate that its speed is set, and the parameter vector particle that crosses the border is carried out the position return, but its position is set in row space at random;
Step 2.5: calculate the lane boundary curve degree of confidence F (B) of each parameter vector particle according to marginal point and edge direction, and relatively upgrade the historical desired positions P of each parameter vector particle iHistorical desired positions G with the parameter vector population;
Step 2.6: with search iteration number of times k and maximum iteration time Iter MaxCompare, if k is less than Iter MaxForward step 2.3 to, otherwise forward step 2.7 to;
Step 2.7: with the lane boundary curve output of the historical desired positions G correspondence of parameter vector population.
The computing method of described lane boundary curve degree of confidence F (B) are
Figure BSA00000472324100051
Wherein D (c, r) the expression marginal point (c, r) to the distance of lane boundary curve,
Figure BSA00000472324100052
(U represents the lane boundary neighborhood of a curve to the expression marginal point for c, the angle of edge direction r) and lane boundary curve, and the span of its radius is [2,30], μ W, Represent (b respectively 1R-b 1L) average and variance, D (c, variance r),
Figure BSA00000472324100054
Variance, its span is respectively
Figure BSA00000472324100055
Figure BSA00000472324100056
Figure BSA00000472324100057
Figure BSA00000472324100058
W wherein LaneThe expression lane width, d x, d yRespectively remarked pixel point in the horizontal direction with vertical direction on physical size, h cThe expression vehicle-mounted vidicon is apart from the height on ground.
In the computing method of described lane boundary curve degree of confidence F (B), marginal point (c, r) to the distance D of lane boundary curve (c, r) and marginal point (c, the angle of edge direction r) and lane boundary curve
Figure BSA00000472324100059
Computing method be respectively
Figure BSA000004723241000510
Figure BSA000004723241000511
Wherein
D L(c,r)=|[b 1L(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c]cosψ L|
D R(c,r)=|[b 1R(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c]cosψ R|
Figure BSA000004723241000512
Figure BSA000004723241000513
Wherein
ψ L=arctan[-b 1L+b -1(r-r H) -2+2b -2(r-r H) -3]
ψ R=arctan[-b 1R+b -1(r-r H) -2+2b -2(r-r H) -3]
Described step 3 may further comprise the steps:
Be calculated as follows the curvature C in track, vehicle place respectively according to the historical desired positions G of parameter vector population 0Rate of change C with track curvature 1:
C 0 = 2 d x d y cos 3 α f c 3 h c ( b - 1 f c + 3 2 b - 2 d y sin 2 α )
C 1 = 6 b - 2 d x d y 2 cos 5 α f c 3 h c 2
According to the historical desired positions G of parameter vector population be calculated as follows respectively the angle of deviation β of vehicle in the track, vehicle to left and right boundary line, track apart from d LAnd d R:
β = - d x cos α f c 3 [ ( b 0 - c O ) f c 2 + b - 1 f c d y sin 2 α + 3 4 b - 2 d y 2 sin 2 2 α ]
d L = - γ h c + d x h c f c 3 d y cos α [ b 1 L f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
d R = - γ h c + d x h c f c 3 d y cos α [ b 1 R f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
Wherein γ is the roll angle of vehicle-mounted vidicon.
The invention has the beneficial effects as follows:
1) track of the present invention model meets the projection of track horizontal alignment in image of structuring track design specifications regulation, has taken into full account and utilized the geometry feature of lane line, can more accurately reflect actual lane boundary curve;
2) the present invention only carries out edge and edge direction extraction to ground image, regulate by step 1.2 and step 1.3 edge threshold self-adaptation, the road environment that can adapt to different light and light and shade, as fine day, cloudy day, night, reflective, traffic lane line wearing and tearing etc., has good environment self-adaption ability;
3) according to distance and direction, the present invention has only calculated the marginal point that has certain similarity with lane line, has effectively avoided the interference at non-traffic lane line edge, as shade, road surface crackle, other vehicle etc., strengthened the robustness of lane detection, method reliability height, antijamming capability is strong.
The present invention is further described below in conjunction with drawings and Examples.
Description of drawings
Fig. 1 is the world coordinate system and the camera coordinate system synoptic diagram of the described lane line of the embodiment of the invention;
Fig. 2 is that the image coordinate system and the pixel coordinate of the described lane line of the embodiment of the invention is synoptic diagram;
Fig. 3 is the general flow chart of the described vehicle of embodiment of the invention place lane boundary detection method;
Fig. 4 is the described method flow diagram to extracting section marginal point below the road image local horizon and edge calculation direction of the embodiment of the invention;
Fig. 5 is that the embodiment of the invention is described according to the marginal point, edge direction and the lane boundary projection model that extract, at the method flow diagram of lane boundary projection model parameter vector space search location lane boundary;
Fig. 6 is the lane boundary testing result of the embodiment of the invention when on the road surface shade being arranged;
Fig. 7 is the lane boundary testing result of the embodiment of the invention when having other vehicle to block in road;
Fig. 8 is the lane boundary testing result of embodiment of the invention when crackle, shade and traffic lane line wearing and tearing are arranged on the road surface;
Fig. 9 is the lane boundary testing result of the embodiment of the invention when backlight and dusk.
Embodiment
In the embodiment of the invention video camera is installed on vehicle roof axis front position, camera lens is over against vehicle front, after the camera parameters demarcation, vehicle is along lanes, the real-time collection vehicle road ahead of vehicle-mounted vidicon image uses the inventive method that this car place lane boundary in the road image is detected.
Lane boundary for structured road detects, and can adopt certain lane boundary model.The degree of accuracy that the lane boundary that not only can improve realistic lane boundary model detects, and can estimate the track planar geometry, as track curvature, curvature variation, and position and the direction etc. of vehicle in the track.The concrete principles illustrated of lane boundary model of the embodiment of the invention is as follows.
According to specification of the highway route design, plane figure of highway should be made up of straight line, circular curve, three kinds of key elements of clothoid.Because the road surface is essentially the plane in visual range, so track curvature C can be expressed as follows with the variation relation of track length l:
C(l)=C 0+C 1l
C in the formula 0Be the curvature of track at the viewpoint place, C 1Rate of change for track curvature.According to C 0And C 1Different values, the above-mentioned relation formula both can be represented clothoid, also can represent straight line and circular arc.As shown in Figure 1, set up world coordinate system O wX wY wZ w, coordinate plane O wherein wX wZ wBe parallel to the road surface, coordinate axis O wZ wBe parallel to the tangent line of lane line, initial point O at the viewpoint corresponding point position wApart from road surface and vehicle-mounted vidicon coordinate system O cX cY cZ cInitial point O cWith high, highly be h cConsider vehicle along lanes, the angle of direction of traffic and track direction is less, for boundary line, the left side, track, following track equation is arranged
x w = C 0 l 2 / 2 + C 1 l 3 / 6 y w = h c z w = l
As shown in Figure 2, set up pixel coordinate system, O 1Be initial point, O cBe image center.If d x, d yBe respectively the physical size of pixel on x, y direction of principal axis, β represents the angle of direction of traffic and track direction, d LThe distance vector that project to track left side boundary line of presentation video central point on the road surface.Roll angle γ, track curvature and the track curvature variation of considering vehicle-mounted vidicon are all smaller, by coordinate system transformation, can get boundary line, the left side, track curvilinear equation in the pixel coordinate system by following formula, are also referred to as track left margin projection model:
b 1L(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c=0
Wherein
b 1 L = d y cos α d x h c ( d L + γ h c + β h c tan α + 1 2 C 0 h c 2 tan 2 α - 1 6 C 1 h c 3 tan 3 α )
b 0 = f c d x cos α ( - β - C 0 h c tan α + 1 2 C 1 h c 2 tan 2 α ) + c O
b - 1 = f c 2 h c ( C 0 - C 1 h c tan α ) 2 d x d y cos 3 α
b - 2 = f c 3 h c 2 C 1 6 d x d y 2 cos 5 α
r H = r O - f c tan α d y
In like manner can get boundary line, the right, track curvilinear equation, be also referred to as track right margin projection model
b 1R(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c=0
Wherein
b 1 R = d y cos α d x h c ( d R + γ h c + β h c tan α + 1 2 C 0 h c 2 tan 2 α - 1 6 C 1 h c 3 tan 3 α )
D wherein RThe presentation video central point is at the distance vector that projects to boundary line, the right, track on road surface.Definition B=(b 1L, b 1R, b 0, b -1, b -2) TBe lane boundary projection model parameter vector, then the lane line in the carriageway image can determine that its span is (5,0,0 ,-2000 ,-3000) by parameter vector B T<B<(0,5,300,2000,3000) T
Different with the track model that uses in the existing lane detection technology, the lane boundary projection model of the embodiment of the invention is the true description to actual lane boundary wire shaped, use it not only can improve the degree of accuracy of lane detection, effectively suppress the interference of non-lane line, and can estimate track curvature C respectively according to lane boundary projection model parameter value 0, track curvature variation C 1, direction of traffic and track direction angle β, vehicle from boundary line, the left and right sides, track apart from d LAnd d R, its computing method are as follows respectively:
C 0 = 2 d x d y cos 3 α f c 3 h c ( b - 1 f c + 3 2 b - 2 d y sin 2 α )
C 1 = 6 b - 2 d x d y 2 cos 5 α f c 3 h c 2
β = - d x cos α f c 3 [ ( b 0 - c O ) f c 2 + b - 1 f c d y sin 2 α + 3 4 b - 2 d y 2 sin 2 2 α ]
d L = - γ h c + d x h c f c 3 d y cos α [ b 1 L f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
d R = - γ h c + d x h c f c 3 d y cos α [ b 1 R f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
Utilize the described lane boundary projection model of the embodiment of the invention, the general flow chart of the embodiment of the invention as shown in Figure 3, its embodiment is as follows:
Step 1, according to the pixel gradient amplitude adaptively to the extracting section marginal point below the road image local horizon, and edge calculation direction; Particularly, its process flow diagram as shown in Figure 4, its embodiment and principle are as described below:
Step 1.1 is two parts about the boundary is with image division with the local horizon; Because the visual information in track only exists only in the surface of road, and the local horizon in the image can calculate in advance by the inside and outside parameter of vehicle-mounted vidicon, in order to reduce calculated amount, avoid processing to garbage in the image, lane detection only need be handled information relevant with the road surface in the road image, i.e. the following part in local horizon in the image; Particularly, calculate R wherein OThe ordinate of presentation video central point, d yPhysical size on the remarked pixel point vertical direction, f cThe focal length of expression vehicle-mounted vidicon, α represents the pitch angle of vehicle-mounted vidicon, with local horizon (j, r H) to be the boundary be two parts up and down with image division, j=0 wherein, 1 ..., N, the pixel value of N presentation video width;
Step 1.2, the local horizon is with the gradient magnitude G of each pixel of lower part in the computed image m(c, r), wherein (c, r) coordinate of the following partial pixel point in local horizon in the presentation video; Particularly, to r 〉=r HPixel use isotropy Sobel operator to calculate the gradient magnitude G of its horizontal direction and vertical direction x(c, r) and G y(c, r), compute gradient amplitude G then m(c, r), its concrete computing method are as follows respectively:
G x ( c , r ) = f ( c + 1 , r - 1 ) + 2 f ( c + 1 , r ) + f ( c + 1 , r + 1 )
- f ( c - 1 , r - 1 ) - 2 f ( c - 1 , r ) - f ( c - 1 , r + 1 )
G y ( c , r ) = f ( c - 1 , r + 1 ) + 2 f ( c , r + 1 ) + f ( c + 1 , r + 1 )
- f ( c - 1 , r - 1 ) - 2 f ( c , r - 1 ) - f ( c + 1 , r - 1 )
G m ( c , r ) = G x 2 ( c , r ) + G y 2 ( c , r )
Step 1.3, edge calculation threshold value G MthIn order to extract the traffic lane line edge, edge threshold need be set; Because the variation of weather, sunlight, environment, the traffic lane line edge extracting method of built-in edge threshold value can not adapt to various variations; When threshold value is higher, can't extract wearing and tearing with dim environment under the traffic lane line edge; When threshold value is low, some noises and interference can be mistakened as edge extracting; Therefore, different carriageway images need use different edge threshold, and its concrete computing method are as follows:
G mth = w G ( M - r H + 1 ) × N Σ r = r H M Σ c = 1 N G m ( c , r )
Wherein M, N distinguish the pixel value of presentation video height and width, w GBe coefficient, its span is 0.1≤w G≤ 1.5, particularly, w GValue is 0.6;
Step 1.4 is with the gradient magnitude G of local horizon in the image with each pixel of lower part m(c is r) with edge threshold G MthCompare, if greater than edge threshold, then be marginal point and edge calculation point edge direction θ (c, r), particularly, θ (c, r)=arctan[G y(c, r)/G x(c, r)];
Step 2 is according to the marginal point, edge direction and the lane boundary projection model that extract, in the parameter vector space utilization particle group optimizing search location of lane boundary projection model lane boundary; Particularly, its process flow diagram as shown in Figure 5, embodiment is as described below:
Step 2.1 is provided with feasible region, beta particle group's big or small m, the maximum flying speed V of particle of lane boundary projection model parameter vector B Max=(v 1Lmax, v 1Rmax, v 0max, v -1max, v -2max) T, and maximum search iterations Iter Max, V wherein MaxSpan be (1,1,100,1500,2000) T≤ V Max≤ (5,5,500,2500,3000) T, the span of m is 10≤m≤60, Iter MaxSpan be 30≤Iter Max≤ 100, particularly, (5,0,0 ,-2000 ,-3000) T<B<(0,5,300,2000,3000) T, V Max=(5,5,300,600,600) T, m=30, Iter Max=50;
Step 2.2, each parameter vector particle position B of random initializtion in the feasible region of lane boundary projection model parameter vector i, speed V i, wherein i represents the sequence number of parameter vector particle, and the historical desired positions P of each parameter vector particle is set i=B i, calculate the lane boundary curve degree of confidence F (B) of each parameter vector particle then according to marginal point distance, edge direction, and maximum confidence corresponding historical desired positions G in the degree of confidence size of all particles and then the vectorial population that gets parms relatively; Particularly, the computing method of lane boundary curve degree of confidence F (B) are
Figure BSA00000472324100111
Wherein D (c, r) the expression marginal point (c, r) to the distance of lane boundary curve,
Figure BSA00000472324100112
(U represents the lane boundary neighborhood of a curve to the expression marginal point for c, the angle of edge direction r) and lane boundary curve, and the span of its radius is [2,30], and particularly, the radius of U is 15, μ W,
Figure BSA00000472324100113
Represent (b respectively 1R-b 1L) average and variance, D (c, variance r), Variance, its span is respectively
Figure BSA00000472324100115
Figure BSA00000472324100116
Figure BSA00000472324100117
Figure BSA00000472324100118
W wherein LaneThe expression lane width, d x, d yRespectively remarked pixel point in the horizontal direction with vertical direction on physical size, h cRepresent the height of vehicle-mounted vidicon apart from ground, particularly, μ W=1.3,
Figure BSA00000472324100119
Figure BSA000004723241001110
The concrete principle of the computing method of above-mentioned lane boundary curve degree of confidence F (B) is as described below; According to the highway engineering technical manual, the width basically identical in structuring track, its constraint condition as lane detection again because the observed reading of lane width meets normal distribution, therefore can be able to be constructed the first half of lane boundary curve confidence calculations formula equal sign the right formula.For the marginal point in the image (c, r), whether it belongs to the lane boundary curve, has two attributes to consider, marginal point (c, r) to the distance of lane line and marginal point (c, edge direction r), so the marginal point probability that belongs to lane line with
Figure BSA000004723241001112
Be directly proportional; Observe the marginal point in the lane boundary neighbourhood of a curve U, can construct the latter half of following formula equal sign the right formula; Therefore, lane boundary curve degree of confidence F (B) is proportional to the similarity of lane line in the lane boundary curve of parameter vector B and the image;
In the calculating of described lane boundary curve degree of confidence F (B), D 2(c, r) and
Figure BSA000004723241001113
As shown in Figure 2, particularly, its computing method are respectively
Figure BSA000004723241001114
Figure BSA000004723241001115
Wherein
D L 2 ( c , r ) = [ b 1 L ( r - r H ) + b 0 + b - 1 ( r - r H ) - 1 + b - 2 ( r - r H ) - 2 - c ] 2 cos 2 ψ L
D R 2 ( c , r ) = [ b 1 R ( r - r H ) + b 0 + b - 1 ( r - r H ) - 1 + b - 2 ( r - r H ) - 2 - c ] 2 cos 2 ψ R
Figure BSA00000472324100123
Figure BSA00000472324100124
Wherein ψ is a lane boundary normal to a curve direction, and its computing method are specially
ψ L=arctan[-b 1L+b -1(r-r H) -2+2b -2(r-r H) -3]
ψ R=arctan[-b 1R+b -1(r-r H) -2+2b -2(r-r H) -3]
Step 2.3 is calculated each parameter vector particle's velocity V of renewal according to following formula iWith position B i:
V i(k+1)=w(k)V i(k)+c 1r 1[P i(k)-B i(k)]+c 2r 2[G(k)-B i(k)]
B i(k+1)=B i(k)+V i(k+1)
K represents search iteration number of times, V in the formula i(k), B i(k), P i(k) and the historical desired positions of G (k) i particle's velocity, position, historical desired positions and population when representing the k time iteration respectively, c 1, c 2Be constant, its span be (0,4], particularly, c 1, c 2Value is respectively 2, r 1, r 2Be the random number on (0,1) interval,
Figure BSA00000472324100125
I=1,2 ..., m.
Step 2.4 is checked parameter vector particle's velocity and position, and the parameter vector particle that surpasses maximal rate is carried out speed limit, and it is maximal rate that its speed is set, and the parameter vector particle that crosses the border is carried out the position return, but its position is set in row space at random; Its principle is, because vector may cause behavior excessive in the process of autognosis and colony's study, the too fast or vectorial feasible region that flies out of vector speed this moment, this can reduce the search efficiency of population of vectors, therefore need correct the excessive behavior of vector, to improve global search efficient;
Step 2.5 is calculated the lane boundary curve degree of confidence F (B) of each parameter vector particle according to marginal point and edge direction, and relatively upgrades the historical desired positions P of each parameter vector particle iHistorical desired positions G with the parameter vector population;
Step 2.6 is with search iteration number of times k and maximum iteration time Iter MaxCompare,, otherwise forward step 2.7 to if less than forwarding step 2.3 to;
Step 2.7, with the lane boundary curve output of the historical desired positions correspondence of parameter vector population, particularly, according to the historical desired positions of parameter vector population and the lane boundary projection model lane line that in the road image that detects, draws;
Step 3 is calculated track, vehicle place planar geometry, position and the angle of deviation of vehicle in the track according to the historical desired positions G of parameter vector population and track structural parameters and vehicle heading calculation of parameter formula; Particularly, according to foregoing lane boundary projection model parameter and C 0, C 1, β, d LAnd d RCorresponding relation and the historical desired positions G of parameter vector population be calculated as follows track, vehicle place curvature C respectively 0With curvature variation C 1, vehicle from border, the left and right sides, track apart from d LAnd d RAnd angle of deviation β:
C 0 = 2 d x d y cos 3 α f c 3 h c ( b - 1 f c + 3 2 b - 2 d y sin 2 α )
C 1 = 6 b - 2 d x d y 2 cos 5 α f c 3 h c 2
β = - d x cos α f c 3 [ ( b 0 - c O ) f c 2 + b - 1 f c d y sin 2 α + 3 4 b - 2 d y 2 sin 2 2 α ]
d L , R = - γ h c + d x h c f c 3 d y cos α [ b 1 L , 1 R f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
Fig. 6 to Fig. 9 is the lane detection result of the described vehicle of embodiment of the invention place lane boundary detection method under the varying environment condition, the present invention not only can adapt to various tracks linear and traffic lane line and weather and illumination variation as can be seen from the result, and can effectively reduce the influence that shade, dazzle, other vehicle, road surface crackle, traffic lane line wearing and tearing detect lane boundary, and can accurately locate lane boundary, and then can estimate the geometry and position and the direction of vehicle in the track on plane, track.

Claims (6)

1. a vehicle place lane boundary detection method is characterized in that comprising the steps:
Step 1: according to the pixel gradient amplitude adaptively to the extracting section marginal point below the road image local horizon, and edge calculation direction;
Step 2: according to the marginal point, edge direction and the lane boundary projection model that extract, in the parameter vector space utilization particle group optimizing search location of lane boundary projection model lane boundary;
Step 3: calculate track, vehicle place planar geometry, position and the angle of deviation of vehicle in the track according to lane boundary projection model parameter value that searches and calculation of parameter formula.
2. vehicle according to claim 1 place lane boundary detection method, it is characterized in that: described step 1 may further comprise the steps:
Step 1.1: determine horizontal position in the image, calculate
Figure FSA00000472324000011
R wherein OThe ordinate of presentation video central point, d yPhysical size on the remarked pixel point vertical direction, f cThe focal length of expression vehicle-mounted vidicon, α represents the pitch angle of vehicle-mounted vidicon, with local horizon (j, r H) to be the boundary be two parts up and down with image division, j=0 wherein, 1 ..., N, the pixel value of N presentation video width;
Step 1.2: the local horizon is with the gradient magnitude of each pixel of lower part in the computed image
Figure FSA00000472324000012
Wherein (c, the r) coordinate of the following partial pixel point in local horizon in the presentation video, G x(c, r), G y(its computing method are respectively for c, the r) gradient magnitude on following partial pixel point level in local horizon and the vertical direction in the difference presentation video
G x ( c , r ) = f ( c + 1 , r - 1 ) + 2 f ( c + 1 , r ) + f ( c + 1 , r + 1 )
- f ( c - 1 , r - 1 ) - 2 f ( c - 1 , r ) - f ( c - 1 , r + 1 )
G y ( c , r ) = f ( c - 1 , r + 1 ) + 2 f ( c , r + 1 ) + f ( c + 1 , r + 1 )
- f ( c - 1 , r - 1 ) - 2 f ( c , r - 1 ) - f ( c + 1 , r - 1 )
F (c, r) local horizon following partial pixel point (c, pixel value r) in the presentation video wherein;
Step 1.3: calculate the threshold value of extracting the edge
Figure FSA00000472324000017
Wherein M, N distinguish the pixel value of presentation video height and width, w GBe coefficient, its span is 0.1≤w G≤ 1.5;
Step 1.4: with local horizon in the image relatively with the gradient magnitude of each pixel of lower part and edge threshold, if gradient magnitude is greater than edge threshold, then this pixel be marginal point and edge calculation point edge direction θ (c, r)=arctan[G y(c, r)/G x(c, r)].
3. vehicle according to claim 1 place lane boundary detection method, it is characterized in that: described step 2 may further comprise the steps:
Step 2.1: left and right border, track projection model is respectively
b 1L(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c=0
b 1R(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c=0
C remarked pixel horizontal ordinate wherein, b 1L, b 1R, b 0, b -1And b -2Be the parameter of lane boundary drop shadow curve, and definition B=(b 1L, b 1R, b 0, b -1, b -2) TBe lane boundary projection model parameter vector, the feasible region (5,0,0 ,-2000 ,-3000) of lane boundary projection model parameter vector B is set T<B<(0,5,300,2000,3000) T, beta particle group's big or small m, the maximum flying speed V of particle Max=(v 1Lmax, v 1Rmax, v 0max, v -1max, v -2max) T, and maximum search iterations Iter Max, V wherein MaxSpan be (1,1,100,1500,2000) T≤ V Max≤ (5,5,500,2500,3000) T, the span of m is 10≤m≤60, Iter MaxSpan be 30≤Iter Max≤ 100;
Step 2.2: each parameter vector particle position B of random initializtion in the feasible region of lane boundary projection model parameter vector B i, speed V i, wherein i represents the sequence number of parameter vector particle, and the historical desired positions P of each parameter vector particle is set i=B i, calculate the lane boundary curve degree of confidence F (B) of each parameter vector particle then according to marginal point and edge direction, and maximum confidence corresponding historical desired positions G in the degree of confidence size of all particles and then the vectorial population that gets parms relatively;
Step 2.3: be calculated as follows and upgrade each parameter vector particle's velocity V iWith position B i,
V i(k+1)=w(k)V i(k)+c 1r 1[P i(k)-B i(k)]+c 2r 2[G(k)-B i(k)]
B i(k+1)=B i(k)+V i(k+1)
Wherein k is the search iteration number of times, V i(k), B i(k), P i(k) and the historical desired positions of G (k) i parameter vector particle's velocity, position, historical desired positions and population when representing the k time iteration respectively, c 1, c 2The expression constant, its span be (0,4], r 1, r 2Be the random number on (0,1) interval,
Figure FSA00000472324000021
I=1,2 ..., m;
Step 2.4: check parameter vector particle's velocity and position, the parameter vector particle that surpasses maximal rate is carried out speed limit, it is maximal rate that its speed is set, and the parameter vector particle that crosses the border is carried out the position return, but its position is set in row space at random;
Step 2.5: calculate the lane boundary curve degree of confidence F (B) of each parameter vector particle according to marginal point and edge direction, and relatively upgrade the historical desired positions P of each parameter vector particle iHistorical desired positions G with the parameter vector population;
Step 2.6: with search iteration number of times k and maximum iteration time Iter MaxCompare, if k is less than Iter MaxForward step 2.3 to, otherwise forward step 2.7 to;
Step 2.7: with the lane boundary curve output of the historical desired positions G correspondence of parameter vector population.
4. vehicle according to claim 1 place lane boundary detection method is characterized in that: the computing method of described lane boundary curve degree of confidence F (B) are
Wherein D (c, r) the expression marginal point (c, r) to the distance of lane boundary curve,
Figure FSA00000472324000032
(U represents the lane boundary neighborhood of a curve to the expression marginal point for c, the angle of edge direction r) and lane boundary curve, and the span of its radius is [2,30], μ W, Represent (b respectively 1R-b 1L) average and variance, D (c, variance r),
Figure FSA00000472324000034
Variance, its span is respectively
Figure FSA00000472324000035
Figure FSA00000472324000036
Figure FSA00000472324000037
Figure FSA00000472324000038
W wherein LaneThe expression lane width, d x, d yRespectively remarked pixel point in the horizontal direction with vertical direction on physical size, h cThe expression vehicle-mounted vidicon is apart from the height on ground.
5. vehicle according to claim 1 place lane boundary detection method, it is characterized in that: in the computing method of described lane boundary curve degree of confidence F (B), marginal point (c, r) to the distance D (c of lane boundary curve, r) and marginal point (c, the angle of edge direction r) and lane boundary curve
Figure FSA00000472324000039
Computing method be respectively
Figure FSA000004723240000310
Wherein
D L(c,r)=|[b 1L(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c]cosψ L|
D R(c,r)=|[b 1R(r-r H)+b 0+b -1(r-r H) -1+b -2(r-r H) -2-c]cosψ R|
Figure FSA00000472324000041
Figure FSA00000472324000042
Wherein
ψ L=arctan[-b 1L+b -1(r-r H) -2+2b -2(r-r H) -3]
ψ R=arctan[-b 1R+b -1(r-r H) -2+2b -2(r-r H) -3]。
6. vehicle according to claim 1 place lane boundary detection method, it is characterized in that: described step 3 may further comprise the steps:
Be calculated as follows the curvature C in track, vehicle place respectively according to the historical desired positions G of parameter vector population 0Rate of change C with track curvature 1:
C 0 = 2 d x d y cos 3 α f c 3 h c ( b - 1 f c + 3 2 b - 2 d y sin 2 α )
C 1 = 6 b - 2 d x d y 2 cos 5 α f c 3 h c 2
According to the historical desired positions G of parameter vector population be calculated as follows respectively the angle of deviation β of vehicle in the track, vehicle to left and right boundary line, track apart from d LAnd d R:
β = - d x cos α f c 3 [ ( b 0 - c O ) f c 2 + b - 1 f c d y sin 2 α + 3 4 b - 2 d y 2 sin 2 2 α ]
d L = - γ h c + d x h c f c 3 d y cos α [ b 1 L f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
d R = - γ h c + d x h c f c 3 d y cos α [ b 1 R f c 3 + 1 2 ( b 0 - c O ) f c 2 d y sin 2 α + 1 4 b - 1 f c d y 2 sin 2 2 α + 1 8 b - 2 d y 3 sin 3 2 α ]
Wherein γ is the roll angle of vehicle-mounted vidicon.
CN 201110094319 2011-04-14 2011-04-14 Method for detecting boundary of lane where vehicle is Expired - Fee Related CN102184535B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110094319 CN102184535B (en) 2011-04-14 2011-04-14 Method for detecting boundary of lane where vehicle is

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110094319 CN102184535B (en) 2011-04-14 2011-04-14 Method for detecting boundary of lane where vehicle is

Publications (2)

Publication Number Publication Date
CN102184535A true CN102184535A (en) 2011-09-14
CN102184535B CN102184535B (en) 2013-08-14

Family

ID=44570705

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110094319 Expired - Fee Related CN102184535B (en) 2011-04-14 2011-04-14 Method for detecting boundary of lane where vehicle is

Country Status (1)

Country Link
CN (1) CN102184535B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663744A (en) * 2012-03-22 2012-09-12 杭州电子科技大学 Complex road detection method under gradient point pair constraint
CN102930543A (en) * 2012-11-01 2013-02-13 南京航空航天大学 Fire monitor jet flow track search method based on particle swarm optimization
CN103196433A (en) * 2012-01-10 2013-07-10 株式会社博思科 Data analysis device, data analysis method and programme
CN103381825A (en) * 2012-05-02 2013-11-06 通用汽车环球科技运作有限责任公司 Full-speed lane sensing using a plurality of cameras
CN103854277A (en) * 2012-12-02 2014-06-11 西安元朔科技有限公司 Marrow nucleated cell edge detection algorithm
CN104751151A (en) * 2015-04-28 2015-07-01 苏州安智汽车零部件有限公司 Method for identifying and tracing multiple lanes in real time
CN105206107A (en) * 2014-06-24 2015-12-30 丰田自动车株式会社 Lane boundary estimation device and lane boundary estimation method
CN105702049A (en) * 2016-03-29 2016-06-22 成都理工大学 DSP-based emergency lane monitoring system and realizing method thereof
CN105741605A (en) * 2014-12-26 2016-07-06 爱信精机株式会社 Parking assisting apparatus
CN104077756B (en) * 2014-07-16 2017-02-08 中电海康集团有限公司 Direction filtering method based on lane line confidence
CN104380343B (en) * 2012-06-01 2017-07-14 株式会社电装 Detect the device and its method in the line of demarcation in track
CN107021103A (en) * 2015-12-16 2017-08-08 丰田自动车株式会社 Information computing device
CN107193888A (en) * 2017-05-02 2017-09-22 东南大学 A kind of urban road network model towards track level navigator fix
CN107273935A (en) * 2017-07-09 2017-10-20 北京北昂科技有限公司 A kind of lane markings group technology based on adaptive K Means
CN107798855A (en) * 2016-09-07 2018-03-13 高德软件有限公司 A kind of lane width computational methods and device
CN107845264A (en) * 2017-12-06 2018-03-27 西安市交通信息中心 A kind of volume of traffic acquisition system and method based on video monitoring
CN108052880A (en) * 2017-11-29 2018-05-18 南京大学 Traffic monitoring scene actual situation method for detecting lane lines
CN108062512A (en) * 2017-11-22 2018-05-22 北京中科慧眼科技有限公司 A kind of method for detecting lane lines and device
CN108885831A (en) * 2016-03-24 2018-11-23 日产自动车株式会社 Traveling road detection method and traveling road detection device
US10373003B2 (en) * 2017-08-22 2019-08-06 TuSimple Deep module and fitting module system and method for motion-based lane detection with multiple sensors
CN110164179A (en) * 2019-06-26 2019-08-23 湖北亿咖通科技有限公司 The lookup method and device of a kind of parking stall of garage free time
CN110969837A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Road information fusion system and method for automatic driving vehicle
CN110998684A (en) * 2017-08-10 2020-04-10 丰田自动车株式会社 Image collection system, image collection method, image collection device, recording medium, and vehicle communication device
CN111091126A (en) * 2019-12-12 2020-05-01 京东数字科技控股有限公司 Certificate image reflection detection method, device, equipment and storage medium
CN111247525A (en) * 2019-01-14 2020-06-05 深圳市大疆创新科技有限公司 Lane detection method and device, lane detection equipment and mobile platform
CN111311902A (en) * 2018-12-12 2020-06-19 阿里巴巴集团控股有限公司 Data processing method, device, equipment and machine readable medium
WO2022001366A1 (en) * 2020-07-03 2022-01-06 华为技术有限公司 Lane line detection method and apparatus
US11874130B2 (en) 2017-08-22 2024-01-16 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101405783A (en) * 2006-03-24 2009-04-08 丰田自动车株式会社 Road division line detector
CN101447019A (en) * 2007-11-29 2009-06-03 爱信艾达株式会社 Image recognition apparatuses, methods and programs
CN101567086A (en) * 2009-06-03 2009-10-28 北京中星微电子有限公司 Method of lane line detection and equipment thereof
CN101608924A (en) * 2009-05-20 2009-12-23 电子科技大学 A kind of method for detecting lane lines based on gray scale estimation and cascade Hough transform
CN101620732A (en) * 2009-07-17 2010-01-06 南京航空航天大学 Visual detection method of road driving line

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101405783A (en) * 2006-03-24 2009-04-08 丰田自动车株式会社 Road division line detector
CN101447019A (en) * 2007-11-29 2009-06-03 爱信艾达株式会社 Image recognition apparatuses, methods and programs
CN101608924A (en) * 2009-05-20 2009-12-23 电子科技大学 A kind of method for detecting lane lines based on gray scale estimation and cascade Hough transform
CN101567086A (en) * 2009-06-03 2009-10-28 北京中星微电子有限公司 Method of lane line detection and equipment thereof
CN101620732A (en) * 2009-07-17 2010-01-06 南京航空航天大学 Visual detection method of road driving line

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103196433A (en) * 2012-01-10 2013-07-10 株式会社博思科 Data analysis device, data analysis method and programme
CN102663744A (en) * 2012-03-22 2012-09-12 杭州电子科技大学 Complex road detection method under gradient point pair constraint
CN103381825B (en) * 2012-05-02 2016-06-01 通用汽车环球科技运作有限责任公司 Use the full speed lane sensing of multiple photographic camera
CN103381825A (en) * 2012-05-02 2013-11-06 通用汽车环球科技运作有限责任公司 Full-speed lane sensing using a plurality of cameras
CN104380343B (en) * 2012-06-01 2017-07-14 株式会社电装 Detect the device and its method in the line of demarcation in track
CN102930543A (en) * 2012-11-01 2013-02-13 南京航空航天大学 Fire monitor jet flow track search method based on particle swarm optimization
CN103854277A (en) * 2012-12-02 2014-06-11 西安元朔科技有限公司 Marrow nucleated cell edge detection algorithm
CN105206107A (en) * 2014-06-24 2015-12-30 丰田自动车株式会社 Lane boundary estimation device and lane boundary estimation method
CN104077756B (en) * 2014-07-16 2017-02-08 中电海康集团有限公司 Direction filtering method based on lane line confidence
CN105741605A (en) * 2014-12-26 2016-07-06 爱信精机株式会社 Parking assisting apparatus
CN105741605B (en) * 2014-12-26 2020-01-24 爱信精机株式会社 Parking assist apparatus
CN104751151B (en) * 2015-04-28 2017-12-26 苏州安智汽车零部件有限公司 A kind of identification of multilane in real time and tracking
CN104751151A (en) * 2015-04-28 2015-07-01 苏州安智汽车零部件有限公司 Method for identifying and tracing multiple lanes in real time
CN107021103A (en) * 2015-12-16 2017-08-08 丰田自动车株式会社 Information computing device
CN108885831B (en) * 2016-03-24 2020-04-14 日产自动车株式会社 Travel path detection method and travel path detection device
CN108885831A (en) * 2016-03-24 2018-11-23 日产自动车株式会社 Traveling road detection method and traveling road detection device
CN105702049A (en) * 2016-03-29 2016-06-22 成都理工大学 DSP-based emergency lane monitoring system and realizing method thereof
CN107798855A (en) * 2016-09-07 2018-03-13 高德软件有限公司 A kind of lane width computational methods and device
CN107798855B (en) * 2016-09-07 2020-05-08 高德软件有限公司 Lane width calculation method and device
CN107193888A (en) * 2017-05-02 2017-09-22 东南大学 A kind of urban road network model towards track level navigator fix
CN107193888B (en) * 2017-05-02 2019-09-20 东南大学 A kind of urban road network model towards lane grade navigator fix
CN107273935A (en) * 2017-07-09 2017-10-20 北京北昂科技有限公司 A kind of lane markings group technology based on adaptive K Means
CN110998684B (en) * 2017-08-10 2022-02-01 丰田自动车株式会社 Image collection system, image collection method, image collection device, and recording medium
CN110998684A (en) * 2017-08-10 2020-04-10 丰田自动车株式会社 Image collection system, image collection method, image collection device, recording medium, and vehicle communication device
US10373003B2 (en) * 2017-08-22 2019-08-06 TuSimple Deep module and fitting module system and method for motion-based lane detection with multiple sensors
US11874130B2 (en) 2017-08-22 2024-01-16 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
CN108062512A (en) * 2017-11-22 2018-05-22 北京中科慧眼科技有限公司 A kind of method for detecting lane lines and device
CN108052880B (en) * 2017-11-29 2021-09-28 南京大学 Virtual and real lane line detection method for traffic monitoring scene
CN108052880A (en) * 2017-11-29 2018-05-18 南京大学 Traffic monitoring scene actual situation method for detecting lane lines
CN107845264A (en) * 2017-12-06 2018-03-27 西安市交通信息中心 A kind of volume of traffic acquisition system and method based on video monitoring
CN110969837A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Road information fusion system and method for automatic driving vehicle
CN110969837B (en) * 2018-09-30 2022-03-25 毫末智行科技有限公司 Road information fusion system and method for automatic driving vehicle
CN111311902A (en) * 2018-12-12 2020-06-19 阿里巴巴集团控股有限公司 Data processing method, device, equipment and machine readable medium
CN111247525A (en) * 2019-01-14 2020-06-05 深圳市大疆创新科技有限公司 Lane detection method and device, lane detection equipment and mobile platform
CN110164179A (en) * 2019-06-26 2019-08-23 湖北亿咖通科技有限公司 The lookup method and device of a kind of parking stall of garage free time
CN111091126A (en) * 2019-12-12 2020-05-01 京东数字科技控股有限公司 Certificate image reflection detection method, device, equipment and storage medium
WO2022001366A1 (en) * 2020-07-03 2022-01-06 华为技术有限公司 Lane line detection method and apparatus

Also Published As

Publication number Publication date
CN102184535B (en) 2013-08-14

Similar Documents

Publication Publication Date Title
CN102184535B (en) Method for detecting boundary of lane where vehicle is
US11940290B2 (en) Virtual stop line mapping and navigation
CN111551958B (en) Mining area unmanned high-precision map manufacturing method
CN109186625B (en) Method and system for accurately positioning intelligent vehicle by using hybrid sampling filtering
CN108802785B (en) Vehicle self-positioning method based on high-precision vector map and monocular vision sensor
US12106574B2 (en) Image segmentation
US20230334776A1 (en) Sensor calibration with environment map
CN101975951B (en) Field environment barrier detection method fusing distance and image information
CN101469991B (en) All-day structured road multi-lane line detection method
CN109017780A (en) A kind of Vehicular intelligent driving control method
CN106199558A (en) Barrier method for quick
CN102509067B (en) Detection method for lane boundary and main vehicle position
CN106842231A (en) A kind of road edge identification and tracking
CN102201054A (en) Method for detecting street lines based on robust statistics
CN104077756A (en) Direction filtering method based on lane line confidence
US20220196829A1 (en) Radar Reference Map Generation
CN105426868A (en) Lane detection method based on adaptive region of interest
CN103794050A (en) Real-time transport vehicle detecting and tracking method
Wang et al. A multi-step curved lane detection algorithm based on hyperbola-pair model
Sun et al. Objects detection with 3-D roadside LiDAR under snowy weather
US20240208547A1 (en) Route planning system and method of self-driving vehicle
US12105192B2 (en) Radar reference map generation
CN110531347A (en) Detection method, device and the computer readable storage medium of laser radar
Wu Data processing algorithms and applications of LiDAR-enhanced connected infrastructure sensing
CN109147328B (en) Traffic flow detection method based on video virtual coil

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130814

Termination date: 20150414

EXPY Termination of patent right or utility model