CN103940434A - Real-time lane line detecting system based on monocular vision and inertial navigation unit - Google Patents

Real-time lane line detecting system based on monocular vision and inertial navigation unit Download PDF

Info

Publication number
CN103940434A
CN103940434A CN201410129551.8A CN201410129551A CN103940434A CN 103940434 A CN103940434 A CN 103940434A CN 201410129551 A CN201410129551 A CN 201410129551A CN 103940434 A CN103940434 A CN 103940434A
Authority
CN
China
Prior art keywords
lane
line
module
inertial navigation
navigation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410129551.8A
Other languages
Chinese (zh)
Other versions
CN103940434B (en
Inventor
薛建儒
崔迪潇
沈雅清
宋晔
张耿
杜少毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201410129551.8A priority Critical patent/CN103940434B/en
Publication of CN103940434A publication Critical patent/CN103940434A/en
Application granted granted Critical
Publication of CN103940434B publication Critical patent/CN103940434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance

Abstract

The invention discloses a real-time lane line detecting system based on monocular vision and an inertial navigation unit. The system mainly comprises an offline calibration and online projection change module, an inertial navigation unit module, a single-frame lane line detecting module and a multi-frame lane line cross-linking checking module based on the inertial navigation unit; the system is capable of accurately detecting multi-lane places of pavements in the urban environment; the lane line detecting system can adapt to different road conditions and road types such as illumination variation, barrier obstruction, large curvature, virtual-actual lines and a plurality of lanes; the system is capable of stably detecting and sensing the lane lines in long distance under a plurality of environments. By adopting a navigation device, an image collecting device and a computing platform which are low in cost and low in power consumption, the system is widely applicable to the fields such as visual navigation of pilotless automobiles and vision-assisted driving of intelligent automobiles.

Description

Real-time lane detection system based on monocular vision and inertial navigation unit
Technical field
The invention belongs to computer vision and intelligent transportation field, relate to lane line detection system under a kind of urban area circumstance, especially a kind of real-time city multilane lane detection system based on monocular vision and the exploitation of inertial navigation unit.
Background technology
Intelligent vehicle (Intelligent Vehicle, IV) is one and integrates the multi-functional system ensemble such as environment sensing, dynamic decision and planning, Based Intelligent Control and execution, is the important symbol of weighing the whole scientific research strength of country and industrial level.Environment perception technology as one of intelligent vehicle three large gordian techniquies, present stage be take active sensing laser, radar and structured light as main sensor-based system, obtained successfully application of part, but the problems such as such sensor exists, and power consumption is large, volume large, involve great expense, have restricted its popularization in intelligent vehicle technical research and application.And passive visible ray sensing, camera, has significant advantage aspect power consumption, volume, cost.In recent years, a large amount of research teams and mechanism have made a lot of fruitful research utilizing visible ray sensing to complete aspect traffic scene perception, and the track based on vision and barrier perception become the study hotspot in this field.
The common lane detection system based on camera, because only utilize image information, cannot solve at illuminance abrupt variation, barrier in the process such as closely block, track is changed, when camera cannot collect the image that comprises enough lane informations and the problem such as undetected, flase drop occurring, be difficult to possess long-time long stable distance runnability, limit this type systematic and only under specific environment, possess application possibility.Therefore, how to design and realize a set of all kinds of external environment conditions variations and variation of car body attitude that can make full use of as far as possible image information, adaptive system, can be for a long time long stable distance operation, the field application demands such as the lane detection system again with low cost, low-power consumption, high transplantability unmanned to meet simultaneously, automobile assistant driving have become one of study hotspot of real-time lane detection.
Summary of the invention
The object of the present invention is to provide a kind of real-time lane detection system based on monocular vision and inertial navigation unit.
To achieve these goals, the present invention has taked following technical scheme:
Real-time city multilane lane detection system based on monocular vision and inertial navigation unit, is characterized in that: this real-time lane detection system comprises the changing pattern piece M1 of off-line calibration Ji line projection, inertial navigation unit module M2, single frames lane detection module M3, the associated correction verification module M4 of multiframe lane line; The changing pattern piece M1 of off-line calibration Ji line projection is used for gathering picture signal, and completes the homography projection variation with respect to car frontal plane, obtains the front image vertical view of car; Inertial navigation unit module M2 is for measuring car body with respect to the attitude variation of initial position; Single frames lane detection module M3 is for carrying out the lane detection of single-frame images; The associated correction verification module M4 of multiframe lane line, for associated present frame and historical frames lane line data, carries out verification to current detection result, and prediction is provided, and provides current time multilane accurately and detects visualization result and parametric description equation.
The described off-line calibration Ji changing pattern piece M1 of line projection comprises off-line calibration module M11 and the plane projective transformation module M12 that carries out camera internal reference and outer ginseng; Off-line calibration module M11 utilizes signature point to ask for camera camera built-in attribute parameter and camera with respect to the external position parameter of car body coordinate; Plane projection conversion module M12 online acquisition RGB image, and utilize camera inner parameter and the external parameter calculate, change image into plan view from above (IPM, Inverse Perspective Mapping), and carry out obtaining gray-scale map after color space convert.
Described inertial navigation unit module M2 measures car body at velocity pulse and the steering angle of current time according to speed pickup and optical fibre gyro, utilize speed, course and the flight path a dead-reckoning algorithm of fused filtering output, estimation is based on this inertial navigation unit (INU, Inertial Navigation Unit) the car body pose under the local coordinate of definition, provides high-precision level and smooth continuous and local car body pose to estimate.
Described single frames lane detection module M3 extracts possible lane line information based on lane line parts of images pixel higher than the basic assumption in region, road surface; Utilize the collimation between many lane lines, and the association between multiple image, and the region-of-interest that provides of historical frames testing result, reject non-lane line information, complete the lane detection in single-frame images.
The associated correction verification module M4 of described multiframe lane line utilizes continuous previous moment and two perception of current time car body pose measurement information constantly, with corresponding lane detection result, judge whether two frame results meet the variation relation that two frame car body pose measurement information are described on spatial relation, be about to historical frames testing result and change to current time according to the car body pose variation relation of two frames, obtain it at the parametric form under front vehicle body pose, mate with present frame lane detection result again, if meet, judge that present frame testing result can accept, otherwise, abandon present frame result, the historical frames result of usining after conversion is as the testing result of current time, and generate next prediction region-of-interest constantly with this net result, meanwhile, utilize multi frame detection result and three track semantic models, improve the semantic information of testing result, the information such as actual situation line, car body relative position of lane line is provided.
By above four module, the real-time lane detection system that the present invention is based on monocular vision and inertial navigation unit realizes the detection to the road surface lane line under urban area circumstance, utilize the associated verification continuous multiple frames of car body posture information lane detection result, improving the semantic information in track describes, and next region-of-interest is constantly provided, the multilane of realizing continous-stable detects.
This system mainly comprises off-line calibration Ji line projection changing pattern piece, inertial navigation unit module, single frames lane detection module and the associated correction verification module of multiframe lane line based on inertial navigation unit, realizes the road surface multilane of urban area circumstance is carried out to accurate detection in real time.This lane detection system, can be adapted to different road conditions and the road types such as illumination variation, barrier are blocked, deep camber, actual situation line, multilane, realizes the detection senses of the stable and long distance of lane line under multiple environment.This system comprises following part: visible light sensor is Basler Scout sca640-74gc Ethernet interface digital camera, is equipped with Computar8mm and focuses manual iris mega pixel camera lens; Inertial navigation unit comprises optical fibre gyro and speed pickup; System-computed loads on vehicle-mounted blade server cPCI-6910 carries out; All intermodules, by gigabit Ethernet, communicate with UDP message packet mode.System can provide with the frequency stabilization of 10HZ per second 50m(>=50m before in-vehicle camera) the lane detection result in three tracks (left-hand lane, current track, right-hand lane) in scope, and providing the actual situation line attribute of lane line, the multilane that meets system detects and detects in real time performance requirement.System cost is cheap, low in energy consumption, and has higher integral transplanting, and applicable batch is applied.
This lane detection System Working Principle: off-line calibration Ji line projection changing pattern piece, the outer ginseng and the internal reference that utilize three line calibration methods and University of California's calibration tool case to complete camera are demarcated, by the current road image of Basler Scout sca640-74gc digital camera Real-time Collection, and carry out contrary projection variation (Inverse Perspective Mapping according to inside and outside parameter, IPM), obtain the overhead view image on road surface; Inertial navigation unit module, the measurement of angle of Real-time Obtaining optical fibre gyro, and speed pickup turn to impulsive measurement, with filtering blending algorithm and flight path a dead-reckoning algorithm, estimate the current attitude of vehicle and position (with respect to system initial time); Single frames lane detection module, the highlighted characteristic in part and parallelism constraint based on lane line, and carry out the intra-frame trunk of image space, obtain the lane detection result of present frame; The associated correction verification module of multiframe lane line, receive the continuous multiple frames car body pose measurement result of single frames lane detection module result and inertial navigation unit module, utilize the car body pose measurement that historical frames and present frame lane detection result and two frames are corresponding, historical frames result and present frame result are carried out to verification, acquire the final lane detection result of verification, and the result after verification is used for generating next estimation range constantly.
Feature of the present invention is the lane detection system that adopts the low-cost and high-precision of Basler industrial camera and optical fibre gyro and speed pickup formation, realizes the multilane of urban area circumstance is detected.
Compared with the prior art, technique effect of the present invention is embodied in:
1. compare with traditional lane detection system, the present invention takes full advantage of view data and car body pose data message, utilize the time alignment of car body pose data and view data, complete the spacial alignment between multiframe lane line, efficiently solve the stable detection when the extreme environment lane lines such as high speed steering, illuminance abrupt variation are lost or observed without lane line in short-term, greatly improved the adaptability of system.
2. the present invention has developed the local pose measurement system of car body of a set of low-cost and high-precision, utilizes fused filtering and reckoning principle, rejects the noise data in observation measurement, reduces to calculate cumulative errors, and the local car body location of continuously smooth is provided.
3. the present invention consists of Basler industrial camera, speed pickup and optical fibre gyro, has feature cheaply.
Accompanying drawing explanation
Fig. 1 is system hardware graph of a relation of the present invention.
Fig. 2 is software system framework figure of the present invention.
Fig. 3 is the three track illustratons of model that the present invention defines.
Fig. 4 is the local pose measurement system frame diagram based on inertial navigation unit of the present invention.
Fig. 5 is car body model and the reckoning schematic diagram that the present invention defines.
Fig. 6 is the contrast of measuring accuracy and the differential GPS precision of the local pose measurement system based on inertial navigation unit of the present invention.
Fig. 7 is local coordinate system and the bodywork reference frame relation that the present invention defines.
Fig. 8 is the linear interpolation pose alignment algorithm block diagram that the present invention adopts.
Fig. 9 is single frames lane detection module of the present invention and the associated correction verification module system framework of multiframe lane line figure.
Figure 10 is lane detection result demonstration figure of the present invention and corresponding region-of-interest.
Figure 11 is that single frames lane detection of the present invention verification associated with multiframe lane line detects intermediate result figure.
Figure 12 is the lane line collimation screening rule key diagram of single frames lane detection part of the present invention.
Figure 13 is that region-of-interest of the present invention is expanded rule declaration figure.
Embodiment
Below in conjunction with accompanying drawing, the present invention is elaborated.
Referring to Fig. 1, the sensing of the real-time city multilane based on monocular vision and inertial navigation unit configuration and the whole hardware annexation figure that is arranged on an intelligent vehicle of the Kua Fu of Xi'an Communications University are as shown in the figure.System consists of following a few class subsystems: 1. power-supply system: by Yamaha generator, provide 220V AC power, from UPS transfer, export, except sensor itself need to use additional power supply module, complete transformation, other equipment all have supporting independently potential device; 2. sensor-based system: monocular camera is configured to: moral is produced Basler digital camera, model: scA640-74gc; Camera lens is daily output Computar mega pixel fixed focus lens, model: M0814-MP; Inertial navigation cell location is vehicular speeds sensor and optical fibre gyro DSP3000; 3. communication system: the communication data of current system comprises 1. inputs: view data, synchronized timestamp data (planning blade sends), synchronizing trolley posture data (transmission of pose blade); 2. output: processing result image.Except view data, every other data are all connectionless UDP message bags.For avoid the transmission blocking integral body of great amount of images data communication link, increase the weight of communication load, the transmission of view data adopts independent communication link, with DLink switch, carry out the exchange of view data, connection according to different network segment decision Basler camera with corresponding blade server, the data of speed pickup and optical fibre gyro are with UDP packet form, by serial ports, be sent to pose modular blade server, all the other UDP bags are all communicated by letter by the 24 mouth switch S1024R-CN of Huawei; 4. computing platform, the CORE XEON DUAL level processor blade server operation that all evaluation works are all cPCI-6910 in model.
Entire system is arranged on the Highlander of Toyota (Toyota, Highlander) SUV prototype vehicle.Wherein, this monocular camera sca640-74gc is arranged on roof, is positioned at car pilothouse top, and UV optical filter and rain cover are installed; Speed pickup dispatches from the factory and carries former car sensor for Highlander, and optical fibre gyro DSP3000 is arranged in the middle of dickey seats; Blade cPCI-6910 is arranged on the integrating cabinet that car owner drives rear.
Referring to Fig. 2, the software system framework integral body of system comprises 4 described main functional modules.The changing pattern piece M1 of off-line calibration Ji line projection comprises the off-line calibration part M11 that carries out camera internal reference and outer ginseng, and plane projective transformation part M12; Off-line calibration partly utilizes signature point to ask for camera camera built-in attribute parameter, and three line calibration methods are asked for video camera with respect to the external position parameter of car body coordinate; Online projective transformation part online acquisition RGB image, and utilize inner parameter and the external parameter calculating, change image into plan view from above (IPM, Inverse Perspective Mapping), and carry out obtaining gray-scale map after color space convert.Inertial navigation unit module M2 is velocity pulse and the steering angle at current time according to the car body of speed pickup and optical fibre gyro measurement, utilize fused filtering to estimate based on this inertial navigation unit (INU, Inertial Navigation Unit) the car body pose under the local coordinate of definition, and coordinate ICP(iterative closet point) overall pose under the local pose of algorithm fusion car body and gps coordinate, provide high-precision level and smooth continuous car body pose to estimate.Single frames lane detection module M3 extracts possible lane line information based on lane line parts of images pixel higher than the basic assumption in region, road surface; Utilize the collimation between many lane lines, and the association between multiple image, and the region-of-interest that provides of historical frames testing result, reject non-lane line information, complete the lane detection in single-frame images.The associated correction verification module M4 of multiframe lane line utilizes continuous previous moment and two perception of current time car body pose measurement information constantly, with corresponding lane detection result, judge whether two frame results meet the variation relation that two frame car body pose measurement information are described on spatial relation, be about to historical frames testing result and change to current time according to the car body pose variation relation of two frames, obtain it at the parametric form under front vehicle body pose, mate with present frame lane detection result again, if meet, judge that present frame testing result can accept, otherwise, abandon present frame result, the historical frames result of usining after conversion is as the testing result of current time, and generate next prediction region-of-interest constantly with this net result, meanwhile, utilize multi frame detection result and three track, track semantic models, improve the semantic information of testing result, the information such as actual situation line, car body relative position of lane line is provided.
Referring to Fig. 3, for simplifying lane detection, the present invention need meet following basic assumption: 1) flat road surface hypothesis: this hypothesis has guaranteed the feasibility of IPM, INU 6D is originally measured to (x, y, z simultaneously, θ, α, β) be reduced to 3D and measure (x, y, θ), (x wherein, y, z) represented the car body position measurement of the world coordinates of GPS definition, θ represents the deflection of camera, α represents the angle of pitch, and β represents roll angle; 2) arbitrary two lane lines keep substantially parallel, and its lane width changes slowly; 3) INU module possesses the precision that meets lane detection; 4) track model meets 3 track models of supposition, and L1 represents the left-hand lane line in track, the current place of car body, and R1 represents the right-hand lane line in track, current place, and by that analogy, every lane line has actual situation line attribute simultaneously for L2 and R2.
Referring to Fig. 4, pose measurement system block diagram is as figure.The local pose measurement system of this inertial navigation unit, based on reckoning algorithm design, utilizes real-time speed and course to extrapolate the local posture information of carrier.Pre-service is first carried out in the former initial course output of the raw velocity output of odometer and gyro before dead reckoning, the measuring error in release rate and course, thus improve the precision of reckoning.
Referring to Fig. 5, principle as shown in the figure in reckoning (Dead Reckoning).Speed and attitude information that reckoning utilizes onboard sensor to provide, the position of calculating current time according to the position of carrier previous moment.Reckoning location relies on the information of carrier inside sensor output to carry out dead reckoning completely, is a kind of completely independently local positioning system.Suppose that car body moves in two dimensional surface, what speed pickup read is car body front wheel rotation speed information, chooses bodywork reference frame initial point and is positioned at car body rear shaft center, and when car is turned, trailing wheel speed is that front-wheel speed is along the component of car body direction.By sensor information, can read car front wheel angle δ.Consider after car front wheel angle, speed output is decomposed under bodywork reference frame:
v D m = v D sin δ v D cos δ T
Car body real-time position information after averaging according to the speed pickup of front and back two frames and gyro-compass course angle in real time cumulative integration obtain.In [k-1, the k] time, relative displacement is expressed as: Δ D k=Δ T [k-1, k](v k-1cos α k-1+ v kcos α k)/2, course angle relative variation is: Δ θ kkk-1.Δ T wherein [k-1, k]mistiming by two frame data before and after obtaining obtains, and course angle θ is provided in real time by gyro, v k, v k-1by speed pickup, provided front wheel slip angle α k-1, α kby drift angle sensor, provided.Reckoning formula is:
X k / k - 1 = f ( X k , u k ) = x k - 1 + Δ D k cos ( θ k - 1 + θ k 2 ) y k - 1 + Δ D k sin ( θ k - 1 + θ k 2 ) θ k
In reckoning process, the precision that the direct impact position of velocity accuracy that odometer provides is calculated.Body speed of vehicle is provided by the vehicle-mounted odometer of demarcating through differential GPS.Be subject to wheel-slip, the impact of ground injustice and calibration factor precision, relies on merely the velocity information that odometer obtains cannot meet under many circumstances the accuracy requirement of reckoning to speed output.Vehicle-mounted MTI silicon micro-inertial measuring system can provide car body real time acceleration information in the horizontal direction, utilize the output of acceleration information and the odometer of MTI simultaneously, setting up Filtering Model estimates real-time speed, can effectively reduce the range rate error of simple dependence odometer in low speed and pavement roughness situation, improve the precision of reckoning.The state vector of this model and state equation are:
X k=[v k?a k] T
X k+1=AX k+W k
In formula:
V-body speed of vehicle;
A-car body acceleration;
A-state-transition matrix;
W k-process noise.
The system of setting up departments is t in the sampling period, obtains state-transition matrix be by normal acceleration model:
A = 1 t 0 1
According to the hypothesis of normal acceleration model, within a sampling period, acceleration is constant.The converted quantity of acceleration in the sampling period is considered as to process noise.If in one-period, the maximal value of acceleration change is σ, process noise W kcan be expressed as:
W k = t 1 σ
Process covariance matrix is can be obtained fom the above equation:
Q = E ( W k W k T ) = t 2 t t 1 σ 2
The horizontal direction acceleration of the speed of odometer and the output of MTI accelerometer is as measured value, that is:
Z k=HX k+V k
In above formula, Z k = v ~ k a ~ k T The measured value vector of expression speed and acceleration, H=I 2represent that second order measures unit matrix, V k=[v vv a] tthe measurement noise of expression speed and acceleration
Referring to Fig. 6, shown the pose measurement precision and the accuracy comparison that uses differential GPS (Differential GPS) of inertial navigation unit module in the present invention, this car body positioning result is level and smooth and stable, maximum error of measuring under 100m operating range is 3m, and this precision meets the correction of multiframe lane detection module to part lane line flase drop.
Referring to Fig. 7, in Vehicle Driving Cycle process, to the variation relation of the observation of Same Scene as shown in the figure.Suppose in traffic scene and exist a series of point, in the autonomous driving process of vehicle, camera can obtain part or full detail.Because of vehicle itself motion, from the hardwired camera of car body to the observation of this data point, can change.Local coordinate system is defined by INU, and vehicle moves to V' from V, and therefore two local coordinate systems are defined by these two different spaces points.Motion between 2 can represent by a translation matrix T and a rotation matrix R, and two lane line observation stations are expressed as P (P') and Q (Q') in these two coordinate system V and V'.Corresponding variation can be expressed as
P′=R*(P-T)
Q′=R*(Q-T)
Wherein
R-R 2 * 22 * 2 the rotation matrix in X space
2 dimension translation matrix of T-2 dimension space
Wherein R and T can directly obtain from the result of INU information.
Referring to Fig. 8, in Kua Fu's real system, the frequency of operation of INU has different frequency of operation from vision lane line perceived frequency.This system has used linear interpolation mode to carry out the time synchronized of asynchronous sensor.Inertial navigation unit obtains corresponding pose measurement p constantly in the t0 moment and t1 respectively 0and p 1, visual pattern collection is t' constantly 1, the interpolation pose measurement of this moment correspondence is:
p' 1=p 1+(p 1-p 0)×(t 1'-t 1)/(t 1-t 0)
P' wherein 1=(x 1', y 1', θ 1').In like manner, t' 2car body pose corresponding to time chart picture is by t 1and t 2pose measurement p constantly 1and p 2by above formula interpolation, obtain.
Referring to Fig. 9, be depicted as the single frames lane detection module M3 of this invention and the algorithm flow chart of the associated correction verification module M4 of multiframe lane line, these two parts have formed whole lane detection function.Algorithm integral body comprises three phases: the first stage, use a horizontal filter in IPM image, to detect all possible lane line candidate line; Subordinate phase, rejects feature flase drop according to parallelism constraint and wide constraint, and the threshold value adopting is adaptive mode; Whether the phase III is the verification stage, by the historical testing result judgement of coordinate conversion, mate with current detection result, and the verification stage also completes perfect to the semantic information of lane line.Wherein first stage and subordinate phase belong to module M3, and the phase III belongs to module M4.
Referring to Figure 10, be the result demonstration figure of this system for urban area circumstance multilane lane detection.The former figure that in figure, left side collects for monocular camera, and mark has actual testing result.Wherein as shown in Figure 3, character ' Solid ' represents that this lane line is solid line for L1, R1, L2 and R2 implication, and character ' dash ' represents that this lane line is dotted line; In figure, right side is according to historical lane detection result and corresponding pose, the region-of-interest of the present frame that conversion obtains.
Referring to Figure 11, for the main algorithm of this system lane detection function, process intermediate result, be respectively: a) feature extraction, b) feature strengthens and binaryzation, c) feature segmented fitting, d) broken line connects, e) the associated verification f of multiframe lane line) final detection result and semantic perfect.
Referring to Figure 11 (a), lane line feature extraction is based on following hypothesis: lane line gray-scale value is higher than its both sides pixel, if exist a pixel higher at a distance of the grey scale pixel value of a lane line width than its left and right, think that this pixel is a possible lane line pixel, cumulative itself and both sides pixel grey scale value difference, as the gray-scale value of current point; If a certain pixel is adjacent pixel and does not meet above-mentioned condition, this pixel value sets to 0.
Wherein
d + m ( x , y ) = b ( x , y ) - b ( x , y + m ) d - m ( x , y ) = b ( x , y ) - b ( x , y - m )
The lane line characteristic pattern of r (x, y) for asking for, d + mthe feature that (x, y) asks for while moving to right for lane line feature masterplate, d -mthe feature that (x, y) asks for while moving to left for lane line feature masterplate, the gray-scale map that b (x, y) is original input, b (x, y ± m) is for to carry out the figure after the translation of left and right by the gray-scale map of original input by m pixel.
Referring to Figure 11 (b), to the lane line feature extracting, utilize the gray-scale value of the pixel of the gray-scale value maximum in its 8 neighborhood to substitute the gray-scale value of current some pixel, can improve the unintelligible situation of lane line causing because of shade etc.Enhancing figure is split as to some fritters, each fritter is carried out to the large Tianjin of overall situation method Threshold segmentation, and merging obtains final whole binary map.Level and vertically fractionation number are controlled by lane width and dotted line average length.
When having historical detection information, needed following operation: 1. complete without the fractionation in historical data situation; 2. historical data is changed and is mapped under the image coordinate of present frame according to car body attitude, generate region-of-interest, the partial pixel being covered by region-of-interest in enhancing figure is carried out to binary segmentation again.
O wherein i(x, y) is the large Tianjin of overall situation method segmentation result, and ir (x, y) is the large Tianjin of region-of-interest method segmentation result.
Referring to Figure 11 (c), to binary segmentation image, carry out connected domain detection, each connected domain is carried out to matching according to broken line.All connected domains are completed after segmented fitting, according to the constraint of its length and angle, carry out broken line connection.If history of existence data, carrying out associated broken line connects: to each candidate's broken line, add up its proportion threshold value in some region-of-interests, for all broken lines that meet same special interests region threshold requirement, think and belong to a region-of-interest.To every broken line assignment S=[s l2, s l1, s r1, s r2], each component represents the overlapping possibility of broken line and estimation range.0 represents that this broken line does not possess and overlaps with arbitrary estimation range, for every broken line, if the maximum nonzero value in this vector surpasses threshold value, judge this broken line and this area coincidence, all broken lines with unified overlapping region are connected to a broken line, form track candidate's line.
Referring to Figure 11 (d), utilize width and collimation to carry out screening and filtering to candidate's lane line.First according to length threshold t lfilter.According to pertinent literature, show, the longest candidate's line of 98% belongs to lane line.All t that are longer than lbroken line form one group ' mother group ', be called datum line, and wherein the longest lane line is elected first group of track candidate's line that carries out collimation calculating as, calculates the parallel nature of other candidate's lines and datum line.Computation process is referring to Figure 12.
For i bar datum line m i, calculate itself and j bar candidate line c jcollimation formula as follows:
x start=max(m i-start(x),c j-start(x))
x end=min(m i-end(x),c j-end(x))
w = Σ Pm i ( x ) k = Pc j ( x ) k = x start x end ( Pm i ( y ) k - Pc j ( y ) k ) ( x end - x start ) / STEP
Wherein, m i-start(x) represent that datum line is at the starting point of directions X (XY definition is shown in Figure 12), m i-end(x) represented the terminal of datum line at directions X, similar to candidate's line.Pm i(y) krepresented that k sample point of datum line is at the value of Y-direction, Pc j(y) krepresented that first k the sample point of candidate is at the value of Y-direction, w krepresented that k sample is to the range deviation in Y-direction, STEP has represented the spatial spreading unit to sample point.If result of calculation average meets following formula, and mean square deviation is less than certain threshold value, judges that both have collimation.
n*W-dw<w<n*W+dw
Wherein, w is that current candidate's lane line is with respect to the detection width of control vehicle diatom; W is known standard lane width, and according to < < CJJ37-2012 urban road design criterion > >, urban area circumstance standard lane width is 3.25m – 3.75m; N is the number of lanes based on after the width W normalization of standard track, and dw is that real system is in service because detecting and measuring the error of bringing, and current reference lane width is W ± dw.All datum lines and candidate's line are repeated to above-mentioned computation process, one group of candidate's line with high evaluation value is the single frame detection result under precondition, and guarantee that candidate's line length surpasses certain threshold value (system is chosen to be 8m, according to follow-up planning module, the primary demand of shortest path planning is determined).
When history of existence testing result, wherein the selection meeting of datum line is slightly different, on the basis of length threshold l, increases lane line type c and two interpretational criterias of matching precision a.Wherein l is normalized apart from 50m according to maximum forward sight; C is lane line type mark (1 is solid line, and 0 is dotted line), in this invention, assert that solid line has high degree of confidence and reliability; A be present frame and historical frames in there is the track matching precision of same lane line label.The line with mxm. is elected candidate's line of present frame as.
Referring to Figure 11 (e), when there is no historical information, track verification is carried out based on image information.By the statistics to the testing result of different time, same track candidate's line that continuous multiple frames detects is exported.When history of existence testing result is carried out association, the information of forecasting that system is used INU to provide carries out verification.Historical testing result changes (R, T) by pose and carries out space-time with current detection result and aim at.After lane line numbering attribute corresponding to two frames determined, verification is carried out in the following manner:
e ( a ) = &Sigma; i | SC ( a ) - ( SP ( a ) ( i ) &times; R + T ) | N , if | SC ( a ) ( i ) - ( SP ( a ) ( i ) &times; R + T ) | = e ( a ) ( i ) &le; t &infin; , else
A={L1 wherein, R1}, SP (a)(i) represent i sample point in historical testing result a, SC (a)(i) represent i sample point in current detection result a, representative is to the degree of confidence assessment of current detection result (have minimum value be chosen as in next frame datum line).If current detection result cannot be mated with historical results continuous multiple frames, historical results becomes present frame Output rusults after conversion.Variable w tshow that whether current result after conversion is still effective.
w t = e ( SP end ( x ) - l min - V x min ) V x max - V x min
V wherein xmaxand V xmindefined the field range of x direction, l minit is the shortest allowed testing result.Work as w tvalue is less than at 1 o'clock, and system assert that the testing result of current conversion is no longer enough accurate, is rejected.
Referring to Figure 11 (f), the semantic information based on 3 track models is improved and need after the verification of present frame track completes, be carried out.The part of properties of lane line, if actual situation line, car Taoist monastic name, left and right lane line are the important informations that carries out rational autonomous driving behavior and observe traffic rules and regulations.The present invention determines the actual situation type of lane line by calculating the ratio of dotted line and solid line.When initial, from security standpoint, lane line is forced solid line, until it is dotted line by continuous detecting, the semantic information that completes accordingly track is perfect.According to this actual situation line characteristic, and 3 track models, system is inferred the lane detection region-of-interest of next frame, this process is referring to Figure 13.It is as follows that region-of-interest is expanded rule, take L1 as example: referring to Figure 13, a) when L1 is dotted line, to both sides, release region-of-interest; Referring to Figure 13 b) when L1 is solid line, release to the right region-of-interest.R1 is similar.
Above content is in conjunction with concrete preferred implementation further description made for the present invention; can not assert that the specific embodiment of the present invention only limits to this; for general technical staff of the technical field of the invention; without departing from the inventive concept of the premise; can also make some simple deduction or replace, all should be considered as belonging to the present invention and determine scope of patent protection by submitted to claims.

Claims (5)

1. the real-time lane detection system based on monocular vision and inertial navigation unit, is characterized in that: this real-time lane detection system comprises the associated correction verification module M4 of the changing pattern piece M1 of off-line calibration Ji line projection, inertial navigation unit module M2, single frames lane detection module M3 and multiframe lane line;
The changing pattern piece M1 of off-line calibration Ji line projection is used for gathering picture signal, and completes the homography projection variation with respect to car frontal plane, obtains the front image vertical view of car;
Inertial navigation unit module M2 is for measuring car body with respect to the attitude variation of initial position;
Single frames lane detection module M3 is for carrying out the lane detection of single-frame images;
The associated correction verification module M4 of multiframe lane line, for associated present frame and historical frames lane line data, carries out verification to current detection result, provides current time multilane accurately and detects visualization result and parametric description equation, and prediction is provided.
2. a kind of real-time lane detection system based on monocular vision and inertial navigation unit according to claim 1, is characterized in that: the described off-line calibration Ji changing pattern piece M1 of line projection comprises off-line calibration module M11 and the plane projective transformation module M12 that carries out camera internal reference and outer ginseng; Off-line calibration module M11 utilizes signature point to ask for camera camera built-in attribute parameter and camera with respect to the external position parameter of car body coordinate; Plane projection conversion module M12 online acquisition RGB image, and utilize camera inner parameter and the external parameter calculating, change image into plan view from above, and carry out obtaining gray-scale map after color space convert.
3. a kind of real-time lane detection system based on monocular vision and inertial navigation unit according to claim 1, it is characterized in that: described inertial navigation unit module M2 is velocity pulse and the steering angle at current time according to the car body of speed pickup and optical fibre gyro measurement, utilize speed, course and the flight path a dead-reckoning algorithm of fused filtering output, the level and smooth continuous and local car body pose under the local coordinate of estimation inertial navigation unit definition.
4. a kind of real-time lane detection system based on monocular vision and inertial navigation unit according to claim 1, is characterized in that: described single frames lane detection module M3 extracts possible lane line information based on lane line parts of images pixel higher than the basic assumption in region, road surface; Utilize the collimation between many lane lines, and the association between multiple image, and the region-of-interest that provides of historical frames testing result, reject non-lane line information, complete the lane detection in single-frame images.
5. a kind of real-time lane detection system based on monocular vision and inertial navigation unit according to claim 1, it is characterized in that: the associated correction verification module M4 of described multiframe lane line utilizes continuous previous moment and two perception of current time car body pose measurement information constantly, with corresponding lane detection result, judge whether two frame results meet the variation relation that two frame car body pose measurement information are described on spatial relation, be about to historical frames testing result and change to current time according to the car body pose variation relation of two frames, obtain it at the parametric form under front vehicle body pose, mate with present frame lane detection result again, if meet, judge that present frame testing result can accept, otherwise, abandon present frame result, the historical frames result of usining after conversion is as the testing result of current time, and generate next prediction region-of-interest constantly with this net result, meanwhile, utilize multi frame detection result and three track semantic models, improve the semantic information of testing result, actual situation line, the car body relative position information of lane line is provided.
CN201410129551.8A 2014-04-01 2014-04-01 Real-time lane detection system based on monocular vision and inertial navigation unit Active CN103940434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410129551.8A CN103940434B (en) 2014-04-01 2014-04-01 Real-time lane detection system based on monocular vision and inertial navigation unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410129551.8A CN103940434B (en) 2014-04-01 2014-04-01 Real-time lane detection system based on monocular vision and inertial navigation unit

Publications (2)

Publication Number Publication Date
CN103940434A true CN103940434A (en) 2014-07-23
CN103940434B CN103940434B (en) 2017-12-15

Family

ID=51188203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410129551.8A Active CN103940434B (en) 2014-04-01 2014-04-01 Real-time lane detection system based on monocular vision and inertial navigation unit

Country Status (1)

Country Link
CN (1) CN103940434B (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104567872A (en) * 2014-12-08 2015-04-29 中国农业大学 Extraction method and system of agricultural implements leading line
CN105588563A (en) * 2016-01-15 2016-05-18 武汉光庭科技有限公司 Joint calibration method of binocular camera and inertial navigation unit in automatic driving
CN105741635A (en) * 2016-03-01 2016-07-06 武汉理工大学 Multifunctional road experiment vehicle platform
CN106324790A (en) * 2016-08-12 2017-01-11 中国科学院光电技术研究所 Automatic adjustment method for coupling lens based on monocular vision posture measurement
CN106485917A (en) * 2016-09-28 2017-03-08 上海海积信息科技股份有限公司 A kind of method and apparatus of judgement vehicle to change lane
CN106485663A (en) * 2015-08-26 2017-03-08 腾讯科技(深圳)有限公司 A kind of lane line image enchancing method and system
CN106546238A (en) * 2016-10-26 2017-03-29 北京小鸟看看科技有限公司 Wearable device and the method that user's displacement is determined in wearable device
CN106682586A (en) * 2016-12-03 2017-05-17 北京联合大学 Method for real-time lane line detection based on vision under complex lighting conditions
CN106886217A (en) * 2017-02-24 2017-06-23 安科智慧城市技术(中国)有限公司 Automatic navigation control method and apparatus
CN107133600A (en) * 2017-05-11 2017-09-05 南宁市正祥科技有限公司 A kind of real-time lane line detection method based on intra-frame trunk
CN107229063A (en) * 2017-06-26 2017-10-03 奇瑞汽车股份有限公司 A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to
CN107527506A (en) * 2017-09-20 2017-12-29 上海安道雷光波系统工程有限公司 Embedded radar monitors recombination optics and radar monitoring capturing system and method
CN107646114A (en) * 2015-05-22 2018-01-30 大陆-特韦斯贸易合伙股份公司及两合公司 Method for estimating track
CN108764075A (en) * 2018-05-15 2018-11-06 北京主线科技有限公司 The method of container truck location under gantry crane
CN109491364A (en) * 2018-11-19 2019-03-19 长安大学 A kind of drive robot system and control method for vehicle testing
CN110081880A (en) * 2019-04-12 2019-08-02 同济大学 A kind of sweeper local positioning system and method merging vision, wheel speed and inertial navigation
CN110120081A (en) * 2018-02-07 2019-08-13 北京四维图新科技股份有限公司 A kind of method, apparatus and storage equipment of generation electronic map traffic lane line
CN110160542A (en) * 2018-08-20 2019-08-23 腾讯科技(深圳)有限公司 The localization method and device of lane line, storage medium, electronic device
CN110207714A (en) * 2019-06-28 2019-09-06 广州小鹏汽车科技有限公司 A kind of method, onboard system and the vehicle of determining vehicle pose
CN110221616A (en) * 2019-06-25 2019-09-10 清华大学苏州汽车研究院(吴江) A kind of method, apparatus, equipment and medium that map generates
CN110458892A (en) * 2019-07-17 2019-11-15 北京自动化控制设备研究所 Camera calibration method based on inertial navigation
CN110770741A (en) * 2018-10-31 2020-02-07 深圳市大疆创新科技有限公司 Lane line identification method and device and vehicle
CN110909569A (en) * 2018-09-17 2020-03-24 深圳市优必选科技有限公司 Road condition information identification method and terminal equipment
WO2020098286A1 (en) * 2018-11-13 2020-05-22 广州小鹏汽车科技有限公司 Lane line detection method and device
CN111256693A (en) * 2018-12-03 2020-06-09 北京初速度科技有限公司 Pose change calculation method and vehicle-mounted terminal
CN111316284A (en) * 2019-02-13 2020-06-19 深圳市大疆创新科技有限公司 Lane line detection method, device and system, vehicle and storage medium
CN111307176A (en) * 2020-03-02 2020-06-19 北京航空航天大学青岛研究院 Online calibration method for visual inertial odometer in VR head-mounted display equipment
CN111324616A (en) * 2020-02-07 2020-06-23 北京百度网讯科技有限公司 Method, device and equipment for detecting lane line change information
CN111401446A (en) * 2020-03-16 2020-07-10 重庆长安汽车股份有限公司 Single-sensor and multi-sensor lane line rationality detection method and system and vehicle
CN111611830A (en) * 2019-02-26 2020-09-01 阿里巴巴集团控股有限公司 Lane line identification method, device and system and vehicle
WO2020216315A1 (en) * 2019-04-26 2020-10-29 纵目科技(上海)股份有限公司 Method and system for rapid generation of reference driving route, terminal and storage medium
CN112050821A (en) * 2020-09-11 2020-12-08 湖北亿咖通科技有限公司 Lane line polymerization method
WO2020258901A1 (en) * 2019-06-25 2020-12-30 上海商汤智能科技有限公司 Method and apparatus for processing data of sensor, electronic device, and system
CN112249014A (en) * 2020-10-22 2021-01-22 英博超算(南京)科技有限公司 Vehicle lateral control method, vehicle, and computer-readable storage medium
CN112304291A (en) * 2019-07-26 2021-02-02 厦门雅迅网络股份有限公司 HUD-based lane line display method and computer-readable storage medium
CN112590670A (en) * 2020-12-07 2021-04-02 安徽江淮汽车集团股份有限公司 Three-lane environment display method, device, equipment and storage medium
CN112683292A (en) * 2021-01-07 2021-04-20 阿里巴巴集团控股有限公司 Navigation route determining method and device and related product
CN113932796A (en) * 2021-10-15 2022-01-14 北京百度网讯科技有限公司 High-precision map lane line generation method and device and electronic equipment
CN114322786A (en) * 2021-12-20 2022-04-12 重庆长安汽车股份有限公司 Lane line transverse distance measuring system and method thereof
CN115116019A (en) * 2022-07-13 2022-09-27 阿波罗智能技术(北京)有限公司 Lane line processing method, lane line processing device, lane line processing apparatus, and storage medium
WO2023092451A1 (en) * 2021-11-26 2023-06-01 华为技术有限公司 Method and apparatus for predicting drivable lane
CN110770741B (en) * 2018-10-31 2024-05-03 深圳市大疆创新科技有限公司 Lane line identification method and device and vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101617197A (en) * 2007-02-16 2009-12-30 三菱电机株式会社 Road feature measurement mechanism, atural object recognition device, road feature measuring method, road feature process of measurement, measurement mechanism, measuring method, process of measurement, measuring position data, measuring terminals device, measure server unit, make map device, drawing method, plotting program and make diagram data
WO2010048611A8 (en) * 2008-10-24 2010-07-15 The Gray Insurance Company Control and systems for autonomously driven vehicles
CN102156979A (en) * 2010-12-31 2011-08-17 上海电机学院 Method and system for rapid traffic lane detection based on GrowCut

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101617197A (en) * 2007-02-16 2009-12-30 三菱电机株式会社 Road feature measurement mechanism, atural object recognition device, road feature measuring method, road feature process of measurement, measurement mechanism, measuring method, process of measurement, measuring position data, measuring terminals device, measure server unit, make map device, drawing method, plotting program and make diagram data
WO2010048611A8 (en) * 2008-10-24 2010-07-15 The Gray Insurance Company Control and systems for autonomously driven vehicles
CN102156979A (en) * 2010-12-31 2011-08-17 上海电机学院 Method and system for rapid traffic lane detection based on GrowCut

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙兴: "基于多传感器融合的车道线检测技术的研究与应用", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
肖已达: "面向城区综合环境的无人驾驶车辆平台及关键技术研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104567872A (en) * 2014-12-08 2015-04-29 中国农业大学 Extraction method and system of agricultural implements leading line
CN104567872B (en) * 2014-12-08 2018-09-18 中国农业大学 A kind of extracting method and system of agricultural machinery and implement leading line
CN107646114A (en) * 2015-05-22 2018-01-30 大陆-特韦斯贸易合伙股份公司及两合公司 Method for estimating track
CN106485663A (en) * 2015-08-26 2017-03-08 腾讯科技(深圳)有限公司 A kind of lane line image enchancing method and system
CN105588563A (en) * 2016-01-15 2016-05-18 武汉光庭科技有限公司 Joint calibration method of binocular camera and inertial navigation unit in automatic driving
CN105588563B (en) * 2016-01-15 2018-06-12 武汉光庭科技有限公司 Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving
CN105741635A (en) * 2016-03-01 2016-07-06 武汉理工大学 Multifunctional road experiment vehicle platform
CN106324790A (en) * 2016-08-12 2017-01-11 中国科学院光电技术研究所 Automatic adjustment method for coupling lens based on monocular vision posture measurement
CN106485917A (en) * 2016-09-28 2017-03-08 上海海积信息科技股份有限公司 A kind of method and apparatus of judgement vehicle to change lane
CN106485917B (en) * 2016-09-28 2019-11-22 上海海积信息科技股份有限公司 A kind of method and apparatus for adjudicating vehicle to change lane
CN106546238A (en) * 2016-10-26 2017-03-29 北京小鸟看看科技有限公司 Wearable device and the method that user's displacement is determined in wearable device
CN106682586A (en) * 2016-12-03 2017-05-17 北京联合大学 Method for real-time lane line detection based on vision under complex lighting conditions
CN106886217A (en) * 2017-02-24 2017-06-23 安科智慧城市技术(中国)有限公司 Automatic navigation control method and apparatus
CN106886217B (en) * 2017-02-24 2020-09-08 深圳中智卫安机器人技术有限公司 Autonomous navigation control method and device
CN107133600A (en) * 2017-05-11 2017-09-05 南宁市正祥科技有限公司 A kind of real-time lane line detection method based on intra-frame trunk
CN107229063A (en) * 2017-06-26 2017-10-03 奇瑞汽车股份有限公司 A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry
CN107527506A (en) * 2017-09-20 2017-12-29 上海安道雷光波系统工程有限公司 Embedded radar monitors recombination optics and radar monitoring capturing system and method
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to
CN110120081B (en) * 2018-02-07 2023-04-25 北京四维图新科技股份有限公司 Method, device and storage equipment for generating lane markings of electronic map
CN110120081A (en) * 2018-02-07 2019-08-13 北京四维图新科技股份有限公司 A kind of method, apparatus and storage equipment of generation electronic map traffic lane line
CN108764075A (en) * 2018-05-15 2018-11-06 北京主线科技有限公司 The method of container truck location under gantry crane
CN110160542A (en) * 2018-08-20 2019-08-23 腾讯科技(深圳)有限公司 The localization method and device of lane line, storage medium, electronic device
CN110909569B (en) * 2018-09-17 2022-09-23 深圳市优必选科技有限公司 Road condition information identification method and terminal equipment
CN110909569A (en) * 2018-09-17 2020-03-24 深圳市优必选科技有限公司 Road condition information identification method and terminal equipment
CN110770741B (en) * 2018-10-31 2024-05-03 深圳市大疆创新科技有限公司 Lane line identification method and device and vehicle
WO2020087322A1 (en) * 2018-10-31 2020-05-07 深圳市大疆创新科技有限公司 Lane line recognition method and device, and vehicle
CN110770741A (en) * 2018-10-31 2020-02-07 深圳市大疆创新科技有限公司 Lane line identification method and device and vehicle
WO2020098286A1 (en) * 2018-11-13 2020-05-22 广州小鹏汽车科技有限公司 Lane line detection method and device
CN109491364A (en) * 2018-11-19 2019-03-19 长安大学 A kind of drive robot system and control method for vehicle testing
CN111256693A (en) * 2018-12-03 2020-06-09 北京初速度科技有限公司 Pose change calculation method and vehicle-mounted terminal
CN111256693B (en) * 2018-12-03 2022-05-13 北京魔门塔科技有限公司 Pose change calculation method and vehicle-mounted terminal
CN111316284A (en) * 2019-02-13 2020-06-19 深圳市大疆创新科技有限公司 Lane line detection method, device and system, vehicle and storage medium
WO2020164010A1 (en) * 2019-02-13 2020-08-20 深圳市大疆创新科技有限公司 Lane line detection method, device, system, vehicle and storage medium
CN111611830A (en) * 2019-02-26 2020-09-01 阿里巴巴集团控股有限公司 Lane line identification method, device and system and vehicle
CN110081880A (en) * 2019-04-12 2019-08-02 同济大学 A kind of sweeper local positioning system and method merging vision, wheel speed and inertial navigation
US11592305B2 (en) 2019-04-26 2023-02-28 Zongmu Technology (Shanghai) Co., Ltd. Method, system, terminal, and storage medium for rapid generation of reference lines
WO2020216315A1 (en) * 2019-04-26 2020-10-29 纵目科技(上海)股份有限公司 Method and system for rapid generation of reference driving route, terminal and storage medium
WO2020258901A1 (en) * 2019-06-25 2020-12-30 上海商汤智能科技有限公司 Method and apparatus for processing data of sensor, electronic device, and system
CN110221616A (en) * 2019-06-25 2019-09-10 清华大学苏州汽车研究院(吴江) A kind of method, apparatus, equipment and medium that map generates
CN110207714A (en) * 2019-06-28 2019-09-06 广州小鹏汽车科技有限公司 A kind of method, onboard system and the vehicle of determining vehicle pose
CN110458892A (en) * 2019-07-17 2019-11-15 北京自动化控制设备研究所 Camera calibration method based on inertial navigation
CN112304291A (en) * 2019-07-26 2021-02-02 厦门雅迅网络股份有限公司 HUD-based lane line display method and computer-readable storage medium
CN112304291B (en) * 2019-07-26 2023-12-22 厦门雅迅网络股份有限公司 HUD-based lane line display method and computer-readable storage medium
CN111324616B (en) * 2020-02-07 2023-08-25 北京百度网讯科技有限公司 Method, device and equipment for detecting lane change information
CN111324616A (en) * 2020-02-07 2020-06-23 北京百度网讯科技有限公司 Method, device and equipment for detecting lane line change information
CN111307176A (en) * 2020-03-02 2020-06-19 北京航空航天大学青岛研究院 Online calibration method for visual inertial odometer in VR head-mounted display equipment
CN111307176B (en) * 2020-03-02 2023-06-16 北京航空航天大学青岛研究院 Online calibration method for visual inertial odometer in VR head-mounted display equipment
CN111401446A (en) * 2020-03-16 2020-07-10 重庆长安汽车股份有限公司 Single-sensor and multi-sensor lane line rationality detection method and system and vehicle
CN112050821B (en) * 2020-09-11 2021-08-20 湖北亿咖通科技有限公司 Lane line polymerization method
CN112050821A (en) * 2020-09-11 2020-12-08 湖北亿咖通科技有限公司 Lane line polymerization method
CN112249014B (en) * 2020-10-22 2021-09-28 英博超算(南京)科技有限公司 Vehicle lateral control method, vehicle, and computer-readable storage medium
CN112249014A (en) * 2020-10-22 2021-01-22 英博超算(南京)科技有限公司 Vehicle lateral control method, vehicle, and computer-readable storage medium
CN112590670A (en) * 2020-12-07 2021-04-02 安徽江淮汽车集团股份有限公司 Three-lane environment display method, device, equipment and storage medium
CN112683292A (en) * 2021-01-07 2021-04-20 阿里巴巴集团控股有限公司 Navigation route determining method and device and related product
CN113932796A (en) * 2021-10-15 2022-01-14 北京百度网讯科技有限公司 High-precision map lane line generation method and device and electronic equipment
WO2023092451A1 (en) * 2021-11-26 2023-06-01 华为技术有限公司 Method and apparatus for predicting drivable lane
CN114322786A (en) * 2021-12-20 2022-04-12 重庆长安汽车股份有限公司 Lane line transverse distance measuring system and method thereof
CN115116019A (en) * 2022-07-13 2022-09-27 阿波罗智能技术(北京)有限公司 Lane line processing method, lane line processing device, lane line processing apparatus, and storage medium
CN115116019B (en) * 2022-07-13 2023-08-01 阿波罗智能技术(北京)有限公司 Lane line processing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN103940434B (en) 2017-12-15

Similar Documents

Publication Publication Date Title
CN103940434A (en) Real-time lane line detecting system based on monocular vision and inertial navigation unit
CN113945206B (en) Positioning method and device based on multi-sensor fusion
CN109341706B (en) Method for manufacturing multi-feature fusion map for unmanned vehicle
EP3016086B1 (en) Negative image for sign placement detection
Pomerleau et al. Long-term 3D map maintenance in dynamic environments
Alonso et al. Accurate global localization using visual odometry and digital maps on urban environments
Song et al. Enhancing GPS with lane-level navigation to facilitate highway driving
Fernández et al. Free space and speed humps detection using lidar and vision for urban autonomous navigation
CN112083725A (en) Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
US20220270358A1 (en) Vehicular sensor system calibration
CN114018248B (en) Mileage metering method and image building method integrating code wheel and laser radar
Pfaff et al. Towards mapping of cities
CN110119138A (en) For the method for self-locating of automatic driving vehicle, system and machine readable media
CN113071518B (en) Automatic unmanned driving method, minibus, electronic equipment and storage medium
US11481579B2 (en) Automatic labeling of objects in sensor data
CN103499337A (en) Vehicle-mounted monocular camera distance and height measuring device based on vertical target
Wu et al. Robust LiDAR-based localization scheme for unmanned ground vehicle via multisensor fusion
Burnett et al. Zeus: A system description of the two‐time winner of the collegiate SAE autodrive competition
CN114323033A (en) Positioning method and device based on lane lines and feature points and automatic driving vehicle
Liu et al. Dloam: Real-time and robust lidar slam system based on cnn in dynamic urban environments
Hara et al. Vehicle localization based on the detection of line segments from multi-camera images
Wu et al. Automatic vehicle tracking with LiDAR-enhanced roadside infrastructure
Hu et al. A small and lightweight autonomous laser mapping system without GPS
Cordes et al. Accuracy evaluation of camera-based vehicle localization
Zhou et al. Road-Pulse from IMU to Enhance HD Map Matching for Intelligent Vehicle Localization

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant