CN103954275B - Lane line detection and GIS map information development-based vision navigation method - Google Patents

Lane line detection and GIS map information development-based vision navigation method Download PDF

Info

Publication number
CN103954275B
CN103954275B CN201410127590.4A CN201410127590A CN103954275B CN 103954275 B CN103954275 B CN 103954275B CN 201410127590 A CN201410127590 A CN 201410127590A CN 103954275 B CN103954275 B CN 103954275B
Authority
CN
China
Prior art keywords
gis
lane
lane line
map information
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410127590.4A
Other languages
Chinese (zh)
Other versions
CN103954275A (en
Inventor
杜少毅
沈雅清
崔迪潇
宋晔
薛建儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changsha puran Network Technology Co.,Ltd.
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201410127590.4A priority Critical patent/CN103954275B/en
Publication of CN103954275A publication Critical patent/CN103954275A/en
Application granted granted Critical
Publication of CN103954275B publication Critical patent/CN103954275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching

Abstract

The invention discloses a lane line detection and GIS map information development-based vision navigation method. The method comprises the following steps: acquiring GIS map information, preprocessing, carrying out real-time image acquisition and lane line detection to generate a GIS local map in real-time, carrying out coupling verification on a track line detection result and the GIS local map through an ICP algorithm, and generating a high-credibility track line result according to the coupling result. The method can realize the credibility evaluation of the lane line detection through a geographic information system, corrects when false detection or missing inspection appears, expands the application range of the lane detection algorithm, and also can improve the map positioning precision through visual perception information, so the performances of a vision navigation system in a whole intelligent driving and aided driving system are improved, thereby the system can adapt to more complex environments.

Description

Vision navigation method based on lane detection and GIS map information development
Technical field
The invention belongs to unmanned field of intelligent control, it is related to the vision guided navigation that a kind of Vehicular intelligent is driven Method, especially one kind are based on lane detection and GIS-Geographic Information System(Geographic Information System, GIS)Cartographic information combines developed vision navigation method, for Real-Time Evaluation and verification The reliability of traditional lane line testing result, improves the precision of intelligent driving vision guided navigation.
Background technology
Vision guided navigation is the key technology step in intelligent driving and auxiliary driving, is by vision number According to process (generally view data), useful information in extraction environment, provide reliable for Driving Decision-making Foundation.Lane detection is one of importance, typically carries out figure by forward sight collected by camera image As processing, obtain the information of surface conditions, including track under the different scenes such as city, rural area and high speed The quantity of line, position, width and the bifurcated that crosses etc..Lane detection is broadly divided into following module: The turning of Image semantic classification, feature extraction, track models fitting, time domain association and image and world coordinates Change.Current lane detection system can meet the requirement under basic scene, but the method for view-based access control model Can affected by various factors, such as road multi-obstacle avoidance, pavement markers are unclear, weather impact and Illumination variation etc..These factors can lead to system flase drop and missing inspection, and system lacks to detection knot The verification of fruit and the evaluation of credibility.On the other hand, GIS-Geographic Information System(GIS), GPS and inertia Measuring unit(IMU)Also begin to be widely used in the positioning and navigation that automatic Pilot and auxiliary are driven.Its The precision of measurement and positioning is key issue.The GPS of current commercial use can reach the precision of 5-10m Scope, can bring up to 1-2m in conjunction with inertial navigation unit, but the navigation that track precision is got off travels Then need to propose higher requirement to precision.How to design realization lane detection to be verified and evaluates, Make detection method adapt to different complex environments, strengthen the robustness of algorithm, improve testing result credibility, The method simultaneously improving the precision of positioning and navigation has become the research heat that automatic driving and auxiliary are driven One of point.
Content of the invention
It is an object of the invention to overcoming above-mentioned technological deficiency, provide a kind of based on lane detection and The vision navigation method of GIS map information development, this vision navigation method can pass through GIS-Geographic Information System Trust evaluation is carried out to lane detection result, is corrected when flase drop or missing inspection occur, expand car The scope of application of road detection algorithm, and the precision of Orientation on map can be improved by visual information, thus improving The performance of whole vision navigation system.
For reaching above-mentioned purpose, present invention employs technical scheme below.
Present invention assumes that Location vehicle, in the GIS map providing, obtains initialization data during original state Afterwards, real-time update local map, then carries out coupling verification with lane detection result.
In order to ensure stability and the suitability of the present invention, the method is based on several hypothesis as follows:1. Will the road edge point of running section and corresponding through GIS map information through pretreatment, being provided car Crossing point as prior information;2. the GPS road edge point being given and crossing point are sequential storage knot Structure, is corresponding in turn to the section in vehicle travels and and guides vehicle by this route;3. by GPS and inertia The error of navigation elements positioning and precision are in tolerance interval(Half lane width about 2m)Within.
The method of the invention mainly includes the following steps that:
1)Gather GIS map information in advance and carry out pretreatment;
2)Driveway line of going forward side by side real-time image acquisition detects;
3)Generate GIS local map in real time;
4)Lane detection result and GIS local map are carried out mating verification;
5)Lane line result with a high credibility is generated according to matching result.
Described step 1)In, the road edge data of global map is gathered in advance by pose measurement equipment, And smooth Filtering Processing is carried out to road edge data, obtain the road edge point set of global map.
Described step 2)In, by image capture device(Including vehicle-mounted digital camera and fixed focal length mirror First-class), using based on monocular vision, carry out feature extraction and track models fitting under time domain association Detection method real-time detection road surface lane line.
Described step 3)In, by pose measurement equipment(Including inertia combined navigation system, optical fiber top Spiral shell, vehicle-mounted speedometer and front wheel angle meter etc.), obtain current car body position in real time, and in GIS Position in global map, then according to local map model generates local environment map.
Described step 4)In, in conjunction with step 2)With step 3)Result, with changing under same scale For closest approach(Iterative Closest Point, ICP)Algorithm is to lane detection result and GIS Localized road edge carries out shiding matching verification, obtains best match position and maximum match error, coupling When according to track line model and local map model group match, select that coupling is optimum one group.
Described step 5)In, according to step 4)The matching result obtaining, by under best match position Maximum match error, compared with threshold value, determines the credibility of lane detection result, then revises translation Distance and the positional information of positioning.
If testing result is insincere, generate virtual lane line according to road edge line.
Improve correctness and the stability of testing result, final choice respective party by lane line position constraint Case generates the high lane line result of degree of accuracy.
Beneficial effects of the present invention are embodied in:
The present invention is provided the road edge information of the global map of vehicle institute running region first by GIS, so Obtain GIS local map information and the lane line result of the local detecting, following GIS information afterwards It is mutually matched verification with lane detection result, to be revised trend and the trend of lane line by GIS information, Position between the road edge simultaneously being provided by vision lane detection modified result GIS and lane line is closed The high lane line result of system, final transmission credibility and degree of accuracy.Add GIS map information purpose be By with the road edge on map as priori, lane line is corrected by the matching degree that curve moves towards trend The flase drop existing and missing inspection.On the overall trend of curve, GIS map information has higher credibility, Therefore can thus correct lane line;But for positional precision, due to unstability and the precision of GPS Limit, its result in visually-perceptible with a low credibility, therefore can by the testing result of visually-perceptible Lai Correction position precision.
The invention has the characteristics that:
1. the present invention can revise the lane line flase drop causing due to factors such as terrestrial reference, light or deep cambers Or the inaccurate situation of testing result;
2. the present invention can provide position accurately virtual lane line when not having lane detection result, protects Card car has continuous perception data in the process of moving;
3., due to the addition of GIS prior information, substantially increase the reliability of lane detection result;
4. enable a system to adapt to more complex environment, such as light change, multi-obstacle avoidance and night Travel etc..
Brief description
Fig. 1 is the system construction drawing of the present invention.
Fig. 2 is the outline flowchart of the present invention.
Fig. 3 is the detailed design flow chart of the present invention.
Fig. 4 is the GIS map road edge illustraton of model of the present invention.
Fig. 5 is the lane detection flow chart of the present invention.
Fig. 6 is the track basic model figure of the present invention.
Fig. 7 is the car body coordinate schematic diagram of the present invention.
Fig. 8 is lane line and the road edge matching algorithm flow chart of the present invention.
Specific embodiment
Below in conjunction with accompanying drawing, the present invention is elaborated.
Referring to Fig. 1 and Fig. 2, described method mainly includes the following steps that:
1)Gather GIS map information in advance and carry out pretreatment;
2)Driveway line of going forward side by side real-time image acquisition detects;
3)Generate GIS local map in real time;
4)Lane detection result carries out mating verification with GIS local map;
5)Lane line result with a high credibility is generated according to matching result.
GIS map road edge information to be used in algorithm is by the collection of gps data on the spot and map Demarcate and to realize.Road left hand edge and road right hand edge is marked respectively during demarcation, and the road being marked For unidirectional, each section arranges according to predetermined travel route order.The data mode of cartographic information is with discrete The mode of sampled point is given, and data content is the gps coordinate of each sampled point, and data is smoothed the most at last Filtering obtains meeting the GIS global map initial data of road edge information model, model referring to Fig. 4, EL and ER represents the left and right edge of road respectively.
Referring to Fig. 5, lane detection process is to obtain the internal reference of camera and outer ginseng by off-line calibration, right Image obtains Herba Plantaginis image top view after line projection's change.It is higher than road based on lane line parts of images pixel It is binary segmentation image that the basic assumption in face region extracts possible lane line information, and it is connected Domain detection is simultaneously fitted according to broken line.All connected domains are completed after segmented fitting, according to its length Carry out broken line connection with angle restriction and obtain track candidate line.Using width and collimation to candidate lane Line carries out screening and filtering and carries out the time domain association between multiple image using pose data, obtains final car Diatom testing result.Track basic model, referring to Fig. 6, takes three lanes model, and L1 is represented respectively with R1 The left and right lane line of current lane, L2 and R2 represents the second from left and right two lane highways line respectively.
The real-time generation of GIS local map, lane line and the verification of GIS information matches and final result Generate to be realized by greater detail below step, referring to Fig. 3:
1)GIS information point and translation distance initialization
The process of GIS map information is based on it is assumed hereinafter that:The GIS map information of the wanted running region of car Be given by road edge point;GIS map information is expressed as road left hand edge and is become with road right hand edge point To sequential organization;Car body starts position in the range of GIS map.System loads GIS map information first File, then positions car body position in map.Initial alignment passes through the road of global search GIS information Marginal point simultaneously searches for nearest GIS road edge point according to car body pose.Find out i,
s . t . min i dis ( P v , G i )
Wherein PvRepresent car body position coordinate, GiRepresent GIS information point coordinates, dis (Pv,Gi) represent point-to-point transmission Distance.
Consider to need to provide the local map under car body coordinate, so it is to be transformed into GIS point that distance calculates Under bodywork reference frame;In addition, in the case of real road, road right hand edge often occurs to be widened or reduces one The situation in track, can affect the judgement of Location vehicle, and road left hand edge is relatively stable, so in positioning When only consider road left hand edge point, that is,:
min i dis ( P v , GL i )
Wherein GLiRepresent road left hand edge point in GIS information.
After initial alignment, start to initialize the local map under car body coordinate.Car body coordinate local map Scope is car body position(- 10m, 60m).After global map positions, tied according to its sequential storage Information in the range of structure real-time loading local map, then according to local map model conversation is tied to corresponding In structure body.
After local map creates success, the translation distance of initialization lane line and GIS map road edge(Real During the operation of border, position in the middle of certain track for the car body starts, in order to obtain correct initial translation amount).Initially Translation distance is defaulted as the distance between left-lane line L1 in current road and road left hand edge EL.If initially having Lane detection result, then calculate left-lane line and road left hand edge translation distance;If no lane detection Result, it assumes that car body present position is track midpoint, estimates left-lane line L1 according to lane width Position, calculate translation distance.I.e.:
IniTransDis = dis ( P v - LW / 2 , EL i , x = 0 ) , if ( DetectLane . IsOK = 0 ) dis ( L 1 x = 0 , EL i , x = 0 ) , if ( DetectLane . IsOK = 1 )
Wherein InitTransDis represents initial translation distance, and LW represents lane width, and subscript x=0 represents car Position under body coordinate, the position that is, car body is located, for lane detection result just whether DetectLane.IsOK True flag bit.Referring to Fig. 7, it is car body coordinate schematic diagram.Initial translation distance provides for situations below Verification foundation:1. initial frame coupling verification;2. no lane detection result, lane line information is lost or is gone through When history associated data is lost.
2)Local GIS information map updates and matching
After initial frame Location vehicle, every frame no longer carries out global search positioning later, but according to car body position Put real-time update local map.Specific practice is:
Step1:History local map is transformed under current car body coordinate;
Step2:Remove the point beyond current local map scope in history local map;
Step3:In order read in GIS information map in subsequent point until beyond local map scope.
In addition, for the GIS map forming loop(I.e. the starting point of map file is identical with terminating point), When car body drives near this position, need to carry out the judgement process to end of file, and initiate from file Then it is loaded into.
For the local map obtaining, GIS map information point therein be tied according to the model of lane line Structure representation is fitted, to carry out coupling verification below.During practical operation, according to polynomial module Type represents road curve.During matching, the left and right edge of road is respectively adopted adaptive fit approach.Tool Body step is:
Step1:The road edge point of local map is started matching from single order, after obtaining fitting parameter, then Carry out discrete sampling, calculate the error of sample point and actually detected sample point.If error exceedes threshold value, Illustrate that precision is unsatisfactory for requiring under current matching order, then need to improve matching order, after order adds 1 Repeat this step until meeting fitting precision(In view of the actual form of road, be not in excessively curved Bent curve, therefore setting matching order is less than 3 times);
Step2:If precision is still unsatisfactory for requiring after 3 rank matchings, need to reduce local map institute's matching side The scope of edge point, rejects distalmost end sample point from the sample point of matching, again according to second-order fit, Repeat this step until meeting fitting precision.
3)Lane detection result is verified with GIS information matches
Algorithm, in coupling verification lane line, only considers the left and right lane line of car body place current lane(I.e. L1 and R1).Respectively by the left and right edge of road of the GIS information after L1, R1 and matching during coupling(I.e. EL, ER)Match, carry out four groups of couplings altogether:Left-lane line and road left hand edge L1EL, left-lane line With road right hand edge L1ER, right-lane line and road left hand edge R1EL, right-lane line and road right hand edge R1ER, four groups of couplings are abbreviated as LL, LR, RL, RR.Record wherein best matching result, and record this The matching error of Optimum Matching under match group.Judge to be examined with the threshold value setting by comparison match error Measure whether lane line result correct.
In matching process, local environment map road edge as referenced model point set E, detection Lane line is as shape data point set L.The purpose of coupling is the optimum change found between two two-dimentional point sets Change T so that shape data point concentrates all of point to concentrate corresponding point with model points after conversion Diversity reaches minimum, and that is, the similarity measurement of the point set T (L) after conversion and E is minimum.I.e.:
min T , C J ( T ( L ) , E )
Wherein, J (T (L), E) represents the similarity measurement pattern between two point sets, if the mapping between two point sets is closed System is expressed as C:L → E, then above formula repeat for:
min T , C J ( T ( L ) , C ( L ) )
The part of coupling verification employs the Discrete Method for Solving of ICP algorithm, and the T in above formula represents rotation peace Move conversion, similarity measurement J adopts minimum squared distance, that is,:
J ( T ( L ) , E ) = Σ i = 1 N | | T ( l i ) → - e → j ( i ) | | 2 2
In formula, l represents the point in shape data point set L, the point in e representative model point set E, and j (i) represents point In collection L i-th point and j-th point of corresponding, N in point set ElRepresent shape data point set L midpoint Number.
During coupling verification lane line, the trend of moving towards of lane line to be estimated, not consider to revolve Transformation is changed, and therefore T (L) only comprises translation transformation, usesRepresent translational movement, then the lane line requiring and road The minimum difference of Road Edge line is:
min J ( t → ) = Σ i = 1 N l | | l → i + t → - e → j ( i ) | | 2 2
When above formula object function takes minima,For:
t → = 1 N Σ j = 1 N e j ( i ) → - 1 N Σ i = 1 N l → i
I.e. when lane line point set is alignd with road edge line point set center, object function takes minima, Namely diversity is minimum.Obtain the translational movement under minimum difference, also just obtain the position of Optimum Matching Put.Then the shape data point set of lane line is moved at the model point set of road edge line according to translational movement, Calculate matching error.In order to judge whether two point sets meet the coupling requirement of practical application, need to obtain Maximum match error under excellent matched position, that is,:
max i w i = | | l → i + t → - e → j ( i ) | | 2 2
Wherein wiRepresent that point concentrates the single-point maximum match error of corresponding point.
Four groups are mated(LL,LR,RL,RR)In each group, all by way of alternating iteration, respectively Obtain corresponding relation and the translation transformation of lane line point set L and road edge point set E, and single-point is maximum Matching error, referring to Fig. 8.Also to ensure that the degree of priority of vision lane line precision is it is believed that four groups As long as having one group to meet coupling in coupling to require, being considered as lane detection result and meeting local map information, Testing result is correct and has higher credibility.Therefore four groups of matching degrees are compared.Matching degree Determined by the cumulative errors that two points concentrate all corresponding point.The minimum match group of cumulative errors is optimum Join.I.e.:
min LL , LR , RL , RR W = Σ i = 1 N w i = Σ i = 1 N | | l → i + t → - e → j ( i ) | | 2 2
Single-point maximum match error under final Optimum Matching is wi M, M ∈ (LL, LR, RL, RR), subscript M represents Optimum Matching group.This error just determines lane line point set and road edge with the relation of threshold value Whether line point set meets coupling requires, that is, the trend of the lane line entirety detecting under local map becomes What whether gesture met road edge line moves towards trend.
In order to execute algorithm above, first have to carry out discrete adopting to the lane line after matching and road edge Sample, obtains shape data point set and model point set.While discrete sampling, according to shiding matching mode Determine the corresponding relation between two point sets.Calculate the translation distance of translation transformation, similarity measurement and right The maximum distance error of appearance should be put.Specifically execute according to the following steps:
Step1:To four match groups, execute Step2 to Step6 respectively:
Step2:The lane line detecting and GIS road edge line shiding matching, the value of sliding distance d In the scope [- D, D] and step-length is Dstep, circulation execution Step3 to Step5;
Step3:Lane line point concentrates the coordinate xl of sampled point to adopt by apart from h from lane line origin-to-destination Sample, and press lane line fitting parameter calculating each point ylValue be yi l=fl(xi l), on the lane line detecting Point is designated as li(xi l,yi l);Calculate the coordinate of the sampled point on road edge line according to current shiding matching apart from d xe, i.e. xe=xl+ d, and calculate each point y according to road edge fitting parametereValue, that is,
yj e=fe(xj e)=fe(xi l+ d), the point set of road edge line is designated as ej(xj e,yj e);Count point centrostigma simultaneously Total Np
Step4:Calculate lane line and the mean place of road edge line up-sampling point(I.e. center), Two points concentrate the x coordinate of all points and y-coordinate value to add up respectively, divided by a number for the point of concentration statistics Np, obtainWith
Step5:Calculate translation distance
Step6:Single-point matching error is calculated to all corresponding point that two points are concentrated Maximum single-point matching error MAXw of record, total matching error W=W+w of accumulation calculating simultaneouslyi
Step7:Find out total matching error in four groups of couplings minimum, that is, under Optimum Matching position corresponding Big single-point matching error, returns MAXw now, translation vectorAnd the group matching(I.e. LL, LR, RL or RR).
The error amount MAXw of return is compared with threshold value, if less than threshold value then it is assumed that being detected Lane line is matched with GIS road edge trend, increased the credibility of lane line it is believed that lane line is examined Survey correct;If being more than threshold value, illustrate that the lane line detecting and road edge move towards trend and have differences, Think lane detection mistake, then generate virtual lane line according to road edge line.
4)Generate virtual lane line
Several situations are required to GIS information to generate virtual vehicle diatom below:1. the lane line detected by Too short, now think that lane line information can not meet the requirement of perception data or credibility is not high;2. examined The lane line measuring and GIS road edge lines matching error exceed threshold value, now think that lane detection is tied Really inaccurate;3. fail to detect lane line, now directly virtual lane line is provided by GIS information, protect Card perception data seriality.
The generation of virtual lane line be based on it is assumed hereinafter that:Step 3)Can be in real time according to lane detection result Update match information and translation transformation, and this match information and translation transformation be suitable within one section of continuous time, I.e. one section continuous time inside lane line mate with road edge with identical translational movement, until next time more Newly.Coupling group when correctly mating for the last time in system log history frame and translation distance, when need When generating virtual lane line, then according to coupling group and the translation distance of record, corresponding road is left Or right hand edge fit line moves to track line position by translational movement.Then the fitting parameter of road edge line is assigned To lane line fitting parameter, the starting and terminal point coordinate of determination lane line, road width, lane line belong to simultaneously The information such as property, as virtual lane line.The lane detection result of flushing errors is needed after generation, it is to avoid Time domain association causes multiframe flase drop.
In addition it is also necessary to traveling one is entered to the integral position of lane line result before sending final lane line result Step constraint, that is, judge whether all tracks line position all between the left and right edge line of GIS road.I.e. full Enough to lower constraint:
ERY<LaneL2,L1,R1,R2<ELY
Wherein LaneL2,L1,R1,R2Represent four lane line L2 in the line model of track, L1, R1, R2, ELY, ERYRepresent the coordinate figure of the left and right edge of road Y-axis under bodywork reference frame respectively.
During algorithm performs, the beginning and end coordinate choosing every lane line is compared with road edge line position, To judge whether meet the constraint.
If R2 or L2 lane line exceeds the restriction range of road edge line, remove this lane line;If R1 Or L1 A-road line exceeds the restriction range of road edge line, then according to the LL match group of acquiescence and initial Translation distance InitTransDis, generates virtual lane line in car body present position.
Final lane detection result and other perception informations of road(Including crossing, barrier etc.) Blend, with UDP bag(User Datagram Protocol)Form pass through switch communication, will Complete visually-perceptible data is activation gives follow-up planning performance element.
This method gathers GIS map information first and carries out pretreatment, then real-time image acquisition carrying out Lane detection, generates GIS local map, by lane detection result with GIS partly simultaneously in real time Figure carries out coupling verification by ICP algorithm, finally generates track knot with a high credibility according to matching result Really.The method can carry out trust evaluation by GIS-Geographic Information System to lane detection result, is going out It is corrected when existing flase drop or missing inspection, expand the scope of application of lane detection algorithm, and visual impression can be passed through Know that information improves the precision of Orientation on map, thus improving vision in whole intelligent driving and DAS (Driver Assistant System) The performance of navigation system, enables a system to adapt to more complex environment.
Above content is further description made for the present invention with reference to specific preferred implementation, It cannot be assumed that the specific embodiment of the present invention is only limitted to this, common for the technical field of the invention For technical staff, without departing from the inventive concept of the premise, some simple deductions can also be made Or replace, all should be considered as belonging to the scope of patent protection of claims of the present invention determination.

Claims (7)

1. a kind of vision navigation method based on lane detection and GIS map information development, its feature It is:Comprise the following steps:
1) gather GIS map information in advance and carry out pretreatment;
2) real-time image acquisition go forward side by side driveway line detection;
3) generate GIS local environment map in real time;
4) lane detection result and GIS local map are carried out mating verification;
5) lane line result with a high credibility is generated according to matching result;
Described step 4) in, in conjunction with step 2) with step 3) result, with changing under same scale For closest approach algorithm, shiding matching verification is carried out to lane detection result and GIS localized road edge, obtain Obtain best match position and maximum match error, according to track line model and local map model during coupling Group match, selects one group of coupling optimum.
2. a kind of regarding based on lane detection and GIS map information development according to claim 1 Feel air navigation aid it is characterised in that:Described step 1) in, gathered in advance entirely by pose measurement equipment The road edge data of local figure, and smooth Filtering Processing is carried out to road edge data, obtain globally The road edge point set of figure.
3. a kind of regarding based on lane detection and GIS map information development according to claim 1 Feel air navigation aid it is characterised in that:Described step 2) in, by image capture device, using being based on Monocular vision, time domain carry out feature extraction under associating and the detection method of track models fitting is examined in real time Survey road surface lane line.
4. a kind of regarding based on lane detection and GIS map information development according to claim 1 Feel air navigation aid it is characterised in that:Described step 3) in, obtained in real time by pose measurement equipment and work as Front truck body position, and position in GIS global map, then according to local map model generates local ring Condition figure.
5. a kind of regarding based on lane detection and GIS map information development according to claim 1 Feel air navigation aid it is characterised in that:Described step 5) in, according to step 4) matching result that obtains, Maximum match error under best match position is compared with threshold value, determine lane detection result can Reliability, then revises the positional information of translation distance and positioning.
6. a kind of regarding based on lane detection and GIS map information development according to claim 5 Feel air navigation aid it is characterised in that:If testing result is insincere, generate virtual according to road edge line Lane line.
7. a kind of regarding based on lane detection and GIS map information development according to claim 5 Feel air navigation aid it is characterised in that:The correctness of testing result and steady is improved by lane line position constraint Qualitative.
CN201410127590.4A 2014-04-01 2014-04-01 Lane line detection and GIS map information development-based vision navigation method Active CN103954275B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410127590.4A CN103954275B (en) 2014-04-01 2014-04-01 Lane line detection and GIS map information development-based vision navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410127590.4A CN103954275B (en) 2014-04-01 2014-04-01 Lane line detection and GIS map information development-based vision navigation method

Publications (2)

Publication Number Publication Date
CN103954275A CN103954275A (en) 2014-07-30
CN103954275B true CN103954275B (en) 2017-02-08

Family

ID=51331585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410127590.4A Active CN103954275B (en) 2014-04-01 2014-04-01 Lane line detection and GIS map information development-based vision navigation method

Country Status (1)

Country Link
CN (1) CN103954275B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11960295B2 (en) 2022-01-28 2024-04-16 Agjunction Llc 3-D image system for vehicle control

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112923936B (en) * 2015-02-10 2023-04-18 御眼视觉技术有限公司 Navigation system, method, and computer readable medium for vehicle
CN104819724B (en) * 2015-03-02 2018-04-03 北京理工大学 A kind of autonomous travel assist system of Unmanned Ground Vehicle based on GIS
DE102015214338A1 (en) * 2015-07-29 2017-02-02 Volkswagen Aktiengesellschaft Determining an arrangement information for a vehicle
CN105043403B (en) * 2015-08-13 2017-12-01 武汉光庭信息技术有限公司 High-precision map route planning system and method
CN108139217B (en) * 2015-09-30 2022-04-26 日产自动车株式会社 Travel control method and travel control device
CN105371859B (en) * 2015-11-05 2018-05-08 武汉光庭信息技术股份有限公司 The navigation guide system and method for track rank based on safe driving map
CN105607071B (en) * 2015-12-24 2018-06-08 百度在线网络技术(北京)有限公司 A kind of indoor orientation method and device
CN105698812B (en) * 2016-01-15 2019-04-30 武汉光庭科技有限公司 Lane detection system and method based on safe driving map and two sides camera in a kind of automatic Pilot
CN105718865A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 System and method for road safety detection based on binocular cameras for automatic driving
CN105783936B (en) * 2016-03-08 2019-09-24 武汉中海庭数据技术有限公司 For the road markings drawing and vehicle positioning method and system in automatic Pilot
CN107272727B (en) * 2016-04-01 2022-02-01 松下电器(美国)知识产权公司 Autonomous moving body
CN107643086B (en) * 2016-07-22 2021-04-13 北京四维图新科技股份有限公司 Vehicle positioning method, device and system
CN106500705A (en) * 2016-10-20 2017-03-15 上海智驾电子科技有限公司 Alignment system and localization method based on lane detection
CN106444765B (en) * 2016-10-21 2019-07-09 广东工业大学 A kind of AGV air navigation aid of view-based access control model
CN106525057A (en) * 2016-10-26 2017-03-22 陈曦 Generation system for high-precision road map
CN106778548B (en) * 2016-11-30 2021-04-09 百度在线网络技术(北京)有限公司 Method and apparatus for detecting obstacles
CN108303103B (en) * 2017-02-07 2020-02-07 腾讯科技(深圳)有限公司 Method and device for determining target lane
CN106918342B (en) * 2017-03-10 2020-01-03 广州汽车集团股份有限公司 Method and system for positioning driving path of unmanned vehicle
WO2018236853A1 (en) * 2017-06-22 2018-12-27 Agjunction Llc 3-d image system for vehicle control
CN113822939A (en) 2017-07-06 2021-12-21 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted sensor
WO2019043831A1 (en) 2017-08-30 2019-03-07 日産自動車株式会社 Position correction method for driving-assist vehicle and position error correction device
CN107782321B (en) * 2017-10-10 2020-10-30 武汉迈普时空导航科技有限公司 Combined navigation method based on vision and high-precision map lane line constraint
CN111527377B (en) * 2017-12-27 2023-06-13 宝马股份公司 Verification of digital maps
CN108413971B (en) * 2017-12-29 2021-12-14 驭势科技(北京)有限公司 Vehicle positioning technology based on lane line and application
CN108297866B (en) * 2018-01-03 2019-10-15 西安交通大学 A kind of lane holding control method of vehicle
CN108737955B (en) * 2018-04-28 2021-02-05 交通运输部公路科学研究所 LDW/LKA test evaluation system and method based on virtual lane line
CN108955702B (en) * 2018-05-07 2021-09-07 西安交通大学 Lane-level map creation system based on three-dimensional laser and GPS inertial navigation system
KR102420476B1 (en) * 2018-05-25 2022-07-13 에스케이텔레콤 주식회사 Apparatus and method for estimating location of vehicle and computer recordable medium storing computer program thereof
CN108847120B (en) * 2018-06-19 2021-03-12 品信科技有限公司 Drawing method based on electronic fence map order thermodynamic diagram distribution
WO2020004817A1 (en) * 2018-06-26 2020-01-02 에스케이텔레콤 주식회사 Apparatus and method for detecting lane information, and computer-readable recording medium storing computer program programmed to execute same method
CN108777071A (en) * 2018-07-04 2018-11-09 深圳智达机械技术有限公司 A kind of highway patrol robot
CN110736474B (en) * 2018-07-18 2021-09-24 郑州宇通客车股份有限公司 Map information acquisition method and device for vehicle
CN109115231B (en) * 2018-08-29 2020-09-11 东软睿驰汽车技术(沈阳)有限公司 Vehicle positioning method and device and automatic driving vehicle
CN109186615A (en) * 2018-09-03 2019-01-11 武汉中海庭数据技术有限公司 Lane side linear distance detection method, device and storage medium based on high-precision map
CN110967026B (en) * 2018-09-30 2022-02-22 毫末智行科技有限公司 Lane line fitting method and system
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
CN110967025B (en) * 2018-09-30 2022-05-13 毫末智行科技有限公司 Lane line screening method and system
JP7117394B2 (en) * 2018-11-29 2022-08-12 日立Astemo株式会社 Vehicle control system and server
CN109470255B (en) * 2018-12-03 2022-03-29 禾多科技(北京)有限公司 High-precision map automatic generation method based on high-precision positioning and lane line identification
CN110263607B (en) * 2018-12-07 2022-05-20 电子科技大学 Road-level global environment map generation method for unmanned driving
CN109579863A (en) * 2018-12-13 2019-04-05 北京航空航天大学 Unknown topographical navigation system and method based on image procossing
CN109583418B (en) * 2018-12-13 2021-03-12 武汉光庭信息技术股份有限公司 Lane line deviation self-correction method and device based on parallel relation
CN111326006B (en) * 2018-12-14 2022-09-16 上海博泰悦臻网络技术服务有限公司 Reminding method, reminding system, storage medium and vehicle-mounted terminal for lane navigation
CN109849922B (en) * 2018-12-25 2020-08-04 青岛中汽特种汽车有限公司 Visual information and GIS information fusion-based method for intelligent vehicle
CN111380546A (en) * 2018-12-28 2020-07-07 沈阳美行科技有限公司 Vehicle positioning method and device based on parallel road, electronic equipment and medium
CN109948413B (en) * 2018-12-29 2021-06-04 禾多科技(北京)有限公司 Lane line detection method based on high-precision map fusion
CN111750878B (en) * 2019-03-28 2022-06-24 北京魔门塔科技有限公司 Vehicle pose correction method and device
CN109974739B (en) * 2019-04-15 2020-11-10 西安交通大学 Global navigation system based on high-precision map and navigation information generation method
CN110415330B (en) * 2019-04-29 2020-05-29 当家移动绿色互联网技术集团有限公司 Road generation method, device, storage medium and electronic equipment
CN111912416B (en) * 2019-05-07 2022-07-29 北京市商汤科技开发有限公司 Method, device and equipment for positioning equipment
WO2020232648A1 (en) * 2019-05-22 2020-11-26 深圳市大疆创新科技有限公司 Lane line detection method, electronic device and storage medium
CN110243372B (en) * 2019-06-18 2021-03-30 北京中科原动力科技有限公司 Intelligent agricultural machinery navigation system and method based on machine vision
CN112172810A (en) * 2019-06-18 2021-01-05 广州汽车集团股份有限公司 Lane keeping device, method and system and automobile
CN110530389B (en) * 2019-09-06 2021-07-06 禾多科技(北京)有限公司 Intersection mode identification method and system based on high-precision navigation electronic map
CN110645973B (en) * 2019-09-24 2021-06-25 森思泰克河北科技有限公司 Vehicle positioning method
CN111145576B (en) * 2019-12-09 2022-05-27 博泰车联网科技(上海)股份有限公司 Vehicle searching method and related equipment
CN111307165B (en) * 2020-03-06 2021-11-23 新石器慧通(北京)科技有限公司 Vehicle positioning method and system and unmanned vehicle
CN115427759B (en) * 2020-04-08 2023-08-29 日产自动车株式会社 Map information correction method, driving assistance method, and map information correction device
CN111709322B (en) * 2020-05-28 2022-09-06 武汉中海庭数据技术有限公司 Method and device for calculating lane line confidence
CN111652952B (en) * 2020-06-05 2022-03-18 腾讯科技(深圳)有限公司 Lane line generation method, lane line generation device, computer device, and storage medium
CN112559659B (en) * 2020-12-10 2021-12-24 首都师范大学 Electronic map generation method and device, and driving data acquisition method and device
CN112572432B (en) * 2020-12-17 2022-03-18 东风汽车集团有限公司 LKA lane keeping system and method based on ultrasonic radar detection road edge
CN113942522A (en) * 2021-05-31 2022-01-18 重庆工程职业技术学院 Intelligent driving safety protection system
CN113486755A (en) * 2021-06-29 2021-10-08 上海商汤临港智能科技有限公司 Map matching method and device, electronic equipment and storage medium
CN113705414B (en) * 2021-08-23 2024-02-23 山东恒宇电子有限公司 Lane line identification method and system based on image compensation and color enhancement
CN114587210B (en) * 2021-11-16 2023-06-20 北京石头创新科技有限公司 Cleaning robot control method and control device
CN114323050A (en) * 2022-01-07 2022-04-12 苏州挚途科技有限公司 Vehicle positioning method and device and electronic equipment
CN115240435A (en) * 2022-09-21 2022-10-25 广州市德赛西威智慧交通技术有限公司 AI technology-based vehicle illegal driving detection method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102529975A (en) * 2010-12-13 2012-07-04 通用汽车环球科技运作有限责任公司 Systems and methods for precise sub-lane vehicle positioning
US8331623B2 (en) * 2008-12-23 2012-12-11 National Chiao Tung University Method for tracking and processing image
CN102842039A (en) * 2012-07-11 2012-12-26 河海大学 Road image detection method based on Sobel operator

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8331623B2 (en) * 2008-12-23 2012-12-11 National Chiao Tung University Method for tracking and processing image
CN102529975A (en) * 2010-12-13 2012-07-04 通用汽车环球科技运作有限责任公司 Systems and methods for precise sub-lane vehicle positioning
CN102842039A (en) * 2012-07-11 2012-12-26 河海大学 Road image detection method based on Sobel operator

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种交通监控场景下的多车道检测方法;王镇波等;《计算机工程与应用》;20121231;第48卷(第12期);第14-18、23页 *
基于信息融合的车道偏离预警鲁棒性算法研究;黄海港;《中国优秀硕士学位论文全文数据库》;20140315;正文第31、36、39页 *
面向地图点集的多尺度层级ICP算法;李策等;《计算机应用与软件》;20120430;第29卷(第4期);第83-85、93页 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11960295B2 (en) 2022-01-28 2024-04-16 Agjunction Llc 3-D image system for vehicle control

Also Published As

Publication number Publication date
CN103954275A (en) 2014-07-30

Similar Documents

Publication Publication Date Title
CN103954275B (en) Lane line detection and GIS map information development-based vision navigation method
CN106441319B (en) A kind of generation system and method for automatic driving vehicle lane grade navigation map
US20210311490A1 (en) Crowdsourcing a sparse map for autonomous vehicle navigation
US20230237689A1 (en) Determining road location of a target vehicle based on tracked trajectory
US20200247431A1 (en) Systems and methods for vehicle navigation
Alonso et al. Accurate global localization using visual odometry and digital maps on urban environments
CN106840148A (en) Wearable positioning and path guide method based on binocular camera under outdoor work environment
TWI742554B (en) Positioning method, path determination method, robot and storage medium
US20230073897A1 (en) Aligning road information for navigation
CN104732518A (en) PTAM improvement method based on ground characteristics of intelligent robot
Bürki et al. Vizard: Reliable visual localization for autonomous vehicles in urban outdoor environments
Senlet et al. Satellite image based precise robot localization on sidewalks
CN110471422A (en) The detection of obstacles and automatic obstacle avoiding method of intelligent wheel chair
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
US20220227373A1 (en) Systems and methods for detecting an open door
CN107167826A (en) The longitudinal direction of car alignment system and method for Image Feature Detection based on variable grid in a kind of automatic Pilot
CN115564865A (en) Construction method and system of crowdsourcing high-precision map, electronic equipment and vehicle
CN115205391A (en) Target prediction method based on three-dimensional laser radar and vision fusion
WO2022149046A1 (en) Systems and methods for monitoring lane mark quality
CN108107897A (en) Real time sensor control method and device
Hara et al. Vehicle localization based on the detection of line segments from multi-camera images
CN113220818A (en) Automatic mapping and high-precision positioning method for parking lot
WO2021198775A1 (en) Control loop for navigating a vehicle
US20230206608A1 (en) Systems and methods for analyzing and resolving image blockages
Boschenriedter et al. Multi-session visual roadway mapping

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210427

Address after: Room 301, building 7, phase I, Changsha Zhongdian Software Park, 39 Jianshan Road, high tech Development Zone, Changsha, Hunan 410000

Patentee after: Changsha puran Network Technology Co.,Ltd.

Address before: Beilin District Xianning West Road 710049, Shaanxi city of Xi'an province No. 28

Patentee before: XI'AN JIAOTONG University

TR01 Transfer of patent right