CN106767853B - Unmanned vehicle high-precision positioning method based on multi-information fusion - Google Patents
Unmanned vehicle high-precision positioning method based on multi-information fusion Download PDFInfo
- Publication number
- CN106767853B CN106767853B CN201611261781.5A CN201611261781A CN106767853B CN 106767853 B CN106767853 B CN 106767853B CN 201611261781 A CN201611261781 A CN 201611261781A CN 106767853 B CN106767853 B CN 106767853B
- Authority
- CN
- China
- Prior art keywords
- map
- vehicle
- lane line
- boundary
- lane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The invention relates to a high-precision positioning method of an unmanned vehicle based on multi-information fusion, which can be applied to environment perception and intelligent decision of the unmanned vehicle. The invention realizes high-precision real-time positioning by using the cooperation of the off-line map and the on-line sensing information. The off-line map records road traffic information of the driving area of the unmanned vehicle. The on-line perception information includes lane lines and road boundaries. When an unmanned vehicle runs in a map area, the approximate position of the vehicle is determined according to positioning information given by an inertial integrated navigation system, a local map near the position is obtained, a lane line in front of the vehicle and road boundaries on two sides of the vehicle are detected through a vehicle-mounted sensor, the relative positions of the vehicle, the lane line and the road boundaries are determined, the position of the vehicle in the map is compared, deviation is calculated, positioning errors are corrected, and high-precision positioning is achieved.
Description
Technical Field
The invention belongs to the technical field of unmanned vehicles, and particularly relates to a high-precision positioning method of an unmanned vehicle based on multi-information fusion.
Background
An unmanned vehicle is an intelligent vehicle which can automatically complete driving tasks. The vehicle-mounted sensor senses the road environment, and a proper driving strategy is adopted to control the vehicle to safely and reliably reach the destination. The unmanned vehicle is a product of high development of computer science, mode recognition and intelligent control technology, and has wide application prospect in the fields of national defense and national economy.
High precision positioning is a necessary condition for achieving unmanned driving. The unmanned vehicle can accurately judge the position of the unmanned vehicle by utilizing high-precision positioning and matching with a high-precision map, is familiar with the road traffic environment near the vehicle and reduces the requirement of a sensing system on environment detection. The high-precision positioning can help the decision-making system to plan the driving path in real time, select a proper lane, process various traffic conditions, effectively improve the driving quality and enhance the safety and intelligence of driving.
Conventional positioning usually consists of a satellite positioning system (GPS, beidou, etc.) plus an Inertial Navigation System (INS). Satellite positioning signals are easily interfered under the condition that high-rise trees shield the satellite positioning signals, and the signal quality is reduced. If the INS system alone is used to output positioning information, the error will increase rapidly over time.
Patent publication No. CN104089619A (application No. CN201410202876.4) provides a GPS navigation map exact matching system for an unmanned vehicle and an operation method thereof. The method comprises a positioning module, a map module and a matching module, wherein the positioning module acquires real-time positioning information and path track information of the vehicle, the map module makes the information into a map analysis module and a map loading module of a KML text map, and the matching module matches the optimal route for the vehicle in real time by using the information of the positioning module and the map module in the driving process. The KML map is matched with the longitude and latitude measured by the GPS in real time, local road environment characteristics are not utilized, and the KML map has a limited application range and insufficient positioning accuracy due to single information source and deviation of the map and the GPS information.
The positioning scheme of multi-sensor information fusion is adopted, various sensors can be combined together, an unmanned vehicle firstly receives satellite and inertial navigation positioning signals to realize coarse positioning, and then an environment sensing system is matched to acquire data by using sensors such as a vehicle-mounted laser radar and a camera to construct a two-dimensional or three-dimensional map of a scene, so that environment characteristics are extracted, and high-precision positioning under a local environment is obtained by means of map characteristic matching. The detection ranges of different sensors are different, and the adaptive environments are different. And a plurality of information sources are provided, so that the detection range can be enlarged, the information redundancy is increased, and the robustness and the reliability of the system are effectively enhanced.
Disclosure of Invention
The technical problem of the invention is solved: the invention overcomes the defects of the prior art and provides a high-precision positioning method of the unmanned vehicle based on multi-information fusion.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a high-precision positioning method of an unmanned vehicle based on multi-information fusion, which fuses information provided by a map module, a camera processing module and a radar processing module, and enables the position of the vehicle in the map to be matched with the real position of the vehicle in the environment through lane line correction and boundary correction. Wherein:
the map module is used for drawing a local map which takes the vehicle as the center and is in a certain range according to the real-time positioning information, and the local map is used by the camera processing module and the radar processing module;
the camera processing module is used for acquiring original road surface data in the driving process of the vehicle by using a camera, extracting lane lines and using the lane lines for map matching and correction;
and the radar processing module is used for collecting the surrounding environment information of the vehicle by using a laser radar, and extracting the road boundary for map matching and correction.
The camera processing module evaluates the availability of the detected lane line before map matching and correction, and the evaluation indexes comprise: a lane angle _ camera, a nose angle _ car, and a lane width difference line _ width,
angle _ camera: representing the included angle between the lane line detected in real time and the lane line in the local map;
angle _ car: representing the included angle between the vehicle head direction detected in real time and the lane line in the local map;
line _ width: and the width difference between the lane line detected in real time and the lane line in the local map is shown.
Wherein, threshold vectors are set for the lane line availability evaluation indexesIf any index exceeds the threshold value, the lane line has no availability and cannot be used for map correction.
The radar processing module carries out usability evaluation on the detected road boundary before map matching and correction, and the evaluation indexes comprise: angle angleDT, width widthDT, and line type leftDDT, rightDDT,
angleDT: a deflection angle representing a road in the local map;
width DT: represents the average width of the road;
leftDDT, rightDDT: the shape deviation of the left and right boundaries is respectively represented, and the linear shape of the boundaries adopts quadratic curve fitting.
Wherein, according to the failure times of the lane line detection and the distance between the vehicle and the actual detection boundary, the usability evaluation of the boundary is divided into three grades: common, strict and unlimited, respectively corresponding to different threshold vectors
Wherein, calculate the detection distance, namely the distance of vehicle and lane line or boundary through the assessment, find the display distance correspondingly, namely the distance of vehicle and lane line or boundary in the map, the detection distance is not always equal to display distance, the difference between them is the positioning deviation of the local map that needs to be corrected, the deviation correction formula:
Adjust_x+=offset*cos(rad);
Adjust_y+=offset*sin(rad); (1)
wherein:
adjust _ x: indicating deviation of the map in the true east direction;
adjust _ y: representing the deviation of the map in the true north direction;
offset: indicating lane line or boundary deviation;
rad: representing a map yaw angle;
cos: a trigonometric function cosine function;
sin: a trigonometric function sine function;
the correction results of Adjust _ x and Adjust _ y are permanently retained for use in the next drawing of the local map.
Compared with the prior art, the invention has the advantages that:
the main problems existing in the prior art are that the source of the positioning information is single, the stability and the reliability of the information are insufficient, the information is easily influenced by the environment, and the positioning requirement of the driveway level of the unmanned vehicle cannot be met. The invention has the innovativeness that a plurality of devices are used as data sources, the method is suitable for different environments, various road information is extracted and fused, a map matching correction algorithm is designed according to detection conditions in a grading manner, and the target of high-precision positioning is realized.
(1) The invention utilizes various devices including a combined positioning system, a high-precision map, a laser radar, a camera and the like as data sources, combines information provided by multiple data sources, can quickly provide lane-level high-precision positioning, and has wide application range.
(2) According to the invention, different deviation correction threshold values are respectively designed according to the detection condition of the road information, so that the high-precision positioning result output can be continuously and stably provided under the condition that the detection of the lane line or the boundary part is invalid, and the robustness and the reliability of the system are improved.
(3) The multi-module information fusion method designed by the invention is independent and mutually connected. Each module is responsible for relatively independent task respectively, only carries out mutual communication when necessary, has improved the functioning speed of system, has guaranteed unmanned vehicle to the real-time nature requirement of high accuracy location.
Drawings
FIG. 1 is a flow chart of a high-precision positioning method of an unmanned vehicle based on multi-information fusion according to the invention;
FIG. 2 is a flow chart of a lane line correction algorithm;
FIG. 3 is a flow chart of a boundary correction algorithm.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to specific implementation steps and accompanying drawings. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.
As shown in FIG. 1, the high-precision positioning method based on multi-information fusion of the invention comprises the following steps:
step 1, obtaining information from a plurality of data sources. The method comprises the following steps:
1) and the map module is used for extracting a local map of the area where the vehicle is located by combining the high-precision map and the combined positioning system. The map is represented in grid graphs, each grid representing a plot of size 20cm by 20 cm. The local map displays vehicle position and road information including road attribute information such as road width, road length, road morphology, lane width, number of lanes, lane type, etc.
2) And the camera and radar module is used for acquiring road traffic environment information in real time based on the vehicle-mounted camera and the radar equipment and extracting lane lines and road boundaries. Lane line and boundary detection uses algorithms known in the art.
And 2, carrying out usability evaluation on the obtained lane lines and road boundaries.
And 3, after the availability evaluation is passed, calculating the distance between the actually detected lane line or boundary and the vehicle by contrasting with the local map generated by the map module, comparing with the result displayed by the local map, obtaining the positioning deviation, and correcting the positioning deviation.
And 4, reserving a correction result for drawing a local map and correcting the local map at the next time.
For step 1, the data sources include:
the high-precision map is used for extracting road attribute information such as the width, the length and the form of a road, the number, the width and the type of lanes and road surface identification information such as a white solid line, a white dotted line, a sidewalk, a road isolation belt, a straight arrow and a left-turn arrow in each road in a driving target area of the unmanned vehicle. The embodiment adopts a user-defined format to manufacture a high-precision map;
and the laser radar is used for detecting the road boundary. In the embodiment, HDL-64E high-precision laser radar produced by Velodyne is adopted to scan the road environment in real time at 360 degrees, a three-dimensional model is constructed, and then road boundary information is extracted from the three-dimensional model;
and the camera is used for detecting the lane line. In the embodiment, a DFK 23G274 industrial camera produced by Mimex is adopted, the resolution is 640 x 480, a lane line detection algorithm is effectively supported, and the accuracy and the robustness of a detection result are ensured;
and the combined positioning system is used for providing satellite positioning information of the vehicle at a certain moment. The integrated navigation positioning system SPAN-CPT produced by NovAtel company is adopted in the embodiment, and the integrated navigation positioning system is a tightly coupled system integrating GPS and INS, can continuously and stably output positioning information and supports the creation of local maps.
For step 2, lane line assessment and road boundary assessment are included:
lane line assessment, including three indicators: lane angle _ camera, vehicle head angle _ car, and lane width difference line _ width. The angle _ camera represents an included angle between a lane line detected in real time and a lane line in the local map, the angle _ car represents an included angle between a vehicle head direction detected in real time and the lane line in the local map, and the line _ width represents a width difference between the lane line detected in real time and the lane line in the local map.
Setting threshold vectors for lane line availability assessment indicatorsIf any index exceeds the threshold value, and the angle _ camera is greater than 5, the lane line has no availability and cannot be used for map correction.
The road boundary adopts quadratic curve fitting, and the usability evaluation indexes comprise angle angleDT, width widthDT, linear leftDDT and rightDDT. angleDT represents the yaw angle of the road in the local map, widthDT represents the average width of the road, and leftDDT and rightDDT represent the shape deviation of the left and right boundaries, respectively.
According to the number of times of failure of lane line detection and the distance between the vehicle and the actual detection boundary, the usability evaluation of the boundary is divided into three levels: ordinary, strict and unlimited, respectively corresponding to different thresholds. If the camera processing module does not detect a lane line for 10 consecutive cycles, a strict evaluation is performed, a threshold vectorOtherwise, performing a normal evaluation, threshold vectorUnder strict evaluation conditions, if the distance from the left boundary or the right boundary of the actually detected road to the vehicle is less than 7, unlimited evaluation is performed.
With respect to the step 3 of the method,
the detected distance, i.e. the distance of the vehicle from the lane line or boundary passing the evaluation, is calculated, corresponding to the found display distance, i.e. the distance of the vehicle from the lane line or boundary in the map. The detection distance and the display distance are not necessarily equal, and the difference value of the two distances is the positioning deviation of the local map needing to be corrected. The deviation correction method comprises the following steps: if the lane line deviation is equal to or less than the threshold 10 or the boundary deviation is equal to or more than the threshold y, the correction is performed by the above equation (1). The correction result is permanently retained and used when the local map is drawn next time. The threshold value y has different values according to different evaluation grades. The unlimited evaluation y is 0, the strict evaluation y is 3, and the general evaluation y is 5.
The map module, the camera processing module and the radar processing module are in parallel relation with each other, operate in three threads respectively, and communicate with each other through a public data storage space.
Fig. 2 is a flowchart of a camera processing module lane line correction algorithm, and fig. 3 is a flowchart of a radar processing module boundary correction algorithm.
In a word, the invention relates to a high-precision positioning method of an unmanned vehicle based on multi-information fusion, which can be applied to environment perception and intelligent decision of the unmanned vehicle. The method obtains a local map of a vehicle driving area through satellite positioning, detects road boundaries and lane lines on line by using a laser radar and a camera, evaluates the availability of the obtained boundary information and lane line information, and calculates the distance deviation between an actual detection result and a map display result after determining that the actual detection result can be used. And correcting the deviation to obtain a new local map, sending the corrected local map to a decision system through a network, and reserving a correction result for next correction. The method fully utilizes the advantage of multi-information fusion, meets the requirements of the unmanned vehicle on the real-time performance and the precision of high-precision positioning, has robustness, and can effectively adapt to the change of the environment.
The invention has not been described in detail and is part of the common general knowledge of a person skilled in the art.
The foregoing is a detailed description of the present invention with reference to specific embodiments, but the present invention is not to be considered as limited to the specific embodiments. Numerous modifications and variations may be made thereto by those skilled in the art without departing from the principles and spirit of the invention, the scope of which is defined by the appended claims and their equivalents.
Claims (4)
1. A high-precision positioning method for an unmanned vehicle based on multi-information fusion is characterized by comprising the following steps: fusing information provided by the map module, the camera processing module and the radar processing module, and matching the position of the vehicle in the map with the real position of the vehicle in the environment through lane line correction and boundary correction, wherein:
the map module is used for drawing a local map which takes the vehicle as the center and is in a certain range according to the real-time positioning information, and the local map is used by the camera processing module and the radar processing module;
the camera processing module is used for acquiring original road surface data in the driving process of the vehicle by using a camera, extracting lane lines and using the lane lines for map matching and correction;
the radar processing module is used for collecting vehicle surrounding environment information by using a laser radar, extracting road boundaries and matching and correcting a map;
the radar processing module carries out usability evaluation on the detected road boundary before map matching and correction, and the evaluation indexes comprise: angle angleDT, width widthDT, and line type leftDDT, rightDDT,
angleDT: a deflection angle representing a road in the local map;
width DT: represents the average width of the road;
leftDDT, rightDDT: respectively representing the shape deviation of the left boundary and the right boundary, and fitting the linear shape of the boundaries by using a quadratic curve;
2. The unmanned vehicle high-precision positioning method based on multi-information fusion is characterized in that: the camera processing module carries out usability evaluation on the detected lane lines before map matching and correction, and the evaluation indexes comprise: a lane angle _ camera, a nose angle _ car, and a lane width difference line _ width,
angle _ camera: representing the included angle between the lane line detected in real time and the lane line in the local map;
angle _ car: representing the included angle between the vehicle head direction detected in real time and the lane line in the local map;
line _ width: and the width difference between the lane line detected in real time and the lane line in the local map is shown.
3. The unmanned vehicle high-precision positioning method based on multi-information fusion as claimed in claim 2, characterized in that: setting threshold vectors for lane line availability assessment indicatorsIf any index exceeds the threshold value, the lane line has no availability and cannot be used for map correction.
4. The unmanned vehicle high-precision positioning method based on multi-information fusion is characterized in that: calculating a detection distance, namely the distance between the vehicle and the lane line or the boundary which passes the evaluation, correspondingly finding a display distance, namely the distance between the vehicle and the lane line or the boundary in the map, wherein the detection distance and the display distance are not necessarily equal, the difference value of the two distances is the positioning deviation of the local map which needs to be corrected, and a deviation correction formula is as follows:
Adjust_x+=offset*cos(rad);
Adjust_y+=offset*sin(rad); (1)
wherein:
adjust _ x: indicating deviation of the map in the true east direction;
adjust _ y: representing the deviation of the map in the true north direction;
offset: indicating lane line or boundary deviation;
rad: representing a map yaw angle;
cos: a trigonometric function cosine function;
sin: a trigonometric function sine function;
the correction results of Adjust _ x and Adjust _ y are permanently retained for use in the next drawing of the local map.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611261781.5A CN106767853B (en) | 2016-12-30 | 2016-12-30 | Unmanned vehicle high-precision positioning method based on multi-information fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611261781.5A CN106767853B (en) | 2016-12-30 | 2016-12-30 | Unmanned vehicle high-precision positioning method based on multi-information fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106767853A CN106767853A (en) | 2017-05-31 |
CN106767853B true CN106767853B (en) | 2020-01-21 |
Family
ID=58954866
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611261781.5A Active CN106767853B (en) | 2016-12-30 | 2016-12-30 | Unmanned vehicle high-precision positioning method based on multi-information fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106767853B (en) |
Families Citing this family (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10222803B2 (en) * | 2017-06-02 | 2019-03-05 | Aptiv Technologies Limited | Determining objects of interest for active cruise control |
CN107451526A (en) * | 2017-06-09 | 2017-12-08 | 蔚来汽车有限公司 | The structure of map and its application |
CN109084782B (en) * | 2017-06-13 | 2024-03-12 | 蔚来(安徽)控股有限公司 | Lane line map construction method and construction system based on camera sensor |
CN107328410B (en) * | 2017-06-30 | 2020-07-28 | 百度在线网络技术(北京)有限公司 | Method for locating an autonomous vehicle and vehicle computer |
CN107328411B (en) * | 2017-06-30 | 2020-07-28 | 百度在线网络技术(北京)有限公司 | Vehicle-mounted positioning system and automatic driving vehicle |
KR102138094B1 (en) * | 2017-07-27 | 2020-07-27 | 닛산 지도우샤 가부시키가이샤 | Self-position correction method and self-position correction device for driving support vehicles |
CN109323701A (en) * | 2017-08-01 | 2019-02-12 | 郑州宇通客车股份有限公司 | The localization method and system combined based on map with FUSION WITH MULTISENSOR DETECTION |
CN109325390B (en) * | 2017-08-01 | 2021-11-05 | 郑州宇通客车股份有限公司 | Positioning method and system based on combination of map and multi-sensor detection |
CN107703528B (en) * | 2017-09-25 | 2020-10-27 | 武汉光庭科技有限公司 | Visual positioning method and system combined with low-precision GPS in automatic driving |
CN107907894A (en) * | 2017-11-09 | 2018-04-13 | 上汽通用五菱汽车股份有限公司 | Pilotless automobile localization method, device, storage medium and pilotless automobile |
CN110044371A (en) * | 2018-01-16 | 2019-07-23 | 华为技术有限公司 | A kind of method and vehicle locating device of vehicle location |
CN108445503B (en) * | 2018-03-12 | 2021-09-14 | 吉林大学 | Unmanned path planning algorithm based on fusion of laser radar and high-precision map |
WO2019185165A1 (en) * | 2018-03-30 | 2019-10-03 | Toyota Motor Europe | System and method for adjusting external position information of a vehicle |
CN108759833B (en) * | 2018-04-25 | 2021-05-25 | 中国科学院合肥物质科学研究院 | Intelligent vehicle positioning method based on prior map |
CN108931801B (en) * | 2018-06-06 | 2022-05-17 | 苏州智加科技有限公司 | Automatic vehicle driving method and system in container terminal area |
US11650059B2 (en) * | 2018-06-06 | 2023-05-16 | Toyota Research Institute, Inc. | Systems and methods for localizing a vehicle using an accuracy specification |
US11113971B2 (en) * | 2018-06-12 | 2021-09-07 | Baidu Usa Llc | V2X communication-based vehicle lane system for autonomous vehicles |
WO2020008221A1 (en) * | 2018-07-04 | 2020-01-09 | 日産自動車株式会社 | Travel assistance method and travel assistance device |
CN108873908B (en) * | 2018-07-12 | 2020-01-24 | 重庆大学 | Robot city navigation system based on combination of visual SLAM and network map |
CN109166314A (en) * | 2018-09-29 | 2019-01-08 | 河北德冠隆电子科技有限公司 | Road conditions awareness apparatus and bus or train route cooperative system based on omnidirectional tracking detection radar |
CN109239752A (en) * | 2018-09-29 | 2019-01-18 | 重庆长安汽车股份有限公司 | Vehicle positioning system |
CN110969837B (en) * | 2018-09-30 | 2022-03-25 | 毫末智行科技有限公司 | Road information fusion system and method for automatic driving vehicle |
DE102018218492A1 (en) * | 2018-10-29 | 2020-04-30 | Robert Bosch Gmbh | Control device, method and sensor arrangement for self-monitored localization |
DE102018220799A1 (en) * | 2018-12-03 | 2020-06-04 | Robert Bosch Gmbh | Method for operating a vehicle |
CN109815555B (en) * | 2018-12-29 | 2023-04-18 | 百度在线网络技术(北京)有限公司 | Environment modeling capability evaluation method and system for automatic driving vehicle |
CN109872533B (en) * | 2019-02-21 | 2020-12-04 | 弈人(上海)科技有限公司 | Lane-level real-time traffic information processing method based on spatial data |
CN110070712A (en) * | 2019-04-12 | 2019-07-30 | 同济大学 | A kind of low speed sweeper Global localization system and method |
CN110081880A (en) * | 2019-04-12 | 2019-08-02 | 同济大学 | A kind of sweeper local positioning system and method merging vision, wheel speed and inertial navigation |
US20200340816A1 (en) * | 2019-04-26 | 2020-10-29 | Mediatek Inc. | Hybrid positioning system with scene detection |
CN111854727B (en) * | 2019-04-27 | 2022-05-13 | 北京魔门塔科技有限公司 | Vehicle pose correction method and device |
CN110174113B (en) * | 2019-04-28 | 2023-05-16 | 福瑞泰克智能系统有限公司 | Positioning method, device and terminal for vehicle driving lane |
CN111912416B (en) * | 2019-05-07 | 2022-07-29 | 北京市商汤科技开发有限公司 | Method, device and equipment for positioning equipment |
CN110187348A (en) * | 2019-05-09 | 2019-08-30 | 盈科视控(北京)科技有限公司 | A kind of method of laser radar positioning |
CN111982133B (en) * | 2019-05-23 | 2023-01-31 | 北京地平线机器人技术研发有限公司 | Method and device for positioning vehicle based on high-precision map and electronic equipment |
CN110544375A (en) * | 2019-06-10 | 2019-12-06 | 河南北斗卫星导航平台有限公司 | Vehicle supervision method and device and computer readable storage medium |
CN110379174B (en) * | 2019-07-24 | 2020-12-25 | 中电科新型智慧城市研究院有限公司 | Traffic control system based on 5G positioning and video analysis technology |
CN110398968B (en) * | 2019-07-24 | 2020-06-05 | 清华大学 | Intelligent vehicle multi-target driving control method and decision system |
CN110673606A (en) * | 2019-09-24 | 2020-01-10 | 芜湖酷哇机器人产业技术研究院有限公司 | Edge cleaning method and system of sweeper |
CN112729316B (en) * | 2019-10-14 | 2024-07-05 | 北京图森智途科技有限公司 | Positioning method and device of automatic driving vehicle, vehicle-mounted equipment, system and vehicle |
CN110806215B (en) * | 2019-11-21 | 2021-06-29 | 北京百度网讯科技有限公司 | Vehicle positioning method, device, equipment and storage medium |
CN111307162B (en) * | 2019-11-25 | 2020-09-25 | 奥特酷智能科技(南京)有限公司 | Multi-sensor fusion positioning method for automatic driving scene |
CN110909711B (en) * | 2019-12-03 | 2022-08-02 | 阿波罗智能技术(北京)有限公司 | Method, device, electronic equipment and storage medium for detecting lane line position change |
CN111207761B (en) * | 2019-12-31 | 2021-12-07 | 深圳一清创新科技有限公司 | Vehicle positioning method and device, computer equipment and storage medium |
CN111141311B (en) * | 2019-12-31 | 2022-04-08 | 武汉中海庭数据技术有限公司 | Evaluation method and system of high-precision map positioning module |
CN113155143A (en) * | 2020-01-23 | 2021-07-23 | 宝马股份公司 | Method, device and vehicle for evaluating a map for automatic driving |
CN111401446A (en) * | 2020-03-16 | 2020-07-10 | 重庆长安汽车股份有限公司 | Single-sensor and multi-sensor lane line rationality detection method and system and vehicle |
JP7377143B2 (en) * | 2020-03-17 | 2023-11-09 | 本田技研工業株式会社 | Travel control device, vehicle, travel control method and program |
CN111323802B (en) * | 2020-03-20 | 2023-02-28 | 阿波罗智能技术(北京)有限公司 | Intelligent driving vehicle positioning method, device and equipment |
CN111337045A (en) * | 2020-03-27 | 2020-06-26 | 北京百度网讯科技有限公司 | Vehicle navigation method and device |
CN111597281B (en) * | 2020-04-23 | 2023-09-29 | 北京百度网讯科技有限公司 | Vehicle positioning system, method and device and electronic equipment |
CN111582079A (en) * | 2020-04-24 | 2020-08-25 | 杭州鸿泉物联网技术股份有限公司 | Lane positioning method and device based on computer vision |
CN111488421B (en) * | 2020-04-27 | 2024-04-16 | 立得空间信息技术股份有限公司 | Data fusion method of traditional map and high-precision map |
CN111551976A (en) * | 2020-05-20 | 2020-08-18 | 四川万网鑫成信息科技有限公司 | Method for automatically completing abnormal positioning by combining various data |
CN111623795B (en) * | 2020-05-28 | 2022-04-15 | 阿波罗智联(北京)科技有限公司 | Live-action navigation icon display method, device, equipment and medium |
CN111650626B (en) * | 2020-06-01 | 2021-08-06 | 知行汽车科技(苏州)有限公司 | Road information acquisition method, device and storage medium |
CN111612095B (en) * | 2020-06-01 | 2023-07-18 | 知行汽车科技(苏州)股份有限公司 | Information clustering method, device and storage medium |
CN111880527B (en) * | 2020-06-19 | 2022-12-27 | 中国煤炭科工集团太原研究院有限公司 | Robot control method for underground unmanned transport vehicle |
CN111721289B (en) * | 2020-06-28 | 2022-06-03 | 阿波罗智能技术(北京)有限公司 | Vehicle positioning method, device, equipment, storage medium and vehicle in automatic driving |
CN114022860A (en) * | 2020-07-16 | 2022-02-08 | 长沙智能驾驶研究院有限公司 | Target detection method and device and electronic equipment |
CN111932887B (en) * | 2020-08-17 | 2022-04-26 | 武汉四维图新科技有限公司 | Method and equipment for generating lane-level track data |
CN114200916A (en) * | 2020-08-26 | 2022-03-18 | 深圳市杉川机器人有限公司 | Self-moving equipment and method for returning to charging station |
CN114111780A (en) * | 2020-08-26 | 2022-03-01 | 深圳市杉川机器人有限公司 | Positioning error correction method, device, self-moving equipment and system |
CN112309233B (en) * | 2020-10-26 | 2022-09-30 | 北京三快在线科技有限公司 | Road boundary determining and road segmenting method and device |
CN112966059B (en) * | 2021-03-02 | 2023-11-24 | 北京百度网讯科技有限公司 | Data processing method and device for positioning data, electronic equipment and medium |
CN113280822B (en) * | 2021-04-30 | 2023-08-22 | 北京觉非科技有限公司 | Vehicle positioning method and positioning device |
CN114212085A (en) * | 2021-08-27 | 2022-03-22 | 南京林业大学 | Method for positioning lane for vehicle lane change on foggy expressway |
CN114234987B (en) * | 2021-11-05 | 2024-06-11 | 河北汉光重工有限责任公司 | Self-adaptive smooth adjustment method for dynamic track of offline electronic map along with unmanned vehicle |
CN114396959B (en) * | 2022-03-25 | 2022-08-30 | 华砺智行(武汉)科技有限公司 | Lane matching positioning method, device, equipment and medium based on high-precision map |
CN114812581B (en) * | 2022-06-23 | 2022-09-16 | 中国科学院合肥物质科学研究院 | Cross-country environment navigation method based on multi-sensor fusion |
CN115880673B (en) * | 2023-02-22 | 2023-05-26 | 西南石油大学 | Obstacle avoidance method and system based on computer vision |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1837753A (en) * | 2005-03-22 | 2006-09-27 | 赵志弘 | Map-matched automobile navigation method based on multiple information sources |
KR20110065786A (en) * | 2009-12-10 | 2011-06-16 | 이희성 | Information-chip boundary line system |
CN104677361A (en) * | 2015-01-27 | 2015-06-03 | 福州华鹰重工机械有限公司 | Comprehensive positioning method |
CN105489035A (en) * | 2015-12-29 | 2016-04-13 | 大连楼兰科技股份有限公司 | Detection method of traffic lights applied to active drive technology |
CN106096525A (en) * | 2016-06-06 | 2016-11-09 | 重庆邮电大学 | A kind of compound lane recognition system and method |
-
2016
- 2016-12-30 CN CN201611261781.5A patent/CN106767853B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1837753A (en) * | 2005-03-22 | 2006-09-27 | 赵志弘 | Map-matched automobile navigation method based on multiple information sources |
KR20110065786A (en) * | 2009-12-10 | 2011-06-16 | 이희성 | Information-chip boundary line system |
CN104677361A (en) * | 2015-01-27 | 2015-06-03 | 福州华鹰重工机械有限公司 | Comprehensive positioning method |
CN105489035A (en) * | 2015-12-29 | 2016-04-13 | 大连楼兰科技股份有限公司 | Detection method of traffic lights applied to active drive technology |
CN106096525A (en) * | 2016-06-06 | 2016-11-09 | 重庆邮电大学 | A kind of compound lane recognition system and method |
Also Published As
Publication number | Publication date |
---|---|
CN106767853A (en) | 2017-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106767853B (en) | Unmanned vehicle high-precision positioning method based on multi-information fusion | |
Suhr et al. | Sensor fusion-based low-cost vehicle localization system for complex urban environments | |
CN110631593B (en) | Multi-sensor fusion positioning method for automatic driving scene | |
Rose et al. | An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS | |
Schreiber et al. | Laneloc: Lane marking based localization using highly accurate maps | |
US20190034730A1 (en) | Systems and methods for providing vehicle cognition | |
Tao et al. | Lane marking aided vehicle localization | |
Brenner | Extraction of features from mobile laser scanning data for future driver assistance systems | |
US7970529B2 (en) | Vehicle and lane recognizing device | |
KR101454153B1 (en) | Navigation system for unmanned ground vehicle by sensor fusion with virtual lane | |
CN110530372B (en) | Positioning method, path determining device, robot and storage medium | |
Xiao et al. | Monocular vehicle self-localization method based on compact semantic map | |
Wang et al. | Vehicle localization at an intersection using a traffic light map | |
CN104240536A (en) | Lane monitoring method with electronic horizon | |
CN112904395B (en) | Mining vehicle positioning system and method | |
CN102208013A (en) | Scene matching reference data generation system and position measurement system | |
Shunsuke et al. | GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon | |
JP4596566B2 (en) | Self-vehicle information recognition device and self-vehicle information recognition method | |
CN112346103A (en) | V2X-based intelligent networking automobile dynamic co-location method and device | |
CN111717244A (en) | Train automatic driving sensing method and system | |
Zinoune et al. | Detection of missing roundabouts in maps for driving assistance systems | |
US11287281B2 (en) | Analysis of localization errors in a mobile object | |
CN110095776B (en) | Method for determining the presence and/or the characteristics of an object and surrounding identification device | |
Yuan et al. | Estimation of vehicle pose and position with monocular camera at urban road intersections | |
Youssefi et al. | Visual and light detection and ranging-based simultaneous localization and mapping for self-driving cars |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |