KR101837821B1 - Method for estimating position using multi-structure filter and System thereof - Google Patents
Method for estimating position using multi-structure filter and System thereof Download PDFInfo
- Publication number
- KR101837821B1 KR101837821B1 KR1020160017838A KR20160017838A KR101837821B1 KR 101837821 B1 KR101837821 B1 KR 101837821B1 KR 1020160017838 A KR1020160017838 A KR 1020160017838A KR 20160017838 A KR20160017838 A KR 20160017838A KR 101837821 B1 KR101837821 B1 KR 101837821B1
- Authority
- KR
- South Korea
- Prior art keywords
- information
- unmanned vehicle
- navigation
- filter
- generating
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 230000033001 locomotion Effects 0.000 claims description 18
- 238000012937 correction Methods 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 8
- 238000005259 measurement Methods 0.000 description 8
- 238000001914 filtration Methods 0.000 description 4
- 230000004927 fusion Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Navigation (AREA)
Abstract
The present invention relates to a position estimation technique, and more particularly, to a method and system for more accurately performing position estimation of an autonomous vehicle using a map map and / or environment information.
Description
The present invention relates to a position estimation technique, and more particularly, to a method and system for more accurately performing position estimation of an autonomous vehicle using a map map and / or environment information.
Generally, GPS (Global Positioning System), INS (Inertial Navigation System), Digital Compass, and video / terrain information are used for location estimation of autonomous vehicles in the ground environment. Wiley-Interscience, 2007 ", which is incorporated herein by reference in its entirety.
In addition, a centralized filter or a federated filter is used as an integrated navigation filter to fuse each individual sensor information. In this paper, "Multi-Sensor Centralized Fusion without Measurement Noise Covariance by Variational Bayesian Approximation," IEEE Transactions on Aerospace and Electronic Systems , vol. 47, no. 1, pp. 718-272, 2011. " And "QT Gu and J. Fang," Global Optimality for Generalized Federated Filter, "ACTA AUTOMATICA SINICA, vol. 35, No. 10, pp. 1310-1316, 2009."
These two filters generally use INS as the basic navigation and other GPS as the correction navigation to correct the INS error. However, since the accuracy of the calibration navigation including the GPS in the terrestrial environment is greatly influenced by the surrounding environment (for example, the signal disconnection due to GPS jamming, the lack of features for image matching, etc.) There is a disadvantage that the navigation error greatly increases due to navigation.
In order to estimate the position, navigation information generally used in a car includes positional information on a road shape and / or feature, and the like. This navigation information can not be directly used in unmanned vehicles, but a lot of related research has been carried out in order to extract necessary information and utilize it as map map for autonomous unmanned vehicle driving. This is disclosed in YJ Wu, YH Wang, and DL Q, "A Google-Map-Based Artificial Traffic Information System, Proc. Of the 2007 IEEE Intelligent Transportation Systems Conference, pp. 968-973, 2007" have.
Therefore, it is necessary to further improve the navigation accuracy in an environment in which the INS performs pure navigation using the map map information and / or the environment recognition information.
The present invention has been proposed in order to solve the problems according to the above background art, and it is an object of the present invention to provide a multi-structure filter which improves the position accuracy of an unmanned vehicle by using a multi-structure filter based on a map map and / A position estimation method using a structure filter, and a system therefor.
The present invention provides a position estimation system using a multi-structure filter that improves the position accuracy of an unmanned vehicle using a multi-structure filter based on a map map and / or environment recognition information.
The position estimation system comprises:
A navigation sensor mounted on an unmanned vehicle for measuring a plurality of navigation information;
An unmanned vehicle environment recognition information unit for generating unmanned vehicle environment recognition information for the unmanned vehicle;
An unmanned vehicle autonomous driving guidance unit for generating unmanned vehicle autonomous driving guidance information according to the unmanned vehicle environment recognition information and generating specific position information by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information; And
Real-time position information according to a real-time integrated navigation method generated by integrating the plurality of navigation information in real time, estimates the current position of the unmanned vehicle based on the real-time position information and the feature position information, Or a multi-structure filter that generates movement trajectory estimation information using the specific position information.
At this time, the multi-structure filter includes a global filter for generating real-time location information according to real-time integrated navigation generated by integrating the plurality of navigation information in real time; A forward filter for generating a current estimated position information by estimating a current position of the unmanned vehicle based on the real-time position information and the characteristic position information; And a backward filter for generating movement trajectory estimation information using the specific position information.
The forward filter extracts features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information, and registers the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information, And sets the characteristic position information as an initial position and generates current estimated position information that is a current position of the unmanned vehicle using the navigation sensor information stored in advance in the database from the initial position.
The backward filter extracts features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information and registers the unmanned vehicle environment recognition information and the map information for unmanned vehicle autonomous driving, And sets the characteristic position information as an initial position and generates movement trajectory estimation information for the unmanned vehicle using the navigation sensor information previously stored in the database from the initial position.
In addition, the navigation sensor information may include navigation information and vehicle speed information.
Also, the backward filter or the forward filter is initialized each time the current position of the unmanned vehicle is estimated and the current position information is generated or the movement locus estimation information is generated using the specific position information .
In addition, the position estimation system may be configured such that the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information are transmitted to the unmanned vehicle by using the specific position information generated at the time of matching and the specific position information as the start position, And a navigation manager for comparing the current estimated position information of the generated unmanned vehicle with the current estimated position information to perform position correction.
In addition, the moving trajectory estimation information may be obtained by matching the current previous trajectory of the unmanned vehicle estimated through the backward filter and the map trajectory in the unmanned vehicle autonomous navigation map information, Is estimated.
On the other hand, another embodiment of the present invention provides a method for controlling a navigation system, comprising: measuring a plurality of navigation information mounted on an unmanned vehicle through a navigation sensor; Generating unmanned vehicle environment recognition information for the unmanned vehicle; Generating an unmanned vehicle autonomous navigation guidance information according to the unmanned vehicle environment recognition information, and generating specific position information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information with the unmanned vehicle autonomous driving leader; Generating real-time location information according to a real-time integrated navigation method in which a multi-structure filter is generated by integrating the plurality of navigation information in real time; And generating the current estimated position information by estimating the current position of the unmanned vehicle based on the real-time position information and the feature position information or generating the motion locus estimation information using the specified position information; And a position estimation method using a multi-structure filter.
According to the present invention, when an unmanned vehicle autonomously travels in an environment where GPS (Global Positioning System) is blocked, a multi-structure filter is implemented based on a map map and / or environment recognition information created for autonomous unmanned vehicle driving, (Inertial Navigation System) It is possible to improve the reliability of autonomous driving by correcting the navigation error.
Another advantage of the present invention is that the position estimation of the autonomous mobile vehicle can be more accurately estimated.
1 is a conceptual diagram of a multi-filter structure position estimation system according to an embodiment of the present invention.
2 is a conceptual diagram of signal processing of the forward filter shown in FIG.
3 is a conceptual diagram of signal processing of the backward filter shown in FIG.
4 is a conceptual diagram of signal processing of the multi-structure filter shown in FIG.
5 is a flowchart illustrating a process of correcting a navigation error using a forward filter according to an embodiment of the present invention.
6 is a flowchart illustrating a process of correcting a navigation error using a backward filter according to an embodiment of the present invention.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It is to be understood, however, that the invention is not to be limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.
Like reference numerals are used for similar elements in describing each drawing.
The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term "and / or" includes any combination of a plurality of related listed items or any of a plurality of related listed items.
Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Should not.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a method for estimating a position using a multi-structure filter according to an embodiment of the present invention and a system thereof will be described in detail with reference to the accompanying drawings.
1 is a conceptual diagram of a multi-filter
The
The first
The
A
A
The
The unmanned vehicle environment
The unmanned vehicle autonomous
2 is a conceptual diagram of signal processing of the forward filter shown in FIG. Referring to FIG. 2, the unmanned vehicle environment
The
At this time, the initial position becomes the previous position, not the current position of the unmanned vehicle due to the processing time of the algorithm. Next, the current position of the unmanned vehicle (or the unmanned robot) is estimated using the navigation sensor information (IMU, VMS) stored in the database.
That is, in the case of the navigation sensor information, the position information is generated by the IMU (Inertial Measurement Unit) 221, and the vehicle speed information is generated by the
3 is a conceptual diagram of signal processing of the backward filter shown in FIG. Referring to FIG. 3, the backward filter is similar to the forward filter. However, in the environment where the INS performs pure navigation, the backward filter calculates the vehicle speed calculated by the IMU (Inertial Measurement Unit), the odometer, (VMS), and feature location to estimate past movement trajectories.
In detail, the
At this time, the initial position becomes the previous position, not the current position of the unmanned vehicle (not shown) due to the processing time of the algorithm. Next, using the navigation sensor information (IMU, VMS) stored in the database, the past trajectory of the unmanned vehicle is estimated to generate the trajectory estimation information.
That is, in the case of the navigation sensor information, the position information is generated by the IMU (Inertial Measurement Unit) 221, and the vehicle speed information is generated by the
4 is a conceptual diagram of signal processing of the multi-structure filter shown in FIG. 4, when the corrected navigation information such as GPS (global positioning system), DGPS (differential GPS), and VMS is in a normal state, the integrated navigation is operated based on the
The
In the case of the
The unit sensor information of the
In the
In the case of the
The
In the case of the
The
The
Each time the
5 is a flowchart illustrating a process of correcting a navigation error using a forward filter according to an embodiment of the present invention. Referring to FIG. 5, the
Thereafter, the initial position of the unmanned vehicle (not shown) is set (step S520). That is, the feature location information determined by the matching is generated by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving map information. This particular location information
. ≪ / RTI >after,
In time The navigation sensor data (IMU, VMS) is loaded in time (step S530).Then, specific position information (
) Is performed as the start position and the IMU / VMS integration navigation is performed to calculate the current estimated position information of the unmanned vehicle (Steps S540 and S550).Thereafter, the navigation position estimated by the
Every time the environment recognition information in the
6 is a flowchart illustrating a process of correcting a navigation error using a backward filter according to an embodiment of the present invention. Referring to FIG. 6, the
Thereafter, the initial position of the unmanned vehicle (not shown) is set (step S620). That is, the feature location information determined by the matching is generated by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving map information. This particular location information
. ≪ / RTI >after,
In time The navigation sensor data (IMU, VMS) is loaded in time (step S630). here The The time when the feature was recognized before the time.Then, specific position information (
IMU / VMS integrated navigation with starting position (from to ), The moving locus estimation information of the unmanned vehicle (from to (Steps S640 and S650).Thereafter, the unmanned vehicle estimates a block located on the current map through matching between the current previous movement trajectory of the unmanned vehicle estimated through the
In addition, steps S610 to S670 are newly performed every time the environment recognition information is updated.
100: Position estimation system
110: Navigation sensor
120: first signal processor
130: Multi-structure filter
131: Global filter 132: Forward filter
133: Backward filter
140: Unmanned Vehicle Environment Recognition Information Division
150: Unmanned Vehicle Autonomous Driving Leadership
212: second signal processor
221: Inertial Measurement Unit (IMU)
222: Speed sensor
Claims (9)
An unmanned vehicle environment recognition information unit for generating unmanned vehicle environment recognition information for the unmanned vehicle;
An unmanned vehicle autonomous driving guidance unit for generating unmanned vehicle autonomous driving guidance information according to the unmanned vehicle environment recognition information and generating specific position information by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information; And
Real-time position information according to a real-time integrated navigation method generated by integrating the plurality of navigation information in real time, estimates the current position of the unmanned vehicle based on the real-time position information and the feature position information, Or a multi-structure filter for generating movement trajectory estimation information using the specific position information,
The multi-
A global filter for generating real-time location information according to real-time integrated navigation generated by integrating the plurality of navigation information in real time;
A forward filter for generating a current estimated position information by estimating a current position of the unmanned vehicle based on the real-time position information and the characteristic position information; And
And a backward filter for generating movement trajectory estimation information using the specific position information,
The forward filter
Extracting features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information, estimating feature location information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information, Is set to an initial position, and the current estimated position information, which is a current position of the unmanned vehicle, is generated using the navigation sensor information stored in advance in the database from the initial position.
The backward filter extracts features from recognition information recognized using an environment recognition sensor to generate unmanned vehicle environment recognition information, and registers the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information to estimate feature location information Wherein the controller is configured to set the feature location information as an initial position and to generate movement trajectory estimation information for the unmanned vehicle using the navigation sensor information previously stored in the database from the initial position, .
Wherein the navigation sensor information includes navigation information and vehicle speed information.
Wherein the backward filter or the forward filter is initialized whenever generating the current estimated position information by estimating the current position of the unmanned vehicle or generating the movement trajectory estimation information using the specified position information. Position estimation system.
The current position of the unmanned vehicle generated using the navigation sensor information using the specific position information generated at the time of matching with the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information, And a navigation manager for comparing the position information with the position information to perform position correction.
Wherein the moving trajectory estimation information is obtained by estimating a block located on the current map through the matching between the present previous movement trajectory of the unmanned vehicle estimated through the backward filter and the map trajectory in the unmanned vehicle autonomous driving guidance information, Wherein the location information is generated by using a multi-structure filter.
Generating unmanned vehicle environment recognition information for the unmanned vehicle;
Generating an unmanned vehicle autonomous navigation guidance information according to the unmanned vehicle environment recognition information, and generating specific position information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information with the unmanned vehicle autonomous driving leader;
Generating real-time location information according to a real-time integrated navigation method in which a multi-structure filter is generated by integrating the plurality of navigation information in real time; And
Estimating a current position of the unmanned vehicle based on the real-time position information and the feature position information to generate current estimated position information or generating movement trajectory estimation information using the specified position information; In addition,
Wherein the generating the real-
Generating real-time location information according to a real-time integrated navigation method in which a global filter is generated by integrating the plurality of navigation information in real time; And
Estimating a current position of the unmanned vehicle based on the real-time position information and the feature position information, and generating a current estimated position information,
Wherein the step of generating the movement locus estimation information includes:
Wherein the backward filter generates movement trajectory estimation information using the specific position information,
The forward filter
Extracting features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information, estimating feature location information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information, Is set as an initial position, and the current estimated position information, which is a current position of the unmanned vehicle, is generated using the navigation sensor information stored in advance in the database from the initial position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160017838A KR101837821B1 (en) | 2016-02-16 | 2016-02-16 | Method for estimating position using multi-structure filter and System thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160017838A KR101837821B1 (en) | 2016-02-16 | 2016-02-16 | Method for estimating position using multi-structure filter and System thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170096448A KR20170096448A (en) | 2017-08-24 |
KR101837821B1 true KR101837821B1 (en) | 2018-03-12 |
Family
ID=59758186
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160017838A KR101837821B1 (en) | 2016-02-16 | 2016-02-16 | Method for estimating position using multi-structure filter and System thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101837821B1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107544504B (en) * | 2017-09-26 | 2020-08-21 | 河南科技学院 | Disaster area rescue robot autonomous detection system and method for complex environment |
TWI724686B (en) * | 2019-12-13 | 2021-04-11 | 國立成功大學 | Positioning and orientation system and positioning and orientation method using high definition maps |
CN116142219A (en) * | 2022-12-21 | 2023-05-23 | 广州沃芽科技有限公司 | Performance evaluation method, device, computer apparatus, storage medium, and program product |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101177374B1 (en) * | 2010-07-27 | 2012-08-29 | 한양대학교 산학협력단 | Method for estimating position of vehicle using Interacting Multiple Model filter |
WO2014076844A1 (en) * | 2012-11-19 | 2014-05-22 | 株式会社日立製作所 | Autonomous movement system and control device |
-
2016
- 2016-02-16 KR KR1020160017838A patent/KR101837821B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101177374B1 (en) * | 2010-07-27 | 2012-08-29 | 한양대학교 산학협력단 | Method for estimating position of vehicle using Interacting Multiple Model filter |
WO2014076844A1 (en) * | 2012-11-19 | 2014-05-22 | 株式会社日立製作所 | Autonomous movement system and control device |
Also Published As
Publication number | Publication date |
---|---|
KR20170096448A (en) | 2017-08-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Atia et al. | A low-cost lane-determination system using GNSS/IMU fusion and HMM-based multistage map matching | |
KR102463176B1 (en) | Device and method to estimate position | |
KR20220033477A (en) | Appratus and method for estimating the position of an automated valet parking system | |
US7868821B2 (en) | Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera | |
US20210248768A1 (en) | Generation of Structured Map Data from Vehicle Sensors and Camera Arrays | |
CN112304302A (en) | Multi-scene high-precision vehicle positioning method and device and vehicle-mounted terminal | |
KR20180106417A (en) | System and Method for recognizing location of vehicle | |
US20220011118A1 (en) | Systems and Methods for Map Matching | |
JP5028662B2 (en) | Road white line detection method, road white line detection program, and road white line detection device | |
KR101704405B1 (en) | System and method for lane recognition | |
CN102565832A (en) | Method of augmenting GPS or gps/sensor vehicle positioning using additional in-vehicle vision sensors | |
WO2008012997A1 (en) | Positioning device, and navigation system | |
CN113405545B (en) | Positioning method, positioning device, electronic equipment and computer storage medium | |
KR102660497B1 (en) | System for positioning location information of car | |
US20190331496A1 (en) | Locating a vehicle | |
KR20190109645A (en) | Apparatus for determining position of vehicle and method thereof | |
KR101837821B1 (en) | Method for estimating position using multi-structure filter and System thereof | |
CN114264301B (en) | Vehicle-mounted multi-sensor fusion positioning method, device, chip and terminal | |
KR20210073281A (en) | Method and apparatus for estimating motion information | |
CN113405555A (en) | Automatic driving positioning sensing method, system and device | |
KR102336070B1 (en) | Apparatus for determining position of vehicle and method thereof | |
EP3051255B1 (en) | Survey data processing device, survey data processing method, and program therefor | |
KR101397933B1 (en) | Device for creating traveling information of unmanned ground vehicle, unmanned ground vehicle having the same and control method thereof | |
JP2018169207A (en) | Vehicle position detector | |
Verentsov et al. | Bayesian framework for vehicle localization using crowdsourced data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
GRNT | Written decision to grant |