KR101837821B1 - Method for estimating position using multi-structure filter and System thereof - Google Patents

Method for estimating position using multi-structure filter and System thereof Download PDF

Info

Publication number
KR101837821B1
KR101837821B1 KR1020160017838A KR20160017838A KR101837821B1 KR 101837821 B1 KR101837821 B1 KR 101837821B1 KR 1020160017838 A KR1020160017838 A KR 1020160017838A KR 20160017838 A KR20160017838 A KR 20160017838A KR 101837821 B1 KR101837821 B1 KR 101837821B1
Authority
KR
South Korea
Prior art keywords
information
unmanned vehicle
navigation
filter
generating
Prior art date
Application number
KR1020160017838A
Other languages
Korean (ko)
Other versions
KR20170096448A (en
Inventor
최지훈
박용운
주상현
김종희
Original Assignee
국방과학연구소
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 국방과학연구소 filed Critical 국방과학연구소
Priority to KR1020160017838A priority Critical patent/KR101837821B1/en
Publication of KR20170096448A publication Critical patent/KR20170096448A/en
Application granted granted Critical
Publication of KR101837821B1 publication Critical patent/KR101837821B1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Navigation (AREA)

Abstract

The present invention relates to a position estimation technique, and more particularly, to a method and system for more accurately performing position estimation of an autonomous vehicle using a map map and / or environment information.

Description

Field of the Invention < RTI ID = 0.0 > [0002] < / RTI >

The present invention relates to a position estimation technique, and more particularly, to a method and system for more accurately performing position estimation of an autonomous vehicle using a map map and / or environment information.

Generally, GPS (Global Positioning System), INS (Inertial Navigation System), Digital Compass, and video / terrain information are used for location estimation of autonomous vehicles in the ground environment. Wiley-Interscience, 2007 ", which is incorporated herein by reference in its entirety.

In addition, a centralized filter or a federated filter is used as an integrated navigation filter to fuse each individual sensor information. In this paper, "Multi-Sensor Centralized Fusion without Measurement Noise Covariance by Variational Bayesian Approximation," IEEE Transactions on Aerospace and Electronic Systems , vol. 47, no. 1, pp. 718-272, 2011. " And "QT Gu and J. Fang," Global Optimality for Generalized Federated Filter, "ACTA AUTOMATICA SINICA, vol. 35, No. 10, pp. 1310-1316, 2009."

These two filters generally use INS as the basic navigation and other GPS as the correction navigation to correct the INS error. However, since the accuracy of the calibration navigation including the GPS in the terrestrial environment is greatly influenced by the surrounding environment (for example, the signal disconnection due to GPS jamming, the lack of features for image matching, etc.) There is a disadvantage that the navigation error greatly increases due to navigation.

In order to estimate the position, navigation information generally used in a car includes positional information on a road shape and / or feature, and the like. This navigation information can not be directly used in unmanned vehicles, but a lot of related research has been carried out in order to extract necessary information and utilize it as map map for autonomous unmanned vehicle driving. This is disclosed in YJ Wu, YH Wang, and DL Q, "A Google-Map-Based Artificial Traffic Information System, Proc. Of the 2007 IEEE Intelligent Transportation Systems Conference, pp. 968-973, 2007" have.

Therefore, it is necessary to further improve the navigation accuracy in an environment in which the INS performs pure navigation using the map map information and / or the environment recognition information.

1. Korean Patent Publication No. 10-2012-0010708 2. Korean Registered Patent No. 10-1390776 (Apr. 24, 2014)

1. M.S. Grewal, L. R. Weill, and A. P. Andrews, Global Positioning Systems, Inertial Navigation and Integration, 2nd edition, New Hersey: Wiley-Interscience, 2007 2. X. B. Gao, J. G. Chen, D. C. Tao, and X. L. Li, "Multi-Sensor Centralized Fusion without Measurement Noise Covariance by Variational Bayesian Approximation," IEEE Transactions on Aerospace and Electronic Systems, vol. 47, no. 1, pp. 718-272, 2011. 3. Q. T. Gu and J. Fang, "Global Optimality for Generalized Federated Filter," ACTA AUTOMATICA SINICA, vol. 35, no. 10, pp. 1310-1316, 2009.

The present invention has been proposed in order to solve the problems according to the above background art, and it is an object of the present invention to provide a multi-structure filter which improves the position accuracy of an unmanned vehicle by using a multi-structure filter based on a map map and / A position estimation method using a structure filter, and a system therefor.

The present invention provides a position estimation system using a multi-structure filter that improves the position accuracy of an unmanned vehicle using a multi-structure filter based on a map map and / or environment recognition information.

The position estimation system comprises:

A navigation sensor mounted on an unmanned vehicle for measuring a plurality of navigation information;

An unmanned vehicle environment recognition information unit for generating unmanned vehicle environment recognition information for the unmanned vehicle;

An unmanned vehicle autonomous driving guidance unit for generating unmanned vehicle autonomous driving guidance information according to the unmanned vehicle environment recognition information and generating specific position information by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information; And

Real-time position information according to a real-time integrated navigation method generated by integrating the plurality of navigation information in real time, estimates the current position of the unmanned vehicle based on the real-time position information and the feature position information, Or a multi-structure filter that generates movement trajectory estimation information using the specific position information.

At this time, the multi-structure filter includes a global filter for generating real-time location information according to real-time integrated navigation generated by integrating the plurality of navigation information in real time; A forward filter for generating a current estimated position information by estimating a current position of the unmanned vehicle based on the real-time position information and the characteristic position information; And a backward filter for generating movement trajectory estimation information using the specific position information.

The forward filter extracts features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information, and registers the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information, And sets the characteristic position information as an initial position and generates current estimated position information that is a current position of the unmanned vehicle using the navigation sensor information stored in advance in the database from the initial position.

The backward filter extracts features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information and registers the unmanned vehicle environment recognition information and the map information for unmanned vehicle autonomous driving, And sets the characteristic position information as an initial position and generates movement trajectory estimation information for the unmanned vehicle using the navigation sensor information previously stored in the database from the initial position.

In addition, the navigation sensor information may include navigation information and vehicle speed information.

Also, the backward filter or the forward filter is initialized each time the current position of the unmanned vehicle is estimated and the current position information is generated or the movement locus estimation information is generated using the specific position information .

In addition, the position estimation system may be configured such that the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information are transmitted to the unmanned vehicle by using the specific position information generated at the time of matching and the specific position information as the start position, And a navigation manager for comparing the current estimated position information of the generated unmanned vehicle with the current estimated position information to perform position correction.

In addition, the moving trajectory estimation information may be obtained by matching the current previous trajectory of the unmanned vehicle estimated through the backward filter and the map trajectory in the unmanned vehicle autonomous navigation map information, Is estimated.

On the other hand, another embodiment of the present invention provides a method for controlling a navigation system, comprising: measuring a plurality of navigation information mounted on an unmanned vehicle through a navigation sensor; Generating unmanned vehicle environment recognition information for the unmanned vehicle; Generating an unmanned vehicle autonomous navigation guidance information according to the unmanned vehicle environment recognition information, and generating specific position information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information with the unmanned vehicle autonomous driving leader; Generating real-time location information according to a real-time integrated navigation method in which a multi-structure filter is generated by integrating the plurality of navigation information in real time; And generating the current estimated position information by estimating the current position of the unmanned vehicle based on the real-time position information and the feature position information or generating the motion locus estimation information using the specified position information; And a position estimation method using a multi-structure filter.

According to the present invention, when an unmanned vehicle autonomously travels in an environment where GPS (Global Positioning System) is blocked, a multi-structure filter is implemented based on a map map and / or environment recognition information created for autonomous unmanned vehicle driving, (Inertial Navigation System) It is possible to improve the reliability of autonomous driving by correcting the navigation error.

Another advantage of the present invention is that the position estimation of the autonomous mobile vehicle can be more accurately estimated.

1 is a conceptual diagram of a multi-filter structure position estimation system according to an embodiment of the present invention.
2 is a conceptual diagram of signal processing of the forward filter shown in FIG.
3 is a conceptual diagram of signal processing of the backward filter shown in FIG.
4 is a conceptual diagram of signal processing of the multi-structure filter shown in FIG.
5 is a flowchart illustrating a process of correcting a navigation error using a forward filter according to an embodiment of the present invention.
6 is a flowchart illustrating a process of correcting a navigation error using a backward filter according to an embodiment of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It is to be understood, however, that the invention is not to be limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

Like reference numerals are used for similar elements in describing each drawing.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.

For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term "and / or" includes any combination of a plurality of related listed items or any of a plurality of related listed items.

Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.

Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Should not.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a method for estimating a position using a multi-structure filter according to an embodiment of the present invention and a system thereof will be described in detail with reference to the accompanying drawings.

1 is a conceptual diagram of a multi-filter structure localization system 100 according to an embodiment of the present invention. Referring to FIG. 1, the multi-filter structure position estimation system 100 includes a navigation sensor 110 mounted on an unmanned vehicle (not shown) for measuring navigation information, navigation information received from the navigation sensor 110, , An unmanned vehicle environment recognition information unit 140 for generating environment recognition information of an unmanned vehicle (not shown), and an unmanned vehicle autonomous driving map according to the unmanned vehicle environment recognition information, A real time position information generating unit for generating real time position information according to a real time integrated navigation generated by integrating the plurality of navigation information in real time and generating the real time position information and feature position information, Estimating a current position of the unmanned vehicle to generate current estimated position information or generating movement trajectory estimation information using the specified position information, A structure filter 130, and the like.

The navigation sensor 110 performs a function of measuring navigation information by utilizing functions of an acceleration sensor, a gyro sensor, and the like. The navigation sensor 110 may be an IMU (Inertial Measurement Unit) sensor or the like.

The first signal processing unit 120 processes the navigation information generated by the navigation sensor 110 into a digital signal. 1, the first signal processing unit 120 may be integrated with the global filter 131 and / or the navigation sensor 110. However, the first signal processing unit 120 may be integrated with the global filter 131 and /

The multi-structure filter 130 includes a global filter 131, a forward filter 132, a backward filter 133, and the like. Here, the global filter 131 generates real-time navigation information (commonly used integrated navigation) in which the navigation sensor data is fused using a federated filter or a centralized filter.

A forward filter 132 / backward filter 133 generates the estimated navigation information based on the estimated feature location using the unmanned vehicle autonomous navigation map.

A forward filter 132 is a filter that calculates a vehicle speed (VMS), a characteristic position, and the like, which are calculated by an INU (Inertial Measurement Unit), an odometer (Odometer) And generates the current estimated position information by estimating the current position.

The backward filter 133 is similar in concept to the forward filter 132 but uses the vehicle speed (VMS) calculated by an IMU (Inertial Measurement Unit), an odometer, etc. in the environment where the INS performs pure navigation, And generates movement trajectory estimation information by estimating the past movement trajectory.

The unmanned vehicle environment recognition information unit 140 generates environment recognition information of an unmanned vehicle (not shown) using an environment recognition sensor.

The unmanned vehicle autonomous navigation guidance unit 150 generates a map for autonomous navigation of the unmanned vehicle according to the unmanned vehicle environment recognition information and performs a function of generating specific position information accordingly. Of course, in FIG. 1, the unmanned vehicle autonomous navigation guidance unit 150 is separately configured, but may be configured by being combined with the forward filter 132 and / or the backward filter 133.

2 is a conceptual diagram of signal processing of the forward filter shown in FIG. Referring to FIG. 2, the unmanned vehicle environment recognition information unit 140 includes an environment recognition sensor 211 for sensing the surrounding environment of an unmanned vehicle (not shown) to generate sensing data, And a second signal processing unit 212 for processing the information with information.

The forward filter 132 first extracts features from the image input to the environment recognition sensor 211, and transmits the unmanned vehicle environment recognition information having the extracted feature extraction information and the unmanned vehicle autonomous navigation stored in the unmanned vehicle self- The map is matched to estimate feature location information for recognized information (eg, lanes) and sets it as an initial location.

At this time, the initial position becomes the previous position, not the current position of the unmanned vehicle due to the processing time of the algorithm. Next, the current position of the unmanned vehicle (or the unmanned robot) is estimated using the navigation sensor information (IMU, VMS) stored in the database.

That is, in the case of the navigation sensor information, the position information is generated by the IMU (Inertial Measurement Unit) 221, and the vehicle speed information is generated by the speed sensor 222. The data logger 220 collects such position information, vehicle speed information, and transmits it to the forward filter 132. Here, the speed sensor 222 may be an odometer or the like.

3 is a conceptual diagram of signal processing of the backward filter shown in FIG. Referring to FIG. 3, the backward filter is similar to the forward filter. However, in the environment where the INS performs pure navigation, the backward filter calculates the vehicle speed calculated by the IMU (Inertial Measurement Unit), the odometer, (VMS), and feature location to estimate past movement trajectories.

In detail, the backward filter 133 extracts features from the image input to the environment recognition sensor 211, and acquires the unmanned vehicle environment recognition information having the extracted feature information and the unmanned vehicle autonomous navigation guidance unit 150 (For example, a lane can be identified), and sets the initial position as the characteristic position information of the recognized information

At this time, the initial position becomes the previous position, not the current position of the unmanned vehicle (not shown) due to the processing time of the algorithm. Next, using the navigation sensor information (IMU, VMS) stored in the database, the past trajectory of the unmanned vehicle is estimated to generate the trajectory estimation information.

That is, in the case of the navigation sensor information, the position information is generated by the IMU (Inertial Measurement Unit) 221, and the vehicle speed information is generated by the speed sensor 222. The data logger 220 collects such position information, vehicle speed information, and transmits it to the forward filter 132. Here, the speed sensor 222 may be an odometer or the like.

4 is a conceptual diagram of signal processing of the multi-structure filter shown in FIG. 4, when the corrected navigation information such as GPS (global positioning system), DGPS (differential GPS), and VMS is in a normal state, the integrated navigation is operated based on the global filter 131, but the INS (Inertial Navigation System) 431 corrects the error of the global filter 131 based on the forward filter 132 and the backward filter 133 when receiving the signal data from the IMU 221 and performs the pure navigation, Can be operated.

The IMU 221 generates navigation data at a high Hz with the basic navigation sensor, and based on this information, the global filter 131, the forward filter 132, and the backward filter 133, which are auxiliary filters, Corrects mutual error.

In the case of the global filter 131, the local filters 432-1, 432-2 and 433-3 for filtering data from the INS 431, the GPS and the VMS, the local filters 432-1, 432-2 and 433-3, FDI (Fault Detection and Isolation) units 433-1 and 433-2 for detecting and separating faults from the data filtered by the filter 434, a master filter 434 for calculating a navigation correction value, and the like.

The unit sensor information of the INS 431 is fused with the INS information in the local filter 432-1 and then the FDI unit 433-1 verifies the validity of the result fused in the FDI unit 433-1, Filter 434 as a whole.

In the master filter 434, "MASTER TIME Prop (PROPAGATION)" is a predicted fusion value, and in "OPTIMAL FUSION UPDATE", the predicted value is corrected by reflecting the result of FDI 433-1.

In the case of the backward filter 133, a second filter 443 for filtering the data from the data logger 220 and a resource description framework (RDF) 442, which is a map map, An INS 444 that generates local navigation information corresponding to the estimated information, a local filter 445 that filters the navigation information generated by the INS 444, an FDI unit 446 that detects and separates faults from the filtered data, And so on.

The backward filter 133 estimates the position using the data logger 220 and the RDF 442 in the second filter 443 and outputs the estimated information to the data logger 220 ), Verifies the validity in the FDI unit 446, and updates the location in the INS 444. That is, it is an algorithm that estimates a past past trajectory at a specific time in the past and uses it in a request by an other algorithm.

 In the case of the forward filter 132, a first filter 453 for filtering data from the data logger 220 and a resource description framework (RDF) as a map map, A local filter 455 for filtering the navigation information generated by the INS 454, an FDI unit 456 for detecting and separating faults from the filtered data, and the like .

The forward filter 132 estimates the position using the data logger 220 and the 452 information in the first filter 453 and stores the estimated information in the RDF 445 with the information of the data logger 220 And verifies the validity of the result fused by the FDI unit 456, and updates the position in the local filter 454. [ That is, the current location is continuously updated by fusing the information of the data logger 220 from the estimated value at the specific time in the past.

The navigation manager 410 estimates an error caused by the fusion of data generated by the global filter 131 (that is, an unmanned vehicle autonomous driving map), and performs correction according to the error estimation. For this, an error estimation unit 413 and an error correction unit 411 for performing correction according to error estimation are configured. 5 and 6, which will be described later.

Each time the environment recognition sensor 211 recognizes characteristic information such as a lane and estimates a position through matching between map maps, the forward filter 132 and the backward filter 133 are initialized and the navigation is performed. As a result, by using a multi-structure filter, INS error can be effectively corrected when GPS or other correction navigation is not available.

5 is a flowchart illustrating a process of correcting a navigation error using a forward filter according to an embodiment of the present invention. Referring to FIG. 5, the environment recognition sensor 211 recognizes specific information (step S510). Feature information recognition time

Figure 112016015275573-pat00001
. ≪ / RTI > here
Figure 112016015275573-pat00002
Is the difference between the current time and the recognized recognition time.

Thereafter, the initial position of the unmanned vehicle (not shown) is set (step S520). That is, the feature location information determined by the matching is generated by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving map information. This particular location information

Figure 112016015275573-pat00003
. ≪ / RTI >

after,

Figure 112016015275573-pat00004
In time
Figure 112016015275573-pat00005
The navigation sensor data (IMU, VMS) is loaded in time (step S530).

Then, specific position information (

Figure 112016015275573-pat00006
) Is performed as the start position and the IMU / VMS integration navigation is performed to calculate the current estimated position information of the unmanned vehicle
Figure 112016015275573-pat00007
(Steps S540 and S550).

Thereafter, the navigation position estimated by the global filter 131 is compared with the navigation position estimated by the forward filter 132 to perform position correction.

Every time the environment recognition information in the environment recognition sensor 211 is updated, steps S510 to S570 are newly performed to correct the navigation error.

6 is a flowchart illustrating a process of correcting a navigation error using a backward filter according to an embodiment of the present invention. Referring to FIG. 6, the environment recognition sensor 211 recognizes specific information (step S610). Feature information recognition time

Figure 112016015275573-pat00008
. ≪ / RTI > here
Figure 112016015275573-pat00009
Is the difference between the current time and the recognized recognition time.

Thereafter, the initial position of the unmanned vehicle (not shown) is set (step S620). That is, the feature location information determined by the matching is generated by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving map information. This particular location information

Figure 112016015275573-pat00010
. ≪ / RTI >

after,

Figure 112016015275573-pat00011
In time
Figure 112016015275573-pat00012
The navigation sensor data (IMU, VMS) is loaded in time (step S630). here
Figure 112016015275573-pat00013
The
Figure 112016015275573-pat00014
The time when the feature was recognized before the time.

Then, specific position information (

Figure 112016015275573-pat00015
IMU / VMS integrated navigation with starting position (from
Figure 112016015275573-pat00016
to
Figure 112016015275573-pat00017
), The moving locus estimation information of the unmanned vehicle (from
Figure 112016015275573-pat00018
to
Figure 112016015275573-pat00019
(Steps S640 and S650).

Thereafter, the unmanned vehicle estimates a block located on the current map through matching between the current previous movement trajectory of the unmanned vehicle estimated through the backward filter 133 and the map trajectory in the unmanned vehicle autonomous driving guidance information (step S660 , S670). Here, the block is a component that is divided into specific sections in the map information for autonomous unmanned vehicle driving. Of course, the map information for unmanned vehicle autonomous driving is configured in block units.

In addition, steps S610 to S670 are newly performed every time the environment recognition information is updated.

100: Position estimation system
110: Navigation sensor
120: first signal processor
130: Multi-structure filter
131: Global filter 132: Forward filter
133: Backward filter
140: Unmanned Vehicle Environment Recognition Information Division
150: Unmanned Vehicle Autonomous Driving Leadership
212: second signal processor
221: Inertial Measurement Unit (IMU)
222: Speed sensor

Claims (9)

A navigation sensor mounted on an unmanned vehicle for measuring a plurality of navigation information;
An unmanned vehicle environment recognition information unit for generating unmanned vehicle environment recognition information for the unmanned vehicle;
An unmanned vehicle autonomous driving guidance unit for generating unmanned vehicle autonomous driving guidance information according to the unmanned vehicle environment recognition information and generating specific position information by matching the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information; And
Real-time position information according to a real-time integrated navigation method generated by integrating the plurality of navigation information in real time, estimates the current position of the unmanned vehicle based on the real-time position information and the feature position information, Or a multi-structure filter for generating movement trajectory estimation information using the specific position information,
The multi-
A global filter for generating real-time location information according to real-time integrated navigation generated by integrating the plurality of navigation information in real time;
A forward filter for generating a current estimated position information by estimating a current position of the unmanned vehicle based on the real-time position information and the characteristic position information; And
And a backward filter for generating movement trajectory estimation information using the specific position information,
The forward filter
Extracting features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information, estimating feature location information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information, Is set to an initial position, and the current estimated position information, which is a current position of the unmanned vehicle, is generated using the navigation sensor information stored in advance in the database from the initial position.
delete delete The method according to claim 1,
The backward filter extracts features from recognition information recognized using an environment recognition sensor to generate unmanned vehicle environment recognition information, and registers the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information to estimate feature location information Wherein the controller is configured to set the feature location information as an initial position and to generate movement trajectory estimation information for the unmanned vehicle using the navigation sensor information previously stored in the database from the initial position, .
The method according to claim 1 or 4,
Wherein the navigation sensor information includes navigation information and vehicle speed information.
The method according to claim 1,
Wherein the backward filter or the forward filter is initialized whenever generating the current estimated position information by estimating the current position of the unmanned vehicle or generating the movement trajectory estimation information using the specified position information. Position estimation system.
The method according to claim 1,
The current position of the unmanned vehicle generated using the navigation sensor information using the specific position information generated at the time of matching with the unmanned vehicle environment recognition information and the unmanned vehicle autonomous driving guidance information, And a navigation manager for comparing the position information with the position information to perform position correction.
5. The method of claim 4,
Wherein the moving trajectory estimation information is obtained by estimating a block located on the current map through the matching between the present previous movement trajectory of the unmanned vehicle estimated through the backward filter and the map trajectory in the unmanned vehicle autonomous driving guidance information, Wherein the location information is generated by using a multi-structure filter.
Measuring a plurality of navigation information mounted on the unmanned vehicle through the navigation sensor;
Generating unmanned vehicle environment recognition information for the unmanned vehicle;
Generating an unmanned vehicle autonomous navigation guidance information according to the unmanned vehicle environment recognition information, and generating specific position information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information with the unmanned vehicle autonomous driving leader;
Generating real-time location information according to a real-time integrated navigation method in which a multi-structure filter is generated by integrating the plurality of navigation information in real time; And
Estimating a current position of the unmanned vehicle based on the real-time position information and the feature position information to generate current estimated position information or generating movement trajectory estimation information using the specified position information; In addition,
Wherein the generating the real-
Generating real-time location information according to a real-time integrated navigation method in which a global filter is generated by integrating the plurality of navigation information in real time; And
Estimating a current position of the unmanned vehicle based on the real-time position information and the feature position information, and generating a current estimated position information,
Wherein the step of generating the movement locus estimation information includes:
Wherein the backward filter generates movement trajectory estimation information using the specific position information,
The forward filter
Extracting features from recognition information recognized using the environment recognition sensor to generate unmanned vehicle environment recognition information, estimating feature location information by matching the unmanned vehicle environment recognition information and unmanned vehicle autonomous driving guidance information, Is set as an initial position, and the current estimated position information, which is a current position of the unmanned vehicle, is generated using the navigation sensor information stored in advance in the database from the initial position.
KR1020160017838A 2016-02-16 2016-02-16 Method for estimating position using multi-structure filter and System thereof KR101837821B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160017838A KR101837821B1 (en) 2016-02-16 2016-02-16 Method for estimating position using multi-structure filter and System thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160017838A KR101837821B1 (en) 2016-02-16 2016-02-16 Method for estimating position using multi-structure filter and System thereof

Publications (2)

Publication Number Publication Date
KR20170096448A KR20170096448A (en) 2017-08-24
KR101837821B1 true KR101837821B1 (en) 2018-03-12

Family

ID=59758186

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160017838A KR101837821B1 (en) 2016-02-16 2016-02-16 Method for estimating position using multi-structure filter and System thereof

Country Status (1)

Country Link
KR (1) KR101837821B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107544504B (en) * 2017-09-26 2020-08-21 河南科技学院 Disaster area rescue robot autonomous detection system and method for complex environment
TWI724686B (en) * 2019-12-13 2021-04-11 國立成功大學 Positioning and orientation system and positioning and orientation method using high definition maps
CN116142219A (en) * 2022-12-21 2023-05-23 广州沃芽科技有限公司 Performance evaluation method, device, computer apparatus, storage medium, and program product

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101177374B1 (en) * 2010-07-27 2012-08-29 한양대학교 산학협력단 Method for estimating position of vehicle using Interacting Multiple Model filter
WO2014076844A1 (en) * 2012-11-19 2014-05-22 株式会社日立製作所 Autonomous movement system and control device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101177374B1 (en) * 2010-07-27 2012-08-29 한양대학교 산학협력단 Method for estimating position of vehicle using Interacting Multiple Model filter
WO2014076844A1 (en) * 2012-11-19 2014-05-22 株式会社日立製作所 Autonomous movement system and control device

Also Published As

Publication number Publication date
KR20170096448A (en) 2017-08-24

Similar Documents

Publication Publication Date Title
Atia et al. A low-cost lane-determination system using GNSS/IMU fusion and HMM-based multistage map matching
KR102463176B1 (en) Device and method to estimate position
KR20220033477A (en) Appratus and method for estimating the position of an automated valet parking system
US7868821B2 (en) Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera
US20210248768A1 (en) Generation of Structured Map Data from Vehicle Sensors and Camera Arrays
CN112304302A (en) Multi-scene high-precision vehicle positioning method and device and vehicle-mounted terminal
KR20180106417A (en) System and Method for recognizing location of vehicle
US20220011118A1 (en) Systems and Methods for Map Matching
JP5028662B2 (en) Road white line detection method, road white line detection program, and road white line detection device
KR101704405B1 (en) System and method for lane recognition
CN102565832A (en) Method of augmenting GPS or gps/sensor vehicle positioning using additional in-vehicle vision sensors
WO2008012997A1 (en) Positioning device, and navigation system
CN113405545B (en) Positioning method, positioning device, electronic equipment and computer storage medium
KR102660497B1 (en) System for positioning location information of car
US20190331496A1 (en) Locating a vehicle
KR20190109645A (en) Apparatus for determining position of vehicle and method thereof
KR101837821B1 (en) Method for estimating position using multi-structure filter and System thereof
CN114264301B (en) Vehicle-mounted multi-sensor fusion positioning method, device, chip and terminal
KR20210073281A (en) Method and apparatus for estimating motion information
CN113405555A (en) Automatic driving positioning sensing method, system and device
KR102336070B1 (en) Apparatus for determining position of vehicle and method thereof
EP3051255B1 (en) Survey data processing device, survey data processing method, and program therefor
KR101397933B1 (en) Device for creating traveling information of unmanned ground vehicle, unmanned ground vehicle having the same and control method thereof
JP2018169207A (en) Vehicle position detector
Verentsov et al. Bayesian framework for vehicle localization using crowdsourced data

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
GRNT Written decision to grant