KR101625486B1 - Map-based positioning system and method thereof - Google Patents

Map-based positioning system and method thereof Download PDF

Info

Publication number
KR101625486B1
KR101625486B1 KR1020140158761A KR20140158761A KR101625486B1 KR 101625486 B1 KR101625486 B1 KR 101625486B1 KR 1020140158761 A KR1020140158761 A KR 1020140158761A KR 20140158761 A KR20140158761 A KR 20140158761A KR 101625486 B1 KR101625486 B1 KR 101625486B1
Authority
KR
South Korea
Prior art keywords
map
sensor
search window
vehicle
extracting
Prior art date
Application number
KR1020140158761A
Other languages
Korean (ko)
Other versions
KR20160057755A (en
Inventor
김남혁
박지호
Original Assignee
재단법인대구경북과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 재단법인대구경북과학기술원 filed Critical 재단법인대구경북과학기술원
Priority to KR1020140158761A priority Critical patent/KR101625486B1/en
Publication of KR20160057755A publication Critical patent/KR20160057755A/en
Application granted granted Critical
Publication of KR101625486B1 publication Critical patent/KR101625486B1/en

Links

Images

Abstract

A map-based positioning method using a LIDAR sensor is disclosed. The method includes the steps of generating a map around the Lidar sensor using a point cloud obtained from the Lidar sensor and setting the generated map as a search window, Calculating a heading azimuth with respect to a current position of the vehicle and a traveling direction of the vehicle using a previous coordinate measurement value of the vehicle and a previous coordinate change amount of the vehicle measured by the INS sensor; Extracting an electronic map from the map database on the basis of the extracted map, performing map matching between the extracted electronic map and the search window, extracting the center coordinates of the map-matched search window, And extracting position coordinates of the electronic map.

Description

[0002] Map-based positioning systems and methods [0003]

The present invention relates to a map-based positioning system and a method thereof, and more particularly,

The present invention relates to a map-based positioning system that can be applied to an automobile equipped with an Advanced Driver Assistance System (ADAS), an autonomous driving vehicle, and the like.

The Global Navigation Satellite System (GNSS), which is the most widely used positioning system, is basically a system that receives signals from four or more satellites and determines three-dimensional position. Therefore, there is a case in which the error increases or the positioning is impossible in the three-dimensional road section where the distance from the center of the city is large. In order to compensate for this, a positioning technique combining an Inertial Navigation System (INS) and GNSS has been developed.

However, in case of the INS sensor, if the price is high and the GNSS signal is not received for a long time, and the positioning is performed only by INS, the positioning error increases exponentially due to accumulation of the error.

Therefore, if the satellite signal is blocked for a long time in the positioning using the GNSS / INS, the positioning will be impossible. Therefore, it is necessary to develop a system capable of precise positioning in areas where signal reception is impossible, such as in urban areas.

SUMMARY OF THE INVENTION Accordingly, it is an object of the present invention to provide a map-based positioning system and method capable of precise positioning even in an area where signal receiving environments are poor, such as in a city center.

According to another aspect of the present invention, there is provided a map-based positioning method comprising: generating a map around the LIDAR sensor using a point cloud acquired from the LIDAR sensor, and setting the generated map as a search window And an azimuth angle with respect to a current position of the vehicle and a traveling direction of the vehicle using the previous coordinate measurement of the vehicle measured by the GNSS sensor included in the GNSS / INS integrated sensor and the previous coordinate variation of the vehicle measured by the INS sensor, Extracting an electronic map from the map database based on the calculated current position of the vehicle; performing map matching between the extracted electronic map and the search window; Extracts the center coordinates of the window, and calculates the position coordinates of the electronic map corresponding to the extracted center coordinates as .

According to another aspect of the present invention, there is provided a map-based positioning system comprising: a LIDAR sensor for generating a point cloud for a surrounding environment by emitting a plurality of laser points to surroundings; a GNSS sensor for measuring a previous coordinate measurement of the vehicle; A GNSS / INS integrated sensor including an INS sensor for measuring a previous coordinate change amount of the vehicle, a map around the Lidar sensor using a point cloud acquired from the Lidar sensor, And calculating a heading azimuth with respect to a current position of the vehicle and a traveling direction of the vehicle based on the previous coordinate measurement value and the previous coordinate change amount measured by the GNSS / INS integrated sensor An integrated sensor for extracting an electronic map from a map database based on the calculated current position of the vehicle A mapping unit for map-matching the extracted electronic map with the search window, and a map-matched search window, and extracts the position coordinates of the electronic map corresponding to the extracted center coordinates And a coordinate extracting unit.

According to the present invention, positioning is performed based on relative coordinate information based on a sensor in addition to absolute coordinate information based on a GNSS / INS integrated sensor, so that stable and accurate positioning can be performed even in an environment where satellite signals are poorly received .

FIG. 1 is a view showing photograph data showing three-dimensional real-time surroundings based on a point cloud obtained by a ladder applied to the present invention.
2 is a block diagram illustrating a map-based positioning system in accordance with an embodiment of the present invention.
FIG. 3 is a block diagram showing the configuration of the Lidar sensor data processing unit shown in FIG. 2. FIG.
4 is a block diagram showing the configuration of the integrated sensor data processing unit shown in FIG.
5 is a flowchart showing a map-based positioning method according to an embodiment of the present invention.
FIG. 6 is a diagram for explaining a process of extracting a set of points in the extracting unit shown in FIG. 3. FIG.
FIG. 7 is a diagram for explaining a process of generating a map in the map generating unit shown in FIG. 3. FIG.
FIG. 8 is a diagram for explaining a process of rotating the electronic map on the basis of the azimuth angle in the map rotation unit shown in FIG.

According to the existing positioning system based on the GNSS / INS integrated sensor, when reception of the satellite signal to the GNSS sensor is interrupted, the position is measured only by the measurement value measured by the INS sensor. Since the conventional positioning system continuously accumulates the measurement value (change amount of the position) measured by the INS sensor in the environment where the reception of the satellite signal is blocked, the positioning is continued based on the cumulative result, As the reception cutoff time becomes longer, the error of the variation increases exponentially.

In the present invention, in addition to the GNSS / INS integrated sensor, a map having relative coordinates is generated based on a point cloud of the surrounding environment using a Lidar sensor, and a map generated based on the point cloud and an electronic map database It is possible to provide precise positioning even in an environment where the reception of the satellite signal is blocked because the position is determined based on the matched electronic map.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following embodiments, the positioning of the vehicle is assumed, but the present invention is not particularly limited to the positioning of the vehicle, System can be fully understood by those skilled in the art.

FIG. 1 is a diagram showing photograph data showing three-dimensional real-time surroundings based on a point cloud obtained by a ladder applied to the present invention. FIG. 2 is a diagram illustrating a map- FIG.

1 and 2, a map-based positioning system 100 according to an embodiment of the present invention includes a positioning system using GNSS or GNSS / INS combined positioning, To improve the stability and precision of positioning.

The map-based positioning system 100 according to an embodiment of the present invention includes a Lidar sensor 110, a Lidar sensor data processing unit 120, a GNSS / INS integrated sensor 130, an integrated sensor data processing unit 140 An electronic map database 150, a map matching unit 160, and a coordinate extraction unit 170.

The lidar sensor 110 is installed in a vehicle and emits a plurality of laser points to the surroundings. The lidar sensor 110 measures surrounding space information (hereinafter referred to as a point cloud) . Such a point cloud can be visualized in three dimensions and provided to a user, such as photographic data as shown in Fig. For reference, FIG. 1 is photograph data obtained by visualizing a point cloud obtained by a rotary Lydia sensor. The LIDAR sensor 110 may obtain relative coordinates based on the installation position (or current position of the vehicle) of the Lydia sensor 110 from the point cloud information.

The LIDAR sensor data processing unit 120 generates a vectorized map around the vehicle using the relative coordinates obtainable from the point clouds acquired from the LIDAR sensor, and sets the generated map as a search window . Details of this will be described in detail with reference to FIG.

The GNSS / INS integrated sensor 130 includes a GNSS sensor for measuring the coordinate measurement of the vehicle in real time and an INS sensor for measuring the amount of coordinate change of the vehicle. The GNSS sensor receives signals containing time information from four or more satellites and uses the received signals to calculate the vehicle's previous and current coordinate measurements. The INS sensor measures the amount of change of the coordinate of the vehicle by using an inertial sensor composed of a gyro sensor and an accelerometer.

The integrated sensor data processing unit 140 calculates a heading azimuth with respect to the current position of the vehicle and the traveling direction of the vehicle using the real time coordinate measurement value measured by the GNSS sensor and the coordinate change amount measured by the INS sensor, And generates an electronic map extracted from the electronic map database 160 based on the calculated current position of the vehicle. The description thereof will be described in detail with reference to FIG.

The map matching unit 160 performs map matching between the search window set by the LIDAR sensor data processing unit 120 and the electronic map generated by the integrated sensor data processing unit 140. The map matching unit 160 performs map matching while scanning the search window from the upper left to the lower right of the electronic map extracted from the electronic map database 160. The map matching unit 160 performs scanning matching with the highest degree of fitting between the electronic map and the search window Position coordinates of the search window and the position coordinates of the electronic map corresponding to the center coordinates thereof are transmitted to the coordinate extraction unit 170. [

The coordinate extraction unit 170 outputs the center coordinates of the search window and the position coordinates of the electronic map corresponding thereto, thereby positioning the vehicle position. As described below, the position coordinates output from the coordinate extraction unit 170 can be used as it is for the vehicle positioning. However, accuracy and stability of the vehicle positioning can be further enhanced through the filtering process of the Kalman filter.

FIG. 3 is a block diagram showing the configuration of the Lidar sensor data processing unit shown in FIG. 2. FIG.

3, the LIDAR sensor data processing unit 120 includes an extraction unit 122, a map generation unit 124, and a window setting unit 126. The LIDAR sensor data processing unit 120 includes an extraction unit 122, a map generation unit 124,

The point extracting unit 122 receives the point cloud generated by the Lathan sensor 110 and extracts a set of points that can be fitted to a specific line corresponding to the road and buildings.

Specifically, the point cloud has a relative coordinate value (relative coordinate value relative to the absolute coordinate value of the electronic map constructed in the electronic map database) based on the installation position (current position of the vehicle) of the lidar sensor 110, The unit 122 extracts a whole set of points within a range forming a specific distance with respect to the current position of the vehicle in the point cloud having such a relative coordinate value. For example, the extraction unit 122 extracts the maximum coordinate value of the upper left corner on the plane coordinate x = -50 m and y = 50 m from the point cloud generated by the Lidar sensor 110, a point set 61 within 100 m * 100 m as shown in FIG. 6 (b) with x = 50 m and y = -50 m is extracted.

7 (a), when the second set of points 63 is extracted, the map generating unit 124 generates a linear fitting algorithm (FIG. 7 (b)) in the extracted set of points 61 linear fitting algorithm to extract points 65 that can be fitted to a particular line corresponding to roads and buildings, in other words, specific lines that can be represented by mathematical model factors.

And connects the points 65 fitted to the specific line of the map generating unit 124 to generate a vectorized map as shown in FIG. 7C. At this time, a RANSAC (Consecutive Random Area Consensus) algorithm may be used to extract points that can be represented by mathematical model factors and to connect the extracted points to generate a vectorized map.

The window setting unit 126 sets the vectorized map generated by the map generating unit 124 as a search window 67 cut to an appropriate size. In this way, the set search window 67 is transmitted to the map matching unit 160.

4 is a block diagram showing the configuration of the integrated sensor data processing unit shown in FIG.

4, the integrated sensor data processing unit 140 according to an exemplary embodiment of the present invention includes a Kalman filter 142, an electronic map extracting unit 144, and an electronic map rotating unit 146.

The Kalman filter 142 receives the GNSS data and the INS data from the GNSS / INS integrated sensor 130, assigns weights to the errors, and calculates an optimal position measurement value and an optimal coordinate change amount.

The electronic map extracting unit 144 calculates the azimuth angle of the current position of the vehicle and the traveling direction of the vehicle on the basis of the optimal position measurement value and the optimal coordinate change amount calculated by the Kalman filter 142, An electronic map having a size larger than the size of the search window 67 set by the window setting unit 126 is extracted from the electronic map database 150 previously constructed.

The electronic map rotating unit 146 rotates the electronic map 81 extracted by the electronic map extracting unit 144 on the basis of the calculated azimuth angle as shown in FIG. To the map matching unit 160.

The map matching unit 160 moves the search window (FIG. 7 (c)) set by the window setting unit 126 described in FIG. 3 from the upper left to the lower right of the rotated electronic map 83, Matching is performed.

8, the rotated electronic map 83 retrieved from the electronic map database 150 is compared with the search window 67, and the scanning is stopped at the position where the degree of fitting is the highest, The coordinates of the position of the vehicle can be determined by utilizing the position coordinates of the electronic map 83 corresponding to the center coordinate C of the search window 67 at the position and the center coordinate C thereof.

FIG. 5 is a flowchart showing a map-based positioning method according to an embodiment of the present invention. In the following description, the subject of each step can be clearly understood from the description of FIGS. 2 to 4 described above, The subject of each step may not be mentioned.

Referring to FIG. 5, the map-based positioning method according to an embodiment of the present invention mainly includes point clouds processing steps (S311 to S317) obtained by the Lada sensor, sensor data obtained from the GNSS / INS integrated sensor Processing steps S319 to S329, map matching the search window and the electronic map generated through these processes (S327), and extracting coordinates according to the map-matched result (S331).

First, the point cloud processing steps (S311 to S317) obtained by the Lidar sensor will be described. When the Lidar sensor generates the point cloud around the installation position, the relative coordinate value constituting the generated point cloud is used as a reference Points existing within a certain distance are extracted (S311).

In step S313, points corresponding to roads and buildings are extracted using the linear fitting algorithm or the like in step S313. A vectorized map is created by connecting points corresponding to the road and the building (S315). At this time, in order to generate such a vectorized map, a RANSAC algorithm or the like may be used. That is, using the RANSAC algorithm, it is possible to extract points that can be fitted to a specific line among points corresponding to roads and buildings, and generate a vectorized map by generating a line that best fits the points.

Then, the generated vectorized map is cut according to a preset size, and a search window is set (S317).

In step S319, sensor data acquired by the GNSS / INS integrated sensor is filtered through a Kalman filter. [0053] Next, the sensor data processing steps S319 to S329 obtained by the GNSS / INS integrated sensor will be described.

Next, a process of calculating the azimuth angle of the vehicle from the filtered sensor data and the current position of the vehicle is performed (S321). For example, assuming that the current time is a time t, the current position of the vehicle and the current azimuth of the vehicle can be calculated by comparing the coordinate change amounts at the time points t-1 and t-2.

Then, an electronic map database is inquired and an electronic map including the calculated current position of the vehicle is extracted from the electronic map database 150 (S323). When the size of the search window is 100m * 100m, for example, the electronic map of 500m * 500m including the current position of the vehicle is stored in the electronic map database < RTI ID = 0.0 > (150).

Then, the electronic map extracted from the electronic map database 150 is rotated based on the calculated azimuth (S325).

Next, map matching is performed between the search window and the electronic map generated through the processes (S311 to S325) (S327), and map matching is performed while scanning the search window from the upper left to the lower right of the electronic map . That is, when the best fitting position is found by comparing the electronic map extracted from the electronic map database 150 with the search window, that is, when the map matching is successful (S329), scanning of the search window is stopped, (S331), thereby determining three-dimensional coordinates for positioning the vehicle. At this time, in order to calculate an optimal position measurement value, a weight can be given by taking into account all the errors of each sensor using a Kalman filter. That is, even though all of the sensors comprising the GNSS / INS sensor and the Lidar sensor calculate the measurements, the accuracy of each sensor may be different. Therefore, the coordinates of the electronic map extracted in S331, the current position measurement value measured by the integrated sensor and the current INS measurement value may be inputted again to the Kalman filter to determine the final coordinates. That is, the accuracy can be further improved by weighting according to the accuracy (or performance) of each sensor and finally calculating the optimal position measurement value.

As described above, when the map matching is successful, three-dimensional coordinates can be acquired from the electronic map, and the three-dimensional coordinate values can be combined with the positioning by the existing GNSS / INS integrated sensor to perform the navigation do. If map matching fails due to any geographical characteristic (such as a curved surface of a road or a corner of a building) in the surrounding environment such as an open space in step S329, the process returns to step S311. The process is restarted.

In summary, the GNSS sensor basically calculates three-dimensional coordinate information at specific time intervals, and in the case of the INS sensor, outputs an amount of change of coordinates using an acceleration sensor. Even if GNSS coordinate measurement is not output for a certain period of time in an area where receiving environment of GNSS signal is poor, for example, in a city center, continuous calculation of coordinates can be performed by reflecting the INS coordinate variation to the existing coordinate measurement .

However, in the case of the INS sensor, only the variation of the coordinates is calculated. If the GNSS coordinate calculation value is not received for a long time, if the error is included in the coordinate variation amount, the error is continuously accumulated and the reliability of the coordinate measurement value drops sharply.

Therefore, in the present invention, an LIDAR sensor is used as a sensor capable of calculating coordinate measurement values such as a GNSS sensor, and an electronic window extracted from an electronic map database based on a navigation window and a GNSS / INS integrated sensor, By outputting the coordinate values from the electronic map extracted from the electronic map database through the map matching process between the maps, it is possible to avoid the accumulation of errors even in a situation where the long time GNSS coordinate calculation value is not received, and stable positioning is possible.

Claims (8)

Generating a map around the Lidar sensor by using a point cloud obtained from a LIDAR sensor, and setting the generated map as a search window;
The current position of the vehicle and the azimuth angle with respect to the traveling direction of the vehicle using the GNSS sensor included in the GNSS / INS integrated sensor and the previous coordinates of the vehicle measured by the INS sensor, Extracting an electronic map having a size larger than the search window from the map database based on the calculated current position of the vehicle;
Performing map matching between the rotated electronic map and the search window by rotating the extracted electronic map based on the calculated azimuth angle and scanning the search window within the rotated electronic map; And
Extracting a center coordinate of the search window matched with the map and extracting a position coordinate of the electronic map corresponding to the extracted center coordinate;
Based positioning method.
The method according to claim 1, wherein, after extracting the position coordinates of the electronic map,
Further comprising the step of determining the final coordinates by inputting the position coordinates of the extracted electronic map, the current coordinate measurement measured by the GNSS sensor, and the current coordinate change measured by the INS sensor into the Kalman filter, Positioning method.
2. The method of claim 1, wherein the setting of the search window comprises:
Extracting a set of points that is within a certain distance from the point cloud obtained from the Lidar sensor;
Generating a map around the sensor that includes a particular line of road and building area that can be represented by mathematical model factors in the extracted set of points; And
Setting a map including the specific line as the search window;
Based positioning method.
4. The method of claim 3, wherein generating the map around the sensor further comprises:
And extracting the specific line corresponding to the road and the building area from the extracted set of points using a linear fitting algorithm.
4. The method of claim 3, wherein generating the map around the Lidar sensor comprises:
And generating a map around the sensor including a specific line that can be represented by mathematical model factors in the extracted second set of points using RANSAC (Random Domain Consensus) algorithm.
delete A LIDAR sensor that fires a number of laser points around to create a point cloud for the surrounding environment;
A GNSS / INS integrated sensor including a GNSS sensor measuring a previous coordinate measurement of the vehicle and an INS sensor measuring a previous coordinate change of the vehicle;
A Lidar sensor data processing unit for generating a map around the Lidar sensor by using a point cloud acquired from the Lidar sensor and setting the generated map as a search window;
Calculating a heading azimuth with respect to a current position of the vehicle and a traveling direction of the vehicle using the previous coordinate measurement and the previous coordinate variation measured by the GNSS / INS integrated sensor, An integrated sensor data processing unit for extracting an electronic map having a size larger than the search window from a map database based on the search window;
A matching unit that maps the rotated electronic map and the search window in such a manner that the extracted electronic map is rotated based on the calculated azimuth angle and the search window is scanned in the rotated electronic map; And
A coordinate extraction unit for extracting a center coordinate of the search window matching the map and extracting a position coordinate of the electronic map corresponding to the extracted central coordinate,
Based positioning system.
The apparatus as claimed in claim 7,
An extraction unit for extracting a set of points existing within a specific distance from the point cloud obtained from the LIDAR sensor and extracting a specific line capable of expressing the road and building area in the extracted point set by a mathematical model factor;
A map generating unit for generating a map around the sensor including the extracted specific line; And
A window setting unit for setting a map including the specific line as the search window,
Based positioning system.
KR1020140158761A 2014-11-14 2014-11-14 Map-based positioning system and method thereof KR101625486B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140158761A KR101625486B1 (en) 2014-11-14 2014-11-14 Map-based positioning system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140158761A KR101625486B1 (en) 2014-11-14 2014-11-14 Map-based positioning system and method thereof

Publications (2)

Publication Number Publication Date
KR20160057755A KR20160057755A (en) 2016-05-24
KR101625486B1 true KR101625486B1 (en) 2016-05-30

Family

ID=56113881

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140158761A KR101625486B1 (en) 2014-11-14 2014-11-14 Map-based positioning system and method thereof

Country Status (1)

Country Link
KR (1) KR101625486B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108732603A (en) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 Method and apparatus for positioning vehicle
CN110082783A (en) * 2019-05-10 2019-08-02 北京理工大学 A kind of method and device of steep cliff detection
US11859997B2 (en) 2018-04-04 2024-01-02 Samsung Electronics Co., Ltd. Electronic device for generating map data and operation method thereof

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101804681B1 (en) 2016-06-09 2017-12-05 재단법인대구경북과학기술원 A human detecting apparatus and method using a low-resolution 2d lidar sensor
DE102016210495A1 (en) * 2016-06-14 2017-12-14 Robert Bosch Gmbh Method and apparatus for creating an optimized location map and method for creating a location map for a vehicle
KR102265376B1 (en) * 2017-03-07 2021-06-16 현대자동차주식회사 Vehicle and controlling method thereof and autonomous driving system
CN109507995B (en) * 2017-09-14 2022-01-04 深圳乐动机器人有限公司 Management system of robot map and robot
KR102427980B1 (en) * 2017-12-20 2022-08-02 현대자동차주식회사 Vehicle and position recognition method of the same
KR102105590B1 (en) * 2018-11-05 2020-04-28 한국교통대학교산학협력단 System and method for improving accuracy of low-cost commercial GNSS Receiver
KR102555916B1 (en) * 2018-12-12 2023-07-17 현대자동차주식회사 Apparatus and method for identificating odm data reliability and vehicle including the same
CN109859562A (en) * 2019-01-31 2019-06-07 南方科技大学 Data creation method, device, server and storage medium
TWI711804B (en) * 2019-05-15 2020-12-01 宜陞有限公司 Vehicle navigation device for self-driving cars
CN112241016A (en) * 2019-07-19 2021-01-19 北京初速度科技有限公司 Method and device for determining geographic coordinates of parking map
KR102083913B1 (en) * 2019-09-03 2020-03-04 주식회사 모빌테크 Apparatus for building map using LiDAR
KR102083911B1 (en) * 2019-09-03 2020-03-04 주식회사 모빌테크 Method for building map including point cloud using LiDAR
CN112455503A (en) * 2019-09-09 2021-03-09 中车株洲电力机车研究所有限公司 Train positioning method and device based on radar
TWI725611B (en) * 2019-11-12 2021-04-21 亞慶股份有限公司 Vehicle navigation switching device for golf course self-driving cars
CN112946606B (en) * 2019-12-11 2024-02-13 北京万集科技股份有限公司 Laser radar calibration method, device, equipment, system and storage medium
KR102312892B1 (en) * 2019-12-20 2021-10-15 재단법인대구경북과학기술원 Apparatus and method for detecting road curb
US11725944B2 (en) * 2020-03-02 2023-08-15 Apollo Intelligent Driving Technology (Beijing) Co, Ltd. Method, apparatus, computing device and computer-readable storage medium for positioning
CN117250647A (en) * 2022-06-09 2023-12-19 腾讯科技(深圳)有限公司 Positioning method, device, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100217529A1 (en) 2009-02-20 2010-08-26 Matei Nicolai Stroila Determining Travel Path Features Based on Retroreflectivity

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100217529A1 (en) 2009-02-20 2010-08-26 Matei Nicolai Stroila Determining Travel Path Features Based on Retroreflectivity

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
박재웅. 차량 모델 및 LIDAR를 이용한 맵 매칭 기반의 야지환경에 강인한 무인 자율주행 기술 연구. 제어로봇시스템학회논문지. 2011.5월. 제17권, 제5호, 페이지.451-459..*

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108732603A (en) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 Method and apparatus for positioning vehicle
CN108732603B (en) * 2017-04-17 2020-07-10 百度在线网络技术(北京)有限公司 Method and device for locating a vehicle
US11859997B2 (en) 2018-04-04 2024-01-02 Samsung Electronics Co., Ltd. Electronic device for generating map data and operation method thereof
CN110082783A (en) * 2019-05-10 2019-08-02 北京理工大学 A kind of method and device of steep cliff detection

Also Published As

Publication number Publication date
KR20160057755A (en) 2016-05-24

Similar Documents

Publication Publication Date Title
KR101625486B1 (en) Map-based positioning system and method thereof
Brenner Extraction of features from mobile laser scanning data for future driver assistance systems
JP6694395B2 (en) Method and system for determining position relative to a digital map
Rose et al. An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS
JP5927735B2 (en) Map data creation device, autonomous mobile system and autonomous mobile control device
EP2133662B1 (en) Methods and system of navigation using terrain features
US11512975B2 (en) Method of navigating an unmanned vehicle and system thereof
JP6656886B2 (en) Information processing apparatus, control method, program, and storage medium
US20180149739A1 (en) Method and device for determining the position of a vehicle
US8818722B2 (en) Rapid lidar image correlation for ground navigation
CN106842271B (en) Navigation positioning method and device
CN110873570A (en) Method and apparatus for sourcing location information, generating and updating a map representing a location
JPWO2017199333A1 (en) Information output device, terminal device, control method, program, and storage medium
EP4012341A1 (en) Camera calibration for localization
CN113984044A (en) Vehicle pose acquisition method and device based on vehicle-mounted multi-perception fusion
US11579628B2 (en) Method for localizing a vehicle
KR101323971B1 (en) A method for automatic generation of tunnel information using a mobile mapping system
JP2023164553A (en) Position estimation device, estimation device, control method, program and storage medium
CN115183762A (en) Airport warehouse inside and outside mapping method, system, electronic equipment and medium
Sternberg et al. Precise indoor mapping as a basis for coarse indoor navigation
JP2019174191A (en) Data structure, information transmitting device, control method, program, and storage medium
KR101181742B1 (en) Apparatus and method for land-use map renewel
WO2019188874A1 (en) Data structure, information processing device, and map data generation device
JP2020073931A (en) Information processing device, control method, program, and storage media
Wei Multi-sources fusion based vehicle localization in urban environments under a loosely coupled probabilistic framework

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190325

Year of fee payment: 4