US11568649B2 - Landmark-less simultaneous localization and mapping - Google Patents

Landmark-less simultaneous localization and mapping Download PDF

Info

Publication number
US11568649B2
US11568649B2 US17/150,017 US202117150017A US11568649B2 US 11568649 B2 US11568649 B2 US 11568649B2 US 202117150017 A US202117150017 A US 202117150017A US 11568649 B2 US11568649 B2 US 11568649B2
Authority
US
United States
Prior art keywords
data points
vehicle
geometric
anchor
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/150,017
Other versions
US20220230016A1 (en
Inventor
Nizar Ahamed
James H. Critchley
Sina Gholamnejad Davani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive Systems Inc
Original Assignee
Continental Automotive Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Systems Inc filed Critical Continental Automotive Systems Inc
Priority to US17/150,017 priority Critical patent/US11568649B2/en
Assigned to CONTINENTAL AUTOMOTIVE SYSTEMS, INC. reassignment CONTINENTAL AUTOMOTIVE SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Ahamed, Nizar, Gholamnejad Davani, Sina, Critchley, James H.
Priority to EP22703825.4A priority patent/EP4278153A1/en
Priority to PCT/US2022/070205 priority patent/WO2022155677A1/en
Publication of US20220230016A1 publication Critical patent/US20220230016A1/en
Application granted granted Critical
Publication of US11568649B2 publication Critical patent/US11568649B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • G06F18/2113Selection of the most significant subset of features by ranking or filtering the set of features, e.g. using a measure of variance or of feature cross-correlation
    • G06K9/0057
    • G06K9/623
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/653Three-dimensional objects by matching three-dimensional models, e.g. conformal mapping of Riemann surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/22Source localisation; Inverse modelling

Abstract

A simultaneous localization and mapping system for a motor vehicle is disclosed and includes a plurality of sensors disposed within a vehicle operable to detect an object proximate the vehicle and generate a plurality of data points representing sensor returns corresponding to the detected objects surrounding the vehicle, and a controller configured to receive the data points representing the sensor returns of the detected objects surrounding the vehicle, to define an occupancy grid based on the data points and to generate vehicle operating instructions based on the defined occupancy grid, wherein the controller is configured to define at least one geometric anchor from the detected data points and localizing the vehicle based on the at least one geometric anchor.

Description

TECHNICAL FIELD
The present disclosure relates to driver assist and autonomous vehicle systems, and more specifically to a system and method of locating a vehicle relative to objects proximate a vehicle.
BACKGROUND
Vehicles may be equipped with a driver assist and/or autonomous vehicle operation system to operate a vehicle partially and/or fully independent of a vehicle operator.
In automated driving scenarios, it is important to know where the vehicle is exactly located on a map (a process known as localization) and simultaneously where things are in the world around the vehicle (a process known as mapping). Localization is often achieved using landmarks however situations arise when a landmark is not available or cannot be classified by the sensor data.
The background description provided herein is for the purpose of generally presenting a context of this disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
SUMMARY
A simultaneous localization and mapping system for a motor vehicle according to one example disclosed embodiment includes, among other possible things, a plurality of sensors disposed within a vehicle operable to detect an object proximate the vehicle and generate a plurality of data points representing sensor returns corresponding to the detected objects surrounding the vehicle, and a controller configured to receive the data points representing the sensor returns of the detected objects surrounding the vehicle, to define an occupancy grid based on the data points and to generate vehicle operating instructions based on the defined occupancy grid, wherein the controller is configured to define at least one geometric anchor from the detected data points and localizing the vehicle based on the at least one geometric anchor.
In another embodiment of the foregoing simultaneous localization and mapping system, the controller is configured to locate the vehicle relative to the at least one geometric anchor.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the plurality of data points representing sensor returns comprise an initial set of data points and additional sets of data points and the controller is further configured to correlate each of the additional sets of data points to the at least one geometric anchor.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the controller is further configured to estimate a position of the vehicle based on the correlation between additional sets of data points and the at least one geometric anchor.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the controller is further configured for correcting an odometry model of the vehicle based on the estimated position of the vehicle provided by the correlation between the additional sets of data points and the at lease one geometric anchor.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the controller is further configured to redefine the at least one geometric anchor in response to at least one of a redefined distance of vehicle travel or a predefined number of additional sets of data points.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the at least one geometric anchor comprises two or more geometric anchors that define boundaries of a path of the vehicle.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the at least one geometric anchor comprises a set of data points on the generated occupancy grid arranged in a straight line.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the at least one geometric anchor comprises a set of data points on the generated occupancy grid arranged in a rectilinear shape or a curvilinear shape.
In another embodiment of any of the foregoing simultaneous localization and mapping systems, the plurality of sensors comprises at least one of a radar device, a lidar device and a camera.
A simultaneous localization and mapping method for a vehicle according to another disclosed example embodiment includes, among other possible things, detecting objects proximate to a vehicle with a plurality of sensors and generating a plurality of data points representing sensor returns corresponding to the detected objects surrounding the vehicle, and generating an occupancy grid based on the data points representing sensor returns with a processing device disposed within the vehicle and defining at least one geometric anchor from the detected data points and localizing the vehicle relative to the at least one geometric anchor.
Another embodiment of the foregoing method of simultaneous localization and mapping further includes generating a plurality of data points representing data points comprises generating an initial set of data points and additional sets of data points and generating the occupancy grid further comprises correlating each of the additional sets of data points with the at least one geometric anchor.
Another embodiment of any of the foregoing methods of simultaneous localization and mapping further comprises estimating a position of the vehicle based on the correlation between additional sets of data points and the at least one geometric anchor.
Another embodiment of any of the foregoing methods of simultaneous localization and mapping further comprises correcting an odometry model of the vehicle based on the estimated position of the vehicle provided by the correlation between the additional sets of data points and the at lease one geometric anchor.
Another embodiment of any of the foregoing methods of simultaneous localization and mapping further comprises redefining the at least one geometric anchor in response to a redefined distance of vehicle travel.
Another embodiment of any of the foregoing methods of simultaneous localization and mapping further comprises redefining the at least one geometric anchor in response to obtaining a predefined number of additional sets of data points.
In another embodiment of any of the foregoing methods of simultaneous localization and mapping, the at least one geometric anchor comprises two or more geometric anchors that define boundaries of a path of the vehicle.
In another embodiment of any of the foregoing methods of simultaneous localization and mapping, the at least one geometric anchor comprises a set of data points on the generated occupancy grid arranged in a straight line.
In another embodiment of any of the foregoing methods of simultaneous localization and mapping, the at least one geometric anchor comprises a set of data points on the generated occupancy grid arranged in a rectilinear shape.
In another embodiment of any of the foregoing methods of simultaneous localization and mapping, the at least one geometric anchor comprises a set of data points arranged in a curvilinear shape.
Although the different examples have the specific components shown in the illustrations, embodiments of this disclosure are not limited to those particular combinations. It is possible to use some of the components or features from one of the examples in combination with features or components from another one of the examples.
These and other features disclosed herein can be best understood from the following specification and drawings, the following of which is a brief description.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic view of a vehicle including an example simultaneous localization and mapping system.
FIG. 2 is a schematic view of an example occupancy grid generated from accumulated sensor data points.
FIG. 3 is a schematic view of the example occupancy grid with geometric anchor features defined from the accumulated sensor data points.
FIG. 4 is a schematic view of the example occupancy grid with additional sensor data points correlated to the geometric anchor features.
DETAILED DESCRIPTION
Referring to FIG. 1 , a vehicle 20 is schematically shown and includes radar devices 22 disposed at various locations to obtain a 360 degree sensor field of view. The vehicle may also include cameras 32 to provide additional sensor field of view. The vehicle includes a simultaneous localization and mapping system (SLAM) 25 for generating mapping information that may be utilized by autonomous and assisted driving systems. The example SLAM system 25 embodiment utilizes data points obtained from sensor systems disposed on the vehicle, such as the radar devices 22 and cameras 32, to define at least one geometric anchor that may then be utilized to determine a relative position of the vehicle 20 and to correlate subsequently obtained data points.
The SLAM 25 is an algorithm executed by a controller 24 of the vehicle 20. The controller 24 is schematically shown and includes at least a processing device 26 and a memory device 30. The controller 24 and the processing device 26 may be a hardware device for executing software, particularly software stored in the memory 30. The processing device 26 can be a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the computing device, a semiconductor based microprocessor (in the form of a microchip or chip set) or generally any device for executing software instructions.
The memory 30 can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, VRAM, etc.)) and/or nonvolatile memory elements. Moreover, the memory 30 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory can also have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the processor.
The software in the memory 30 may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing disclosed logical functions and operation. A system component embodied as software may also be construed as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When constructed as a source program, the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory.
Input/Output devices (not shown) that may be coupled to system I/O Interface(s) may include input devices, for example but not limited to, a keyboard, mouse, scanner, microphone, camera, proximity device, etc. Further, the Input/Output devices may also include output devices, for example but not limited to, a printer, display, etc. Finally, the Input/Output devices may further include devices that communicate both as inputs and outputs, for instance but not limited to, a modulator/demodulator (modem; for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc.
When the SLAM system 25 is in operation, the processor 26 can be configured to execute software stored within the memory 30, to communicate data to and from the memory 30, and to generally control operations of the system 25 pursuant to the software. Software in memory, in whole or in part, is read by the processor, perhaps buffered within the processor, and then executed. The controller 24 may control vehicle systems schematically indicated at 28 to either autonomously control the vehicle 20 or provide driver assist functions to aid an operator of the vehicle 20.
Referring to FIG. 2 , with continued reference to FIG. 1 , the radar devices 22 and the cameras 32 generate a plurality of sensor return data points 34 that represent points of objects within the environment surrounding the vehicle. The data points 34 are utilized to generate an occupancy grid 35 that provides information to the vehicle regarding objects proximate to the vehicle 20 and a relative location to those objects. The data points 34 may be fused and filtered before being added to the occupancy grid 35.
In this disclosed example, data points are generated for portions of a first home 38, edges of a drive way 36, a second home 42 and a curb 40 across a roadway 46 from the first home 38. The background images of the homes and other features are provided for context of real world objects to the proximate location of the data points 34. The data points 34 are shown in FIG. 2 as dots proximate features and objects surrounding the vehicle 20. The combined data points 34 provide an initial picture of the features surrounding the vehicle. In this example, the data points 34 represent an initial set of data points that provide for the generation of the initial occupancy grid 35.
The vehicle 20 may utilize the occupancy grid 35 to define a path 44 to a desired destination. In this example, the desired destination is a parking spot within a garage of the first home 38. The SLAM system 25 takes multiple passes of the environment such that additional data points are continually added to the occupancy grid 35 to improve the information and understanding of the environment around the vehicle 20.
Referring to FIG. 3 , the accumulation of data points 34 and population of the occupancy grid 35 improve confidence in the occupancy grid 35 over several cycles of data point gathering. Previous data points 34 are compared to and added to subsequent data points and geometric features such as lines, curvilinear shapes and rectilinear shapes are recognized. The example SLAM system 25 utilizes the data points 32 and recognized geometric shapes to improve the occupancy grid 35. The recognized geometric shapes such as the example lines 48, circle 56 and rectangle 52 are added back to the occupancy grid as defined geometric anchors. The geometric anchors 48, 50 and 52 are clusters of data points defined by a plurality of data points 34. The constructed geometric anchors 48, 50 and 52 are maintained and utilized to verify subsequent data points rather than verifying each subsequent data point with a proceeding data point. The example system and method therefore, constructs the geometric localization features from the accumulated data points and creates geometric anchors that are referenced for localization purposes instead of using radar point based data directly. Optionally, the existing radar point based data may also be used in addition to the created geometric anchors.
As is shown in FIG. 3 , data points 34 are added to the occupancy grid 35 and accumulated until sufficient data points 34 are present to define the geometric anchors. In this example, lines 48 are defined along the driveway 36, on the side of the second home 42 and across the street along a portion of the curb 40. Accordingly, rather than a plurality of clustered data points 34, the occupancy grid 35 now includes common anchor features that provide a high confidence level for localization of the vehicle. As appreciated, only a few geometric anchor features are shown in FIG. 3 by way of example, but any number of anchor features may be utilized and generated for any occupancy grid 35 within the contemplation of this disclosure.
Referring to FIG. 4 with continued reference to FIG. 3 , the generated anchor features 48, 50 and 52 are utilized to correlate new sensor data points 54. The anchor features 48, 50 and 52 are utilized rather than correlating each new data point 54 with a previous data point 34 and thereby significantly reduces the number of calculations required, that in turn reduces the data processing requirements of the overall system.
Once the vehicle 20 changes its position and the new sensor data 54 accumulated over multiple measurement cycles is correlated with the existing data in the occupancy grid 35. The occupancy grid 35 as is shown in FIG. 4 includes older data points and the geometric anchors 48, 50 and 52. In one disclosed example, the geometric anchors 48, 50, and 52 alone are used to correlate future sensor data to reduce the complexity of the solution while also improving computing efficiency and reducing required processing compacity.
The correlation between the geometric anchors 48, 50 and 52 is used to estimate the vehicle position and correct any error in the odometry model of the vehicle 20.
The occupancy grid 35 can then be built around the corrected position of the vehicle 20 with the latest data points 54. In one disclosed example, the geometric anchors 48,50 and 52 are recomputed after a predefined number of sensor data gathering cycles. In another disclosed embodiment, the geometric anchors 48, 50 and 52 can be recomputed after a predefined distance travelled by the vehicle 20.
Accordingly, the example SLAM system 25 utilizes at least one geometric anchor 48, 50 and 52 to correlate subsequently gathered data points for localizing a vehicle. The use of the geometric anchor features for correlating subsequently gathered data points reduces overall computing requirements to improve system efficiency.
Although the different non-limiting embodiments are illustrated as having specific components or steps, the embodiments of this disclosure are not limited to those particular combinations. It is possible to use some of the components or features from any of the non-limiting embodiments in combination with features or components from any of the other non-limiting embodiments.
It should be understood that like reference numerals identify corresponding or similar elements throughout the several drawings. It should be understood that although a particular component arrangement is disclosed and illustrated in these exemplary embodiments, other arrangements could also benefit from the teachings of this disclosure.
The foregoing description shall be interpreted as illustrative and not in any limiting sense. A worker of ordinary skill in the art would understand that certain modifications could come within the scope of this disclosure. For these reasons, the following claims should be studied to determine the true scope and content of this disclosure.

Claims (15)

What is claimed is:
1. A simultaneous localization and mapping system for a motor vehicle comprising:
a plurality of sensors disposed within a vehicle operable to detect objects proximate the vehicle and generate a plurality of data points representing sensor returns corresponding to the detected objects surrounding the vehicle; and
a controller configured to receive the data points representing the sensor returns of the detected objects surrounding the vehicle, to define an occupancy grid based on the data points, to define in the occupancy grid at least one geometric anchor from the detected data points, the at least one geometric anchor comprising at least one cluster of some of the data points and having a two dimensional (2D) or three dimensional (3D) shape, and to localize the vehicle based on the occupancy grid and the at least one geometric anchor therein using a simultaneous localization and mapping algorithm,
wherein the plurality of data points representing sensor returns comprises an initial set of data points and additional sets of data points and the controller is further configured to correlate each of the additional sets of data points to the at least one geometric anchor and to estimate a position of the vehicle based on the correlation between additional sets of data points and the at least one geometric anchor.
2. The simultaneous localization and mapping system as recited in claim 1, wherein the controller is further configured for correcting an odometry model of the vehicle based on the estimated position of the vehicle provided by the correlation between the additional sets of data points and the at least one geometric anchor.
3. The simultaneous localization and mapping system as recited in claim 1, wherein the controller is further configured to redefine the at least one geometric anchor in response to at least one of a redefined distance of vehicle travel or a predefined number of additional sets of data points.
4. The simultaneous localization and mapping system as recited in claim 1, wherein the at least one geometric anchor comprises two or more geometric anchors that define boundaries of a path of the vehicle.
5. The simultaneous localization and mapping system as recited in claim 1, wherein the at least one geometric anchor comprises the at least one cluster of some of the data points on the generated occupancy grid arranged in a straight line.
6. The simultaneous localization and mapping system as recited in claim 1, wherein the at least one geometric anchor comprises the at least one cluster of some of data the points on the generated occupancy grid arranged in a rectilinear shape or a curvilinear shape.
7. The simultaneous localization and mapping system as recited in claim 1, wherein the plurality of sensors comprises at least one of a radar device, a lidar device and a camera.
8. A simultaneous localization and mapping method for a vehicle, comprising:
detecting objects proximate to a vehicle with a plurality of sensors and generating a plurality of data points representing sensor returns corresponding to the detected objects surrounding the vehicle;
generating an occupancy grid based on the data points representing sensor returns with a processing device disposed within the vehicle, defining at least one geometric anchor in the occupancy grid from the detected data points, the at least one geometric anchor comprising at least one cluster of some of the data points and having a two dimensional (2D) or three dimensional (3D) shape, to localize the vehicle relative to the occupancy grid and the at least one geometric anchor therein,
wherein generating a plurality of data points representing data points comprises generating an initial set of data points and additional sets of data points and generating the occupancy grid further comprises correlating each of the additional sets of data points with the at least one geometric anchor,
wherein the method further comprises estimating a position of the vehicle based on the correlation between additional sets of data points and the at least one geometric anchor.
9. The method as recited in claim 8, further comprising correcting an odometry model of the vehicle based on the estimated position of the vehicle provided by the correlation between the additional sets of data points and the at least one geometric anchor.
10. The method as recited in claim 8, further comprising redefining the at least one geometric anchor in response to a redefined distance of vehicle travel.
11. The method as recited in claim 8, further comprising redefining the at least one geometric anchor in response to obtaining a predefined number of additional sets of data points.
12. The method as recited in claim 8, wherein the at least one geometric anchor comprises two or more geometric anchors that define boundaries of a path of the vehicle.
13. The method as recited in claim 8, wherein the at least one geometric anchor comprises the at least one cluster of some set of data points on the generated occupancy grid arranged in a straight line.
14. The method as recited in claim 8, wherein the at least one geometric anchor comprises the at least one cluster of some set of data points on the generated occupancy grid arranged in a rectilinear shape.
15. The method as recited in claim 8, wherein the at least one geometric anchor comprises the at least one cluster of some set of data points arranged in a curvilinear shape.
US17/150,017 2021-01-15 2021-01-15 Landmark-less simultaneous localization and mapping Active US11568649B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/150,017 US11568649B2 (en) 2021-01-15 2021-01-15 Landmark-less simultaneous localization and mapping
EP22703825.4A EP4278153A1 (en) 2021-01-15 2022-01-14 Landmark-less simultaneous localization and mapping technical field
PCT/US2022/070205 WO2022155677A1 (en) 2021-01-15 2022-01-14 Landmark-less simultaneous localization and mapping technical field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/150,017 US11568649B2 (en) 2021-01-15 2021-01-15 Landmark-less simultaneous localization and mapping

Publications (2)

Publication Number Publication Date
US20220230016A1 US20220230016A1 (en) 2022-07-21
US11568649B2 true US11568649B2 (en) 2023-01-31

Family

ID=80447935

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/150,017 Active US11568649B2 (en) 2021-01-15 2021-01-15 Landmark-less simultaneous localization and mapping

Country Status (3)

Country Link
US (1) US11568649B2 (en)
EP (1) EP4278153A1 (en)
WO (1) WO2022155677A1 (en)

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7015831B2 (en) * 2002-12-17 2006-03-21 Evolution Robotics, Inc. Systems and methods for incrementally updating a pose of a mobile device calculated by visual simultaneous localization and mapping techniques
US8855819B2 (en) 2008-10-09 2014-10-07 Samsung Electronics Co., Ltd. Method and apparatus for simultaneous localization and mapping of robot
US9329598B2 (en) 2013-05-23 2016-05-03 Irobot Corporation Simultaneous localization and mapping for a mobile robot
US20160139255A1 (en) 2014-11-17 2016-05-19 Volkswagen Aktiengesellschaft Method and device for the localization of a vehicle from a fixed reference map
CN106840179A (en) 2017-03-07 2017-06-13 中国科学院合肥物质科学研究院 A kind of intelligent vehicle localization method based on multi-sensor information fusion
US20170176998A1 (en) 2014-06-05 2017-06-22 Conti Temic Microelectronic Gmbh Method and system for determining the position of a vehicle
CN107990899A (en) 2017-11-22 2018-05-04 驭势科技(北京)有限公司 A kind of localization method and system based on SLAM
US20180299275A1 (en) 2011-09-30 2018-10-18 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
WO2019018315A1 (en) 2017-07-17 2019-01-24 Kaarta, Inc. Aligning measured signal data with slam localization data and uses thereof
US10222211B2 (en) * 2016-12-30 2019-03-05 DeepMap Inc. Alignment of data captured by autonomous vehicles to generate high definition maps
US20190113927A1 (en) 2017-10-18 2019-04-18 Luminar Technologies, Inc. Controlling an Autonomous Vehicle Using Cost Maps
US20190171224A1 (en) 2017-12-01 2019-06-06 Volkswagen Aktiengesellschaft Method and Device for Self-Positioning a Vehicle
CN109917376A (en) 2019-02-26 2019-06-21 东软睿驰汽车技术(沈阳)有限公司 A kind of localization method and device
US20190384318A1 (en) 2017-01-31 2019-12-19 Arbe Robotics Ltd. Radar-based system and method for real-time simultaneous localization and mapping
US10571280B2 (en) * 2017-05-09 2020-02-25 Toyota Research Institute, Inc. Systems and methods for localizing a vehicle using a roadway signature
US10606274B2 (en) * 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
US20200109954A1 (en) 2017-06-30 2020-04-09 SZ DJI Technology Co., Ltd. Map generation systems and methods
WO2020168667A1 (en) 2019-02-18 2020-08-27 广州小鹏汽车科技有限公司 High-precision localization method and system based on shared slam map
US20200284587A1 (en) * 2016-08-04 2020-09-10 Reification Inc. Methods for simultaneous localization and mapping (slam) and related apparatus and systems
US20200380270A1 (en) 2018-03-23 2020-12-03 NetraDyne, Inc. Traffic Boundary Mapping
US20210201569A1 (en) * 2019-12-31 2021-07-01 Lyft, Inc. Map Feature Extraction Using Overhead View Images
US20210284198A1 (en) * 2020-03-10 2021-09-16 Seegrid Corporation Self-driving vehicle path adaptation system and method
US11138465B2 (en) * 2019-12-10 2021-10-05 Toyota Research Institute, Inc. Systems and methods for transforming coordinates between distorted and undistorted coordinate systems
US20210404814A1 (en) * 2020-06-30 2021-12-30 Lyft, Inc. Map Generation Using Two Sources of Sensor Data
US11281228B2 (en) * 2018-06-14 2022-03-22 Volkswagen Aktiengesellschaft Method and device for determining a position of a transportation vehicle
US11391578B2 (en) * 2019-07-02 2022-07-19 Nvidia Corporation Using measure of constrainedness in high definition maps for localization of vehicles

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7015831B2 (en) * 2002-12-17 2006-03-21 Evolution Robotics, Inc. Systems and methods for incrementally updating a pose of a mobile device calculated by visual simultaneous localization and mapping techniques
US8830091B2 (en) * 2002-12-17 2014-09-09 Irobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
US9110470B2 (en) * 2002-12-17 2015-08-18 Irobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
US8855819B2 (en) 2008-10-09 2014-10-07 Samsung Electronics Co., Ltd. Method and apparatus for simultaneous localization and mapping of robot
US20180299275A1 (en) 2011-09-30 2018-10-18 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US9329598B2 (en) 2013-05-23 2016-05-03 Irobot Corporation Simultaneous localization and mapping for a mobile robot
US20170176998A1 (en) 2014-06-05 2017-06-22 Conti Temic Microelectronic Gmbh Method and system for determining the position of a vehicle
US20160139255A1 (en) 2014-11-17 2016-05-19 Volkswagen Aktiengesellschaft Method and device for the localization of a vehicle from a fixed reference map
US20200284587A1 (en) * 2016-08-04 2020-09-10 Reification Inc. Methods for simultaneous localization and mapping (slam) and related apparatus and systems
US10222211B2 (en) * 2016-12-30 2019-03-05 DeepMap Inc. Alignment of data captured by autonomous vehicles to generate high definition maps
US20190384318A1 (en) 2017-01-31 2019-12-19 Arbe Robotics Ltd. Radar-based system and method for real-time simultaneous localization and mapping
CN106840179A (en) 2017-03-07 2017-06-13 中国科学院合肥物质科学研究院 A kind of intelligent vehicle localization method based on multi-sensor information fusion
US10571280B2 (en) * 2017-05-09 2020-02-25 Toyota Research Institute, Inc. Systems and methods for localizing a vehicle using a roadway signature
US20200109954A1 (en) 2017-06-30 2020-04-09 SZ DJI Technology Co., Ltd. Map generation systems and methods
WO2019018315A1 (en) 2017-07-17 2019-01-24 Kaarta, Inc. Aligning measured signal data with slam localization data and uses thereof
US20190113927A1 (en) 2017-10-18 2019-04-18 Luminar Technologies, Inc. Controlling an Autonomous Vehicle Using Cost Maps
US10606274B2 (en) * 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
CN107990899A (en) 2017-11-22 2018-05-04 驭势科技(北京)有限公司 A kind of localization method and system based on SLAM
US20190171224A1 (en) 2017-12-01 2019-06-06 Volkswagen Aktiengesellschaft Method and Device for Self-Positioning a Vehicle
US20200380270A1 (en) 2018-03-23 2020-12-03 NetraDyne, Inc. Traffic Boundary Mapping
US11281228B2 (en) * 2018-06-14 2022-03-22 Volkswagen Aktiengesellschaft Method and device for determining a position of a transportation vehicle
WO2020168667A1 (en) 2019-02-18 2020-08-27 广州小鹏汽车科技有限公司 High-precision localization method and system based on shared slam map
CN109917376A (en) 2019-02-26 2019-06-21 东软睿驰汽车技术(沈阳)有限公司 A kind of localization method and device
US11391578B2 (en) * 2019-07-02 2022-07-19 Nvidia Corporation Using measure of constrainedness in high definition maps for localization of vehicles
US11138465B2 (en) * 2019-12-10 2021-10-05 Toyota Research Institute, Inc. Systems and methods for transforming coordinates between distorted and undistorted coordinate systems
US20210201569A1 (en) * 2019-12-31 2021-07-01 Lyft, Inc. Map Feature Extraction Using Overhead View Images
US20210284198A1 (en) * 2020-03-10 2021-09-16 Seegrid Corporation Self-driving vehicle path adaptation system and method
US20210404814A1 (en) * 2020-06-30 2021-12-30 Lyft, Inc. Map Generation Using Two Sources of Sensor Data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
The International Search Report and the Written Opinion of the International Searching Authority dated May 19, 2022 for the counterpart PCT Application No. PCT/US2022/070205.

Also Published As

Publication number Publication date
EP4278153A1 (en) 2023-11-22
WO2022155677A1 (en) 2022-07-21
US20220230016A1 (en) 2022-07-21

Similar Documents

Publication Publication Date Title
US11530924B2 (en) Apparatus and method for updating high definition map for autonomous driving
CN110312912B (en) Automatic vehicle parking system and method
US9129523B2 (en) Method and system for obstacle detection for vehicles using planar sensor data
CN110867132B (en) Environment sensing method, device, electronic equipment and computer readable storage medium
US8558679B2 (en) Method of analyzing the surroundings of a vehicle
US10325163B2 (en) Vehicle vision
US20220169280A1 (en) Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
KR102547274B1 (en) Moving robot and method for estiating location of moving robot
CN112880694B (en) Method for determining the position of a vehicle
US11150096B2 (en) Method and device for the localization of a vehicle based on a degree of robustness of the localization
KR20180066618A (en) Registration method of distance data and 3D scan data for autonomous vehicle and method thereof
JP4052291B2 (en) Image processing apparatus for vehicle
JP2019512699A5 (en)
CN102436758B (en) Method and apparatus for supporting parking process of vehicle
US20220205804A1 (en) Vehicle localisation
JP2021120255A (en) Distance estimation device and computer program for distance estimation
EP4272185A1 (en) Image semantic segmentation for parking space detection
CN111806421B (en) Vehicle attitude determination system and method
US11568649B2 (en) Landmark-less simultaneous localization and mapping
JP6988873B2 (en) Position estimation device and computer program for position estimation
US20220404506A1 (en) Online validation of lidar-to-lidar alignment and lidar-to-vehicle alignment
JP7409330B2 (en) Self-position estimation accuracy verification method, self-position estimation system
CN113261007A (en) Method and apparatus for multi-sensor data fusion for autonomous and autonomous vehicles
JP5460413B2 (en) Own vehicle position recognition device
CN113795418A (en) Vehicle positioning

Legal Events

Date Code Title Description
AS Assignment

Owner name: CONTINENTAL AUTOMOTIVE SYSTEMS, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AHAMED, NIZAR;CRITCHLEY, JAMES H.;GHOLAMNEJAD DAVANI, SINA;SIGNING DATES FROM 20210107 TO 20210115;REEL/FRAME:054931/0757

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE