EP4260154A1 - Localisation basée sur des objets sémantiques - Google Patents
Localisation basée sur des objets sémantiquesInfo
- Publication number
- EP4260154A1 EP4260154A1 EP21904119.1A EP21904119A EP4260154A1 EP 4260154 A1 EP4260154 A1 EP 4260154A1 EP 21904119 A EP21904119 A EP 21904119A EP 4260154 A1 EP4260154 A1 EP 4260154A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- sensor
- map
- determining
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004807 localization Effects 0.000 title description 26
- 238000000034 method Methods 0.000 claims abstract description 102
- 230000008569 process Effects 0.000 description 34
- 230000015654 memory Effects 0.000 description 22
- 238000004422 calculation algorithm Methods 0.000 description 21
- 230000008447 perception Effects 0.000 description 21
- 238000004891 communication Methods 0.000 description 16
- 230000009471 action Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000013528 artificial neural network Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000012423 maintenance Methods 0.000 description 6
- 238000013507 mapping Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 230000004075 alteration Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010238 partial least squares regression Methods 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 238000012628 principal component regression Methods 0.000 description 2
- 238000010845 search algorithm Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000003936 working memory Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000013488 ordinary least square regression Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
Abstract
L'invention concerne des techniques permettant de déterminer un emplacement d'un véhicule dans un environnement à l'aide de capteurs et de déterminer des informations d'étalonnage associées aux capteurs. Un véhicule peut utiliser des données cartographiques pour traverser un environnement. Les données cartographiques peuvent comprendre des objets cartographiques sémantiques tels que des feux de circulation, des marquages de voies et autres. Le véhicule peut utiliser un capteur, tel qu'un capteur d'images, pour capturer des données de capteur. Des objets cartographiques sémantiques peuvent être projetés dans les données de capteur et mis en correspondance avec le ou les objets dans les données de capteur. De tels objets sémantiques peuvent être représentés sous la forme d'un point central et de données de covariance. Une distance ou une probabilité associée à l'objet cartographique sémantique projeté et à l'objet détecté peut être optimisée pour déterminer un emplacement du véhicule. Des objets détectés peuvent être déterminés comme étant les mêmes sur la base de la mise en correspondance avec l'objet cartographique sémantique. Une géométrie épipolaire peut être utilisée pour déterminer si des capteurs capturent des données cohérentes.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/119,518 US11538185B2 (en) | 2020-12-11 | 2020-12-11 | Localization based on semantic objects |
US17/119,562 US20220185331A1 (en) | 2020-12-11 | 2020-12-11 | Calibration based on semantic objects |
PCT/US2021/060997 WO2022125322A1 (fr) | 2020-12-11 | 2021-11-29 | Localisation basée sur des objets sémantiques |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4260154A1 true EP4260154A1 (fr) | 2023-10-18 |
Family
ID=81974049
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21904119.1A Pending EP4260154A1 (fr) | 2020-12-11 | 2021-11-29 | Localisation basée sur des objets sémantiques |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4260154A1 (fr) |
JP (1) | JP2023553238A (fr) |
CN (1) | CN116324928A (fr) |
WO (1) | WO2022125322A1 (fr) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10809361B2 (en) * | 2017-05-31 | 2020-10-20 | Uatc, Llc | Hybrid-view LIDAR-based object detection |
US11003945B2 (en) * | 2019-05-22 | 2021-05-11 | Zoox, Inc. | Localization using semantically segmented images |
US11170485B2 (en) * | 2019-05-22 | 2021-11-09 | Here Global B.V. | Method, apparatus, and system for automatic quality assessment of cross view feature correspondences using bundle adjustment techniques |
-
2021
- 2021-11-29 WO PCT/US2021/060997 patent/WO2022125322A1/fr active Application Filing
- 2021-11-29 EP EP21904119.1A patent/EP4260154A1/fr active Pending
- 2021-11-29 CN CN202180066759.8A patent/CN116324928A/zh active Pending
- 2021-11-29 JP JP2023517891A patent/JP2023553238A/ja active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2022125322A1 (fr) | 2022-06-16 |
JP2023553238A (ja) | 2023-12-21 |
CN116324928A (zh) | 2023-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10937178B1 (en) | Image-based depth data and bounding boxes | |
US11351991B2 (en) | Prediction based on attributes | |
US11748909B2 (en) | Image-based depth data and localization | |
US11021148B2 (en) | Pedestrian prediction based on attributes | |
US11295161B2 (en) | Localization using semantically segmented images | |
US10984543B1 (en) | Image-based depth data and relative depth data | |
US11386671B2 (en) | Refining depth from an image | |
WO2020236720A1 (fr) | Localisation à l'aide d'images segmentées sémantiquement | |
US11538185B2 (en) | Localization based on semantic objects | |
US20220185331A1 (en) | Calibration based on semantic objects | |
US11614742B2 (en) | Height estimation using sensor data | |
US11829449B2 (en) | Intermediate input for machine learned model | |
US11847831B2 (en) | Multi-resolution top-down prediction | |
WO2021096806A1 (fr) | Apprentissage de modèle de données de profondeur avec suréchantillonnage, pertes et équilibrage de perte | |
WO2022125308A1 (fr) | Détermination d'entrées pour un système de perception | |
US11761780B1 (en) | Determining data for semantic localization | |
US11810370B2 (en) | Techniques for identifying curbs | |
US20220171404A1 (en) | Techniques for authorizing vehicle control systems | |
US20230058731A1 (en) | Determining occupancy using unobstructed sensor emissions | |
EP4260154A1 (fr) | Localisation basée sur des objets sémantiques | |
US11983933B1 (en) | Boundary aware top-down trajectory prediction | |
US11906967B1 (en) | Determining yaw with learned motion model | |
US20230059808A1 (en) | Determining object characteristics using unobstructed sensor emissions | |
WO2022146622A1 (fr) | Entrée intermédiaire pour modèle appris de manière automatisée |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20230120 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) |