EP4318397B1 - Verfahren zur computersichtbasierten lokalisierung und navigation und system zur durchführung davon - Google Patents
Verfahren zur computersichtbasierten lokalisierung und navigation und system zur durchführung davonInfo
- Publication number
- EP4318397B1 EP4318397B1 EP23217324.5A EP23217324A EP4318397B1 EP 4318397 B1 EP4318397 B1 EP 4318397B1 EP 23217324 A EP23217324 A EP 23217324A EP 4318397 B1 EP4318397 B1 EP 4318397B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- subject
- environment
- objects
- data
- observed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3863—Structures of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/77—Determining position or orientation of objects or cameras using statistical methods
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Probability & Statistics with Applications (AREA)
- Algebra (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Medical Informatics (AREA)
- Mathematical Optimization (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Geometry (AREA)
- Pure & Applied Mathematics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Claims (13)
- Verfahren zur Bestimmung der Position eines Subjekts, das folgende Schritte umfasst:Abrufen und Speichern eines Objektdatensatzes, der Objektdaten enthält, die auf mehrere Objekte in einer Umgebung hinweisen, einschließlich einer Angabe von Objektparametern, die jedem Objekt zugeordnet sind, wobei die Objektparameter den Standort des Objekts in der Umgebung und eine dem Objekt zugeordneten semantischen Typenklassifizierung einschließen,Erfassung von Umgebungsdaten, die auf einen bestimmten Bereich der Umgebung hinweisen, von einem dem Probanden zugeordneten Sensor,Bestimmen des Vorhandenseins mehrerer beobachteter Objekte in den Umgebungsdaten mithilfe eines künstlichen neuronalen Netzes zur Erkennung und Klassifizierung jedes beobachteten Objekts, einschließlich der Bestimmung eines oder mehrerer äquivalenter Parameter des beobachteten Objekts, die jedem beobachteten Objekt zugeordnet sind, einschließlich seiner jeweiligen semantischen Typklassifizierung, undBestimmen der Position des Subjekts wird mithilfe eines probabilistischen Modells, das die Ermittlung einer Wahrscheinlichkeitsverteilung über mehrere mögliche Positionen umfasst, wobei die beurteilte Wahrscheinlichkeit angezeigt wird, mit der sich das Subjekt in jeder dieser Positionen befindet, basierend auf einem Vergleich der beobachteten Objektparameter jedes der beobachteten Objekte mit den entsprechenden Objektparametern der Objekte im Objektdatensatz einschließlich eines Vergleichs der jeweiligen semantischen Typklassifizierungen.
- Verfahren nach Anspruch 1, wobei der Sensor eine Kamera ist und die Umgebungsdaten Bilddaten umfassen.
- Verfahren nach einem vorhergehenden Anspruch, wobei der Schritt der Bestimmung eines oder mehrerer äquivalenter beobachteter Objektparameter, die jedem beobachteten Objekt zugeordnet sind, die Verwendung des künstlichen neuronalen Netzwerks umfasst, um die Daten, die den Bereich der Umgebung kennzeichnen, einer Wahrscheinlichkeitsverteilung über die Klassifizierungstypen zuzuordnen sind, die dem Objekt im Objektdatensatz zugeordnet sind, wobei die bewertete Wahrscheinlichkeit angegeben wird, dass das beobachtete Objekt jeden dieser Typen aufweist, die damit verbunden sind.
- Verfahren nach einem vorhergehenden Anspruch, das ferner den Schritt der Bereitstellung einer direkten Anweisung umfasst, das Subjekt zu steuern oder eine Handlung auf der Grundlage der ermittelten Position des Subjekts vorzunehmen.
- Verfahren nach Anspruch 4, bei dem das Subjekt zumindest teilweise autonom ist und die Anweisung zur Steuerung des Subjekts oder zur Durchführung einer Handlung eine automatisierte Reaktion des Subjekts zur Folge hat.
- Verfahren nach einem vorhergehenden Anspruch, wobei ein Partikelfilteralgorithmus verwendet wird, um die Wahrscheinlichkeit zu bewerten, dass sich das Subjekt an jeder der mehreren möglichen Positionen befindet.
- Verfahren nach einem vorhergehenden Anspruch, wobei es sich bei den Objekten um Orientierungspunkte handelt und wobei die Typklassifizierungen von Orientierungspunkten eines oder mehrere von Vegetation, Gebäuden, Bauwerken, geografische Merkmale, Schilder, Telegrafenmasten, Strommasten, Straßenlaternen und Straßen, Straßenränder und Straßenmarkierungen umfassen.
- Verfahren nach einem vorhergehenden Anspruch, das ferner die Messung einer Orientierung, Richtung und/oder Peilung des Subjekts umfasst.
- System (10) zur Bestimmung der Position eines Subjekts, umfassend:Ein Speichergerät (18), das konfiguriert ist, einen Objektdatensatz zu speichern, der Objektdaten umfasst, die auf mehrere Objekte in einer Umgebung hinweisen, einschließlich einer Angabe von Objektparametern, die jedem Objekt zugeordnet sind, wobei die Objektparameter einen jeweiligen Standort des Objekts innerhalb der Umgebung und eine dem Objekt zugeordneten semantischen Typklassifizierung umfassen,einen Sensor (20), der so konfiguriert ist, dass er Umgebungsdaten erfasst, die auf einen Bereich der Umgebung hinweisen,ein Objekterkennungsmodul (42), das so konfiguriert, dass es mithilfe eines künstlichen neuronalen Netzes das Vorhandensein mehrerer beobachteter Objekte in den Umgebungsdaten ermittelt, um jedes beobachtete Objekt zu erkennen und zu klassifizieren, welches die Bestimmung eines oder mehrerer äquivalenter Objektparameter, die jedem beobachteten Objekt zugeordnet sind, einschließlich seiner jeweiligen semantischen Typklassifizierung umfasst, undein Positionserkennungsmodul (44), das so konfiguriert ist, dass es die Position des Subjekts mithilfe eines probabilistischen Modells bestimmt, einschließlich der Bestimmung einer Wahrscheinlichkeitsverteilung über eine Vielzahl von potenziellen Positionen, welche die bewertete Wahrscheinlichkeit angibt, dass sich das Subjekt an jeder dieser Positionen befindet, basierend auf einem Vergleich der beobachteten Objektparameter jedes einzelnen der mehreren beobachteten Objekte mit den entsprechenden Objektparametern der Objekte im Objektdatensatz einschließlich eines Vergleichs der jeweiligen semantischen Typklassifizierungen.
- System (10) nach Anspruch 9, wobei der Sensor (20) eine Kamera ist und die Umgebungsdaten Bilddaten umfassen.
- System (10) nach Anspruch 9 oder 10, wobei das künstliche neuronale Netzwerk so konfiguriert ist, dass es die Daten, die den Bereich der Umgebung kennzeichnen, einer Wahrscheinlichkeitsverteilung über die Klassifizierungstypen zuordnet, die den Objekten im Objektdatensatz zugeordnet sind, und die geschätzte Wahrscheinlichkeit angibt, dass das beobachtete Objekt jeden dieser zugeordneten Typen aufweist.
- System (10) nach einem der Ansprüche 9 bis 11, wobei das Positionserkennungsmodul (44) so konfiguriert ist, dass es einen Partikelfilteralgorithmus verwendet, um die Wahrscheinlichkeit zu bewerten, dass sich das Subjekt an jedem der zahlreichen Orte befindet.
- Fahrzeug (42) einschließlich des Systems (10) nach einem der Ansprüche 9 bis 12.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1718628.9A GB2568286B (en) | 2017-11-10 | 2017-11-10 | Method of computer vision based localisation and navigation and system for performing the same |
| EP18811893.9A EP3707466B1 (de) | 2017-11-10 | 2018-11-07 | Verfahren zur computervisionsbasierten lokalisierung und navigation, sowie system zu dessen ausführung |
| PCT/GB2018/053228 WO2019092418A1 (en) | 2017-11-10 | 2018-11-07 | Method of computer vision based localisation and navigation and system for performing the same |
Related Parent Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18811893.9A Division EP3707466B1 (de) | 2017-11-10 | 2018-11-07 | Verfahren zur computervisionsbasierten lokalisierung und navigation, sowie system zu dessen ausführung |
| EP18811893.9A Division-Into EP3707466B1 (de) | 2017-11-10 | 2018-11-07 | Verfahren zur computervisionsbasierten lokalisierung und navigation, sowie system zu dessen ausführung |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| EP4318397A2 EP4318397A2 (de) | 2024-02-07 |
| EP4318397A3 EP4318397A3 (de) | 2024-10-09 |
| EP4318397B1 true EP4318397B1 (de) | 2026-04-22 |
Family
ID=60788323
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18811893.9A Active EP3707466B1 (de) | 2017-11-10 | 2018-11-07 | Verfahren zur computervisionsbasierten lokalisierung und navigation, sowie system zu dessen ausführung |
| EP23217324.5A Active EP4318397B1 (de) | 2017-11-10 | 2018-11-07 | Verfahren zur computersichtbasierten lokalisierung und navigation und system zur durchführung davon |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18811893.9A Active EP3707466B1 (de) | 2017-11-10 | 2018-11-07 | Verfahren zur computervisionsbasierten lokalisierung und navigation, sowie system zu dessen ausführung |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11393216B2 (de) |
| EP (2) | EP3707466B1 (de) |
| GB (1) | GB2568286B (de) |
| PL (1) | PL3707466T3 (de) |
| WO (1) | WO2019092418A1 (de) |
Families Citing this family (51)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6783742B2 (ja) * | 2017-11-24 | 2020-11-11 | Kddi株式会社 | パラメータ特定装置及びパラメータ特定方法 |
| EP3502976A1 (de) * | 2017-12-19 | 2019-06-26 | Veoneer Sweden AB | Zustandsschätzer |
| US11551032B1 (en) * | 2018-03-14 | 2023-01-10 | United States Of America As Represented By The Secretary Of The Navy | Machine learning based automated object recognition for unmanned autonomous vehicles |
| WO2019183393A1 (en) * | 2018-03-21 | 2019-09-26 | Lei Zhou | Controlling movement of an automated guided vehicle |
| WO2020030966A1 (en) * | 2018-07-06 | 2020-02-13 | Verity Studios Ag | Methods and systems for estimating the orientation of an object |
| WO2020075412A1 (ja) * | 2018-10-10 | 2020-04-16 | ソニー株式会社 | 情報処理装置、移動装置、および方法、並びにプログラム |
| US11734472B2 (en) * | 2018-12-07 | 2023-08-22 | Zoox, Inc. | System and method for modeling physical objects in a simulation |
| US11003945B2 (en) | 2019-05-22 | 2021-05-11 | Zoox, Inc. | Localization using semantically segmented images |
| US11295161B2 (en) | 2019-05-22 | 2022-04-05 | Zoox, Inc. | Localization using semantically segmented images |
| CN112116654B (zh) * | 2019-06-20 | 2024-06-07 | 杭州海康威视数字技术股份有限公司 | 一种车辆位姿确定方法、装置及电子设备 |
| US10895637B1 (en) * | 2019-07-17 | 2021-01-19 | BGA Technology LLC | Systems and methods for mapping manmade objects buried in subterranean surfaces using an unmanned aerial vehicle integrated with radar sensor equipment |
| US11302033B2 (en) | 2019-07-22 | 2022-04-12 | Adobe Inc. | Classifying colors of objects in digital images |
| US11468550B2 (en) | 2019-07-22 | 2022-10-11 | Adobe Inc. | Utilizing object attribute detection models to automatically select instances of detected objects in images |
| US11631234B2 (en) | 2019-07-22 | 2023-04-18 | Adobe, Inc. | Automatically detecting user-requested objects in images |
| US11107219B2 (en) | 2019-07-22 | 2021-08-31 | Adobe Inc. | Utilizing object attribute detection models to automatically select instances of detected objects in images |
| US20210101616A1 (en) * | 2019-10-08 | 2021-04-08 | Mobileye Vision Technologies Ltd. | Systems and methods for vehicle navigation |
| US20220383541A1 (en) * | 2019-11-13 | 2022-12-01 | Battelle Energy Alliance, Llc | Unmanned vehicle navigation, and associated methods, systems, and computer-readable medium |
| WO2021141666A2 (en) * | 2019-11-13 | 2021-07-15 | Battelle Energy Alliance, Llc | Unmanned vehicle navigation, and associated methods, systems, and computer-readable medium |
| US11043003B2 (en) * | 2019-11-18 | 2021-06-22 | Waymo Llc | Interacted object detection neural network |
| CN110954933B (zh) * | 2019-12-09 | 2023-05-23 | 王相龙 | 一种基于场景dna的移动平台定位装置及方法 |
| CN114761997B (zh) | 2019-12-12 | 2025-12-19 | Oppo广东移动通信有限公司 | 目标检测方法、终端设备和介质 |
| US11468110B2 (en) | 2020-02-25 | 2022-10-11 | Adobe Inc. | Utilizing natural language processing and multiple object detection models to automatically select objects in images |
| US11852751B2 (en) * | 2020-03-02 | 2023-12-26 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method, apparatus, computing device and computer-readable storage medium for positioning |
| US11055566B1 (en) * | 2020-03-12 | 2021-07-06 | Adobe Inc. | Utilizing a large-scale object detector to automatically select objects in digital images |
| US11808578B2 (en) * | 2020-05-29 | 2023-11-07 | Aurora Flight Sciences Corporation | Global positioning denied navigation |
| US12327175B2 (en) * | 2020-08-06 | 2025-06-10 | Micron Technology, Inc. | Collaborative sensor data processing by deep learning accelerators with integrated random access memory |
| US11550068B2 (en) * | 2020-08-29 | 2023-01-10 | Google Llc | Modeling mutable environmental structures |
| JP2022042630A (ja) * | 2020-09-03 | 2022-03-15 | 本田技研工業株式会社 | 自己位置推定方法 |
| US11615268B2 (en) * | 2020-09-09 | 2023-03-28 | Toyota Research Institute, Inc. | System and method for optimizing performance of a model performing a downstream task |
| US12045992B2 (en) * | 2020-11-10 | 2024-07-23 | Nec Corporation | Multi-domain semantic segmentation with label shifts |
| EP4016111A1 (de) * | 2020-12-16 | 2022-06-22 | Trimble Inc. | Verfahren zur geospatialen positionierung und tragbaren positionierungsvorrichtungen |
| DE102020134119B3 (de) * | 2020-12-18 | 2021-12-23 | Audi Aktiengesellschaft | Verfahren zur Lokalisierung eines Kraftfahrzeugs in einer Assistenzkarte, Kraftfahrzeug, Computerprogramm und elektronisch lesbarer Datenträger |
| CN112762928B (zh) * | 2020-12-23 | 2022-07-15 | 重庆邮电大学 | 含有激光slam的odom与dm地标组合移动机器人及导航方法 |
| US11587234B2 (en) | 2021-01-15 | 2023-02-21 | Adobe Inc. | Generating class-agnostic object masks in digital images |
| US11972569B2 (en) | 2021-01-26 | 2024-04-30 | Adobe Inc. | Segmenting objects in digital images utilizing a multi-object segmentation model framework |
| DE102021107904A1 (de) * | 2021-03-29 | 2022-09-29 | Conti Temic Microelectronic Gmbh | Verfahren und System zur Bestimmung der Bodenebene mit einem künstlichen neuronalen Netz |
| RU2769918C1 (ru) | 2021-05-18 | 2022-04-08 | Общество с ограниченной ответственностью "ЭвоКарго" | Способ позиционирования наземного транспортного средства |
| US11609093B2 (en) | 2021-06-07 | 2023-03-21 | Honeywell International Inc. | Position probability density function filter to determine real-time measurement errors for map based, vision navigation systems |
| US12265163B2 (en) * | 2021-06-24 | 2025-04-01 | Science Applications International Corporation | Navigation system |
| CN113537350B (zh) * | 2021-07-16 | 2023-12-22 | 商汤集团有限公司 | 图像处理方法及装置、电子设备和存储介质 |
| CN114167468B (zh) * | 2021-12-14 | 2023-06-27 | 四川大学 | 一种基于图像和gnss的目标空间定位方法 |
| IT202100033116A1 (it) * | 2021-12-30 | 2023-06-30 | Prinoth Spa | Veicolo cingolato e metodo di localizzazione di un veicolo cingolato |
| SE547500C2 (en) * | 2022-06-03 | 2025-10-07 | Tobii Ab | Method of estimating a three-dimensional position of an object |
| US12434740B1 (en) * | 2022-12-22 | 2025-10-07 | Zoox, Inc. | Determining object orientation based on parameter modes |
| CN116543603B (zh) * | 2023-07-07 | 2023-09-29 | 四川大学 | 一种考虑空域态势和局部优化的航迹补全预测方法及装置 |
| CN116817903B (zh) * | 2023-08-24 | 2023-11-21 | 湖南大学 | 一种基于先验视觉引导的智能机器人全局定位方法及系统 |
| TR2023016496A1 (tr) * | 2023-12-05 | 2025-06-23 | Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi | Küresel konum beli̇rleme si̇stemi̇ (gnss) veri̇si̇ olmayan ortamlarda taşinabi̇li̇r ci̇haz kamerasi i̇le hedef konumlandirma |
| KR20250094787A (ko) * | 2023-12-18 | 2025-06-26 | 주식회사 비트센싱 | 레이더의 데이터를 처리하는 시스템 및 방법 |
| CN117473398B (zh) * | 2023-12-26 | 2024-03-19 | 四川国蓝中天环境科技集团有限公司 | 一种基于运渣车活动的城市扬尘污染源分类方法 |
| EP4607309A1 (de) * | 2024-02-23 | 2025-08-27 | Deere & Company | Verfahren zur positionsbestimmung eines arbeitsfahrzeugs in einer siloanlage |
| CN120198873B (zh) * | 2025-03-14 | 2025-09-26 | 徐州北峪科学技术研究有限公司 | 基于环境感知的无人破拆机器人自适应控制方法及系统 |
Family Cites Families (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5459636A (en) * | 1994-01-14 | 1995-10-17 | Hughes Aircraft Company | Position and orientation estimation neural network system and method |
| US7191056B2 (en) * | 2005-01-04 | 2007-03-13 | The Boeing Company | Precision landmark-aided navigation |
| US8174568B2 (en) | 2006-12-01 | 2012-05-08 | Sri International | Unified framework for precise vision-aided navigation |
| US9476970B1 (en) * | 2012-03-19 | 2016-10-25 | Google Inc. | Camera based localization |
| US8849308B2 (en) * | 2012-11-21 | 2014-09-30 | Apple Inc. | Tiling of map data |
| EP3070430B1 (de) * | 2013-11-13 | 2019-08-14 | Nissan Motor Co., Ltd. | Vorrichtung zur positionsschätzung eines beweglichen körpers und verfahren zur positionsschätzung eines beweglichen körpers |
| US9589355B2 (en) * | 2015-03-16 | 2017-03-07 | Here Global B.V. | Guided geometry extraction for localization of a device |
| US9524435B2 (en) * | 2015-03-20 | 2016-12-20 | Google Inc. | Detecting the location of a mobile device based on semantic indicators |
| JP2019508677A (ja) | 2016-01-08 | 2019-03-28 | インテリジェント テクノロジーズ インターナショナル、インコーポレイテッド | 地図を使用した車両構成部品の制御 |
| US10373019B2 (en) * | 2016-01-13 | 2019-08-06 | Ford Global Technologies, Llc | Low- and high-fidelity classifiers applied to road-scene images |
| US10489691B2 (en) * | 2016-01-15 | 2019-11-26 | Ford Global Technologies, Llc | Fixation generation for machine learning |
| WO2018031678A1 (en) | 2016-08-09 | 2018-02-15 | Nauto Global Limited | System and method for precision localization and mapping |
| EP3290864A1 (de) * | 2016-08-30 | 2018-03-07 | Continental Automotive GmbH | Fahrerassistenzsystem zur bestimmung der position eines fahrzeugs |
| WO2018104563A2 (en) | 2016-12-09 | 2018-06-14 | Tomtom Global Content B.V. | Method and system for video-based positioning and mapping |
| US9953236B1 (en) * | 2017-03-10 | 2018-04-24 | TuSimple | System and method for semantic segmentation using dense upsampling convolution (DUC) |
-
2017
- 2017-11-10 GB GB1718628.9A patent/GB2568286B/en active Active
-
2018
- 2018-11-07 EP EP18811893.9A patent/EP3707466B1/de active Active
- 2018-11-07 WO PCT/GB2018/053228 patent/WO2019092418A1/en not_active Ceased
- 2018-11-07 US US16/761,765 patent/US11393216B2/en active Active
- 2018-11-07 EP EP23217324.5A patent/EP4318397B1/de active Active
- 2018-11-07 PL PL18811893.9T patent/PL3707466T3/pl unknown
Also Published As
| Publication number | Publication date |
|---|---|
| EP4318397A2 (de) | 2024-02-07 |
| PL3707466T4 (pl) | 2025-03-17 |
| GB2568286A (en) | 2019-05-15 |
| EP4318397A3 (de) | 2024-10-09 |
| WO2019092418A1 (en) | 2019-05-16 |
| EP3707466A1 (de) | 2020-09-16 |
| US11393216B2 (en) | 2022-07-19 |
| GB201718628D0 (en) | 2017-12-27 |
| EP3707466B1 (de) | 2024-08-21 |
| PL3707466T3 (pl) | 2025-03-17 |
| EP3707466C0 (de) | 2024-08-21 |
| GB2568286B (en) | 2020-06-10 |
| US20200349362A1 (en) | 2020-11-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4318397B1 (de) | Verfahren zur computersichtbasierten lokalisierung und navigation und system zur durchführung davon | |
| US11175145B2 (en) | System and method for precision localization and mapping | |
| US10365363B2 (en) | Mobile localization using sparse time-of-flight ranges and dead reckoning | |
| US20170023659A1 (en) | Adaptive positioning system | |
| JP2022106924A (ja) | 自律的な自己位置推定のためのデバイス及び方法 | |
| CN113485441A (zh) | 结合无人机高精度定位和视觉跟踪技术的配网巡检方法 | |
| CA3177161A1 (en) | Object detection and tracking for automated operation of vehicles and machinery | |
| US11967091B2 (en) | Detection of environmental changes to delivery zone | |
| Schleiss | Translating aerial images into street-map-like representations for visual self-localization of UAVs | |
| Dumble et al. | Airborne vision-aided navigation using road intersection features | |
| Kealy et al. | Collaborative navigation as a solution for PNT applications in GNSS challenged environments–report on field trials of a joint FIG/IAG working group | |
| WO2016196717A2 (en) | Mobile localization using sparse time-of-flight ranges and dead reckoning | |
| Wang et al. | UGV‐UAV robust cooperative positioning algorithm with object detection | |
| US20250029332A1 (en) | Building modeling method using aerial lidar and computer program recorded on recording medium to execute the same | |
| Berrio et al. | Long-term map maintenance pipeline for autonomous vehicles | |
| Christie et al. | Semantics for UGV Registration in GPS-denied Environments | |
| Soleimani et al. | A disaster invariant feature for localization | |
| Beer et al. | Towards Feature-Based Low-Latency Localization with Rotating LiDARs | |
| US12203772B1 (en) | Map updating method and computer program recorded on recording medium to execute the same | |
| CN121527760A (zh) | 基于语义加权自适应粒子滤波的无人机视觉定位方法 | |
| Ricciardelli | Visual Place Recognition as a solution for the Kidnapped Robot Problem | |
| WO2025207878A1 (en) | Method and system for map building using perception and motion sensors | |
| CN118225078A (zh) | 交通工具的定位方法、装置、交通工具及存储介质 | |
| CN119197559A (zh) | 一种地图构建的方法、地图定位的方法及装置 | |
| Miller | Robotic localization and perception in static terrain and dynamic urban environments |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
| AC | Divisional application: reference to earlier application |
Ref document number: 3707466 Country of ref document: EP Kind code of ref document: P |
|
| AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref country code: DE Ref legal event code: R079 Ref document number: 602018090900 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06T0007700000 Ipc: G01C0021000000 |
|
| PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
| AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 7/70 20170101ALI20240905BHEP Ipc: G01S 5/16 20060101ALI20240905BHEP Ipc: G01C 21/32 20060101ALI20240905BHEP Ipc: G01C 21/20 20060101ALI20240905BHEP Ipc: G01C 21/00 20060101AFI20240905BHEP |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20250404 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G01C 21/00 20060101AFI20250820BHEP Ipc: G01C 21/20 20060101ALI20250820BHEP Ipc: G01C 21/32 20060101ALI20250820BHEP Ipc: G01S 5/16 20060101ALI20250820BHEP Ipc: G06T 7/70 20170101ALI20250820BHEP |
|
| GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
| INTG | Intention to grant announced |
Effective date: 20251107 |
|
| GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
| GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
| AC | Divisional application: reference to earlier application |
Ref document number: 3707466 Country of ref document: EP Kind code of ref document: P |
|
| AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: IDV DEFENCE VEHICLES UK LTD |
|
| REG | Reference to a national code |
Ref country code: CH Ref legal event code: F10 Free format text: ST27 STATUS EVENT CODE: U-0-0-F10-F00 (AS PROVIDED BY THE NATIONAL OFFICE) Effective date: 20260422 |