CL2023000206A1 - Actionable Surface Identification Techniques - Google Patents

Actionable Surface Identification Techniques

Info

Publication number
CL2023000206A1
CL2023000206A1 CL2023000206A CL2023000206A CL2023000206A1 CL 2023000206 A1 CL2023000206 A1 CL 2023000206A1 CL 2023000206 A CL2023000206 A CL 2023000206A CL 2023000206 A CL2023000206 A CL 2023000206A CL 2023000206 A1 CL2023000206 A1 CL 2023000206A1
Authority
CL
Chile
Prior art keywords
drivable surface
driveable
identification
sensor data
additional images
Prior art date
Application number
CL2023000206A
Other languages
Spanish (es)
Inventor
Theverapperuma Lalin
Halder Bibhrajit
Original Assignee
Safeai Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Safeai Inc filed Critical Safeai Inc
Publication of CL2023000206A1 publication Critical patent/CL2023000206A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0025Planning or execution of driving tasks specially adapted for specific operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • G01C21/3822Road feature data, e.g. slope data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3826Terrain data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • G06F18/2148Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/647Three-dimensional objects by matching two-dimensional images to three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/15Road slope, i.e. the inclination of a road segment in the longitudinal direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/30Road curve radius
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/35Road bumpiness, e.g. potholes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Medical Informatics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

La presente divulgación generalmente se refiere a la identificación de superficies conducibles en conexión con la ejecución autónoma de varias tareas en sitios de trabajo industriales y, de manera más particular, a técnicas para distinguir superficies conducibles de superficies no conducibles con base en datos del sensor. Se proporciona un marco para la identificación de superficies conducibles para una máquina autónoma con la finalidad de facilitarle la detección autónoma de la presencia de una superficie conducible y estimar, con base en datos del sensor, atributos de la superficie conducible tal como la condición del camino, la curvatura del camino, el grado de inclinación o declinación, y similar. En algunas modalidades, al menos una imagen de cámara es procesada para extraer un conjunto de características a partir de las cuales se identifican superficies y objetos en un ambiente físico, y generar imágenes adicionales para procesamiento adicional. Las imágenes adicionales son combinadas con una representación 3D, derivadas de datos de LIDAR o radar, para generar una representación de salida indicando una superficie conducible.The present disclosure generally relates to the identification of driveable surfaces in connection with the autonomous performance of various tasks at industrial work sites and, more particularly, to techniques for distinguishing driveable surfaces from non-driveable surfaces based on sensor data. A framework for drivable surface identification is provided for an autonomous machine to enable it to autonomously detect the presence of a drivable surface and estimate, based on sensor data, drivable surface attributes such as road condition. , the curvature of the road, the degree of incline or decline, and the like. In some embodiments, at least one camera image is processed to extract a set of features from which to identify surfaces and objects in a physical environment, and generate additional images for further processing. The additional images are combined with a 3D representation, derived from lidar or radar data, to generate an output representation indicating a drivable surface.

CL2023000206A 2020-07-24 2023-01-20 Actionable Surface Identification Techniques CL2023000206A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/938,312 US11691648B2 (en) 2020-07-24 2020-07-24 Drivable surface identification techniques

Publications (1)

Publication Number Publication Date
CL2023000206A1 true CL2023000206A1 (en) 2023-07-07

Family

ID=79687381

Family Applications (1)

Application Number Title Priority Date Filing Date
CL2023000206A CL2023000206A1 (en) 2020-07-24 2023-01-20 Actionable Surface Identification Techniques

Country Status (8)

Country Link
US (1) US11691648B2 (en)
EP (1) EP4186230A4 (en)
JP (1) JP2023536407A (en)
AU (1) AU2021313775A1 (en)
BR (1) BR112023001159A2 (en)
CA (1) CA3189467A1 (en)
CL (1) CL2023000206A1 (en)
WO (1) WO2022020028A1 (en)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210089896A1 (en) * 2019-08-19 2021-03-25 Savitude, Inc. Automated Image Processing System for Garment Targeting and Generation
US11810364B2 (en) * 2020-08-10 2023-11-07 Volvo Car Corporation Automated road damage detection
US11964691B2 (en) * 2020-08-17 2024-04-23 Magna Electronics Inc. Vehicular control system with autonomous braking
CN111829545B (en) * 2020-09-16 2021-01-08 深圳裹动智驾科技有限公司 Automatic driving vehicle and dynamic planning method and system for motion trail of automatic driving vehicle
US11645779B1 (en) * 2020-09-21 2023-05-09 Ambarella International Lp Using vehicle cameras for automatically determining approach angles onto driveways
US12030501B2 (en) * 2020-10-02 2024-07-09 Magna Electronics Inc. Vehicular control system with enhanced vehicle passing maneuvering
US11521394B2 (en) * 2020-10-09 2022-12-06 Motional Ad Llc Ground plane estimation using LiDAR semantic network
US11623661B2 (en) * 2020-10-12 2023-04-11 Zoox, Inc. Estimating ground height based on lidar data
US20220122363A1 (en) * 2020-10-21 2022-04-21 Motional Ad Llc IDENTIFYING OBJECTS USING LiDAR
TWI768548B (en) * 2020-11-19 2022-06-21 財團法人資訊工業策進會 System and method for generating basic information for positioning and self-positioning determination device
AU2021232767A1 (en) * 2020-12-21 2022-07-07 Commonwealth Scientific And Industrial Research Organisation Vehicle navigation
CN112634343A (en) * 2020-12-23 2021-04-09 北京百度网讯科技有限公司 Training method of image depth estimation model and processing method of image depth information
US11669998B2 (en) * 2021-01-20 2023-06-06 GM Global Technology Operations LLC Method and system for learning a neural network to determine a pose of a vehicle in an environment
US11860641B2 (en) * 2021-01-28 2024-01-02 Caterpillar Inc. Visual overlays for providing perception of depth
JP2022132882A (en) * 2021-03-01 2022-09-13 キヤノン株式会社 Navigation system and navigation method and program
US20220309336A1 (en) * 2021-03-26 2022-09-29 Nvidia Corporation Accessing tensors
US20220377973A1 (en) * 2021-05-25 2022-12-01 Scythe Robotics, Inc. Method and apparatus for modeling an environment proximate an autonomous system
US12006655B2 (en) * 2021-08-02 2024-06-11 Deere & Company Ground engaging tool contact detection system and method
US11821744B2 (en) * 2021-10-08 2023-11-21 Ford Global Technologies, Llc Recommending an alternative off-road track to a driver of a vehicle
US20230127185A1 (en) * 2021-10-22 2023-04-27 Zoox, Inc. Drivable surface map for autonomous vehicle navigation
US12025465B2 (en) 2021-10-22 2024-07-02 Zoox, Inc. Drivable surface map for autonomous vehicle navigation
US20230142305A1 (en) * 2021-11-05 2023-05-11 GM Global Technology Operations LLC Road condition detection systems and methods
US11999352B2 (en) * 2021-12-15 2024-06-04 Industrial Technology Research Institute Method and system for extracting road data and method and system for controlling self-driving car
US11952746B1 (en) 2022-03-10 2024-04-09 AIM Intelligent Machines, Inc. Autonomous control of on-site movement of powered earth-moving construction or mining vehicles
KR20230139560A (en) * 2022-03-28 2023-10-05 현대자동차주식회사 Vehicle and method of controlling the same
DE102022108769A1 (en) * 2022-04-11 2023-10-12 Still Gesellschaft Mit Beschränkter Haftung Environment monitoring system and industrial truck with an environment monitoring system
US11746499B1 (en) 2022-05-10 2023-09-05 AIM Intelligent Machines, Inc. Hardware component configuration for autonomous control of powered earth-moving vehicles
US11746501B1 (en) 2022-08-29 2023-09-05 RIM Intelligent Machines, Inc. Autonomous control of operations of powered earth-moving vehicles using data from on-vehicle perception systems
WO2024054585A1 (en) * 2022-09-09 2024-03-14 Tesla, Inc. Artificial intelligence modeling techniques for vision-based occupancy determination
WO2024073117A1 (en) * 2022-09-30 2024-04-04 Tesla, Inc. Ai inference compiler and runtime tool chain
WO2024097955A1 (en) * 2022-11-04 2024-05-10 Instock, Inc. Modular automated storage and retrieval systems and methods
EP4386684A1 (en) * 2022-12-13 2024-06-19 My Virtual Reality Software AS Method for 3d visualization of sensor data
US11898324B1 (en) 2022-12-19 2024-02-13 AIM Intelligent Machines, Inc. Adaptive control system for autonomous control of powered earth-moving vehicles
US20240208530A1 (en) * 2022-12-21 2024-06-27 Mercedes-Benz Group AG Evaluating integrity of vehicle pose estimates via semantic labels

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8306672B2 (en) * 2009-09-09 2012-11-06 GM Global Technology Operations LLC Vehicular terrain detection system and method
JP6514681B2 (en) 2013-03-15 2019-05-15 ウーバー テクノロジーズ,インコーポレイテッド Method, system and apparatus for multi-perceptive stereo vision for robots
US20150294143A1 (en) 2014-04-10 2015-10-15 GM Global Technology Operations LLC Vision based monitoring system for activity sequency validation
EP3734504A1 (en) 2015-02-10 2020-11-04 Mobileye Vision Technologies Ltd. Sparse map for autonomous vehicle navigation
US20190176321A1 (en) 2015-09-30 2019-06-13 AI Incorporated Robotic floor-cleaning system manager
US20170147990A1 (en) 2015-11-23 2017-05-25 CSI Holdings I LLC Vehicle transactions using objective vehicle data
WO2018104563A2 (en) * 2016-12-09 2018-06-14 Tomtom Global Content B.V. Method and system for video-based positioning and mapping
US20180190014A1 (en) 2017-01-03 2018-07-05 Honeywell International Inc. Collaborative multi sensor system for site exploitation
US20200294401A1 (en) 2017-09-04 2020-09-17 Nng Software Developing And Commercial Llc. A Method and Apparatus for Collecting and Using Sensor Data from a Vehicle
US20190079526A1 (en) 2017-09-08 2019-03-14 Uber Technologies, Inc. Orientation Determination in Object Detection and Tracking for Autonomous Vehicles
US10970553B2 (en) 2017-11-15 2021-04-06 Uatc, Llc Semantic segmentation of three-dimensional data
US10997433B2 (en) 2018-02-27 2021-05-04 Nvidia Corporation Real-time detection of lanes and boundaries by autonomous vehicles
US11561541B2 (en) 2018-04-09 2023-01-24 SafeAI, Inc. Dynamically controlling sensor behavior
US11308338B2 (en) 2018-12-28 2022-04-19 Nvidia Corporation Distance to obstacle detection in autonomous machine applications
US11927457B2 (en) * 2019-07-10 2024-03-12 Deka Products Limited Partnership System and method for real time control of an autonomous device

Also Published As

Publication number Publication date
AU2021313775A1 (en) 2023-02-23
EP4186230A1 (en) 2023-05-31
WO2022020028A1 (en) 2022-01-27
EP4186230A4 (en) 2024-06-26
CA3189467A1 (en) 2022-01-27
US20220024485A1 (en) 2022-01-27
BR112023001159A2 (en) 2023-04-04
US11691648B2 (en) 2023-07-04
JP2023536407A (en) 2023-08-25

Similar Documents

Publication Publication Date Title
CL2023000206A1 (en) Actionable Surface Identification Techniques
CN109685141B (en) Robot article sorting visual detection method based on deep neural network
MX2018013090A (en) Pallet detection using units of physical length.
US11170220B2 (en) Delegation of object and pose detection
Stein et al. Convexity based object partitioning for robot applications
EP2797032A3 (en) Method and system using two parallel optical character recognition processes
Fan et al. Three-filters-to-normal: An accurate and ultrafast surface normal estimator
WO2017220599A3 (en) Method for generating 3d data relating to an object
CN103824318A (en) Multi-camera-array depth perception method
EP2372641A3 (en) Surface detection in images based on spatial data
CN106372051B8 (en) A kind of method for visualizing and system of patent map
CN107545247B (en) Stereo cognition method based on binocular recognition
CN104460505A (en) Industrial robot relative pose estimation method
Xin et al. A RGBD SLAM algorithm combining ORB with PROSAC for indoor mobile robot
Choe et al. Fast point cloud segmentation for an intelligent vehicle using sweeping 2D laser scanners
CN103337073A (en) Three-dimensional entropy based two-dimensional image threshold segmentation method
Wu et al. Fishery monitoring system with AUV based on YOLO and SGBM
EP2899963A3 (en) Image scanner and image scanning method
Guo et al. Measurement of three-dimensional deformation and load using vision-based tactile sensor
US20230089616A1 (en) Monocular camera activation for localization based on data from depth sensor
US20200200906A1 (en) Tracking objects in lidar point clouds with enhanced template matching
Bhowmick et al. A novel floor segmentation algorithm for mobile robot navigation
Brown et al. Dataset and performance comparison of deep learning architectures for plum detection and robotic harvesting
CN109348122A (en) For imaging image processing method, system, equipment and the storage medium of picture machine
WO2015112194A3 (en) Image processor comprising gesture recognition system with static hand pose recognition based on dynamic warping